I have a hosts containing file say /tmp/hostlist
which requires username & password to login and I am using expect command to login hosts and execute command and come out like below in my bash script.
#!/bin/bashexport file="/tmp/hostlist"export cmd="hostname -f"script=$(cat <<'END_OF_SCRIPT' set timeout 120 set f $::env(file) set fh [open $f r] while {[gets $fh server_name] != -1} { spawn sudo -i -u rcuser ssh $server_name send "\r" expect "$server_name login:" send "root\r" expect "Password:" send "passwd\r" send "$::env(cmd)\r" send "exit\r" expect eof } close $fhEND_OF_SCRIPT)VAR=$(expect -c "$script")echo "$VAR" >/tmp/outexp-----------------
I am able to handle parallel execution -
while IFS= read -r ido ( export server_name=`echo $i`;echo "connecting to $i";expect -c "$script";echo "Job completed on $i";echo "-----------------------------------" ) >> "${log}_${i}" 2>&1 &done < "$file"waitwhile IFS= read -r ido cat "${log}_${i}" rm "${log}_${i}"done < "$file" > "$log2"wait
Is there a way to limit no. of hosts at a time? e.g suppose i have 1000 hosts in a file..and i want to execute in a set of 100 hosts at a time to complete 1000 hosts..