Parallelism

I've written a simple bit of BASH for running jobs in parallel. It's not a replacement for GNU Parallel (which has far more features!), but works well enough for simple jobs or butchering into another script.

Setup

Set defaults, parse command line arguments and show usage.

ITERATIONS=20
WORKERS=10

USAGE="$0 [--iterations num] [--workers num] COMMAND"

while [ $# -gt 0 ]; do
    case "$1" in
    (--iterations) ITERATIONS=$2; shift;;
    (--workers) WORKERS=$2; shift;;
    (--help) echo $USAGE; exit 0;;
    (--) shift; break;;
    (*)  break;;
    esac
    shift
done

WORKER_COMMAND=${@-"sleep 2"}

MAX_WORKERS_FILE=$(mktemp -t parallel.workers.XXXXX) || exit 1
echo $WORKERS > $MAX_WORKERS_FILE

echo "Starting $ITERATIONS iterations of '$WORKER_COMMAND' with $WORKERS parallel workers."
echo "Workers can be altered at runtime by editing $MAX_WORKERS_FILE"
echo "Press any key to continue..."
read -n 1 c

Core

Worker orchestration relies on the jobs command returning the number of child processes spawned in the background by the parent script.
for ((i=1; i<=$ITERATIONS; i++)); do
    while (true); do
        MAX_WORKERS=$(cat $MAX_WORKERS_FILE)
        WORKERS=$(jobs -r | wc -l)
        echo -e "\e[91mWorkers: $WORKERS / $MAX_WORKERS\e[39m"
        [[ $WORKERS -lt $MAX_WORKERS]] && break || sleep 1
    done
    echo "$i: $WORKER_COMMAND"
    eval "$WORKER_COMMAND &"
done

Tidy Up

Remove the dynamic worker configuration file and exit with a nice summary.

unlink $MAX_WORKERS_FILE
echo -e "Completed $((i-1)) iterations of '$WORKER_COMMAND'"
exit 0

Examples

Example usage:

# ./parallel.sh --iterations 100 --workers 20 sleep 1

To test with random intervals, create a script to sleep for between 2 and 10 seconds:

# echo -e '#!/bin/bash\nsleep $((2 + (RANDOM%9)))' > /tmp/testParallel.sh

Then use parallel.sh to invoke it 200 times with 50 workers:

# ./parallel.sh --iterations 200 --workers 50 /tmp/testParallel.sh

Code on GitHub