Add parallel execution for bash script#64
Conversation
|
did you test this and the generate steps don't walk over each other? |
|
Running that right now: but the only issue I found is that the minecraft generate needs to be done before because neoforge and forge checks for mc version to exist |
|
something about wait doesn't work right now |
Signed-off-by: Trial97 <alexandru.tripon97@gmail.com>
|
perhaps it needs to wait on spesfic pids? #!/bin/bash
# Track exit status of multiple processes
declare -A process_status
run_and_track() {
local name=$1
shift
"$@" &
process_status[$!]="$name"
}
# Start processes
run_and_track "Task 1 (sleep 2)" sleep 2
run_and_track "Task 2 (sleep 1)" sleep 1
run_and_track "Task 3 (sleep 3)" sleep 3
echo "Background processes started: ${!process_status[@]}"
# Collect results
for pid in "${!process_status[@]}"; do
wait "$pid"
status=$?
echo "${process_status[$pid]} (PID $pid) exited with status: $status"
done
process_status=()
echo "All processes complete" |
|
Nah this PR actually works: the issue was the neoforge cache stuff and I marked it as draft to not make it harder to debug.
|
|
And right now the only issue I have with it is that I can't Ctrl^C it |
|
tracking the pids should let you trap and exit the pids |
|
did not worked that way on my testing But it is enough meta for today, I will give it a shot another time. |
|
Maybe this would be reason enough to rewrite the script in Python? That way we could just use a single multiprocessing Pool for all update steps as well as a Pool for all generate steps |
if make forge and neoforge concurrent why not do the same for the bash script?