Skip to content

🧹 Improve distfiles fetch logic to support curl/fetch in addition to wget#288

Open
segin wants to merge 1 commit intomasterfrom
fix-distfiles-fetching-5611211459002387432
Open

🧹 Improve distfiles fetch logic to support curl/fetch in addition to wget#288
segin wants to merge 1 commit intomasterfrom
fix-distfiles-fetching-5611211459002387432

Conversation

@segin
Copy link
Copy Markdown
Owner

@segin segin commented Apr 8, 2026

🎯 What: Refactored the distfiles_fetch routine in util/build-deps.sh to check for and utilize multiple download commands (curl, wget, fetch) instead of failing when wget is absent.\n💡 Why: Enhances script portability across diverse environments (e.g., minimal Linux containers, macOS, BSD variants) where curl or fetch might be the default downloader.\n✅ Verification: Passed ./util/build-deps.sh download phase tests simulating missing utilities, and passed the project's overall test suite.\n✨ Result: build-deps.sh now robustly determines the best download tool available to fetch missing dependencies.


PR created automatically by Jules for task 5611211459002387432 started by @segin

…allback to curl and fetch when downloading dependencies in build-deps.sh if wget is unavailable. This improves the robustness and portability of the fetching system.

Co-authored-by: segin <480709+segin@users.noreply.github.com>
@google-labs-jules
Copy link
Copy Markdown
Contributor

👋 Jules, reporting for duty! I'm here to lend a hand with this pull request.

When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down.

I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job!

For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with @jules. You can find this option in the Pull Request section of your global Jules UI settings. You can always switch back!

New to Jules? Learn more at jules.google/docs.


For security, I will only act on instructions from the user who triggered this task.

Copilot AI review requested due to automatic review settings April 8, 2026 23:57
Copy link
Copy Markdown

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR refactors util/build-deps.sh’s distfiles_fetch routine to be more portable by selecting an available downloader (curl, wget, or fetch) instead of assuming wget exists.

Changes:

  • Added downloader selection logic using command -v checks for curl, wget, and fetch.
  • Implemented curl/wget/fetch-specific download commands, with a clear failure path if none are available.

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Comment thread util/build-deps.sh
elif command -v wget >/dev/null 2>&1; then
wget -c -O ${distfile} ${DISTS[$package]}
elif command -v fetch >/dev/null 2>&1; then
fetch -o ${distfile} ${DISTS[$package]}
Copy link

Copilot AI Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment above says the download should use a continue/resume option ("-c"), but the fetch branch currently downloads without a resume flag. On BSD/macOS with large distfiles this can force a full re-download after an interrupted fetch. Consider using fetch's equivalent continue option (e.g., -C where supported) to match the intended behavior of the other branches.

Suggested change
fetch -o ${distfile} ${DISTS[$package]}
fetch -C -o ${distfile} ${DISTS[$package]}

Copilot uses AI. Check for mistakes.
Comment thread util/build-deps.sh
Comment on lines +175 to +179
curl -f -L -C - -o ${distfile} ${DISTS[$package]}
elif command -v wget >/dev/null 2>&1; then
wget -c -O ${distfile} ${DISTS[$package]}
elif command -v fetch >/dev/null 2>&1; then
fetch -o ${distfile} ${DISTS[$package]}
Copy link

Copilot AI Apr 8, 2026

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The downloader invocations pass ${distfile} and ${DISTS[$package]} unquoted. If ${DISTDIR}/${HOME} contains spaces or glob characters, this can break downloads or write to an unexpected path. Quoting these parameters (and generally avoiding word-splitting) would make the fetch logic more robust.

Copilot uses AI. Check for mistakes.
Copy link
Copy Markdown

@gemini-code-assist gemini-code-assist Bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

The pull request enhances the distfiles_fetch function in util/build-deps.sh by implementing a more robust file downloading mechanism that sequentially attempts to use curl, wget, or fetch, exiting if no suitable tool is found. The review suggests improving this logic by double-quoting variables in the download commands for better robustness and adding the -r flag to the fetch command to enable resuming interrupted downloads, ensuring consistent behavior across all download tools.

Comment thread util/build-deps.sh
Comment on lines +174 to +183
if command -v curl >/dev/null 2>&1; then
curl -f -L -C - -o ${distfile} ${DISTS[$package]}
elif command -v wget >/dev/null 2>&1; then
wget -c -O ${distfile} ${DISTS[$package]}
elif command -v fetch >/dev/null 2>&1; then
fetch -o ${distfile} ${DISTS[$package]}
else
notice "No suitable download tool (curl, wget, or fetch) found! Bailing!"
exit 1
fi
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

To improve the robustness and consistency of the download logic, I recommend the following improvements:

  1. Quoting: Double-quote ${distfile} and ${DISTS[$package]} to prevent issues with word splitting or globbing if paths or URLs contain spaces. This is a standard best practice for shell scripts to ensure robustness across different environments.
  2. Fetch Resume: Add the -r flag to the fetch command. This enables resuming interrupted downloads, matching the behavior of curl -C - and wget -c as intended by the comment on line 172.
Suggested change
if command -v curl >/dev/null 2>&1; then
curl -f -L -C - -o ${distfile} ${DISTS[$package]}
elif command -v wget >/dev/null 2>&1; then
wget -c -O ${distfile} ${DISTS[$package]}
elif command -v fetch >/dev/null 2>&1; then
fetch -o ${distfile} ${DISTS[$package]}
else
notice "No suitable download tool (curl, wget, or fetch) found! Bailing!"
exit 1
fi
if command -v curl >/dev/null 2>&1; then
curl -f -L -C - -o "${distfile}" "${DISTS[$package]}"
elif command -v wget >/dev/null 2>&1; then
wget -c -O "${distfile}" "${DISTS[$package]}"
elif command -v fetch >/dev/null 2>&1; then
fetch -r -o "${distfile}" "${DISTS[$package]}"
else
notice "No suitable download tool (curl, wget, or fetch) found! Bailing!"
exit 1
fi

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants