🧹 Improve distfiles fetch logic to support curl/fetch in addition to wget#288
🧹 Improve distfiles fetch logic to support curl/fetch in addition to wget#288
Conversation
…allback to curl and fetch when downloading dependencies in build-deps.sh if wget is unavailable. This improves the robustness and portability of the fetching system. Co-authored-by: segin <480709+segin@users.noreply.github.com>
|
👋 Jules, reporting for duty! I'm here to lend a hand with this pull request. When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down. I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job! For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with New to Jules? Learn more at jules.google/docs. For security, I will only act on instructions from the user who triggered this task. |
There was a problem hiding this comment.
Pull request overview
This PR refactors util/build-deps.sh’s distfiles_fetch routine to be more portable by selecting an available downloader (curl, wget, or fetch) instead of assuming wget exists.
Changes:
- Added downloader selection logic using
command -vchecks forcurl,wget, andfetch. - Implemented
curl/wget/fetch-specific download commands, with a clear failure path if none are available.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| elif command -v wget >/dev/null 2>&1; then | ||
| wget -c -O ${distfile} ${DISTS[$package]} | ||
| elif command -v fetch >/dev/null 2>&1; then | ||
| fetch -o ${distfile} ${DISTS[$package]} |
There was a problem hiding this comment.
The comment above says the download should use a continue/resume option ("-c"), but the fetch branch currently downloads without a resume flag. On BSD/macOS with large distfiles this can force a full re-download after an interrupted fetch. Consider using fetch's equivalent continue option (e.g., -C where supported) to match the intended behavior of the other branches.
| fetch -o ${distfile} ${DISTS[$package]} | |
| fetch -C -o ${distfile} ${DISTS[$package]} |
| curl -f -L -C - -o ${distfile} ${DISTS[$package]} | ||
| elif command -v wget >/dev/null 2>&1; then | ||
| wget -c -O ${distfile} ${DISTS[$package]} | ||
| elif command -v fetch >/dev/null 2>&1; then | ||
| fetch -o ${distfile} ${DISTS[$package]} |
There was a problem hiding this comment.
The downloader invocations pass ${distfile} and ${DISTS[$package]} unquoted. If ${DISTDIR}/${HOME} contains spaces or glob characters, this can break downloads or write to an unexpected path. Quoting these parameters (and generally avoiding word-splitting) would make the fetch logic more robust.
There was a problem hiding this comment.
Code Review
The pull request enhances the distfiles_fetch function in util/build-deps.sh by implementing a more robust file downloading mechanism that sequentially attempts to use curl, wget, or fetch, exiting if no suitable tool is found. The review suggests improving this logic by double-quoting variables in the download commands for better robustness and adding the -r flag to the fetch command to enable resuming interrupted downloads, ensuring consistent behavior across all download tools.
| if command -v curl >/dev/null 2>&1; then | ||
| curl -f -L -C - -o ${distfile} ${DISTS[$package]} | ||
| elif command -v wget >/dev/null 2>&1; then | ||
| wget -c -O ${distfile} ${DISTS[$package]} | ||
| elif command -v fetch >/dev/null 2>&1; then | ||
| fetch -o ${distfile} ${DISTS[$package]} | ||
| else | ||
| notice "No suitable download tool (curl, wget, or fetch) found! Bailing!" | ||
| exit 1 | ||
| fi |
There was a problem hiding this comment.
To improve the robustness and consistency of the download logic, I recommend the following improvements:
- Quoting: Double-quote
${distfile}and${DISTS[$package]}to prevent issues with word splitting or globbing if paths or URLs contain spaces. This is a standard best practice for shell scripts to ensure robustness across different environments. - Fetch Resume: Add the
-rflag to thefetchcommand. This enables resuming interrupted downloads, matching the behavior ofcurl -C -andwget -cas intended by the comment on line 172.
| if command -v curl >/dev/null 2>&1; then | |
| curl -f -L -C - -o ${distfile} ${DISTS[$package]} | |
| elif command -v wget >/dev/null 2>&1; then | |
| wget -c -O ${distfile} ${DISTS[$package]} | |
| elif command -v fetch >/dev/null 2>&1; then | |
| fetch -o ${distfile} ${DISTS[$package]} | |
| else | |
| notice "No suitable download tool (curl, wget, or fetch) found! Bailing!" | |
| exit 1 | |
| fi | |
| if command -v curl >/dev/null 2>&1; then | |
| curl -f -L -C - -o "${distfile}" "${DISTS[$package]}" | |
| elif command -v wget >/dev/null 2>&1; then | |
| wget -c -O "${distfile}" "${DISTS[$package]}" | |
| elif command -v fetch >/dev/null 2>&1; then | |
| fetch -r -o "${distfile}" "${DISTS[$package]}" | |
| else | |
| notice "No suitable download tool (curl, wget, or fetch) found! Bailing!" | |
| exit 1 | |
| fi |
🎯 What: Refactored the
distfiles_fetchroutine inutil/build-deps.shto check for and utilize multiple download commands (curl,wget,fetch) instead of failing whenwgetis absent.\n💡 Why: Enhances script portability across diverse environments (e.g., minimal Linux containers, macOS, BSD variants) wherecurlorfetchmight be the default downloader.\n✅ Verification: Passed./util/build-deps.shdownload phase tests simulating missing utilities, and passed the project's overall test suite.\n✨ Result:build-deps.shnow robustly determines the best download tool available to fetch missing dependencies.PR created automatically by Jules for task 5611211459002387432 started by @segin