🧹 Improve distfiles fetching system in util/build-deps.sh#285
🧹 Improve distfiles fetching system in util/build-deps.sh#285
util/build-deps.sh#285Conversation
… downloading distfiles. The new `fetch_url` function provides fallback logic to download files using `curl`, `wget`, or `fetch` (with appropriate flags) instead of hardcoding `wget`. It properly handles error returns and enables the script to run seamlessly on systems lacking `wget`. Co-authored-by: segin <480709+segin@users.noreply.github.com>
|
👋 Jules, reporting for duty! I'm here to lend a hand with this pull request. When you start a review, I'll add a 👀 emoji to each comment to let you know I've read it. I'll focus on feedback directed at me and will do my best to stay out of conversations between you and other bots or reviewers to keep the noise down. I'll push a commit with your requested changes shortly after. Please note there might be a delay between these steps, but rest assured I'm on the job! For more direct control, you can switch me to Reactive Mode. When this mode is on, I will only act on comments where you specifically mention me with New to Jules? Learn more at jules.google/docs. For security, I will only act on instructions from the user who triggered this task. |
There was a problem hiding this comment.
Code Review
This pull request introduces a fetch_url utility function to provide a more robust and flexible way to download dependency files using curl, wget, or fetch. It also updates the distfiles_fetch logic to utilize this helper and includes error handling for failed downloads. Feedback was provided to include the -r flag for the fetch command to ensure consistent support for resuming interrupted downloads across all supported tools.
| elif command -v wget >/dev/null 2>&1; then | ||
| wget -c -O "$out" "$url" | ||
| elif command -v fetch >/dev/null 2>&1; then | ||
| fetch -o "$out" "$url" |
There was a problem hiding this comment.
The fetch command (commonly used on FreeBSD) is missing the flag to resume an interrupted download, which makes it inconsistent with the curl and wget implementations in this function. Adding the -r (or --restart) flag will ensure that partial downloads are resumed rather than overwritten, matching the behavior described in the comments and the implementation of the other tools.
| fetch -o "$out" "$url" | |
| fetch -r -o "$out" "$url" |
There was a problem hiding this comment.
Pull request overview
Refactors the dependency distfile download logic in util/build-deps.sh by introducing a downloader wrapper function, aiming to reduce coupling to wget and improve failure handling when downloads can’t be retrieved.
Changes:
- Added
fetch_url()to download viacurl,wget, orfetchdepending on availability. - Updated
distfiles_fetch()to usefetch_url()and to fail fast when the download step fails.
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| if command -v curl >/dev/null 2>&1; then | ||
| curl -f -L -C - -o "$out" "$url" | ||
| elif command -v wget >/dev/null 2>&1; then | ||
| wget -c -O "$out" "$url" | ||
| elif command -v fetch >/dev/null 2>&1; then | ||
| fetch -o "$out" "$url" | ||
| else | ||
| notice "No download tool found. Please install curl, wget, or fetch." | ||
| return 1 | ||
| fi |
There was a problem hiding this comment.
fetch_url selects the first available tool but does not actually fall back to the next tool if the chosen one fails (e.g., curl is installed but fails due to TLS/CA/config issues). If the intent is a real fallback mechanism, consider trying wget/fetch when curl exits non-zero (or adjust the PR description to clarify it’s only a tool-selection preference).
| if command -v curl >/dev/null 2>&1; then | |
| curl -f -L -C - -o "$out" "$url" | |
| elif command -v wget >/dev/null 2>&1; then | |
| wget -c -O "$out" "$url" | |
| elif command -v fetch >/dev/null 2>&1; then | |
| fetch -o "$out" "$url" | |
| else | |
| notice "No download tool found. Please install curl, wget, or fetch." | |
| return 1 | |
| fi | |
| local found_tool=0 | |
| local status=1 | |
| if command -v curl >/dev/null 2>&1; then | |
| found_tool=1 | |
| curl -f -L -C - -o "$out" "$url" && return 0 | |
| status=$? | |
| fi | |
| if command -v wget >/dev/null 2>&1; then | |
| found_tool=1 | |
| wget -c -O "$out" "$url" && return 0 | |
| status=$? | |
| fi | |
| if command -v fetch >/dev/null 2>&1; then | |
| found_tool=1 | |
| fetch -o "$out" "$url" && return 0 | |
| status=$? | |
| fi | |
| if [ ${found_tool} -eq 0 ]; then | |
| notice "No download tool found. Please install curl, wget, or fetch." | |
| return 1 | |
| fi | |
| return ${status} |
| @@ -171,7 +187,10 @@ distfiles_fetch () { | |||
|
|
|||
| # If the file exists but didn't pass check, or doesn't exist, we download it using -c (continue) | |||
There was a problem hiding this comment.
The comment says downloads use “-c (continue)”, but the fetch branch in fetch_url doesn’t request resume/continue behavior. Either update the comment to match reality, or add the appropriate resume flag for fetch (if supported) so behavior is consistent across download tools.
| # If the file exists but didn't pass check, or doesn't exist, we download it using -c (continue) | |
| # If the file exists but didn't pass check, or doesn't exist, download it again. |
🛠️ What: Introduced a
fetch_urlfunction toutil/build-deps.shand updateddistfiles_fetchto use it.💡 Why: The dependency fetching script was tightly coupled to
wget. Using a fallback mechanism ensures broader compatibility across POSIX systems (e.g., macOS often defaults tocurl, FreeBSD usesfetch). It also properly surfaces failures if a URL isn't found (e.g., HTTP 404).✅ Verification: Tested the updated
build-deps.shlocally withzsh, confirming it successfully attemptscurland accurately fails out if a 404 page is encountered instead of swallowing the error.PR created automatically by Jules for task 4133156286357211395 started by @segin