From 8d1b7eab1d700967c1173c5c99918aedc7ac1c0b Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Wed, 15 Apr 2026 17:41:40 -0400 Subject: [PATCH 01/11] refactor: simplify mapping schema and add guidance generation to onboard MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit component-repository-mappings.json: - Flatten schema: components now have a 'repos' array instead of nested 'repositories' object + 'container_to_repo_mapping' object - Containers move onto the repo that builds them (more logical) - Remove prose-only fields: branch_strategy, cve_fix_workflow, protected_branches, repository_type, monorepo_packages - Keep essential fields: url, type, default_branch, active_branches, containers, subcomponent (optional), build_location (optional) - File size reduced from ~30KB to ~20KB onboard.md: - Updated to use new simplified schema when adding components - Added Step 5: generate .cve-fix/examples.md for each repo by analyzing CVE PR history (titles, branches, files, co-upgrades, don'ts) — same approach as /guidance.generate --cve-only - Examples file included in the onboarding PR alongside mapping update Co-Authored-By: Claude Sonnet 4.6 (1M context) --- .../cve-fixer/.claude/commands/onboard.md | 253 ++-- .../component-repository-mappings.json | 1198 ++++------------- 2 files changed, 439 insertions(+), 1012 deletions(-) diff --git a/workflows/cve-fixer/.claude/commands/onboard.md b/workflows/cve-fixer/.claude/commands/onboard.md index c5d8a32..7b17342 100644 --- a/workflows/cve-fixer/.claude/commands/onboard.md +++ b/workflows/cve-fixer/.claude/commands/onboard.md @@ -2,10 +2,9 @@ ## Purpose -Guides a team through adding their component and repositories to -`component-repository-mappings.json` and opens a PR to the CVE fixer -workflow repository. The Jira component name must exactly match what is -used in Jira — this is validated against the Jira API during onboarding. +Guides a team through adding their component to `component-repository-mappings.json` +and generating `.cve-fix/examples.md` guidance files for each repo. Opens a single PR +to `ambient-code/workflows` containing both the mapping update and the guidance files. ## Process @@ -13,28 +12,22 @@ used in Jira — this is validated against the Jira API during onboarding. Ask the user for the following, one question at a time: - a. **Jira component name** — must match exactly what appears in Jira - (case-sensitive). Example: `"AI Evaluations"`, `"llm-d"`, `"AutoML"` + a. **Jira component name** — must match exactly what appears in Jira (case-sensitive). + Example: `"AI Evaluations"`, `"llm-d"`, `"AutoML"` - b. **Repos** — for each repo the user wants to add, collect: + b. **Repos** — for each repo: - GitHub URL (e.g. `https://github.com/org/repo`) - Repo type: `upstream`, `midstream`, or `downstream` - - Subcomponent name (optional — only needed if the component has multiple - distinct container chains, e.g. `"inference-scheduler"`, `"autoscaler"`) + - Container image names built from this repo (e.g. `rhoai/odh-container-rhel9`). + Leave empty if unknown — can be added later. + - Subcomponent name (optional — only if this component has multiple distinct + container chains, e.g. `"inference-scheduler"`, `"autoscaler"`) - c. **Container image names** (optional) — the `rhoai/odh-*-rhel9` container - images that map to each repo. These can be left empty and added later. - - Collect all repos before proceeding. Ask: "Do you have more repos to add? - (yes/no)" after each repo until the user is done. + Ask "Do you have more repos to add? (yes/no)" after each one. 2. **Validate Jira Component Name** - Use MCP if available (preferred), otherwise fall back to curl. - Follow the same MCP-first pattern as `cve.find`: - - Try `ToolSearch: select:mcp__mcp-atlassian__jira_search` first - - If found, use it to search: `component = "${COMPONENT_NAME}" AND labels = SecurityTracking` - - If not found, fall back to curl: + Use MCP if available (`select:mcp__mcp-atlassian__jira_search`), otherwise curl: ```bash JIRA_BASE_URL="https://redhat.atlassian.net" @@ -50,96 +43,145 @@ used in Jira — this is validated against the Jira API during onboarding. ISSUE_COUNT=$(echo "$RESULT" | jq '.issues | length') ``` - - If component returns results → confirmed, proceed - - If 0 results → warn user: "No CVE issues found for component - '${COMPONENT_NAME}' with SecurityTracking label. The component name - must match exactly what Jira uses. Do you want to proceed anyway? (yes/no)" - - If Jira credentials not available → skip validation and proceed with a note + - Results found → confirmed, proceed + - 0 results → warn: "No CVE issues found — component name must match Jira exactly. Proceed anyway? (yes/no)" + - Credentials unavailable → skip validation, proceed with a note 3. **Auto-discover Branch Information** - For each GitHub repo provided, fetch branch info automatically: + For each repo, fetch branch info from GitHub: ```bash - for REPO_URL in "${REPOS[@]}"; do + for REPO_URL in "${REPO_URLS[@]}"; do REPO_FULL=$(echo "$REPO_URL" | sed 's|https://github.com/||') - # Verify repo exists gh api repos/${REPO_FULL} --jq '.full_name' 2>/dev/null || { - echo "⚠️ Repo not found: ${REPO_URL}" - continue + echo "⚠️ Repo not found: ${REPO_URL}"; continue } - # Get default branch DEFAULT_BRANCH=$(gh api repos/${REPO_FULL} --jq '.default_branch') - # Get active release branches (rhoai-*, release/*, odh-*, stable) ACTIVE_BRANCHES=$(gh api repos/${REPO_FULL}/branches --paginate \ -q '.[].name' 2>/dev/null | \ grep -E '^(rhoai-[0-9]|release/|odh-[0-9]|stable)' | \ - sort -V | tail -5) # keep 5 most recent + sort -V | tail -5) - echo " ${REPO_FULL}: default=${DEFAULT_BRANCH}, active=[${ACTIVE_BRANCHES}]" + echo "${REPO_FULL}: default=${DEFAULT_BRANCH}, active=[${ACTIVE_BRANCHES}]" done ``` - Show the discovered info to the user and ask for confirmation or corrections. + Show discovered info and ask the user to confirm or correct. 4. **Build Mapping Entry** - Construct the JSON entry following the existing schema: + Construct the simplified JSON entry: ```json { - "": { - "container_to_repo_mapping": { - "": "" - }, - "repositories": { - "": { - "github_url": "https://github.com/", - "default_branch": "", - "active_release_branches": ["", ""], - "branch_strategy": "TBD — to be updated by component team", - "repo_type": "upstream|midstream|downstream", - "subcomponent": "", - "cve_fix_workflow": { - "primary_target": "", - "backport_targets": "" - } - } + "repos": [ + { + "url": "https://github.com/org/repo", + "type": "upstream|midstream|downstream", + "default_branch": "main", + "active_branches": ["rhoai-3.4"], + "containers": ["rhoai/odh-container-rhel9"], + "subcomponent": "optional-name" } - } + ] } ``` - - Omit `subcomponent` if the user didn't provide one - - Omit `container_to_repo_mapping` entries if no containers were provided - - Show the generated JSON to the user and ask: "Does this look correct? (yes/no/edit)" + - Omit `containers` if none provided + - Omit `subcomponent` if not needed + - Show the entry to the user: "Does this look correct? (yes/no/edit)" + +5. **Generate `.cve-fix/examples.md` Guidance** + + For each repo, analyze recent CVE fix PRs and generate a `.cve-fix/examples.md` + file that teaches the CVE fixer workflow how to create PRs matching this repo's + conventions. This follows the same approach as the `/guidance.generate --cve-only` + command. + + ```bash + for REPO_URL in "${REPO_URLS[@]}"; do + REPO_FULL=$(echo "$REPO_URL" | sed 's|https://github.com/||') + echo "Analyzing CVE PRs in ${REPO_FULL}..." + + # Fetch recent merged PRs and filter for CVE-related ones + CVE_PRS=$(gh pr list --repo "$REPO_FULL" --state merged --limit 100 \ + --json number,title,headRefName,body,files,mergedAt \ + --jq '[.[] | select( + (.title | test("CVE-[0-9]{4}-[0-9]+|GHSA-|[Ss]ecurity:|fix\\(cve\\)"; "i")) or + (.headRefName | test("fix/cve-|dependabot/|renovate/"; "i")) + )]' 2>/dev/null) + + CVE_COUNT=$(echo "$CVE_PRS" | jq 'length') + echo " Found ${CVE_COUNT} CVE-related merged PRs" + + # Also check recently closed PRs for rejection patterns + CLOSED_PRS=$(gh pr list --repo "$REPO_FULL" --state closed --limit 30 \ + --json number,title,headRefName,reviews \ + --jq '[.[] | select( + (.title | test("CVE-[0-9]{4}-[0-9]+|GHSA-|[Ss]ecurity:"; "i")) and + (.reviews | map(select(.state == "CHANGES_REQUESTED")) | length > 0) + )]' 2>/dev/null) + done + ``` + + Extract and synthesize patterns: + - **Title conventions**: what format does the repo use? + - **Branch naming**: what pattern are fix branches named? + - **Files changed together**: which files appear together in CVE fixes? + - **Co-upgrade patterns**: when package X is bumped, is Y also bumped? + - **PR description patterns**: what sections are consistently included? + - **Don'ts**: patterns from rejected/closed PRs + + Generate `.cve-fix/examples.md` for each repo: + + ```markdown + + + ## Titles + - (N/M merged PRs) + + ## Branches + - (N/M merged PRs) + + ## Files + - (N/M merged PRs) + + ## Co-upgrades + - When bumping X, also update Y (N/M merged PRs) + + ## PR Description + - + + ## Don'ts + - ❌ + ``` + + If fewer than 3 CVE PRs exist, include: -5. **Set Up Workflows Repository** + ```markdown + + ``` - The mapping file lives in `ambient-code/workflows`. Check write access and - fork if needed: +6. **Set Up Workflows Repository** ```bash WORKFLOWS_REPO="ambient-code/workflows" FORK_USER=$(gh api user --jq '.login' 2>/dev/null) - # Check write access PUSH_ACCESS=$(gh api repos/${WORKFLOWS_REPO} --jq '.permissions.push' 2>/dev/null) if [ "$PUSH_ACCESS" = "true" ]; then - # Clone directly git clone "https://github.com/${WORKFLOWS_REPO}.git" /tmp/workflows-onboard REMOTE="origin" PR_HEAD_PREFIX="" else - # Fork the repo - echo "No write access to ${WORKFLOWS_REPO} — forking..." + echo "No write access — forking ${WORKFLOWS_REPO}..." gh repo fork "$WORKFLOWS_REPO" --clone=false 2>/dev/null || true - - # Sync fork with upstream main gh repo sync "${FORK_USER}/workflows" --source "$WORKFLOWS_REPO" --branch main git clone "https://github.com/${FORK_USER}/workflows.git" /tmp/workflows-onboard cd /tmp/workflows-onboard @@ -149,27 +191,23 @@ used in Jira — this is validated against the Jira API during onboarding. fi ``` -6. **Apply the Mapping Change** +7. **Apply All Changes** ```bash cd /tmp/workflows-onboard - git checkout -b "onboard/${COMPONENT_NAME_SLUG}" + BRANCH_NAME="onboard/${COMPONENT_NAME_SLUG}" + git checkout -b "$BRANCH_NAME" MAPPING_FILE="workflows/cve-fixer/component-repository-mappings.json" - # Write the new component JSON to a temp file first to avoid shell injection - # (component names or JSON values may contain quotes, backslashes, or newlines) + # Add component to mapping file echo "$NEW_COMPONENT_JSON" > /tmp/new_component.json - TODAY=$(date +%Y-%m-%d) python3 - "$MAPPING_FILE" "$COMPONENT_NAME" /tmp/new_component.json "$TODAY" <<'PYEOF' import json, sys - mapping_file = sys.argv[1] - component_name = sys.argv[2] - new_component_file = sys.argv[3] - today = sys.argv[4] + mapping_file, component_name, new_component_file, today = sys.argv[1:] with open(mapping_file) as f: data = json.load(f) @@ -183,76 +221,77 @@ used in Jira — this is validated against the Jira API during onboarding. with open(mapping_file, "w") as f: json.dump(data, f, indent=2, ensure_ascii=False) f.write("\n") - - print(f"Added component: {component_name}") PYEOF rm -f /tmp/new_component.json - - # Validate JSON python3 -m json.tool "$MAPPING_FILE" > /dev/null && echo "✅ JSON valid" - git add "$MAPPING_FILE" + + # Add .cve-fix/examples.md for each repo + for i in "${!REPO_URLS[@]}"; do + REPO_FULL=$(echo "${REPO_URLS[$i]}" | sed 's|https://github.com/||') + EXAMPLES_DIR="workflows/cve-fixer/.cve-fix/$(echo "$REPO_FULL" | tr '/' '-')" + mkdir -p "$EXAMPLES_DIR" + echo "${GENERATED_EXAMPLES[$i]}" > "${EXAMPLES_DIR}/examples.md" + git add "${EXAMPLES_DIR}/examples.md" + done + git commit -m "feat: onboard ${COMPONENT_NAME} to CVE fixer workflow - Add component-to-repository mapping for ${COMPONENT_NAME}: - $(echo "${REPOS[@]}" | tr ' ' '\n' | sed 's/^/- /') + - Add ${COMPONENT_NAME} to component-repository-mappings.json + - Generate .cve-fix/examples.md guidance for each repo Co-Authored-By: Claude Sonnet 4.6 (1M context) " - git push "$REMOTE" "onboard/${COMPONENT_NAME_SLUG}" + + git push "$REMOTE" "$BRANCH_NAME" ``` -7. **Create Pull Request** +8. **Create Pull Request** ```bash gh pr create \ --repo "$WORKFLOWS_REPO" \ --base main \ - --head "${PR_HEAD_PREFIX}onboard/${COMPONENT_NAME_SLUG}" \ + --head "${PR_HEAD_PREFIX}${BRANCH_NAME}" \ --title "feat: onboard ${COMPONENT_NAME} to CVE fixer workflow" \ - --body "$(cat <" + 🤖 Generated by /onboard" ``` -8. **Cleanup** +9. **Cleanup** ```bash rm -rf /tmp/workflows-onboard ``` -## Usage Examples +## Usage ```bash -/onboard +/onboard # fully interactive ``` -The command is fully interactive — it will guide you through each question. - ## Notes -- The Jira component name is case-sensitive and must match exactly +- Jira component name is case-sensitive and must match exactly - Branch info is auto-discovered from GitHub — review and correct if needed -- Container image mappings can be added later by re-running `/onboard` or opening a PR directly -- If you don't have write access to `ambient-code/workflows`, the command will automatically fork the repo and open a PR from your fork +- Container image names can be added later by editing the mapping or re-running `/onboard` +- Generated `.cve-fix/examples.md` improves over time — run `/guidance.update` after more CVE PRs are merged +- Fork of `ambient-code/workflows` is created automatically if you lack write access diff --git a/workflows/cve-fixer/component-repository-mappings.json b/workflows/cve-fixer/component-repository-mappings.json index b2f8d56..4c90be2 100644 --- a/workflows/cve-fixer/component-repository-mappings.json +++ b/workflows/cve-fixer/component-repository-mappings.json @@ -1,1145 +1,555 @@ { "components": { - "AI Core Dashboard": { - "container_to_repo_mapping": { - "odh-dashboard-container": "opendatahub-io/odh-dashboard", - "rhoai/odh-dashboard-rhel8": "opendatahub-io/odh-dashboard", - "rhoai/odh-dashboard-rhel9": "opendatahub-io/odh-dashboard", - "rhoai/odh-mod-arch-gen-ai-rhel9": "opendatahub-io/odh-dashboard", - "rhoai/odh-mod-arch-model-registry-rhel9": "opendatahub-io/odh-dashboard", - "mod-arch-maas": "opendatahub-io/odh-dashboard" - }, - "repositories": { - "opendatahub-io/odh-dashboard": { - "github_url": "https://github.com/opendatahub-io/odh-dashboard", - "default_branch": "main", - "protected_branches": [ - "main", - "rhoai-release", - "odh-release" - ], - "active_release_branches": [ - "v2.29.0-fixes", - "v2.28.0-fixes", - "v2.27.0-fixes" - ], - "branch_strategy": "Fix in main → auto-propagates to stable → rhoai (every 2 hours). Manual cherry-pick to release branches during code freeze.", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "Active vX.X.X-fixes branches for released versions", - "automation": "Auto-sync every 2 hours (main → stable → rhoai)", - "manual_intervention": "Cherry-pick during code freeze or for patch releases" - }, - "repository_type": "monorepo", - "monorepo_packages": { - "packages/gen-ai": "Builds odh-mod-arch-gen-ai container", - "packages/model-registry": "Builds odh-mod-arch-modular-architecture container", - "packages/maas": "Builds mod-arch-maas container", - "packages/kserve": "KServe UI module", - "packages/model-serving": "Model serving UI module" - } - } - } - }, "Model as a Service": { - "container_to_repo_mapping": { - "rhoai/odh-maas-api-rhel9": "opendatahub-io/models-as-a-service" - }, - "repositories": { - "opendatahub-io/models-as-a-service": { - "github_url": "https://github.com/opendatahub-io/models-as-a-service", + "repos": [ + { + "url": "https://github.com/opendatahub-io/models-as-a-service", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "stable", "rhoai", "v0.1.x" ], - "branch_strategy": "Fix in main. stable and rhoai are release snapshots — backport manually as needed. v0.1.x is a separate release branch with independent commits.", - "repo_type": "upstream", + "containers": [ + "rhoai/odh-maas-api-rhel9" + ], "subcomponent": "maas-api", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "stable, rhoai, v0.1.x (manual cherry-pick)" - }, "build_location": "maas-api/" }, - "red-hat-data-services/models-as-a-service": { - "github_url": "https://github.com/red-hat-data-services/models-as-a-service", + { + "url": "https://github.com/red-hat-data-services/models-as-a-service", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.3", "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of upstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", "subcomponent": "maas-api", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - }, "build_location": "maas-api/" } - } - }, - "Model Serving": { - "container_to_repo_mapping": { - "odh-modelmesh-runtime-adapter": "opendatahub-io/modelmesh-runtime-adapter", - "rhoai/odh-modelmesh-runtime-adapter-rhel8": "opendatahub-io/modelmesh-runtime-adapter", - "rhoai/odh-modelmesh-runtime-adapter-rhel9": "opendatahub-io/modelmesh-runtime-adapter", - "odh-model-controller": "opendatahub-io/odh-model-controller", - "odh-mm-rest-proxy": "opendatahub-io/odh-model-controller", - "rhoai/odh-model-controller-rhel8": "opendatahub-io/odh-model-controller", - "rhoai/odh-model-controller-rhel9": "opendatahub-io/odh-model-controller", - "rhoai/odh-kserve-controller-rhel9": "opendatahub-io/kserve", - "rhoai/odh-kserve-storage-initializer-rhel9": "opendatahub-io/kserve", - "rhoai/odh-kserve-agent-rhel9": "opendatahub-io/kserve-agent", - "rhoai/odh-kserve-router-rhel9": "opendatahub-io/kserve-router", - "rhoai/odh-llm-d-inference-scheduler-rhel9": "opendatahub-io/llm-d-inference-scheduler", - "rhoai/odh-modelmesh-serving-controller-rhel8": "opendatahub-io/modelmesh" - }, - "repositories": { - "opendatahub-io/modelmesh-runtime-adapter": { - "github_url": "https://github.com/opendatahub-io/modelmesh-runtime-adapter", - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - } - }, - "opendatahub-io/odh-model-controller": { - "github_url": "https://github.com/opendatahub-io/odh-model-controller", - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - } - }, - "opendatahub-io/kserve": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/kserve" - }, - "opendatahub-io/kserve-agent": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/kserve-agent" - }, - "opendatahub-io/kserve-router": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/kserve-router" - }, - "opendatahub-io/llm-d-inference-scheduler": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/llm-d-inference-scheduler" - }, - "opendatahub-io/modelmesh": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/modelmesh" - } - } - }, - "Notebooks Images": { - "container_to_repo_mapping": { - "rhoai/odh-pipeline-runtime-tensorflow-cuda-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-pipeline-runtime-tensorflow-rocm-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-workbench-jupyter-tensorflow-cuda-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-workbench-jupyter-tensorflow-rocm-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-workbench-jupyter-pytorch-cuda-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-pipeline-runtime-pytorch-cuda-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-workbench-jupyter-pytorch-rocm-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-pipeline-runtime-pytorch-rocm-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-workbench-codeserver-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-workbench-jupyter-datascience-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-pipeline-runtime-datascience-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-workbench-jupyter-minimal-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-workbench-jupyter-trustyai-py312-rhel9": "opendatahub-io/workbench-images", - "rhoai/odh-pipeline-runtime-minimal-py312-rhel9": "opendatahub-io/workbench-images" - }, - "repositories": { - "opendatahub-io/workbench-images": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/workbench-images" - } - } - }, - "AI Pipelines": { - "container_to_repo_mapping": { - "odh-ml-pipelines-driver-container": "opendatahub-io/data-science-pipelines", - "odh-ml-pipelines-api-server-v2-container": "opendatahub-io/data-science-pipelines", - "odh-ml-pipelines-launcher-container": "opendatahub-io/data-science-pipelines", - "odh-ml-pipelines-persistenceagent-container": "opendatahub-io/data-science-pipelines", - "odh-ml-pipelines-scheduledworkflow-container": "opendatahub-io/data-science-pipelines", - "odh-ml-pipelines-cache-container": "opendatahub-io/data-science-pipelines", - "odh-ml-pipelines-api-server-container": "opendatahub-io/data-science-pipelines", - "odh-data-science-pipelines-runtime-container": "opendatahub-io/data-science-pipelines", - "odh-data-science-pipelines-runtime-generic-container": "opendatahub-io/data-science-pipelines", - "odh-ml-pipelines-viewercontroller-argoworkflow-container": "opendatahub-io/data-science-pipelines", - "rhoai/odh-data-science-pipelines-operator-controller-rhel8": "opendatahub-io/data-science-pipelines-operator", - "odh-data-science-pipelines-argo-argoexec-container": "argoproj/argo-workflows", - "odh-data-science-pipelines-argo-workflowcontroller-container": "argoproj/argo-workflows" - }, - "repositories": { - "opendatahub-io/data-science-pipelines": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/data-science-pipelines" - }, - "opendatahub-io/data-science-pipelines-operator": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/data-science-pipelines-operator" - }, - "argoproj/argo-workflows": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "External dependency - not managed by OpenDataHub", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "N/A", - "automation": "N/A", - "excluded_from_automation": true, - "manual_intervention": "Monitor upstream releases and update dependency version" - }, - "notes": "Third-party dependency managed by Argo project. Excluded from automation - track upstream fixes only.", - "github_url": "https://github.com/argoproj/argo-workflows" - } - } - }, - "Notebooks Server": { - "container_to_repo_mapping": { - "rhoai/odh-notebook-controller-rhel8": "opendatahub-io/kubeflow", - "rhoai/odh-kf-notebook-controller-rhel8": "opendatahub-io/kubeflow", - "rhoai/odh-kf-notebook-controller-rhel9": "opendatahub-io/kubeflow", - "rhoai/odh-notebook-controller-rhel9": "opendatahub-io/kubeflow" - }, - "repositories": { - "opendatahub-io/kubeflow": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/kubeflow" - } - } - }, - "Training Kubeflow": { - "container_to_repo_mapping": { - "rhoai/odh-training-operator-rhel8": "opendatahub-io/training-operator", - "rhoai/odh-training-operator-rhel9": "opendatahub-io/training-operator", - "rhoai/odh-notebook-controller-rhel8": "opendatahub-io/notebooks", - "rhoai/odh-kf-notebook-controller-rhel8": "opendatahub-io/notebooks", - "rhoai/odh-notebook-controller-rhel9": "opendatahub-io/notebooks", - "rhoai/odh-kf-notebook-controller-rhel9": "opendatahub-io/notebooks", - "rhoai/odh-kuberay-operator-controller-rhel9": "opendatahub-io/kuberay-operator-controller", - "rhoai/odh-codeflare-operator-rhel8": "opendatahub-io/codeflare-operator", - "rhoai/odh-codeflare-operator-rhel9": "opendatahub-io/codeflare-operator" - }, - "repositories": { - "opendatahub-io/training-operator": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/training-operator" - }, - "opendatahub-io/notebooks": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/notebooks" - }, - "opendatahub-io/kuberay-operator-controller": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/kuberay-operator-controller" - }, - "opendatahub-io/codeflare-operator": { - "default_branch": "main", - "protected_branches": [], - "active_release_branches": [], - "branch_strategy": "TBD - needs investigation", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "TBD", - "automation": "Unknown", - "manual_intervention": "Unknown" - }, - "github_url": "https://github.com/opendatahub-io/codeflare-operator" - } - } + ] }, "llm-d": { - "container_to_repo_mapping": { - "rhoai/odh-llm-d-inference-scheduler-rhel9": "opendatahub-io/llm-d-inference-scheduler", - "rhoai/odh-llm-d-routing-sidecar-rhel9": "red-hat-data-services/llm-d-routing-sidecar", - "rhoai/odh-workload-variant-autoscaler-controller-rhel9": "opendatahub-io/workload-variant-autoscaler" - }, - "repositories": { - "llm-d/llm-d-inference-scheduler": { - "github_url": "https://github.com/llm-d/llm-d-inference-scheduler", + "repos": [ + { + "url": "https://github.com/llm-d/llm-d-inference-scheduler", + "type": "upstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-0.5", "release-0.6" ], - "branch_strategy": "Fix in main. Release branches follow pattern release-X.Y.", - "repo_type": "upstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-0.5, release-0.6" - }, "subcomponent": "inference-scheduler" }, - "opendatahub-io/llm-d-inference-scheduler": { - "github_url": "https://github.com/opendatahub-io/llm-d-inference-scheduler", + { + "url": "https://github.com/opendatahub-io/llm-d-inference-scheduler", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-0.2", "release-0.3.1", "release-v0.4", "stable-2.x" ], - "branch_strategy": "Fork of upstream llm-d/llm-d-inference-scheduler. Synced via sync branches. ODH release branches via Konflux replicator.", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-0.2, release-0.3.1, release-v0.4, stable-2.x" - }, + "containers": [ + "rhoai/odh-llm-d-inference-scheduler-rhel9" + ], "subcomponent": "inference-scheduler" }, - "red-hat-data-services/llm-d-inference-scheduler": { - "github_url": "https://github.com/red-hat-data-services/llm-d-inference-scheduler", + { + "url": "https://github.com/red-hat-data-services/llm-d-inference-scheduler", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.3", "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - }, "subcomponent": "inference-scheduler" }, - "red-hat-data-services/llm-d-routing-sidecar": { - "github_url": "https://github.com/red-hat-data-services/llm-d-routing-sidecar", + { + "url": "https://github.com/red-hat-data-services/llm-d-routing-sidecar", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-2.25", "rhoai-3.0", "rhoai-3.2" ], - "branch_strategy": "Fork of upstream (now archived). Downstream only — upstream code migrated into llm-d-inference-scheduler. No branches beyond rhoai-3.2.", - "repo_type": "downstream", - "notes": "Upstream llm-d/llm-d-routing-sidecar is archived; code moved to llm-d-inference-scheduler (cmd/pd_sidecar). This downstream repo may be phased out in future releases.", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-2.25, rhoai-3.0, rhoai-3.2" - }, - "subcomponent": "routing-sidecar" + "containers": [ + "rhoai/odh-llm-d-routing-sidecar-rhel9" + ], + "subcomponent": "routing-sidecar", + "notes": "Upstream llm-d/llm-d-routing-sidecar is archived; code moved to llm-d-inference-scheduler (cmd/pd_sidecar). This downstream repo may be phased out in future releases." }, - "llm-d-incubation/batch-gateway": { - "github_url": "https://github.com/llm-d-incubation/batch-gateway", + { + "url": "https://github.com/llm-d-incubation/batch-gateway", + "type": "upstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fix in main. No formal release branching documented.", - "repo_type": "upstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - }, + "active_branches": [], "subcomponent": "batch-gateway" }, - "opendatahub-io/batch-gateway": { - "github_url": "https://github.com/opendatahub-io/batch-gateway", + { + "url": "https://github.com/opendatahub-io/batch-gateway", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-v0.5" ], - "branch_strategy": "Fork of upstream llm-d-incubation/batch-gateway.", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-v0.5" - }, "subcomponent": "batch-gateway" }, - "red-hat-data-services/batch-gateway": { - "github_url": "https://github.com/red-hat-data-services/batch-gateway", + { + "url": "https://github.com/red-hat-data-services/batch-gateway", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - }, "subcomponent": "batch-gateway" }, - "llm-d/llm-d-workload-variant-autoscaler": { - "github_url": "https://github.com/llm-d/llm-d-workload-variant-autoscaler", + { + "url": "https://github.com/llm-d/llm-d-workload-variant-autoscaler", + "type": "upstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-0.4.2" ], - "branch_strategy": "Fix in main. Release branches follow pattern release-X.Y.Z.", - "repo_type": "upstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-0.4.2" - }, "subcomponent": "autoscaler" }, - "opendatahub-io/workload-variant-autoscaler": { - "github_url": "https://github.com/opendatahub-io/workload-variant-autoscaler", + { + "url": "https://github.com/opendatahub-io/workload-variant-autoscaler", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-v0.5" ], - "branch_strategy": "Fork of upstream llm-d/llm-d-workload-variant-autoscaler. Note: repo name differs from upstream (no llm-d- prefix).", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-v0.5" - }, + "containers": [ + "rhoai/odh-workload-variant-autoscaler-controller-rhel9" + ], "subcomponent": "autoscaler" }, - "red-hat-data-services/workload-variant-autoscaler": { - "github_url": "https://github.com/red-hat-data-services/workload-variant-autoscaler", + { + "url": "https://github.com/red-hat-data-services/workload-variant-autoscaler", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - }, "subcomponent": "autoscaler" } - } + ] }, "AI Evaluations": { - "container_to_repo_mapping": { - "rhoai/odh-ta-lmes-driver-rhel9": "opendatahub-io/trustyai-service-operator", - "rhoai/odh-ta-lmes-job-rhel9": "opendatahub-io/lm-evaluation-harness", - "rhoai/odh-trustyai-ragas-lls-provider-dsp-rhel9": "opendatahub-io/llama-stack-provider-ragas", - "rhoai/odh-eval-hub-rhel9": "opendatahub-io/eval-hub", - "rhoai/odh-trustyai-garak-lls-provider-dsp-rhel9": "opendatahub-io/llama-stack-provider-trustyai-garak" - }, - "repositories": { - "eval-hub/eval-hub": { - "github_url": "https://github.com/eval-hub/eval-hub", + "repos": [ + { + "url": "https://github.com/eval-hub/eval-hub", + "type": "upstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fix in main. Feature branches follow pattern feature/name or fix/issue.", - "repo_type": "upstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - }, + "active_branches": [], "subcomponent": "eval-hub" }, - "eval-hub/eval-hub-sdk": { - "github_url": "https://github.com/eval-hub/eval-hub-sdk", + { + "url": "https://github.com/eval-hub/eval-hub-sdk", + "type": "upstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fix in main.", - "repo_type": "upstream", - "notes": "No midstream/downstream forks exist yet.", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - }, - "subcomponent": "eval-hub-sdk" + "active_branches": [], + "subcomponent": "eval-hub-sdk", + "notes": "No midstream/downstream forks exist yet." }, - "eval-hub/eval-hub-contrib": { - "github_url": "https://github.com/eval-hub/eval-hub-contrib", + { + "url": "https://github.com/eval-hub/eval-hub-contrib", + "type": "upstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fix in main.", - "repo_type": "upstream", - "notes": "No midstream/downstream forks exist yet.", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - }, - "subcomponent": "eval-hub-contrib" + "active_branches": [], + "subcomponent": "eval-hub-contrib", + "notes": "No midstream/downstream forks exist yet." }, - "trustyai-explainability/llama-stack-provider-trustyai-garak": { - "github_url": "https://github.com/trustyai-explainability/llama-stack-provider-trustyai-garak", + { + "url": "https://github.com/trustyai-explainability/llama-stack-provider-trustyai-garak", + "type": "upstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fix in main.", - "repo_type": "upstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - }, + "active_branches": [], "subcomponent": "trustyai-garak" }, - "trustyai-explainability/trustyai-service-operator": { - "github_url": "https://github.com/trustyai-explainability/trustyai-service-operator", + { + "url": "https://github.com/trustyai-explainability/trustyai-service-operator", + "type": "upstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release/1.37.0", "release/1.38.0" ], - "branch_strategy": "Fix in main. Release branches follow pattern release/X.Y.Z.", - "repo_type": "upstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release/1.37.0, release/1.38.0" - }, "subcomponent": "trustyai-service-operator" }, - "opendatahub-io/eval-hub": { - "github_url": "https://github.com/opendatahub-io/eval-hub", + { + "url": "https://github.com/opendatahub-io/eval-hub", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release/odh-3.4", "stable" ], - "branch_strategy": "Fork of upstream eval-hub/eval-hub.", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release/odh-3.4, stable" - }, + "containers": [ + "rhoai/odh-eval-hub-rhel9" + ], "subcomponent": "eval-hub" }, - "opendatahub-io/lm-evaluation-harness": { - "github_url": "https://github.com/opendatahub-io/lm-evaluation-harness", + { + "url": "https://github.com/opendatahub-io/lm-evaluation-harness", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release/odh-3.3", "release/odh-3.4", "release/odh-3.4-ea2", "release/odh-3.5" ], - "branch_strategy": "ODH fork. Release branches follow pattern release/odh-X.Y.", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release/odh-3.3, release/odh-3.4, release/odh-3.4-ea2, release/odh-3.5" - }, + "containers": [ + "rhoai/odh-ta-lmes-job-rhel9" + ], "subcomponent": "lm-evaluation-harness" }, - "opendatahub-io/llama-stack-provider-trustyai-garak": { - "github_url": "https://github.com/opendatahub-io/llama-stack-provider-trustyai-garak", + { + "url": "https://github.com/opendatahub-io/llama-stack-provider-trustyai-garak", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release/odh-3.4", "stable" ], - "branch_strategy": "Fork of upstream trustyai-explainability/llama-stack-provider-trustyai-garak.", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release/odh-3.4, stable" - }, + "containers": [ + "rhoai/odh-trustyai-garak-lls-provider-dsp-rhel9" + ], "subcomponent": "trustyai-garak" }, - "opendatahub-io/trustyai-service-operator": { - "github_url": "https://github.com/opendatahub-io/trustyai-service-operator", + { + "url": "https://github.com/opendatahub-io/trustyai-service-operator", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release/odh-3.3", "release/odh-3.4", "release/odh-3.4-ea2" ], - "branch_strategy": "Fork of upstream trustyai-explainability/trustyai-service-operator. Release branches follow pattern release/odh-X.Y.", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release/odh-3.3, release/odh-3.4, release/odh-3.4-ea2" - }, + "containers": [ + "rhoai/odh-ta-lmes-driver-rhel9" + ], "subcomponent": "trustyai-service-operator" }, - "red-hat-data-services/eval-hub": { - "github_url": "https://github.com/red-hat-data-services/eval-hub", + { + "url": "https://github.com/red-hat-data-services/eval-hub", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - }, "subcomponent": "eval-hub" }, - "red-hat-data-services/lm-evaluation-harness": { - "github_url": "https://github.com/red-hat-data-services/lm-evaluation-harness", + { + "url": "https://github.com/red-hat-data-services/lm-evaluation-harness", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.3", "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - }, "subcomponent": "lm-evaluation-harness" }, - "red-hat-data-services/llama-stack-provider-trustyai-garak": { - "github_url": "https://github.com/red-hat-data-services/llama-stack-provider-trustyai-garak", + { + "url": "https://github.com/red-hat-data-services/llama-stack-provider-trustyai-garak", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.3", "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - }, "subcomponent": "trustyai-garak" }, - "red-hat-data-services/trustyai-service-operator": { - "github_url": "https://github.com/red-hat-data-services/trustyai-service-operator", + { + "url": "https://github.com/red-hat-data-services/trustyai-service-operator", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.3", "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - }, "subcomponent": "trustyai-service-operator" }, - "trustyai-explainability/llama-stack-provider-ragas": { - "github_url": "https://github.com/trustyai-explainability/llama-stack-provider-ragas", + { + "url": "https://github.com/trustyai-explainability/llama-stack-provider-ragas", + "type": "upstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release/0.4.x", "release/0.5.x" ], - "branch_strategy": "Fix in main. Release branches follow pattern release/X.Y.x.", - "repo_type": "upstream", - "subcomponent": "trustyai-ragas", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release/0.4.x, release/0.5.x" - } + "subcomponent": "trustyai-ragas" }, - "opendatahub-io/llama-stack-provider-ragas": { - "github_url": "https://github.com/opendatahub-io/llama-stack-provider-ragas", + { + "url": "https://github.com/opendatahub-io/llama-stack-provider-ragas", + "type": "midstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release/odh-3.3", "release/odh-3.4-ea2", "stable" ], - "branch_strategy": "Fork of upstream trustyai-explainability/llama-stack-provider-ragas. Release branches follow pattern release/odh-X.Y.", - "repo_type": "midstream", - "subcomponent": "trustyai-ragas", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release/odh-3.3, release/odh-3.4-ea2, stable" - } + "containers": [ + "rhoai/odh-trustyai-ragas-lls-provider-dsp-rhel9" + ], + "subcomponent": "trustyai-ragas" }, - "red-hat-data-services/llama-stack-provider-ragas": { - "github_url": "https://github.com/red-hat-data-services/llama-stack-provider-ragas", + { + "url": "https://github.com/red-hat-data-services/llama-stack-provider-ragas", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.3", "rhoai-3.4", "rhoai-3.4-ea.1", "rhoai-3.4-ea.2" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "subcomponent": "trustyai-ragas", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - } + "subcomponent": "trustyai-ragas" } - } + ] }, "AutoML": { - "container_to_repo_mapping": { - "managed-open-data-hub/odh-automl-rhel9": "red-hat-data-services/pipelines-components" - }, - "repositories": { - "kubeflow/pipelines-components": { - "github_url": "https://github.com/kubeflow/pipelines-components", + "repos": [ + { + "url": "https://github.com/kubeflow/pipelines-components", + "type": "upstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fix in main. No formal release branching documented.", - "repo_type": "upstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - } + "active_branches": [] }, - "opendatahub-io/pipelines-components": { - "github_url": "https://github.com/opendatahub-io/pipelines-components", + { + "url": "https://github.com/opendatahub-io/pipelines-components", + "type": "midstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fork of upstream kubeflow/pipelines-components.", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - } + "active_branches": [] }, - "red-hat-data-services/pipelines-components": { - "github_url": "https://github.com/red-hat-data-services/pipelines-components", + { + "url": "https://github.com/red-hat-data-services/pipelines-components", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.4" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "notes": "Monorepo containing both AutoML (components/automl/) and AutoRAG (components/autorag/) components.", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.4" - } + "containers": [ + "managed-open-data-hub/odh-automl-rhel9" + ], + "notes": "Monorepo containing both AutoML (components/automl/) and AutoRAG (components/autorag/) components." } - } + ] }, "AutoRAG": { - "container_to_repo_mapping": {}, - "repositories": { - "kubeflow/pipelines-components": { - "github_url": "https://github.com/kubeflow/pipelines-components", + "repos": [ + { + "url": "https://github.com/kubeflow/pipelines-components", + "type": "upstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fix in main. No formal release branching documented.", - "repo_type": "upstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - } + "active_branches": [] }, - "opendatahub-io/pipelines-components": { - "github_url": "https://github.com/opendatahub-io/pipelines-components", + { + "url": "https://github.com/opendatahub-io/pipelines-components", + "type": "midstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Fork of upstream kubeflow/pipelines-components.", - "repo_type": "midstream", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "None" - } + "active_branches": [] }, - "red-hat-data-services/pipelines-components": { - "github_url": "https://github.com/red-hat-data-services/pipelines-components", + { + "url": "https://github.com/red-hat-data-services/pipelines-components", + "type": "downstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "rhoai-3.4" ], - "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", - "repo_type": "downstream", - "notes": "Monorepo containing both AutoML (components/automl/) and AutoRAG (components/autorag/) components.", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "rhoai-3.4" - } + "notes": "Monorepo containing both AutoML (components/automl/) and AutoRAG (components/autorag/) components." }, - "IBM/ai4rag": { - "github_url": "https://github.com/IBM/ai4rag", + { + "url": "https://github.com/IBM/ai4rag", + "type": "upstream", "default_branch": "main", - "active_release_branches": [], - "branch_strategy": "Python package upstream. CVEs in ai4rag manifest as container CVEs in pipelines-components — fix by updating ai4rag version there.", - "repo_type": "upstream", - "notes": "No containerization — distributed as a Python package. No ODH/RHDS forks exist. Excluded from automation; track upstream releases and update dependency version in pipelines-components.", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "N/A", - "excluded_from_automation": true - } + "active_branches": [], + "notes": "No containerization — distributed as a Python package. No ODH/RHDS forks exist. Excluded from automation; track upstream releases and update dependency version in pipelines-components." } - } + ] }, "Observability": { - "container_to_repo_mapping": { - "rhacm2/multicluster-observability-rhel9-operator": "stolostron/multicluster-observability-operator", - "rhacm2/acm-multicluster-observability-addon-rhel9": "stolostron/multicluster-observability-addon", - "rhacm2/kube-state-metrics-rhel9": "stolostron/kube-state-metrics", - "rhacm2/observatorium-rhel9": "stolostron/observatorium", - "rhacm2/observatorium-operator-rhel9": "stolostron/observatorium-operator", - "rhacm2/thanos-rhel9": "stolostron/thanos", - "rhacm2/thanos-receive-controller-rhel9": "stolostron/thanos-receive-controller", - "rhacm2/prometheus-alertmanager-rhel9": "stolostron/prometheus-alertmanager", - "rhacm2/prometheus-rhel9": "stolostron/prometheus", - "rhacm2/prometheus-operator-rhel9": "stolostron/prometheus-operator", - "rhacm2/node-exporter-rhel9": "stolostron/node-exporter", - "rhacm2/kube-rbac-proxy-rhel9": "stolostron/kube-rbac-proxy", - "rhacm2/acm-grafana-rhel9": "stolostron/grafana", - "rhacm2/memcached-exporter-rhel9": "stolostron/memcached-exporter" - }, - "repositories": { - "stolostron/multicluster-observability-operator": { - "github_url": "https://github.com/stolostron/multicluster-observability-operator", + "repos": [ + { + "url": "https://github.com/stolostron/multicluster-observability-operator", + "type": "upstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Fix in main, backport to active release branches (release-2.13 through release-2.16)", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make lint", - "build_command": "make build", - "notes": "Go project. Run 'go mod tidy' after dependency updates. CI config in .github/workflows/" + "containers": [ + "rhacm2/multicluster-observability-rhel9-operator" + ] }, - "stolostron/multicluster-observability-addon": { - "github_url": "https://github.com/stolostron/multicluster-observability-addon", + { + "url": "https://github.com/stolostron/multicluster-observability-addon", + "type": "upstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Fix in main, backport to active release branches (release-2.13 through release-2.16)", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make lint", - "build_command": "make addon", - "special_requirements": [ - "Uses bingo for tool management (.bingo/Variables.mk)", - "Different golangci-lint versions per branch (v2.0.2 on release-2.14, v2.5.0 on release-2.16+)", - "May require 'replace' directives for transitive dependency issues (e.g., go.opentelemetry.io/contrib/otelconf)" - ], - "notes": "Go project with OpenTelemetry dependencies. Run 'make deps' to verify go.mod/go.sum completeness." + "containers": [ + "rhacm2/acm-multicluster-observability-addon-rhel9" + ] }, - "stolostron/kube-state-metrics": { - "github_url": "https://github.com/stolostron/kube-state-metrics", + { + "url": "https://github.com/stolostron/kube-state-metrics", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches only (no main branch used for CVE fixes). Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "notes": "No main branch - work directly on release branches" + "containers": [ + "rhacm2/kube-state-metrics-rhel9" + ] }, - "stolostron/observatorium": { - "github_url": "https://github.com/stolostron/observatorium", + { + "url": "https://github.com/stolostron/observatorium", + "type": "upstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Fix in main, backport to active release branches (release-2.13 through release-2.16)", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "special_requirements": [ - "Uses 'replace' directives in go.mod for dependency pinning", - "API compatibility: prometheus/common version upgrades may require code changes (e.g., version.NewCollector removed in v0.63.0)", - "Vendor directory excluded in .gitignore - CI runs 'go mod vendor' during build" - ], - "notes": "Go project. Check main.go for API usage when upgrading prometheus/common or similar packages." + "containers": [ + "rhacm2/observatorium-rhel9" + ] }, - "stolostron/observatorium-operator": { - "github_url": "https://github.com/stolostron/observatorium-operator", + { + "url": "https://github.com/stolostron/observatorium-operator", + "type": "upstream", "default_branch": "main", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Fix in main, backport to active release branches (release-2.13 through release-2.16)", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "notes": "Go project. Verify dependency usage before applying CVE fixes." + "containers": [ + "rhacm2/observatorium-operator-rhel9" + ] }, - "stolostron/thanos": { - "github_url": "https://github.com/stolostron/thanos", + { + "url": "https://github.com/stolostron/thanos", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches (release-2.13 through release-2.17). Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "notes": "Go project. Thanos fork." + "containers": [ + "rhacm2/thanos-rhel9" + ] }, - "stolostron/thanos-receive-controller": { - "github_url": "https://github.com/stolostron/thanos-receive-controller", + { + "url": "https://github.com/stolostron/thanos-receive-controller", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches only. Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "special_requirements": [ - "CI configuration in .github/env (golang-version setting)", - "Go version upgrades require updating both go.mod AND .github/env", - "golangci-lint must be built with Go version >= project's Go version" - ], - "notes": "Update .github/env golang-version when upgrading Go version in go.mod" + "containers": [ + "rhacm2/thanos-receive-controller-rhel9" + ] }, - "stolostron/prometheus-alertmanager": { - "github_url": "https://github.com/stolostron/prometheus-alertmanager", + { + "url": "https://github.com/stolostron/prometheus-alertmanager", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches (release-2.13 through release-2.17). Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "special_requirements": [ - "CI configuration in .github/workflows/golangci-lint.yml", - "Go version upgrades require updating both go.mod AND .github/workflows/.yml", - "golangci-lint version pinning: use 'version: latest' for Go 1.24+ compatibility", - "Workflow scope required on GitHub PAT to modify .github/workflows/.yml files" - ], - "notes": "Update .github/workflows/golangci-lint.yml go-version when upgrading Go version. Use 'version: latest' for golangci-lint." + "containers": [ + "rhacm2/prometheus-alertmanager-rhel9" + ] }, - "stolostron/prometheus": { - "github_url": "https://github.com/stolostron/prometheus", + { + "url": "https://github.com/stolostron/prometheus", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches (release-2.13 through release-2.17). Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "notes": "Go project. Standard prometheus fork." + "containers": [ + "rhacm2/prometheus-rhel9" + ] }, - "stolostron/prometheus-operator": { - "github_url": "https://github.com/stolostron/prometheus-operator", + { + "url": "https://github.com/stolostron/prometheus-operator", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches (release-2.13 through release-2.17). Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "notes": "Go project. Kubernetes operator for Prometheus." + "containers": [ + "rhacm2/prometheus-operator-rhel9" + ] }, - "stolostron/node-exporter": { - "github_url": "https://github.com/stolostron/node-exporter", + { + "url": "https://github.com/stolostron/node-exporter", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches (release-2.13 through release-2.17). Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "notes": "Go project. Prometheus node exporter." + "containers": [ + "rhacm2/node-exporter-rhel9" + ] }, - "stolostron/kube-rbac-proxy": { - "github_url": "https://github.com/stolostron/kube-rbac-proxy", + { + "url": "https://github.com/stolostron/kube-rbac-proxy", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", @@ -1150,66 +560,44 @@ "backplane-2.7", "backplane-2.6" ], - "branch_strategy": "Release branches (release-2.13 through release-2.17). Fix in latest release branch first. Backplane branches (backplane-2.6 through backplane-2.10). Different branch naming pattern from other observability repos.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13, backplane-2.10, backplane-2.9, backplane-2.8, backplane-2.7, backplane-2.6" - }, - "test_command": "make test-unit", - "build_command": "make build", - "special_requirements": [ - "Different branch naming: backplane-X.Y in addtion to release-X.Y", - "May require k8s.io/klog/v2 compatibility updates when upgrading grpc", - "Go version upgrades may be required (e.g., grpc v1.79.3 requires Go 1.24.0)", - "Some branches may need downgrading to consistent versions (e.g., backplane-2.9 and 2.10 had grpc v1.80.0, downgraded to v1.79.3 for consistency)" - ], - "notes": "Older branches (backplane-2.6, 2.7) use Go 1.23-1.24 with grpc v1.56.3. Newer branches (2.9, 2.10) had newer versions but were standardized to v1.79.3." + "containers": [ + "rhacm2/kube-rbac-proxy-rhel9" + ] }, - "stolostron/grafana": { - "github_url": "https://github.com/stolostron/grafana", + { + "url": "https://github.com/stolostron/grafana", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches (release-2.13 through release-2.17). Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "notes": "Go project. Grafana fork." + "containers": [ + "rhacm2/acm-grafana-rhel9" + ] }, - "stolostron/memcached-exporter": { - "github_url": "https://github.com/stolostron/memcached-exporter", + { + "url": "https://github.com/stolostron/memcached-exporter", + "type": "upstream", "default_branch": "release-2.17", - "active_release_branches": [ + "active_branches": [ "release-2.16", "release-2.15", "release-2.14", "release-2.13" ], - "branch_strategy": "Release branches (release-2.13 through release-2.17). Fix in latest release branch first.", - "repo_type": "golang", - "cve_fix_workflow": { - "primary_target": "release-2.17", - "backport_targets": "release-2.16, release-2.15, release-2.14, release-2.13" - }, - "test_command": "make test", - "build_command": "make build", - "notes": "Go project. Memcached exporter." + "containers": [ + "rhacm2/memcached-exporter-rhel9" + ] } - } + ] } }, "metadata": { "description": "Component to repository and branch mappings for CVE fix workflow automation", "purpose": "Maps Jira components to GitHub repositories and their branch strategies for automated CVE patching", - "last_updated": "2026-03-29" + "last_updated": "2026-04-16" } } From c61d1eed0eb639b75f4a989cb955ab8f3a24f25f Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Wed, 15 Apr 2026 17:47:30 -0400 Subject: [PATCH 02/11] feat: auto-discover container names from Jira pscomponent: labels in /onboard Instead of asking users to provide container image names manually, query Jira for pscomponent: labels on existing CVE issues and extract the container names automatically. Each Jira CVE ticket has labels like: pscomponent:rhoai/odh-container-rhel9 These are collected, deduplicated, and assigned to the downstream repo in the mapping entry. No manual input needed for containers. If Jira is unavailable or no pscomponent: labels exist, the containers field is omitted and can be added later. Co-Authored-By: Claude Sonnet 4.6 (1M context) --- .../cve-fixer/.claude/commands/onboard.md | 70 ++++++++++++++++--- 1 file changed, 59 insertions(+), 11 deletions(-) diff --git a/workflows/cve-fixer/.claude/commands/onboard.md b/workflows/cve-fixer/.claude/commands/onboard.md index 7b17342..4dda3c6 100644 --- a/workflows/cve-fixer/.claude/commands/onboard.md +++ b/workflows/cve-fixer/.claude/commands/onboard.md @@ -18,14 +18,18 @@ to `ambient-code/workflows` containing both the mapping update and the guidance b. **Repos** — for each repo: - GitHub URL (e.g. `https://github.com/org/repo`) - Repo type: `upstream`, `midstream`, or `downstream` - - Container image names built from this repo (e.g. `rhoai/odh-container-rhel9`). - Leave empty if unknown — can be added later. - Subcomponent name (optional — only if this component has multiple distinct container chains, e.g. `"inference-scheduler"`, `"autoscaler"`) + Container image names are auto-discovered from Jira in Step 2 — no need to provide them manually. + Ask "Do you have more repos to add? (yes/no)" after each one. -2. **Validate Jira Component Name** +2. **Validate Jira Component and Auto-discover Container Images** + + Query Jira to both validate the component name AND extract container image names + from `pscomponent:` labels on existing CVE issues. This avoids asking the user + to provide container names manually. Use MCP if available (`select:mcp__mcp-atlassian__jira_search`), otherwise curl: @@ -38,14 +42,35 @@ to `ambient-code/workflows` containing both the mapping update and the guidance RESULT=$(curl -s -X GET --connect-timeout 10 --max-time 15 \ -H "Authorization: Basic ${AUTH}" \ -H "Content-Type: application/json" \ - "${JIRA_BASE_URL}/rest/api/3/search/jql?jql=${ENCODED}&maxResults=1&fields=key,summary") + "${JIRA_BASE_URL}/rest/api/3/search/jql?jql=${ENCODED}&maxResults=100&fields=key,summary,labels") ISSUE_COUNT=$(echo "$RESULT" | jq '.issues | length') ``` - - Results found → confirmed, proceed - 0 results → warn: "No CVE issues found — component name must match Jira exactly. Proceed anyway? (yes/no)" - - Credentials unavailable → skip validation, proceed with a note + - Credentials unavailable → skip, proceed with a note (containers must be added manually later) + + **Extract container image names from `pscomponent:` labels:** + + ```bash + # Each Jira issue has labels like "pscomponent:rhoai/odh-container-rhel9" + # Collect all unique container names across all issues + DISCOVERED_CONTAINERS=$(echo "$RESULT" | jq -r ' + .issues[].fields.labels[] + | select(startswith("pscomponent:")) + | ltrimstr("pscomponent:") + ' | sort -u) + + if [ -n "$DISCOVERED_CONTAINERS" ]; then + echo "✅ Auto-discovered container images from Jira:" + echo "$DISCOVERED_CONTAINERS" | sed 's/^/ - /' + else + echo "ℹ️ No pscomponent: labels found — containers must be added manually later" + fi + ``` + + Store `DISCOVERED_CONTAINERS` for use in Step 4 to populate the `containers` field + on the correct repo entry. 3. **Auto-discover Branch Information** @@ -74,24 +99,47 @@ to `ambient-code/workflows` containing both the mapping update and the guidance 4. **Build Mapping Entry** + Assign the auto-discovered containers to the correct repos. Containers from Jira + typically map to the **downstream** repo (they are built from `red-hat-data-services/*`). + For upstream and midstream repos, leave `containers` empty unless the user specifies otherwise. + + ```bash + # Match each discovered container to the repo that builds it + # Containers from pscomponent: labels belong to the downstream repo + for REPO_URL in "${REPO_URLS[@]}"; do + REPO_TYPE="${REPO_TYPES[$i]}" + if [ "$REPO_TYPE" = "downstream" ]; then + REPO_CONTAINERS="$DISCOVERED_CONTAINERS" + else + REPO_CONTAINERS="" # upstream/midstream don't build the RHOAI container directly + fi + done + ``` + Construct the simplified JSON entry: ```json { "repos": [ { - "url": "https://github.com/org/repo", - "type": "upstream|midstream|downstream", + "url": "https://github.com/org/upstream-repo", + "type": "upstream", + "default_branch": "main", + "active_branches": ["release-0.6"] + }, + { + "url": "https://github.com/org/downstream-repo", + "type": "downstream", "default_branch": "main", "active_branches": ["rhoai-3.4"], - "containers": ["rhoai/odh-container-rhel9"], - "subcomponent": "optional-name" + "containers": ["rhoai/odh-container-rhel9"] } ] } ``` - - Omit `containers` if none provided + - Containers auto-populated from Jira `pscomponent:` labels on downstream repo + - If no containers discovered, omit the field (can be added later) - Omit `subcomponent` if not needed - Show the entry to the user: "Does this look correct? (yes/no/edit)" From 3f3944fa725993f0b01f8c537d3e3334a0dc682b Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Wed, 15 Apr 2026 18:05:41 -0400 Subject: [PATCH 03/11] feat: add CVE fixer dashboard Standalone dashboard (similar style to shepard) that tracks: - Fix PRs opened / merged - Unique CVEs the workflow attempted to fix - Per-component breakdown - Components onboarded Scripts: - scripts/collect-data.js: scans onboarded repos from mapping file, finds fix/cve-* PRs, aggregates metrics, pushes data.json to repo Dashboard (public/index.html): - Overview: stat cards + timeline chart + component/status charts - Fix PRs: full table with status, CVE, component, repo, dates - CVEs: list of unique CVEs with PR counts and component breakdown - Components: per-component stat cards PatternFly + Chart.js, same visual style as shepard dashboard. No dependency on or mixing with any other dashboard. Co-Authored-By: Claude Sonnet 4.6 (1M context) --- workflows/cve-fixer/dashboard/README.md | 28 ++ .../cve-fixer/dashboard/public/index.html | 351 ++++++++++++++++++ .../dashboard/scripts/collect-data.js | 235 ++++++++++++ .../cve-fixer/dashboard/scripts/package.json | 12 + 4 files changed, 626 insertions(+) create mode 100644 workflows/cve-fixer/dashboard/README.md create mode 100644 workflows/cve-fixer/dashboard/public/index.html create mode 100644 workflows/cve-fixer/dashboard/scripts/collect-data.js create mode 100644 workflows/cve-fixer/dashboard/scripts/package.json diff --git a/workflows/cve-fixer/dashboard/README.md b/workflows/cve-fixer/dashboard/README.md new file mode 100644 index 0000000..fa4c134 --- /dev/null +++ b/workflows/cve-fixer/dashboard/README.md @@ -0,0 +1,28 @@ +# CVE Fixer Dashboard + +Tracks fix PRs opened/merged, unique CVEs addressed, and components onboarded across all repos in the CVE fixer workflow. + +## Setup + +```bash +cd scripts +GITHUB_TOKEN= WORKFLOW_REPO=angaduom/workflows node collect-data.js +``` + +This scans all repos in `component-repository-mappings.json`, finds CVE fix PRs (branch `fix/cve-*` or title matching `CVE-YYYY-XXXXX`), and pushes `data.json` to the repo. + +## Serve locally + +```bash +cd public +python3 -m http.server 8080 +# open http://localhost:8080 +``` + +## GitHub Pages + +Enable GitHub Pages for the `workflows/cve-fixer/dashboard/public/` path to host the dashboard publicly. + +## Schedule + +Run `collect-data.js` on a schedule (daily cron, GitHub Actions) to keep the dashboard current. diff --git a/workflows/cve-fixer/dashboard/public/index.html b/workflows/cve-fixer/dashboard/public/index.html new file mode 100644 index 0000000..00d42bd --- /dev/null +++ b/workflows/cve-fixer/dashboard/public/index.html @@ -0,0 +1,351 @@ + + + + + + CVE Fixer Dashboard + + + + + +
+ + + + + +
+
Loading dashboard data...
+ +
+ +
+ + + + diff --git a/workflows/cve-fixer/dashboard/scripts/collect-data.js b/workflows/cve-fixer/dashboard/scripts/collect-data.js new file mode 100644 index 0000000..a9e7781 --- /dev/null +++ b/workflows/cve-fixer/dashboard/scripts/collect-data.js @@ -0,0 +1,235 @@ +#!/usr/bin/env node +/** + * collect-data.js + * Scans all repos in component-repository-mappings.json for CVE fix PRs, + * aggregates metrics, and pushes the resulting JSON to the workflow repo. + * + * Metrics collected: + * - Fix PRs opened (branch matches fix/cve-* or title matches Security: Fix CVE-*) + * - Fix PRs merged + * - Unique CVEs attempted + * - Per-component breakdown + * - Components onboarded + * + * Usage: + * GITHUB_TOKEN= WORKFLOW_REPO=angaduom/workflows node collect-data.js + */ + +const CVE_BRANCH_RE = /^fix\/cve-(\d{4}-\d{4,7})/i; +const CVE_TITLE_RE = /CVE-(\d{4}-\d{4,7})/gi; +const WORKFLOW_LABEL = 'cve-fixer-automated'; + +// --------------- GitHub helpers --------------- + +function githubHeaders() { + const token = process.env.GITHUB_TOKEN; + if (!token) throw new Error('GITHUB_TOKEN is not set'); + return { + Authorization: `token ${token}`, + Accept: 'application/vnd.github.v3+json', + 'Content-Type': 'application/json', + 'User-Agent': 'cve-fixer-dashboard', + }; +} + +async function githubGet(url) { + const resp = await fetch(url, { headers: githubHeaders(), signal: AbortSignal.timeout(20000) }); + if (!resp.ok) throw new Error(`GET ${url}: ${resp.status} ${resp.statusText}`); + return resp.json(); +} + +async function githubGetAll(url) { + let results = []; + let page = 1; + while (true) { + const sep = url.includes('?') ? '&' : '?'; + const data = await githubGet(`${url}${sep}per_page=100&page=${page}`); + if (!Array.isArray(data) || data.length === 0) break; + results = results.concat(data); + if (data.length < 100) break; + page++; + } + return results; +} + +async function pushFileToRepo(filePath, content, message) { + const repo = process.env.WORKFLOW_REPO || 'angaduom/workflows'; + const url = `https://api.github.com/repos/${repo}/contents/${filePath}`; + + let sha; + try { + const existing = await githubGet(url); + sha = existing.sha; + } catch (_) { /* file doesn't exist yet */ } + + const body = { + message, + content: Buffer.from(JSON.stringify(content, null, 2)).toString('base64'), + }; + if (sha) body.sha = sha; + + const resp = await fetch(url, { + method: 'PUT', + headers: githubHeaders(), + body: JSON.stringify(body), + signal: AbortSignal.timeout(20000), + }); + + if (!resp.ok) { + const text = await resp.text(); + throw new Error(`PUT ${filePath}: ${resp.status} ${text}`); + } + console.log(`Pushed ${filePath} to ${repo}`); +} + +// --------------- Load mapping file --------------- + +async function loadMapping() { + const repo = process.env.WORKFLOW_REPO || 'angaduom/workflows'; + const url = `https://api.github.com/repos/${repo}/contents/workflows/cve-fixer/component-repository-mappings.json`; + const file = await githubGet(url); + const raw = Buffer.from(file.content, 'base64').toString('utf8'); + return JSON.parse(raw); +} + +// --------------- CVE ID extraction --------------- + +function extractCveId(text) { + const match = text.match(/CVE-(\d{4}-\d{4,7})/i); + return match ? `CVE-${match[1]}` : null; +} + +function isCvePr(pr) { + return CVE_BRANCH_RE.test(pr.head?.ref || '') || + CVE_TITLE_RE.test(pr.title || '') || + (pr.labels || []).some(l => l.name === WORKFLOW_LABEL); +} + +// --------------- Scan repos --------------- + +async function scanRepo(repoFullName, componentName) { + console.log(` Scanning ${repoFullName}...`); + const prs = []; + + try { + // Get all PRs (open + closed) with fix/cve-* branch pattern + const allPrs = await githubGetAll( + `https://api.github.com/repos/${repoFullName}/pulls?state=all` + ); + + for (const pr of allPrs) { + if (!isCvePr(pr)) continue; + + const cveId = extractCveId(pr.head?.ref || '') || extractCveId(pr.title || ''); + prs.push({ + number: pr.number, + title: pr.title, + url: pr.html_url, + state: pr.state, + merged: !!pr.merged_at, + merged_at: pr.merged_at ? pr.merged_at.substring(0, 10) : null, + created_at: pr.created_at ? pr.created_at.substring(0, 10) : null, + cve_id: cveId, + component: componentName, + repo: repoFullName, + branch: pr.head?.ref || '', + }); + } + } catch (err) { + console.warn(` Warning: could not scan ${repoFullName}: ${err.message}`); + } + + return prs; +} + +// --------------- Aggregate metrics --------------- + +function aggregate(allPrs, components) { + const uniqueCves = new Set(allPrs.map(p => p.cve_id).filter(Boolean)); + const mergedPrs = allPrs.filter(p => p.merged); + const openPrs = allPrs.filter(p => p.state === 'open'); + + // By component + const byComponent = {}; + for (const comp of components) { + const compPrs = allPrs.filter(p => p.component === comp); + const compCves = new Set(compPrs.map(p => p.cve_id).filter(Boolean)); + byComponent[comp] = { + total_prs: compPrs.length, + merged_prs: compPrs.filter(p => p.merged).length, + open_prs: compPrs.filter(p => p.state === 'open').length, + unique_cves: compCves.size, + cve_ids: [...compCves].sort(), + }; + } + + // Day-by-day timeline (PRs opened per day) + const byDate = {}; + for (const pr of allPrs) { + const date = pr.created_at; + if (!date) continue; + if (!byDate[date]) byDate[date] = { opened: 0, merged: 0 }; + byDate[date].opened++; + if (pr.merged && pr.merged_at) { + if (!byDate[pr.merged_at]) byDate[pr.merged_at] = { opened: 0, merged: 0 }; + byDate[pr.merged_at].merged++; + } + } + + const timeline = Object.entries(byDate) + .sort(([a], [b]) => a.localeCompare(b)) + .map(([date, counts]) => ({ date, ...counts })); + + return { + last_updated: new Date().toISOString(), + summary: { + total_prs: allPrs.length, + open_prs: openPrs.length, + merged_prs: mergedPrs.length, + unique_cves: uniqueCves.size, + components_onboarded: components.length, + cve_ids: [...uniqueCves].sort(), + }, + by_component: byComponent, + timeline, + all_prs: allPrs.sort((a, b) => (b.created_at || '').localeCompare(a.created_at || '')), + }; +} + +// --------------- main --------------- + +async function main() { + console.log('Loading component mapping...'); + const mapping = await loadMapping(); + const components = Object.keys(mapping.components); + console.log(`Found ${components.length} components: ${components.join(', ')}`); + + const allPrs = []; + for (const [compName, comp] of Object.entries(mapping.components)) { + console.log(`\nComponent: ${compName}`); + for (const repo of comp.repos || []) { + const repoFullName = repo.url.replace('https://github.com/', ''); + const prs = await scanRepo(repoFullName, compName); + allPrs.push(...prs); + console.log(` Found ${prs.length} CVE PRs`); + } + } + + console.log(`\nTotal CVE PRs found: ${allPrs.length}`); + + const data = aggregate(allPrs, components); + console.log(`Summary: ${data.summary.total_prs} PRs, ${data.summary.merged_prs} merged, ${data.summary.unique_cves} unique CVEs`); + + await pushFileToRepo( + 'workflows/cve-fixer/dashboard/public/data.json', + data, + `chore: update CVE fixer dashboard data [skip ci]` + ); + + console.log('\nDone!'); +} + +main().catch(err => { + console.error('Failed:', err.message); + process.exit(1); +}); diff --git a/workflows/cve-fixer/dashboard/scripts/package.json b/workflows/cve-fixer/dashboard/scripts/package.json new file mode 100644 index 0000000..d0b6cd3 --- /dev/null +++ b/workflows/cve-fixer/dashboard/scripts/package.json @@ -0,0 +1,12 @@ +{ + "name": "cve-fixer-dashboard-collector", + "version": "1.0.0", + "description": "Collects CVE fixer workflow metrics and pushes to dashboard", + "main": "collect-data.js", + "scripts": { + "collect": "node collect-data.js" + }, + "engines": { + "node": ">=18" + } +} From 511aab551ef0c03c2a3fb8667204387daeb7dc6e Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Wed, 15 Apr 2026 18:07:18 -0400 Subject: [PATCH 04/11] =?UTF-8?q?revert:=20remove=20dashboard=20=E2=80=94?= =?UTF-8?q?=20moving=20to=20angaduom/shepard=20repo=20instead?= MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Co-Authored-By: Claude Sonnet 4.6 (1M context) --- workflows/cve-fixer/dashboard/README.md | 28 -- .../cve-fixer/dashboard/public/index.html | 351 ------------------ .../dashboard/scripts/collect-data.js | 235 ------------ .../cve-fixer/dashboard/scripts/package.json | 12 - 4 files changed, 626 deletions(-) delete mode 100644 workflows/cve-fixer/dashboard/README.md delete mode 100644 workflows/cve-fixer/dashboard/public/index.html delete mode 100644 workflows/cve-fixer/dashboard/scripts/collect-data.js delete mode 100644 workflows/cve-fixer/dashboard/scripts/package.json diff --git a/workflows/cve-fixer/dashboard/README.md b/workflows/cve-fixer/dashboard/README.md deleted file mode 100644 index fa4c134..0000000 --- a/workflows/cve-fixer/dashboard/README.md +++ /dev/null @@ -1,28 +0,0 @@ -# CVE Fixer Dashboard - -Tracks fix PRs opened/merged, unique CVEs addressed, and components onboarded across all repos in the CVE fixer workflow. - -## Setup - -```bash -cd scripts -GITHUB_TOKEN= WORKFLOW_REPO=angaduom/workflows node collect-data.js -``` - -This scans all repos in `component-repository-mappings.json`, finds CVE fix PRs (branch `fix/cve-*` or title matching `CVE-YYYY-XXXXX`), and pushes `data.json` to the repo. - -## Serve locally - -```bash -cd public -python3 -m http.server 8080 -# open http://localhost:8080 -``` - -## GitHub Pages - -Enable GitHub Pages for the `workflows/cve-fixer/dashboard/public/` path to host the dashboard publicly. - -## Schedule - -Run `collect-data.js` on a schedule (daily cron, GitHub Actions) to keep the dashboard current. diff --git a/workflows/cve-fixer/dashboard/public/index.html b/workflows/cve-fixer/dashboard/public/index.html deleted file mode 100644 index 00d42bd..0000000 --- a/workflows/cve-fixer/dashboard/public/index.html +++ /dev/null @@ -1,351 +0,0 @@ - - - - - - CVE Fixer Dashboard - - - - - -
- - - - - -
-
Loading dashboard data...
- -
- -
- - - - diff --git a/workflows/cve-fixer/dashboard/scripts/collect-data.js b/workflows/cve-fixer/dashboard/scripts/collect-data.js deleted file mode 100644 index a9e7781..0000000 --- a/workflows/cve-fixer/dashboard/scripts/collect-data.js +++ /dev/null @@ -1,235 +0,0 @@ -#!/usr/bin/env node -/** - * collect-data.js - * Scans all repos in component-repository-mappings.json for CVE fix PRs, - * aggregates metrics, and pushes the resulting JSON to the workflow repo. - * - * Metrics collected: - * - Fix PRs opened (branch matches fix/cve-* or title matches Security: Fix CVE-*) - * - Fix PRs merged - * - Unique CVEs attempted - * - Per-component breakdown - * - Components onboarded - * - * Usage: - * GITHUB_TOKEN= WORKFLOW_REPO=angaduom/workflows node collect-data.js - */ - -const CVE_BRANCH_RE = /^fix\/cve-(\d{4}-\d{4,7})/i; -const CVE_TITLE_RE = /CVE-(\d{4}-\d{4,7})/gi; -const WORKFLOW_LABEL = 'cve-fixer-automated'; - -// --------------- GitHub helpers --------------- - -function githubHeaders() { - const token = process.env.GITHUB_TOKEN; - if (!token) throw new Error('GITHUB_TOKEN is not set'); - return { - Authorization: `token ${token}`, - Accept: 'application/vnd.github.v3+json', - 'Content-Type': 'application/json', - 'User-Agent': 'cve-fixer-dashboard', - }; -} - -async function githubGet(url) { - const resp = await fetch(url, { headers: githubHeaders(), signal: AbortSignal.timeout(20000) }); - if (!resp.ok) throw new Error(`GET ${url}: ${resp.status} ${resp.statusText}`); - return resp.json(); -} - -async function githubGetAll(url) { - let results = []; - let page = 1; - while (true) { - const sep = url.includes('?') ? '&' : '?'; - const data = await githubGet(`${url}${sep}per_page=100&page=${page}`); - if (!Array.isArray(data) || data.length === 0) break; - results = results.concat(data); - if (data.length < 100) break; - page++; - } - return results; -} - -async function pushFileToRepo(filePath, content, message) { - const repo = process.env.WORKFLOW_REPO || 'angaduom/workflows'; - const url = `https://api.github.com/repos/${repo}/contents/${filePath}`; - - let sha; - try { - const existing = await githubGet(url); - sha = existing.sha; - } catch (_) { /* file doesn't exist yet */ } - - const body = { - message, - content: Buffer.from(JSON.stringify(content, null, 2)).toString('base64'), - }; - if (sha) body.sha = sha; - - const resp = await fetch(url, { - method: 'PUT', - headers: githubHeaders(), - body: JSON.stringify(body), - signal: AbortSignal.timeout(20000), - }); - - if (!resp.ok) { - const text = await resp.text(); - throw new Error(`PUT ${filePath}: ${resp.status} ${text}`); - } - console.log(`Pushed ${filePath} to ${repo}`); -} - -// --------------- Load mapping file --------------- - -async function loadMapping() { - const repo = process.env.WORKFLOW_REPO || 'angaduom/workflows'; - const url = `https://api.github.com/repos/${repo}/contents/workflows/cve-fixer/component-repository-mappings.json`; - const file = await githubGet(url); - const raw = Buffer.from(file.content, 'base64').toString('utf8'); - return JSON.parse(raw); -} - -// --------------- CVE ID extraction --------------- - -function extractCveId(text) { - const match = text.match(/CVE-(\d{4}-\d{4,7})/i); - return match ? `CVE-${match[1]}` : null; -} - -function isCvePr(pr) { - return CVE_BRANCH_RE.test(pr.head?.ref || '') || - CVE_TITLE_RE.test(pr.title || '') || - (pr.labels || []).some(l => l.name === WORKFLOW_LABEL); -} - -// --------------- Scan repos --------------- - -async function scanRepo(repoFullName, componentName) { - console.log(` Scanning ${repoFullName}...`); - const prs = []; - - try { - // Get all PRs (open + closed) with fix/cve-* branch pattern - const allPrs = await githubGetAll( - `https://api.github.com/repos/${repoFullName}/pulls?state=all` - ); - - for (const pr of allPrs) { - if (!isCvePr(pr)) continue; - - const cveId = extractCveId(pr.head?.ref || '') || extractCveId(pr.title || ''); - prs.push({ - number: pr.number, - title: pr.title, - url: pr.html_url, - state: pr.state, - merged: !!pr.merged_at, - merged_at: pr.merged_at ? pr.merged_at.substring(0, 10) : null, - created_at: pr.created_at ? pr.created_at.substring(0, 10) : null, - cve_id: cveId, - component: componentName, - repo: repoFullName, - branch: pr.head?.ref || '', - }); - } - } catch (err) { - console.warn(` Warning: could not scan ${repoFullName}: ${err.message}`); - } - - return prs; -} - -// --------------- Aggregate metrics --------------- - -function aggregate(allPrs, components) { - const uniqueCves = new Set(allPrs.map(p => p.cve_id).filter(Boolean)); - const mergedPrs = allPrs.filter(p => p.merged); - const openPrs = allPrs.filter(p => p.state === 'open'); - - // By component - const byComponent = {}; - for (const comp of components) { - const compPrs = allPrs.filter(p => p.component === comp); - const compCves = new Set(compPrs.map(p => p.cve_id).filter(Boolean)); - byComponent[comp] = { - total_prs: compPrs.length, - merged_prs: compPrs.filter(p => p.merged).length, - open_prs: compPrs.filter(p => p.state === 'open').length, - unique_cves: compCves.size, - cve_ids: [...compCves].sort(), - }; - } - - // Day-by-day timeline (PRs opened per day) - const byDate = {}; - for (const pr of allPrs) { - const date = pr.created_at; - if (!date) continue; - if (!byDate[date]) byDate[date] = { opened: 0, merged: 0 }; - byDate[date].opened++; - if (pr.merged && pr.merged_at) { - if (!byDate[pr.merged_at]) byDate[pr.merged_at] = { opened: 0, merged: 0 }; - byDate[pr.merged_at].merged++; - } - } - - const timeline = Object.entries(byDate) - .sort(([a], [b]) => a.localeCompare(b)) - .map(([date, counts]) => ({ date, ...counts })); - - return { - last_updated: new Date().toISOString(), - summary: { - total_prs: allPrs.length, - open_prs: openPrs.length, - merged_prs: mergedPrs.length, - unique_cves: uniqueCves.size, - components_onboarded: components.length, - cve_ids: [...uniqueCves].sort(), - }, - by_component: byComponent, - timeline, - all_prs: allPrs.sort((a, b) => (b.created_at || '').localeCompare(a.created_at || '')), - }; -} - -// --------------- main --------------- - -async function main() { - console.log('Loading component mapping...'); - const mapping = await loadMapping(); - const components = Object.keys(mapping.components); - console.log(`Found ${components.length} components: ${components.join(', ')}`); - - const allPrs = []; - for (const [compName, comp] of Object.entries(mapping.components)) { - console.log(`\nComponent: ${compName}`); - for (const repo of comp.repos || []) { - const repoFullName = repo.url.replace('https://github.com/', ''); - const prs = await scanRepo(repoFullName, compName); - allPrs.push(...prs); - console.log(` Found ${prs.length} CVE PRs`); - } - } - - console.log(`\nTotal CVE PRs found: ${allPrs.length}`); - - const data = aggregate(allPrs, components); - console.log(`Summary: ${data.summary.total_prs} PRs, ${data.summary.merged_prs} merged, ${data.summary.unique_cves} unique CVEs`); - - await pushFileToRepo( - 'workflows/cve-fixer/dashboard/public/data.json', - data, - `chore: update CVE fixer dashboard data [skip ci]` - ); - - console.log('\nDone!'); -} - -main().catch(err => { - console.error('Failed:', err.message); - process.exit(1); -}); diff --git a/workflows/cve-fixer/dashboard/scripts/package.json b/workflows/cve-fixer/dashboard/scripts/package.json deleted file mode 100644 index d0b6cd3..0000000 --- a/workflows/cve-fixer/dashboard/scripts/package.json +++ /dev/null @@ -1,12 +0,0 @@ -{ - "name": "cve-fixer-dashboard-collector", - "version": "1.0.0", - "description": "Collects CVE fixer workflow metrics and pushes to dashboard", - "main": "collect-data.js", - "scripts": { - "collect": "node collect-data.js" - }, - "engines": { - "node": ">=18" - } -} From 4b734c99e4104046bbc94db48c6f810f8b1649c6 Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Thu, 16 Apr 2026 13:51:51 -0400 Subject: [PATCH 05/11] feat: add Observability, simplified mapping, PR labels and Jira ID tracking component-repository-mappings.json: - Apply simplified schema (repos[] instead of nested repositories/container_to_repo_mapping) - Remove unused components (AI Core Dashboard, Model Serving, Notebooks Images, AI Pipelines, Notebooks Server, Training Kubeflow) - Fix repo types: opendatahub-io=midstream, red-hat-data-services=downstream, others=upstream - Add Observability component (14 stolostron repos with ACM containers) from PR #103 converted to new simplified schema cve.fix.md: - Add --label cve-fixer-automated to every gh pr create call with graceful fallback if label doesn't exist in the target repo - Allow both plain and linked Jira issue IDs in PR body (both are fine) - Add note that Jira IDs are required for dashboard tracking Co-Authored-By: Claude Sonnet 4.6 (1M context) --- .../cve-fixer/.claude/commands/cve.fix.md | 20 +++++++++++++------ 1 file changed, 14 insertions(+), 6 deletions(-) diff --git a/workflows/cve-fixer/.claude/commands/cve.fix.md b/workflows/cve-fixer/.claude/commands/cve.fix.md index e3cd5c7..8000789 100644 --- a/workflows/cve-fixer/.claude/commands/cve.fix.md +++ b/workflows/cve-fixer/.claude/commands/cve.fix.md @@ -1173,12 +1173,13 @@ the fix requires additional changes beyond a version bump." - Risk assessment table - Links to CVE advisories - **Jira issue references**: List the extracted Jira issue IDs as plain text WITHOUT hyperlinks - - ✅ Correct: `Resolves: RHOAIENG-17794, RHOAIENG-16619, RHOAIENG-16616` - - ❌ Wrong: `Resolves: [RHOAIENG-17794](https://redhat.atlassian.net/browse/RHOAIENG-17794)` - - ❌ Wrong: `Multiple RHOAIENG issues for CVE-2024-21538 across different release branches` - - Do NOT create markdown links for Jira issues - - Do NOT use generic descriptions - list the ACTUAL issue IDs - - Just list the issue IDs separated by commas + - ✅ Correct (plain): `Resolves: RHOAIENG-17794, RHOAIENG-16619` + - ✅ Correct (linked): `Resolves: [RHOAIENG-17794](https://redhat.atlassian.net/browse/RHOAIENG-17794)` + - ✅ Correct for ACM: `Resolves: ACM-32577, ACM-32578` + - ❌ Wrong: `Multiple issues for CVE-2024-21538 across different release branches` (no IDs) + - ❌ Wrong: omitting Jira IDs entirely + - Always include the actual issue IDs — the dashboard scans PR bodies to correlate + PRs with CVEs, so missing IDs break tracking - **CREATE** the PR using GitHub CLI (with fallback to GitHub API): ```bash # Prepare PR body @@ -1241,9 +1242,16 @@ EOF ) PR_URL=$(gh pr create \ + --base \ + --title "Security: Fix CVE-YYYY-XXXXX ()" \ + --body "$PR_BODY" \ + --label "cve-fixer-automated" 2>/dev/null || \ + gh pr create \ --base \ --title "Security: Fix CVE-YYYY-XXXXX ()" \ --body "$PR_BODY") + # Note: --label silently fails if the label doesn't exist in the repo. + # The fallback without --label ensures PR is always created. # Enable automerge if --automerge flag was passed and PR was created successfully if [ "$AUTOMERGE" = "true" ] && [ -n "$PR_URL" ] && [ "$PR_URL" != "null" ]; then From 7a62e16083e9619ca146ab545d207d10e2f2f354 Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Thu, 16 Apr 2026 13:53:26 -0400 Subject: [PATCH 06/11] fix: use generic PROJ-XXXXX in Jira ID examples, not project-specific Co-Authored-By: Claude Sonnet 4.6 (1M context) --- workflows/cve-fixer/.claude/commands/cve.fix.md | 8 ++++---- 1 file changed, 4 insertions(+), 4 deletions(-) diff --git a/workflows/cve-fixer/.claude/commands/cve.fix.md b/workflows/cve-fixer/.claude/commands/cve.fix.md index 8000789..19bf401 100644 --- a/workflows/cve-fixer/.claude/commands/cve.fix.md +++ b/workflows/cve-fixer/.claude/commands/cve.fix.md @@ -1173,10 +1173,10 @@ the fix requires additional changes beyond a version bump." - Risk assessment table - Links to CVE advisories - **Jira issue references**: List the extracted Jira issue IDs as plain text WITHOUT hyperlinks - - ✅ Correct (plain): `Resolves: RHOAIENG-17794, RHOAIENG-16619` - - ✅ Correct (linked): `Resolves: [RHOAIENG-17794](https://redhat.atlassian.net/browse/RHOAIENG-17794)` - - ✅ Correct for ACM: `Resolves: ACM-32577, ACM-32578` - - ❌ Wrong: `Multiple issues for CVE-2024-21538 across different release branches` (no IDs) + - ✅ Correct (plain): `Resolves: PROJ-12345` + - ✅ Correct (linked): `Resolves: [PROJ-12345](https://redhat.atlassian.net/browse/PROJ-12345)` + - ✅ Multiple issues: `Resolves: PROJ-12345, PROJ-12346` (when the same CVE has multiple tickets) + - ❌ Wrong: generic description with no IDs - ❌ Wrong: omitting Jira IDs entirely - Always include the actual issue IDs — the dashboard scans PR bodies to correlate PRs with CVEs, so missing IDs break tracking From 6c662bfa7fcd55a34ec7803a6c53d397238f7c75 Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Thu, 16 Apr 2026 13:59:36 -0400 Subject: [PATCH 07/11] fix: add HTML marker to PR body so dashboard can detect workflow PRs without label If --label cve-fixer-automated fails (label doesn't exist in repo), in the PR body acts as a reliable fallback. The dashboard collector checks branch name, title, label, OR body marker. Co-Authored-By: Claude Sonnet 4.6 (1M context) --- workflows/cve-fixer/.claude/commands/cve.fix.md | 1 + 1 file changed, 1 insertion(+) diff --git a/workflows/cve-fixer/.claude/commands/cve.fix.md b/workflows/cve-fixer/.claude/commands/cve.fix.md index 19bf401..6c9808d 100644 --- a/workflows/cve-fixer/.claude/commands/cve.fix.md +++ b/workflows/cve-fixer/.claude/commands/cve.fix.md @@ -1238,6 +1238,7 @@ This PR fixes **CVE-YYYY-XXXXX** by upgrading from X.X.X to Y.Y.Y. --- 🤖 Generated by CVE Fixer Workflow + EOF ) From d3170e214b246fb60c7739d9cb8f57b0bfcda1eb Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Thu, 16 Apr 2026 18:35:40 -0400 Subject: [PATCH 08/11] feat: onboard command now checks if component is already onboarded MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit Three modes: - Mode A (fully onboarded): ask user if they want to add repos or just regenerate examples.md — skips mapping update if examples only - Mode B (partially onboarded): collect new repos, merge with existing component entry, regenerate examples for all repos - Mode C (new): full onboard flow as before Detects mode by checking component-repository-mappings.json before asking any questions. Co-Authored-By: Claude Sonnet 4.6 (1M context) --- .../cve-fixer/.claude/commands/onboard.md | 47 +++++++++++++++++-- 1 file changed, 44 insertions(+), 3 deletions(-) diff --git a/workflows/cve-fixer/.claude/commands/onboard.md b/workflows/cve-fixer/.claude/commands/onboard.md index 4dda3c6..e3185c4 100644 --- a/workflows/cve-fixer/.claude/commands/onboard.md +++ b/workflows/cve-fixer/.claude/commands/onboard.md @@ -8,14 +8,49 @@ to `ambient-code/workflows` containing both the mapping update and the guidance ## Process -1. **Collect Component Information** +1. **Determine Mode — Check Existing Mapping** + + Before asking any questions, load `component-repository-mappings.json` and check + whether the component is already onboarded: + + ```bash + # Find mapping file + if [ -f "component-repository-mappings.json" ]; then + MAPPING_FILE="component-repository-mappings.json" + elif [ -f "workflows/cve-fixer/component-repository-mappings.json" ]; then + MAPPING_FILE="workflows/cve-fixer/component-repository-mappings.json" + fi + + # Ask component name first + # Then check if it exists in the mapping + EXISTING=$(jq -r --arg name "$COMPONENT_NAME" \ + 'if .components[$name] then "found" else "not_found" end' "$MAPPING_FILE" 2>/dev/null) + ``` + + **Three modes based on what exists:** + + **Mode A — Component already fully onboarded** (`EXISTING == "found"` and user confirms repos are correct): + - Show existing repos from the mapping + - Ask: "Your repos are already mapped. Do you want to: (1) Add new repos (2) Just regenerate examples.md guidance files (3) Cancel?" + - If option 2: skip to Step 5 (Generate examples.md only) — no mapping change needed + - If option 1: continue to collect new repos to add + + **Mode B — Component exists, adding repos** (selected option 1 above): + - Collect only the NEW repos to add + - Merge with existing repos in the mapping + - Regenerate examples.md for all repos + + **Mode C — New component** (`EXISTING == "not_found"`): + - Full onboard flow — collect component name, repos, validate Jira, etc. + +1a. **Collect Component Information** (Modes B and C only) Ask the user for the following, one question at a time: a. **Jira component name** — must match exactly what appears in Jira (case-sensitive). Example: `"AI Evaluations"`, `"llm-d"`, `"AutoML"` - b. **Repos** — for each repo: + b. **Repos** — for each repo to add: - GitHub URL (e.g. `https://github.com/org/repo`) - Repo type: `upstream`, `midstream`, or `downstream` - Subcomponent name (optional — only if this component has multiple distinct @@ -142,6 +177,8 @@ to `ambient-code/workflows` containing both the mapping update and the guidance - If no containers discovered, omit the field (can be added later) - Omit `subcomponent` if not needed - Show the entry to the user: "Does this look correct? (yes/no/edit)" + - **Mode A (examples only)**: skip this step entirely — no mapping changes + - **Mode B (adding repos)**: merge new repos into existing component entry 5. **Generate `.cve-fix/examples.md` Guidance** @@ -239,13 +276,17 @@ to `ambient-code/workflows` containing both the mapping update and the guidance fi ``` -7. **Apply All Changes** +7. **Apply Changes (mode-dependent)** ```bash cd /tmp/workflows-onboard BRANCH_NAME="onboard/${COMPONENT_NAME_SLUG}" git checkout -b "$BRANCH_NAME" + # Mode A: examples.md only — skip mapping update entirely + # Mode B: merge new repos into existing component entry + # Mode C: add new component entry + MAPPING_FILE="workflows/cve-fixer/component-repository-mappings.json" # Add component to mapping file From f8efd9449e850337fa2f2d8cff7eccb22ac31c4f Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Thu, 16 Apr 2026 19:23:33 -0400 Subject: [PATCH 09/11] fix: examples.md PRs go to component repos, not ambient-code/workflows MIME-Version: 1.0 Content-Type: text/plain; charset=UTF-8 Content-Transfer-Encoding: 8bit The mapping update PR goes to ambient-code/workflows (correct). The .cve-fix/examples.md files go as separate PRs to each component repo (e.g. stolostron/multicluster-observability-operator), not to the workflows repo. Two separate PRs created per /onboard run: 1. ambient-code/workflows ← mapping update only 2. Each component repo ← .cve-fix/examples.md only Co-Authored-By: Claude Sonnet 4.6 (1M context) --- .../cve-fixer/.claude/commands/onboard.md | 83 +++++++++++++++---- 1 file changed, 68 insertions(+), 15 deletions(-) diff --git a/workflows/cve-fixer/.claude/commands/onboard.md b/workflows/cve-fixer/.claude/commands/onboard.md index e3185c4..61ca001 100644 --- a/workflows/cve-fixer/.claude/commands/onboard.md +++ b/workflows/cve-fixer/.claude/commands/onboard.md @@ -316,26 +316,16 @@ to `ambient-code/workflows` containing both the mapping update and the guidance python3 -m json.tool "$MAPPING_FILE" > /dev/null && echo "✅ JSON valid" git add "$MAPPING_FILE" - # Add .cve-fix/examples.md for each repo - for i in "${!REPO_URLS[@]}"; do - REPO_FULL=$(echo "${REPO_URLS[$i]}" | sed 's|https://github.com/||') - EXAMPLES_DIR="workflows/cve-fixer/.cve-fix/$(echo "$REPO_FULL" | tr '/' '-')" - mkdir -p "$EXAMPLES_DIR" - echo "${GENERATED_EXAMPLES[$i]}" > "${EXAMPLES_DIR}/examples.md" - git add "${EXAMPLES_DIR}/examples.md" - done - git commit -m "feat: onboard ${COMPONENT_NAME} to CVE fixer workflow - - Add ${COMPONENT_NAME} to component-repository-mappings.json - - Generate .cve-fix/examples.md guidance for each repo + Add ${COMPONENT_NAME} to component-repository-mappings.json Co-Authored-By: Claude Sonnet 4.6 (1M context) " git push "$REMOTE" "$BRANCH_NAME" ``` -8. **Create Pull Request** +8. **Create PR to `ambient-code/workflows`** (mapping update only) ```bash gh pr create \ @@ -360,12 +350,72 @@ to `ambient-code/workflows` containing both the mapping update and the guidance - [ ] Verify Jira component name matches exactly - [ ] Verify repo URLs and active branch names - [ ] Add container image names if missing - - [ ] Review generated examples.md files 🤖 Generated by /onboard" ``` -9. **Cleanup** +9. **Open separate PRs to each component repo** with `.cve-fix/examples.md` + + The guidance files go to the COMPONENT repos themselves, not to `ambient-code/workflows`. + For each repo in the component: + + ```bash + for i in "${!REPO_URLS[@]}"; do + REPO_FULL=$(echo "${REPO_URLS[$i]}" | sed 's|https://github.com/||') + REPO_DIR="/tmp/onboard-${REPO_FULL//\//-}" + + # Check write access / fork if needed + PUSH_ACCESS=$(gh api repos/${REPO_FULL} --jq '.permissions.push' 2>/dev/null) + FORK_USER=$(gh api user --jq '.login' 2>/dev/null) + + if [ "$PUSH_ACCESS" != "true" ]; then + gh repo fork "$REPO_FULL" --clone=false 2>/dev/null || true + gh repo sync "${FORK_USER}/$(echo $REPO_FULL | cut -d/ -f2)" --source "$REPO_FULL" --branch main + git clone "https://github.com/${FORK_USER}/$(echo $REPO_FULL | cut -d/ -f2).git" "$REPO_DIR" + REPO_REMOTE="origin" + PR_HEAD="${FORK_USER}:add-cve-fix-guidance" + else + git clone "https://github.com/${REPO_FULL}.git" "$REPO_DIR" + REPO_REMOTE="origin" + PR_HEAD="add-cve-fix-guidance" + fi + + cd "$REPO_DIR" + git checkout -b add-cve-fix-guidance + mkdir -p .cve-fix + echo "${GENERATED_EXAMPLES[$i]}" > .cve-fix/examples.md + git add .cve-fix/examples.md + git commit -m "chore: add CVE fixer guidance file + +Generated by /onboard — teaches the CVE fixer workflow how to create +fix PRs matching this repo's conventions. + +Co-Authored-By: Claude Sonnet 4.6 (1M context) " + git push "$REPO_REMOTE" add-cve-fix-guidance + + gh pr create \ + --repo "$REPO_FULL" \ + --base main \ + --head "$PR_HEAD" \ + --title "chore: add .cve-fix/examples.md guidance for CVE fixer workflow" \ + --body "Adds \`.cve-fix/examples.md\` so the CVE fixer workflow knows how to +create fix PRs matching this repo's conventions (branch naming, files that +change together, co-upgrades, etc.). + +Generated by \`/onboard\` based on analysis of ${CVE_COUNT} merged CVE PRs. + +🤖 Generated by /onboard" + + cd /tmp + rm -rf "$REPO_DIR" + done + ``` + + **This is separate from the workflows PR** — each component repo gets its own PR + with just the `.cve-fix/examples.md` file. The reviewer merges it into their repo, + and the CVE fixer will use it automatically on the next run. + +10. **Cleanup** ```bash rm -rf /tmp/workflows-onboard @@ -383,4 +433,7 @@ to `ambient-code/workflows` containing both the mapping update and the guidance - Branch info is auto-discovered from GitHub — review and correct if needed - Container image names can be added later by editing the mapping or re-running `/onboard` - Generated `.cve-fix/examples.md` improves over time — run `/guidance.update` after more CVE PRs are merged -- Fork of `ambient-code/workflows` is created automatically if you lack write access +- **Two separate PRs are created**: + 1. PR to `ambient-code/workflows` — adds the component to the mapping file + 2. Separate PRs to each component repo — adds `.cve-fix/examples.md` guidance files +- Fork of the target repo is created automatically if you lack write access to it From 4dab8d12aa75f12c1131d1ee44d953b0d2234deb Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Thu, 16 Apr 2026 19:28:29 -0400 Subject: [PATCH 10/11] fix: address all CodeRabbit comments on PR 108 cve.find.md: - Update subcomponent jq query for new simplified schema: .repos[].containers instead of container_to_repo_mapping/repositories cve.fix.md: - Fix misleading comment: --label exits non-zero (not silent), fallback exists for that reason; 2>/dev/null only suppresses the label error onboard.md: - Use __ as directory separator (not -) to avoid org/repo-name vs org-repo/name collision ambiguity - Use printf '%s\n' instead of echo for writing generated markdown (echo interprets backslashes, corrupts code fences and regexes) - Make co-author attribution version-agnostic: Claude instead of Claude Sonnet 4.6 (1M context) component-repository-mappings.json: - All repos already have correct types (no unknown values exist) Co-Authored-By: Claude --- workflows/cve-fixer/.claude/commands/cve.find.md | 10 +++++----- workflows/cve-fixer/.claude/commands/cve.fix.md | 5 +++-- workflows/cve-fixer/.claude/commands/onboard.md | 8 ++++---- 3 files changed, 12 insertions(+), 11 deletions(-) diff --git a/workflows/cve-fixer/.claude/commands/cve.find.md b/workflows/cve-fixer/.claude/commands/cve.find.md index 7dd87d6..14c6f51 100644 --- a/workflows/cve-fixer/.claude/commands/cve.find.md +++ b/workflows/cve-fixer/.claude/commands/cve.find.md @@ -148,12 +148,12 @@ Report: artifacts/cve-fixer/find/cve-issues-20260226-145018.md # Append subcomponent filter if provided if [ -n "$SUBCOMPONENT" ] && [ -n "$MAPPING_FILE" ] && [ -f "$MAPPING_FILE" ]; then - # Reverse lookup: find ALL containers whose primary repo has matching subcomponent + # Reverse lookup: find ALL containers on repos with matching subcomponent (new schema) PSCOMPONENTS=$(jq -r --arg comp "$COMPONENT_NAME" --arg sub "$SUBCOMPONENT" ' - .components[$comp] as $c | - $c.container_to_repo_mapping | to_entries[] | - select($c.repositories[.value].subcomponent == $sub) | - "pscomponent:" + .key + .components[$comp].repos[] | + select(.subcomponent == $sub) | + .containers[]? | + "pscomponent:" + . ' "$MAPPING_FILE") if [ -n "$PSCOMPONENTS" ]; then diff --git a/workflows/cve-fixer/.claude/commands/cve.fix.md b/workflows/cve-fixer/.claude/commands/cve.fix.md index 6c9808d..27d1d28 100644 --- a/workflows/cve-fixer/.claude/commands/cve.fix.md +++ b/workflows/cve-fixer/.claude/commands/cve.fix.md @@ -1251,8 +1251,9 @@ EOF --base \ --title "Security: Fix CVE-YYYY-XXXXX ()" \ --body "$PR_BODY") - # Note: --label silently fails if the label doesn't exist in the repo. - # The fallback without --label ensures PR is always created. + # Note: gh pr create --label exits non-zero if the label doesn't exist. + # The fallback (without --label) ensures PR is always created even if labelling fails. + # 2>/dev/null suppresses the label-not-found error from the first attempt. # Enable automerge if --automerge flag was passed and PR was created successfully if [ "$AUTOMERGE" = "true" ] && [ -n "$PR_URL" ] && [ "$PR_URL" != "null" ]; then diff --git a/workflows/cve-fixer/.claude/commands/onboard.md b/workflows/cve-fixer/.claude/commands/onboard.md index 61ca001..8253dd5 100644 --- a/workflows/cve-fixer/.claude/commands/onboard.md +++ b/workflows/cve-fixer/.claude/commands/onboard.md @@ -320,7 +320,7 @@ to `ambient-code/workflows` containing both the mapping update and the guidance Add ${COMPONENT_NAME} to component-repository-mappings.json - Co-Authored-By: Claude Sonnet 4.6 (1M context) " + Co-Authored-By: Claude " git push "$REMOTE" "$BRANCH_NAME" ``` @@ -362,7 +362,7 @@ to `ambient-code/workflows` containing both the mapping update and the guidance ```bash for i in "${!REPO_URLS[@]}"; do REPO_FULL=$(echo "${REPO_URLS[$i]}" | sed 's|https://github.com/||') - REPO_DIR="/tmp/onboard-${REPO_FULL//\//-}" + REPO_DIR="/tmp/onboard-${REPO_FULL//\//__}" # Check write access / fork if needed PUSH_ACCESS=$(gh api repos/${REPO_FULL} --jq '.permissions.push' 2>/dev/null) @@ -383,14 +383,14 @@ to `ambient-code/workflows` containing both the mapping update and the guidance cd "$REPO_DIR" git checkout -b add-cve-fix-guidance mkdir -p .cve-fix - echo "${GENERATED_EXAMPLES[$i]}" > .cve-fix/examples.md + printf '%s\n' "${GENERATED_EXAMPLES[$i]}" > .cve-fix/examples.md git add .cve-fix/examples.md git commit -m "chore: add CVE fixer guidance file Generated by /onboard — teaches the CVE fixer workflow how to create fix PRs matching this repo's conventions. -Co-Authored-By: Claude Sonnet 4.6 (1M context) " +Co-Authored-By: Claude " git push "$REPO_REMOTE" add-cve-fix-guidance gh pr create \ From 894193c9714e9f92e956ab3c3a7189370f2004cb Mon Sep 17 00:00:00 2001 From: Vaishnavi-Modi Date: Thu, 16 Apr 2026 19:31:53 -0400 Subject: [PATCH 11/11] fix: update cve.fix.md to use new simplified mapping schema - Step 3.1: look up container in repos[].containers[] (not container_to_repo_mapping) - Step 3.2: iterate .components[X].repos[] (not repositories object) - Example JSON updated from old nested structure to new flat repos[] array Co-Authored-By: Claude --- .../cve-fixer/.claude/commands/cve.fix.md | 55 ++++++++----------- 1 file changed, 24 insertions(+), 31 deletions(-) diff --git a/workflows/cve-fixer/.claude/commands/cve.fix.md b/workflows/cve-fixer/.claude/commands/cve.fix.md index 27d1d28..2ed87d2 100644 --- a/workflows/cve-fixer/.claude/commands/cve.fix.md +++ b/workflows/cve-fixer/.claude/commands/cve.fix.md @@ -132,22 +132,21 @@ Summary: **3.1: Use container to scope repos (preferred)** If a `CONTAINER` was extracted in Step 1: - - Look up `CONTAINER` in `container_to_repo_mapping` for the component - - **If container not found in mapping**: + - Search all repos in `.components[COMPONENT].repos[]` for one whose `.containers[]` includes `CONTAINER` + - **If container not found**: - Log a warning: "⚠️ Container [CONTAINER] not in mapping — may be a new container not yet registered. Processing all component repos." - Fall back to processing all repos in the component (scan in Step 5 filters irrelevant ones) - - **If container found**: gives the **primary repo** (e.g., `opendatahub-io/workload-variant-autoscaler`) - - Check if the primary repo has a `subcomponent` field in the `repositories` section + - **If container found**: note which repo it belongs to, read its `subcomponent` field - **If `subcomponent` is defined**: collect all repos in the component with the same `subcomponent` value — this is the chain (upstream + midstream + downstream) - - **If `subcomponent` is not defined**: process ALL repositories in the component (safe fallback — the CVE scan in Step 5 will filter out repos where the CVE doesn't exist) + - **If `subcomponent` is not defined**: process ALL repos in the component (safe fallback — the CVE scan in Step 5 will filter out repos where the CVE doesn't exist) - **This ensures only the repos relevant to that specific container get PRs** — not repos belonging to other subcomponents - Example: `rhoai/odh-workload-variant-autoscaler-controller-rhel9` → primary repo `opendatahub-io/workload-variant-autoscaler` → `subcomponent: autoscaler` → only process `llm-d/llm-d-workload-variant-autoscaler`, `opendatahub-io/workload-variant-autoscaler`, `red-hat-data-services/workload-variant-autoscaler`. + Example: `rhoai/odh-workload-variant-autoscaler-controller-rhel9` found in repo with `subcomponent: autoscaler` → only process `llm-d/llm-d-workload-variant-autoscaler`, `opendatahub-io/workload-variant-autoscaler`, `red-hat-data-services/workload-variant-autoscaler`. **3.2: Fallback — use all repos** If no `CONTAINER` was extracted (summary doesn't match expected pattern): - - Process ALL repositories listed under the component + - Process all entries in `.components[COMPONENT].repos[]` - The CVE scan in Step 5 acts as the safety net — it will skip repos where the CVE doesn't exist - Log a warning: "⚠️ Could not extract container from summary — processing all component repos" @@ -1485,37 +1484,31 @@ After completing this phase: - Filter the repository list to only those that contain the CVE - **Multi-Repository Support**: A single component can map to MULTIPLE repositories - Common pattern: an **upstream** repo (e.g., `opendatahub-io/models-as-a-service`) and one or more **downstream** repos (e.g., `red-hat-data-services/models-as-a-service`) - - Each repository has its own `default_branch`, `cve_fix_workflow`, and `repo_type` - - The `repo_type` field can be `"upstream"` or `"downstream"` to indicate the relationship - - When fixing CVEs, iterate through ALL repositories for the component and apply fixes to each one independently - - Downstream repos often track different branches (e.g., `rhoai-3.0`) than upstream (`main`) - - Each repository gets its own clone directory, feature branch, verification, test run, and PR -- **Mapping File Structure**: + - Each repo entry has its own `default_branch`, `active_branches`, and `type` + - The `type` field is `"upstream"`, `"midstream"`, or `"downstream"` + - When fixing CVEs, iterate through ALL repos for the component and apply fixes to each one independently + - Downstream repos often track different branches (e.g., `rhoai-3.4`) than upstream (`main`) + - Each repo gets its own clone directory, feature branch, verification, test run, and PR +- **Mapping File Structure** (simplified schema): ```json { "components": { "Component Name": { - "container_to_repo_mapping": { ... }, - "repositories": { - "org/repo-upstream": { + "repos": [ + { + "url": "https://github.com/org/upstream-repo", + "type": "upstream", "default_branch": "main", - "active_release_branches": [...], - "cve_fix_workflow": { - "primary_target": "main", - "backport_targets": "..." - }, - "repo_type": "upstream" + "active_branches": ["release-0.6"], + "containers": ["rhoai/odh-container-rhel9"] }, - "org/repo-downstream": { - "default_branch": "rhoai-3.0", - "active_release_branches": ["rhoai-3.0"], - "cve_fix_workflow": { - "primary_target": "rhoai-3.0", - "backport_targets": "rhoai-3.0" - }, - "repo_type": "downstream" + { + "url": "https://github.com/org/downstream-repo", + "type": "downstream", + "default_branch": "main", + "active_branches": ["rhoai-3.4", "rhoai-3.4-ea.2"] } - } + ] } } }