diff --git a/workflows/cve-fixer/.claude/commands/cve.find.md b/workflows/cve-fixer/.claude/commands/cve.find.md index 5087acc6..f30e71b4 100644 --- a/workflows/cve-fixer/.claude/commands/cve.find.md +++ b/workflows/cve-fixer/.claude/commands/cve.find.md @@ -31,10 +31,11 @@ Report: artifacts/cve-fixer/find/cve-issues-20260226-145018.md ## Process 1. **Parse Arguments and Flags** - - Parse the command arguments for both the component name and optional flags + - Parse the command arguments for the component name, optional subcomponent, and optional flags - **Supported flags:** - `--ignore-resolved` — Exclude issues with Jira status "Resolved" from results - - The component name is any argument that is not a flag + - The component name is the first argument that is not a flag + - The subcomponent is the second positional argument that is not a flag (optional) - If component is not provided, ask the user to type the component name - **IMPORTANT**: Let the user type the component name freely as text input - **DO NOT** provide multiple-choice options or suggestions @@ -42,6 +43,14 @@ Report: artifacts/cve-fixer/find/cve-issues-20260226-145018.md - Simply ask: "What is the component name?" and wait for user's text response - Store the `--ignore-resolved` flag as a boolean for use in step 3 + **Examples:** + ```bash + /cve.find llm-d # all llm-d CVEs + /cve.find llm-d autoscaler # only autoscaler CVEs + /cve.find llm-d autoscaler --ignore-resolved + /cve.find "AI Evaluations" trustyai-ragas + ``` + 2. **Check JIRA API Token (REQUIRED - User Setup)** - **This is the ONLY thing the user must configure manually before proceeding** @@ -120,6 +129,28 @@ Report: artifacts/cve-fixer/find/cve-issues-20260226-145018.md # Build JQL query JQL="component = \"${COMPONENT_NAME}\" AND summary ~ \"CVE*\" AND labels = SecurityTracking" + # Append subcomponent filter if provided + if [ -n "$SUBCOMPONENT" ] && [ -n "$MAPPING_FILE" ] && [ -f "$MAPPING_FILE" ]; then + # Reverse lookup: find ALL containers whose primary repo has matching subcomponent + PSCOMPONENTS=$(jq -r --arg comp "$COMPONENT_NAME" --arg sub "$SUBCOMPONENT" ' + .components[$comp] as $c | + $c.container_to_repo_mapping | to_entries[] | + select($c.repositories[.value].subcomponent == $sub) | + "pscomponent:" + .key + ' "$MAPPING_FILE") + + if [ -n "$PSCOMPONENTS" ]; then + # Build OR clause for all matching containers + LABEL_FILTERS=$(echo "$PSCOMPONENTS" | \ + awk '{print "labels = \"" $0 "\""}' | \ + paste -sd ' OR ' -) + JQL="${JQL} AND (${LABEL_FILTERS})" + echo "Filtering by subcomponent '${SUBCOMPONENT}': ${PSCOMPONENTS}" + else + echo "⚠️ Subcomponent '${SUBCOMPONENT}' not found in mapping for '${COMPONENT_NAME}' — running without subcomponent filter" + fi + fi + # Append resolved filter if --ignore-resolved flag was provided if [ "$IGNORE_RESOLVED" = "true" ]; then JQL="${JQL} AND status not in (\"Resolved\")" diff --git a/workflows/cve-fixer/.claude/commands/cve.fix.md b/workflows/cve-fixer/.claude/commands/cve.fix.md index 4de528a6..d2bd3de6 100644 --- a/workflows/cve-fixer/.claude/commands/cve.fix.md +++ b/workflows/cve-fixer/.claude/commands/cve.fix.md @@ -40,6 +40,7 @@ Summary: - Fetch the issue details from Jira API - Extract CVE ID from the issue summary - Extract component name from the issue + - Extract container name from the issue summary (see below) - Proceed with this single CVE - Skip the `/cve.find` output lookup @@ -49,10 +50,34 @@ Summary: - Extract CVE IDs with their status from the markdown file - Filter for CVEs where `Status: Open` (unfixed vulnerabilities) - Extract component name from the find output (e.g., "AI Core Dashboard") + - Extract container name from each issue summary (see below) - Collect ALL open CVEs (no filtering) - Proceed with all open CVEs found - **Result**: A list of CVEs to fix with their associated Jira issues and components + **Extracting container and package from Jira summary (both options)** + + Jira summaries follow the pattern: + ``` + CVE-YYYY-XXXXX : : + ``` + Example: + ``` + CVE-2025-66418 rhoai/odh-llm-d-routing-sidecar-rhel9: urllib3: Unbounded decompression + ``` + + Parse each summary to extract: + - **Container name**: the `rhoai/odh-*-rhel9` token (or similar) between the CVE ID and the first colon + - **Package name**: the token after the first colon (e.g., `urllib3`, `grpc-go`, `aiohttp`) + + ```bash + SUMMARY="CVE-2025-66418 rhoai/odh-llm-d-routing-sidecar-rhel9: urllib3: Unbounded decompression" + CONTAINER=$(echo "$SUMMARY" | grep -oP '(?<=CVE-[0-9]+-[0-9]+ )[\w/.-]+(?=:)') + PACKAGE=$(echo "$SUMMARY" | grep -oP '(?<=: )[\w.@/_-]+(?=:)') + ``` + + Store `CONTAINER` and `PACKAGE` per CVE for use in Steps 3 and 5. + + **Result**: A list of CVEs to fix with their associated Jira issues, components, containers, and package names 2. **Load Component-Repository Mapping** - Use `component-repository-mappings.json` from workspace root @@ -92,23 +117,42 @@ Summary: - Proceed with mapped repository as documented below 3. **Identify Target Repositories** - - Get list of ALL repositories from the mapping for the component - - **IMPORTANT**: A single component may map to MULTIPLE repositories (e.g., an upstream repo and one or more downstream repos) - - Each repository entry may have a `repo_type` field indicating `"upstream"` or `"downstream"` - - For each repository, gather: - - Repository name (e.g., "opendatahub-io/odh-dashboard") - - Default branch (e.g., "main") - - Active release branches (e.g., ["v2.29.0-fixes", "v2.28.0-fixes", "rhoai-3.0"]) - - Primary target branch for CVE fixes (from `cve_fix_workflow.primary_target`) - - Backport targets from cve_fix_workflow - - Repository type (monorepo vs single package) - - Repo type: upstream or downstream (from `repo_type` field, defaults to upstream if absent) - - Create initial list of ALL candidate repositories for the fix - - **Multi-repo strategy**: When a component has both upstream and downstream repos: - - Fix upstream first, then apply the same fix to downstream repos - - Each repo gets its own clone, branch, PR, and verification cycle - - The fix in downstream repos may be a cherry-pick or re-application of the upstream fix - - Steps 4 through 11 are repeated for EACH repository in the list + + **3.1: Use container to scope repos (preferred)** + + If a `CONTAINER` was extracted in Step 1: + - Look up `CONTAINER` in `container_to_repo_mapping` for the component + - **If container not found in mapping**: + - Log a warning: "⚠️ Container [CONTAINER] not in mapping — may be a new container not yet registered. Processing all component repos." + - Fall back to processing all repos in the component (scan in Step 5 filters irrelevant ones) + - **If container found**: gives the **primary repo** (e.g., `opendatahub-io/workload-variant-autoscaler`) + - Check if the primary repo has a `subcomponent` field in the `repositories` section + - **If `subcomponent` is defined**: collect all repos in the component with the same `subcomponent` value — this is the chain (upstream + midstream + downstream) + - **If `subcomponent` is not defined**: process ALL repositories in the component (safe fallback — the CVE scan in Step 5 will filter out repos where the CVE doesn't exist) + - **This ensures only the repos relevant to that specific container get PRs** — not repos belonging to other subcomponents + + Example: `rhoai/odh-workload-variant-autoscaler-controller-rhel9` → primary repo `opendatahub-io/workload-variant-autoscaler` → `subcomponent: autoscaler` → only process `llm-d/llm-d-workload-variant-autoscaler`, `opendatahub-io/workload-variant-autoscaler`, `red-hat-data-services/workload-variant-autoscaler`. + + **3.2: Fallback — use all repos** + + If no `CONTAINER` was extracted (summary doesn't match expected pattern): + - Process ALL repositories listed under the component + - The CVE scan in Step 5 acts as the safety net — it will skip repos where the CVE doesn't exist + - Log a warning: "⚠️ Could not extract container from summary — processing all component repos" + + **3.3: For each target repo, gather:** + - Repository name (e.g., "opendatahub-io/odh-dashboard") + - Default branch (e.g., "main") + - Active release branches (e.g., ["v2.29.0-fixes", "v2.28.0-fixes", "rhoai-3.0"]) + - Primary target branch for CVE fixes (from `cve_fix_workflow.primary_target`) + - Backport targets from `cve_fix_workflow` + - Repository type (monorepo vs single package) + - Repo type: upstream or downstream (from `repo_type` field, defaults to upstream if absent) + + **Multi-repo strategy**: When a container chain has upstream, midstream, and downstream repos: + - Fix upstream first, then apply the same fix to midstream and downstream + - Each repo gets its own clone, branch, PR, and verification cycle + - Steps 4 through 11 are repeated for EACH repository in the list 4. **Clone or Use Existing Repository** - Always use `/tmp` for repository operations with unique dirs per repo @@ -234,18 +278,47 @@ Summary: **5.2: Analyze Scan Results** - Check if the target CVE appears in the scan results - - **If CVE has already been fixed (not present in scan results)**: - - **DO NOT create a PR** — the vulnerability is already resolved + - **If CVE found in scan** → proceed with fix (confirmed vulnerable) + - **If CVE NOT found in scan** → do NOT skip immediately. Instead run Step 5.2.1 below. + + **5.2.1: Package version check (when scan does not find CVE)** + + Container-level CVEs may not be detected by source-level scanners because the vulnerable + package may be installed via RPM, a transitive dependency, or a base image layer rather + than declared directly in the manifest. If the scan returns no result, check the package + version directly: + + ```bash + # Use PACKAGE extracted from Jira summary in Step 1 (e.g., "urllib3", "grpc-go") + + # Python — check requirements files + grep -ri "${PACKAGE}" requirements*.txt setup.py pyproject.toml 2>/dev/null + + # Go — check go.mod + grep -i "${PACKAGE}" go.mod 2>/dev/null + + # Node — check package.json + grep -i "${PACKAGE}" package.json 2>/dev/null + ``` + + **Interpret results:** + - **Package found at a version** → compare against CVE affected version range + - If version is in affected range → proceed with fix + - If version is already patched → mark as already fixed (see below) + - **Package not found in any manifest** → it may be transitive or RPM-installed + - **Do NOT blindly add a direct dependency** — this can cause version conflicts or unnecessary bloat + - Instead, document the situation and create PR with guidance: + - **Go**: transitive deps require a `replace` directive in go.mod — add it only if intentional + - **Python**: adding to requirements.txt may conflict with what pip resolves transitively; prefer updating the parent package that pulls it in + - **Node**: use npm `overrides` to force a safe version without adding a direct dep + - Include note in PR: "⚠️ Package not found directly in manifests — may be a transitive or RPM-installed dependency. Manual review required to confirm the right fix approach." + - **Both scan AND version check find nothing** → mark as already fixed: + - **DO NOT create a PR** - **Print to stdout**: "✅ CVE-YYYY-XXXXX is already fixed in [repository] ([branch]). No action needed." - - **Document in artifacts**: Create a brief note in `artifacts/cve-fixer/fixes/already-fixed-CVE-YYYY-XXXXX.md` with: - - CVE ID - - Repository and branch checked - - Scan results showing CVE is not present - - Timestamp of verification - - Note that Jira ticket may need manual closure - - **Move to next CVE**: Skip all remaining steps for this CVE and proceed to the next one - - **Note**: The Jira ticket may still be open — this is an issue management task, not a code fix task - - Only proceed with remaining steps for CVEs that are confirmed as current vulnerabilities in the scan + - **Document in artifacts**: `artifacts/cve-fixer/fixes/already-fixed-CVE-YYYY-XXXXX.md` + - **Note**: Jira ticket may need manual closure + + - Only skip the CVE entirely when BOTH the scan AND the direct package check find no evidence of the vulnerability **5.3: Check for Existing Open PRs** diff --git a/workflows/cve-fixer/component-repository-mappings.json b/workflows/cve-fixer/component-repository-mappings.json index 62e6fca1..2ea8163a 100644 --- a/workflows/cve-fixer/component-repository-mappings.json +++ b/workflows/cve-fixer/component-repository-mappings.json @@ -60,7 +60,8 @@ }, "build_location": "maas-api/", "notes": "Upstream repository. Contains maas-api Go application. Builds using Dockerfile.konflux for Red Hat builds.", - "repo_type": "upstream" + "repo_type": "upstream", + "subcomponent": "maas-api" }, "red-hat-data-services/models-as-a-service": { "github_url": "https://github.com/red-hat-data-services/models-as-a-service", @@ -78,7 +79,8 @@ }, "build_location": "maas-api/", "notes": "Downstream Red Hat release repository for maas-api. Fixes from upstream should be backported to rhoai-3.0 branch.", - "repo_type": "downstream" + "repo_type": "downstream", + "subcomponent": "maas-api" } } }, @@ -394,7 +396,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release-0.5, release-0.6" - } + }, + "subcomponent": "inference-scheduler" }, "opendatahub-io/llm-d-inference-scheduler": { "github_url": "https://github.com/opendatahub-io/llm-d-inference-scheduler", @@ -410,7 +413,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release-0.2, release-0.3.1, release-v0.4, stable-2.x" - } + }, + "subcomponent": "inference-scheduler" }, "red-hat-data-services/llm-d-inference-scheduler": { "github_url": "https://github.com/red-hat-data-services/llm-d-inference-scheduler", @@ -426,7 +430,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - } + }, + "subcomponent": "inference-scheduler" }, "red-hat-data-services/llm-d-routing-sidecar": { "github_url": "https://github.com/red-hat-data-services/llm-d-routing-sidecar", @@ -442,7 +447,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "rhoai-2.25, rhoai-3.0, rhoai-3.2" - } + }, + "subcomponent": "routing-sidecar" }, "llm-d-incubation/batch-gateway": { "github_url": "https://github.com/llm-d-incubation/batch-gateway", @@ -453,7 +459,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "None" - } + }, + "subcomponent": "batch-gateway" }, "opendatahub-io/batch-gateway": { "github_url": "https://github.com/opendatahub-io/batch-gateway", @@ -466,7 +473,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release-v0.5" - } + }, + "subcomponent": "batch-gateway" }, "red-hat-data-services/batch-gateway": { "github_url": "https://github.com/red-hat-data-services/batch-gateway", @@ -481,7 +489,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - } + }, + "subcomponent": "batch-gateway" }, "llm-d/llm-d-workload-variant-autoscaler": { "github_url": "https://github.com/llm-d/llm-d-workload-variant-autoscaler", @@ -494,7 +503,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release-0.4.2" - } + }, + "subcomponent": "autoscaler" }, "opendatahub-io/workload-variant-autoscaler": { "github_url": "https://github.com/opendatahub-io/workload-variant-autoscaler", @@ -507,7 +517,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release-v0.5" - } + }, + "subcomponent": "autoscaler" }, "red-hat-data-services/workload-variant-autoscaler": { "github_url": "https://github.com/red-hat-data-services/workload-variant-autoscaler", @@ -522,14 +533,18 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - } + }, + "subcomponent": "autoscaler" } } }, "AI Evaluations": { "container_to_repo_mapping": { "rhoai/odh-ta-lmes-driver-rhel9": "opendatahub-io/trustyai-service-operator", - "rhoai/odh-ta-lmes-job-rhel9": "opendatahub-io/lm-evaluation-harness" + "rhoai/odh-ta-lmes-job-rhel9": "opendatahub-io/lm-evaluation-harness", + "rhoai/odh-trustyai-ragas-lls-provider-dsp-rhel9": "opendatahub-io/llama-stack-provider-ragas", + "rhoai/odh-eval-hub-rhel9": "opendatahub-io/eval-hub", + "rhoai/odh-trustyai-garak-lls-provider-dsp-rhel9": "opendatahub-io/llama-stack-provider-trustyai-garak" }, "repositories": { "eval-hub/eval-hub": { @@ -541,7 +556,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "None" - } + }, + "subcomponent": "eval-hub" }, "eval-hub/eval-hub-sdk": { "github_url": "https://github.com/eval-hub/eval-hub-sdk", @@ -553,7 +569,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "None" - } + }, + "subcomponent": "eval-hub-sdk" }, "eval-hub/eval-hub-contrib": { "github_url": "https://github.com/eval-hub/eval-hub-contrib", @@ -565,7 +582,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "None" - } + }, + "subcomponent": "eval-hub-contrib" }, "trustyai-explainability/llama-stack-provider-trustyai-garak": { "github_url": "https://github.com/trustyai-explainability/llama-stack-provider-trustyai-garak", @@ -576,7 +594,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "None" - } + }, + "subcomponent": "trustyai-garak" }, "trustyai-explainability/trustyai-service-operator": { "github_url": "https://github.com/trustyai-explainability/trustyai-service-operator", @@ -590,7 +609,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release/1.37.0, release/1.38.0" - } + }, + "subcomponent": "trustyai-service-operator" }, "opendatahub-io/eval-hub": { "github_url": "https://github.com/opendatahub-io/eval-hub", @@ -604,7 +624,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release/odh-3.4, stable" - } + }, + "subcomponent": "eval-hub" }, "opendatahub-io/lm-evaluation-harness": { "github_url": "https://github.com/opendatahub-io/lm-evaluation-harness", @@ -620,7 +641,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release/odh-3.3, release/odh-3.4, release/odh-3.4-ea2, release/odh-3.5" - } + }, + "subcomponent": "lm-evaluation-harness" }, "opendatahub-io/llama-stack-provider-trustyai-garak": { "github_url": "https://github.com/opendatahub-io/llama-stack-provider-trustyai-garak", @@ -634,7 +656,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release/odh-3.4, stable" - } + }, + "subcomponent": "trustyai-garak" }, "opendatahub-io/trustyai-service-operator": { "github_url": "https://github.com/opendatahub-io/trustyai-service-operator", @@ -649,7 +672,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "release/odh-3.3, release/odh-3.4, release/odh-3.4-ea2" - } + }, + "subcomponent": "trustyai-service-operator" }, "red-hat-data-services/eval-hub": { "github_url": "https://github.com/red-hat-data-services/eval-hub", @@ -664,7 +688,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - } + }, + "subcomponent": "eval-hub" }, "red-hat-data-services/lm-evaluation-harness": { "github_url": "https://github.com/red-hat-data-services/lm-evaluation-harness", @@ -680,7 +705,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - } + }, + "subcomponent": "lm-evaluation-harness" }, "red-hat-data-services/llama-stack-provider-trustyai-garak": { "github_url": "https://github.com/red-hat-data-services/llama-stack-provider-trustyai-garak", @@ -696,7 +722,8 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" - } + }, + "subcomponent": "trustyai-garak" }, "red-hat-data-services/trustyai-service-operator": { "github_url": "https://github.com/red-hat-data-services/trustyai-service-operator", @@ -712,6 +739,152 @@ "cve_fix_workflow": { "primary_target": "main", "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" + }, + "subcomponent": "trustyai-service-operator" + }, + "trustyai-explainability/llama-stack-provider-ragas": { + "github_url": "https://github.com/trustyai-explainability/llama-stack-provider-ragas", + "default_branch": "main", + "active_release_branches": [ + "release/0.4.x", + "release/0.5.x" + ], + "branch_strategy": "Fix in main. Release branches follow pattern release/X.Y.x.", + "repo_type": "upstream", + "subcomponent": "trustyai-ragas", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "release/0.4.x, release/0.5.x" + } + }, + "opendatahub-io/llama-stack-provider-ragas": { + "github_url": "https://github.com/opendatahub-io/llama-stack-provider-ragas", + "default_branch": "main", + "active_release_branches": [ + "release/odh-3.3", + "release/odh-3.4-ea2", + "stable" + ], + "branch_strategy": "Fork of upstream trustyai-explainability/llama-stack-provider-ragas. Release branches follow pattern release/odh-X.Y.", + "repo_type": "midstream", + "subcomponent": "trustyai-ragas", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "release/odh-3.3, release/odh-3.4-ea2, stable" + } + }, + "red-hat-data-services/llama-stack-provider-ragas": { + "github_url": "https://github.com/red-hat-data-services/llama-stack-provider-ragas", + "default_branch": "main", + "active_release_branches": [ + "rhoai-3.3", + "rhoai-3.4", + "rhoai-3.4-ea.1", + "rhoai-3.4-ea.2" + ], + "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", + "repo_type": "downstream", + "subcomponent": "trustyai-ragas", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "rhoai-3.3, rhoai-3.4, rhoai-3.4-ea.1, rhoai-3.4-ea.2" + } + } + } + }, + "AutoML": { + "container_to_repo_mapping": { + "managed-open-data-hub/odh-automl-rhel9": "red-hat-data-services/pipelines-components" + }, + "repositories": { + "kubeflow/pipelines-components": { + "github_url": "https://github.com/kubeflow/pipelines-components", + "default_branch": "main", + "active_release_branches": [], + "branch_strategy": "Fix in main. No formal release branching documented.", + "repo_type": "upstream", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "None" + } + }, + "opendatahub-io/pipelines-components": { + "github_url": "https://github.com/opendatahub-io/pipelines-components", + "default_branch": "main", + "active_release_branches": [], + "branch_strategy": "Fork of upstream kubeflow/pipelines-components.", + "repo_type": "midstream", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "None" + } + }, + "red-hat-data-services/pipelines-components": { + "github_url": "https://github.com/red-hat-data-services/pipelines-components", + "default_branch": "main", + "active_release_branches": [ + "rhoai-3.4" + ], + "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", + "repo_type": "downstream", + "notes": "Monorepo containing both AutoML (components/automl/) and AutoRAG (components/autorag/) components.", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "rhoai-3.4" + } + } + } + }, + "AutoRAG": { + "container_to_repo_mapping": {}, + "repositories": { + "kubeflow/pipelines-components": { + "github_url": "https://github.com/kubeflow/pipelines-components", + "default_branch": "main", + "active_release_branches": [], + "branch_strategy": "Fix in main. No formal release branching documented.", + "repo_type": "upstream", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "None" + } + }, + "opendatahub-io/pipelines-components": { + "github_url": "https://github.com/opendatahub-io/pipelines-components", + "default_branch": "main", + "active_release_branches": [], + "branch_strategy": "Fork of upstream kubeflow/pipelines-components.", + "repo_type": "midstream", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "None" + } + }, + "red-hat-data-services/pipelines-components": { + "github_url": "https://github.com/red-hat-data-services/pipelines-components", + "default_branch": "main", + "active_release_branches": [ + "rhoai-3.4" + ], + "branch_strategy": "Fork of midstream. RHOAI release branches follow pattern rhoai-X.Y.", + "repo_type": "downstream", + "notes": "Monorepo containing both AutoML (components/automl/) and AutoRAG (components/autorag/) components.", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "rhoai-3.4" + } + }, + "IBM/ai4rag": { + "github_url": "https://github.com/IBM/ai4rag", + "default_branch": "main", + "active_release_branches": [], + "branch_strategy": "Python package upstream. CVEs in ai4rag manifest as container CVEs in pipelines-components \u2014 fix by updating ai4rag version there.", + "repo_type": "upstream", + "notes": "No containerization \u2014 distributed as a Python package. No ODH/RHDS forks exist. Excluded from automation; track upstream releases and update dependency version in pipelines-components.", + "cve_fix_workflow": { + "primary_target": "main", + "backport_targets": "N/A", + "excluded_from_automation": true } } }