Uploaded image for project: 'AI Platform Core Components'
  1. AI Platform Core Components
  2. AIPCC-11363

Enable Standalone Garak Package on the Red Hat AI Python Index for Independent AI Safety Testing

XMLWordPrintable

    • False
    • Hide

      None

      Show
      None
    • False

      Problem Statement

      Currently, Garak (the AI model vulnerability scanning and red-teaming framework) is only consumable within Red Hat AI through the llama-stack-provider-trustyai-garak container image, which runs as a KFP (Kubeflow Pipelines) component inside the Llama Stack distribution. While this integrated approach works well for the EvalHub/Chatterbox workflow (RHAISTRAT-1178), it prevents users from leveraging Garak as a standalone tool for independent security testing outside of KFP or Llama Stack.

      Field teams, security engineers, and customers need the ability to run pip install garak directly from the Red Hat AI Python Index and execute Garak evaluations independently — without requiring a full Llama Stack deployment or KFP pipeline setup. This is critical for:

      • Security-first workflows: Security teams often need to evaluate models before they are deployed into Llama Stack or any orchestration layer.
      • CI/CD integration: Teams want to incorporate Garak scans into their existing CI/CD pipelines without taking a dependency on KFP.
      • Notebook-based evaluation: Data scientists and AI engineers working in RHOAI workbenches should be able to pip install garak and run evaluations directly in their notebooks.
      • Air-gapped environments: Customers in regulated industries need trusted, supply-chain-verified packages available from the Red Hat AI Python Index rather than pulling from upstream PyPI.

      Business Alignment

      Garak maps directly to both the OWASP LLM Top 10 and the AVID (AI Vulnerability Database) taxonomy, making it a core component of Red Hat AI's safety story. Providing it as a standalone, trusted package:

      • Strengthens Red Hat's positioning as a leader in enterprise AI safety
      • Enables broader adoption beyond users who have deployed Llama Stack
      • Aligns with the AI Safety Starter Kit initiative to make safety tools accessible and well-documented
      • Supports the principle that safety controls should be platform-native and easy to adopt, not buried behind complex deployment prerequisites

      Current Situation

      • Garak from NVIDIA is already an approved wheel on the RH AI Python Index (AIPCC-10008).
      • The TrustyAI midstream fork with Chatterbox security testing logic exists at trustyai-explainability/garak (branch: automated-red-teaming).
      • For Summit 2026, Garak execution is scoped to KFP pipelines via the container image (RHAIRFE-1310).
      • Post-summit, we want to make the standalone package available with Chatterbox capabilities included.

      Proposed Solution

      1. Onboard the midstream Garak fork (with Chatterbox security testing capabilities) as a standalone Python package on the Red Hat AI Python Index, so users can run pip install garak from the trusted index.
      2. Ensure the standalone package includes the Chatterbox taxonomy and attack strategies developed in the midstream fork.
      3. Provide documentation and examples for running Garak independently — in notebooks, CI/CD pipelines, and from the command line — without requiring Llama Stack or KFP.
      4. Maintain compatibility with the KFP-based workflow so both consumption paths (standalone and integrated) coexist.

      Success Criteria

      • Users can run pip install garak from the Red Hat AI Python Index and get a working, supply-chain-verified package with Chatterbox capabilities.
      • The standalone package can probe models for vulnerabilities mapped to OWASP LLM Top 10 and AVID taxonomy without requiring Llama Stack or KFP infrastructure.
      • Documentation exists for standalone usage (CLI, notebook, CI/CD integration).
      • The existing KFP/Llama Stack integration continues to work alongside the standalone package.

      References

              Unassigned Unassigned
              azaalouk Adel Zaalouk
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated: