Uploaded image for project: 'Network Edge'
  1. Network Edge
  2. NE-2050

Support for Gateway API Inference Extension

XMLWordPrintable

    • Support for Gateway API Inference Extension
    • Product / Portfolio Work
    • OCPSTRAT-1757Support for Gateway API Inference Extensions
    • 0% To Do, 0% In Progress, 100% Done
    • False
    • None
    • True
    • Not Selected
    • L
    • 0

      The purpose of this task is to deliver support for the Gateway API Inference Extension (GIE) as part of the platform level Gateway API implementation to support the requirements of Red Hat OpenShift AI (RHOAI) 3.0.

      This means specifically that we need to update the platform level OSSM version in the Cluster Ingress Operator (CIO) to a version which includes this support (enabling the "ENABLE_GATEWAY_API_INFERENCE_EXTENSION" Pilot ENV var to turn the feature on).

      This is a deliverable on top of OSSM-9656 (which blocks this).

      Additional Notes

      The CRD resources such as InferencePool are NOT in scope for this issue, as RHOAI will be managing those themselves at least for GA so as to maintain flexibility and control at the onset. We will revisit whether we want to move those CRDs into core at a later time, as they mature, but that is not part of this effort.

              mmasters1@redhat.com Miciah Masters
              rh-ee-sutt Shane Utt
              None
              Ishmam Amin Ishmam Amin
              Jesse Dohmann Jesse Dohmann
              Votes:
              0 Vote for this issue
              Watchers:
              11 Start watching this issue

                Created:
                Updated:
                Resolved: