-
Epic
-
Resolution: Done
-
Blocker
-
4.19.z, 4.20
-
Support for Gateway API Inference Extension
-
Product / Portfolio Work
-
-
0% To Do, 0% In Progress, 100% Done
-
False
-
None
-
True
-
Not Selected
-
L
-
0
The purpose of this task is to deliver support for the Gateway API Inference Extension (GIE) as part of the platform level Gateway API implementation to support the requirements of Red Hat OpenShift AI (RHOAI) 3.0.
This means specifically that we need to update the platform level OSSM version in the Cluster Ingress Operator (CIO) to a version which includes this support (enabling the "ENABLE_GATEWAY_API_INFERENCE_EXTENSION" Pilot ENV var to turn the feature on).
This is a deliverable on top of OSSM-9656 (which blocks this).
Additional Notes
The CRD resources such as InferencePool are NOT in scope for this issue, as RHOAI will be managing those themselves at least for GA so as to maintain flexibility and control at the onset. We will revisit whether we want to move those CRDs into core at a later time, as they mature, but that is not part of this effort.