-
Task
-
Resolution: Done
-
Major
-
None
-
None
-
None
-
RHDHPAI Sprint 3270
Task Description (Required)
Once we get ODH / Kubeflow / KServe inference service deployment and reconciliation working reliably again with RHDHPAI-575, this task will cap at a day or two of taking the CLI usage from the RHDHPAI-64 demo from Dec 2024
- replicate it in a controller
- bypass github and use our AI plugin from
RHDHPAI-506andRHDHPAI-505to push the model metadata
We'll have
- ODH+model registry and our Model Catalog HTTP Rest endpoint running on OCP.
- The plugin and controller will run out of IDE's
A very very early form our the bridge depicted in https://miro.com/app/board/uXjVLkD_hyQ=/ and detailed in https://docs.google.com/document/d/1kvQ1MUNYLjYifRYaPvIJ-vPqzvXMby1nytwuooayWzU/edit?tab=t.0 that will POC our designs there, and provide a video to our stakeholders, from RHOAI Dev to the PMs and BUs of both teams, to give an early preview of what it could look like.
Waiting on the Jira from John for sorting out building the image for the AI plugin and getting the dynamic loading of the plugin working.
Major bits that most likely will be missing from our longer term designs
- move off of catalog-info.yaml format to the new json array format as the unit of exchange
- move to a more robust forms of the data store for the bridge like github/DBs/PVs , but rather the short term use of a configmap mounted into the bridge
- We need https://issues.redhat.com/browse/RHDHPAI-577 in order to run the plugins out of RHDH running on OCP ... once we have that it then makes sense to create manifests to run the controller in OCP
- depends on
-
RHIDP-10095 Configure OCI/Dynamic plugin build for proof of concept model catalog plugin
-
- To Do
-
-
RHIDP-10560 get ODH Model Registry deploy and reconciliation with KServe inference services working again
-
- Closed
-
- links to