-
Feature
-
Resolution: Unresolved
-
Critical
-
rhelai-1.5
Goal:
We will need to make the models that we are validating on the vLLM-ent midstream for inference + accuracy and the models that we are validating for teacher/student models in InstructLab flows on RHEL AI and RHOAI available to be pulled into RHEL AI and RHOAI.
Scope:
- Requires:
-
- Model List: https://docs.google.com/spreadsheets/d/1NGPhJV0pk7jYuAFOHk7aWPomX7Svb_-Xa-OVUVtpNbM/edit?gid=1505755754#gid=1505755754
- Store model as ModelCar artifact in Quay, labelled for Teacher/Student/Judge/Inference (if possible, otherwise we can cross tag on HF)
- [for 1.5] Enable model be downloaded and served via “ilab model download” and “ilab model serve” by referencing the ModelCar artifact in Quay
- Expose packaged models into RHELAI, list models on CLI
- Expose packaged models in RHOAI, to be surfaced in UI
- FUTURE GOAL: [for 2.x post-llamastack] Same as above but ilab commands TBD
- See llama model list
Acceptance Criteria:
- Enable model be downloaded and served via “ilab model download” and “ilab model serve” by referencing the ModelCar artifact in Quay
- Functionally test the models run in Instruct Lab student and teacher contexts coming from Quay
- Functionally tests the models run in vLLM on RHEL AI and RHOAI for inference coming from Quay
- blocks
-
RHELAI-3557 RHEL AI Third-Party Model Validation Deliverables for Summit '25
-
- In Progress
-
- is depended on by
-
RHELAI-3557 RHEL AI Third-Party Model Validation Deliverables for Summit '25
-
- In Progress
-