-
Story
-
Resolution: Done
-
Critical
-
None
-
None
-
None
-
5
-
False
-
-
False
-
-
-
RHDHPAI Sprint 3266, RHDHPAI Sprint 3267
Story (Required)
As a developer I would like to create Catalog Component from Software Templates which __ deploy Model Servers only.
This would allow me to utilize the Model Server instance with my other Software Templates for example.
Background (Required)
Create Software Template without bundling the AI application and the Model Server. The Model Server deployment should be created along with the Service and the Endpoint route for users to use it.
Since there is no Application bundled, we dont need to provide CI/CD pipelines with it.
Out of scope
<Defines what is not included in this story>
Approach (Required)
<Description of the general technical path on how to achieve the goal of the story. Include details like json schema, class definitions>
Dependencies
<Describes what this story depends on. Dependent Stories and EPICs should be linked to the story.>
Acceptance Criteria (Required)
<Describe edge cases to consider when implementing the story and defining tests>
<Provides a required and minimum list of acceptance tests for this story. More is expected as the engineer implements this story>
documentation updates (design docs, release notes etc)
demo needed
SOP required
education module update (Filled by DEVAI team only)
R&D label required (Filled by DEVAI team only)
Done Checklist
Code is completed, reviewed, documented and checked in
Unit and integration test automation have been delivered and running cleanly in continuous integration/staging/canary environment
Continuous Delivery pipeline(s) is able to proceed with new code included
Customer facing documentation, API docs, design docs etc. are produced/updated, reviewed and published
Acceptance criteria are met
If the Grafana dashboard is updated, ensure the corresponding SOP is updated as well
- is depended on by
-
RHIDP-10643 Host the granite-3-8b model in VLLM
-
- Closed
-
- relates to
-
RHIDP-10643 Host the granite-3-8b model in VLLM
-
- Closed
-