-
Story
-
Resolution: Unresolved
-
Normal
-
None
-
None
-
2
-
False
-
-
False
-
No
-
No
-
-
-
Testable
In https://access.redhat.com/documentation/en-us/red_hat_openshift_ai_self-managed/2-latest/html/serving_models/serving-large-models_serving-large-models#accessing-inference-endpoints-for-models-deployed-on-single-model-serving-platform_serving-large-models, the method we document for accessing the inference endpoint of a deployed model is via the Projects page.
Per the current (that is, 2.9) dashboard layout, you can also access this from the Model Serving page. Consider documenting this approach.