-
Bug
-
Resolution: Done
-
Blocker
-
None
-
False
-
None
-
False
-
Testable
-
Yes
-
-
-
-
-
-
-
1.20.0-z
-
No
-
No
-
Yes
-
None
-
-
Description of problem:
When configuring a server for the model serving feature, The user can decide to not tick "Require token authentication" (it's actually the default choice); In this case, after deploying a model, it won't be possible to receive any inference from the provided endpoint. Requests will receive the Openshift "Application is not available" error page as a response.
Prerequisites (if any, like setup, operators/versions):
latest model serving live build
Steps to Reproduce
- Create DSP
- Configure Server (without token authentication)
- Deploy model
- Try to reach the inference endpoint
Actual results:
The Openshift "Application is not available" error page is received as a response from the inference endpoint
Expected results:
Any user is able to reach the inference endpoint and get an inference result back, without using any token
Reproducibility (Always/Intermittent/Only Once):
always
Build Details:
Workaround:
Enable token authentication, include token in request to inference endpoint