-
Feature
-
Resolution: Unresolved
-
Undefined
-
None
Proposed title of this feature request
OpenTelemetry Lightspeed Integration
What is the nature and description of the request?
OpenShift Lightspeed is a generative AI-based virtual assistant integrated into the OpenShift web console. Using an English natural-language interface, OpenShift Lightspeed answers questions related to OpenShift and layered OpenShift offerings. Using Red Hat's extensive experience in OpenShift and mission-critical applications, OpenShift Lightspeed assists with troubleshooting and investigating cluster resources.
Given the complex nature of this software and potential high cost with difficult troubleshooting, both users and Lightspeed developers need to easily configure and setup an observability platform that helps them to observe Lightspeed operations. Streamline Metrics, Logs and Traces for a way to extract, store, visualize and export Lightspeed usage telemetry data in various formats.
Why does the customer need this? (List the business requirements)
Data scientists who run workloads in the cloud, need to obtain GPU data to control costs. This data is not available today. Even if we make it available, customers want to leverage on OpenShift to send data to many platforms, on prem, self-supported and/or observability solutions. That's why, the business requirements of this feature are:
- Lightspeed works with three different large language models (LLMs): IBM WatsonX, Microsoft Azure OpenAI, and OpenAI.
- Provide a way to read, transform, export and store Lightspeed data in OTLP format
- Provide a dashboard in the OpenShift console related to relevant Lightspeed info related to relevant LLM info as defined in the OpenTelemetry standard.
- This includes detecting hallucinations, text quality, length, word count... Read more here
- Document this integration as part of the integrations framework of this outcome: OBSDA-914
List any affected packages or components.
- Red Hat build of OpenTelemetry
Further reading