Uploaded image for project: 'OpenShift Service Mesh'
  1. OpenShift Service Mesh
  2. OSSM-12225 [TP] Enhance AI Chatbot: Context Awareness & UX Improvements (Phase 2)
  3. OSSM-12252

Implement Streaming Responses for Chatbot to Display Real-Time MCP Tool Execution Status

XMLWordPrintable

    • Icon: Sub-task Sub-task
    • Resolution: Unresolved
    • Icon: Undefined Undefined
    • None
    • None
    • Kiali
    • False
    • Hide

      None

      Show
      None
    • False

      Context Currently, the Kiali Chatbot operates on a synchronous request/response model. When a user sends a message, the UI waits (showing a generic loading state) while the backend performs complex operations using MCP tools (e.g., querying Prometheus, fetching Namespace info). This latency creates a "black box" experience where the user is unsure if the system is hung or working.

      Goal Update the Chatbot architecture to use Streaming Responses. This will allow the backend to push intermediate status updates (e.g., "Querying Graph API...", "Analyzing metrics...") to the UI in real-time before the final answer is generated.

      User Story

      As a Kiali user interacting with the Chatbot, I want to see real-time status updates indicating which tools the bot is currently accessing, So that I understand the bot's thought process and know the request is being processed activeley.

      Acceptance Criteria

      1. Backend Streaming:
        • The Backend API must be updated to return a streamed response (e.g., Transfer-Encoding: chunked or text/event-stream) instead of a single JSON blob.
        • The Backend must emit events for specific lifecycle moments:
          • status: When starting a new generic task.
          • tool_start: Specifically when an MCP tool is invoked (e.g., "Calling GetNodeMetrics").
          • message: The final content chunks of the LLM response.
      1. Frontend Stream Consumption:
        • The UI client must be updated to read the stream incrementally (using response.body.getReader()).
        • The UI must parse the incoming chunks to distinguish between a "Status Update" and the "Final Answer."
      1. UI/UX Updates:
        • While the bot is "thinking," display a dynamic status line (e.g., small text above or below the chat bubble) showing the current action (e.g., "Loading Namespace info...").
        • Once the final message begins streaming, the status line should disappear or change to "Generating answer...".
        • Handle errors gracefully (if the stream cuts off, the UI should not freeze).

              agutierr@redhat.com Alberto Jesus Gutierrez Juanes
              agutierr@redhat.com Alberto Jesus Gutierrez Juanes
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated: