-
Sub-task
-
Resolution: Unresolved
-
Undefined
-
None
-
None
-
False
-
-
False
-
-
Context Currently, the Kiali Chatbot operates on a synchronous request/response model. When a user sends a message, the UI waits (showing a generic loading state) while the backend performs complex operations using MCP tools (e.g., querying Prometheus, fetching Namespace info). This latency creates a "black box" experience where the user is unsure if the system is hung or working.
Goal Update the Chatbot architecture to use Streaming Responses. This will allow the backend to push intermediate status updates (e.g., "Querying Graph API...", "Analyzing metrics...") to the UI in real-time before the final answer is generated.
User Story
As a Kiali user interacting with the Chatbot, I want to see real-time status updates indicating which tools the bot is currently accessing, So that I understand the bot's thought process and know the request is being processed activeley.
Acceptance Criteria
- Backend Streaming:
-
- The Backend API must be updated to return a streamed response (e.g., Transfer-Encoding: chunked or text/event-stream) instead of a single JSON blob.
-
- The Backend must emit events for specific lifecycle moments:
-
-
- status: When starting a new generic task.
-
-
-
- tool_start: Specifically when an MCP tool is invoked (e.g., "Calling GetNodeMetrics").
-
-
-
- message: The final content chunks of the LLM response.
-
- Frontend Stream Consumption:
-
- The UI client must be updated to read the stream incrementally (using response.body.getReader()).
-
- The UI must parse the incoming chunks to distinguish between a "Status Update" and the "Final Answer."
- UI/UX Updates:
-
- While the bot is "thinking," display a dynamic status line (e.g., small text above or below the chat bubble) showing the current action (e.g., "Loading Namespace info...").
-
- Once the final message begins streaming, the status line should disappear or change to "Generating answer...".
-
- Handle errors gracefully (if the stream cuts off, the UI should not freeze).
- blocks
-
OSSM-12499 Support Lightspeed Provider
-
- Backlog
-