-
Task
-
Resolution: Unresolved
-
Major
-
None
-
None
Task Description (Required)
- Investigate RHOAI Agentic Samples/Application feasibility
- What do they have? Do we have to wait for them? (Gabe has an application that is ready https://github.com/opendatahub-io/agents/tree/main/examples/langchain-langgraph)
- What are the alternatives? Is it possible to wrap LCS? LCS already has
- Endpoints
- Llama-stack
- MCP support (future consideration)
- RAG integration
- No streamlit frontend, if LCS is used as a backend
- Is it possible to wrap the LS plugin?
- We can also consider Fanis' Llama Stack Streamlit Template https://github.com/redhat-ai-dev/llama-stack-template/blob/main/templates/llama-stack-agent/content/llama_stack_agent.py
- We will also need to investigate performance issues - async vs sync as MCP tool calls sometimes take > 2 mins for a response.
- Gabe Sample Application is async
- Fanis Llama Stack Streamlit is sync (but we can work to make it async)