-
Task
-
Resolution: Done
-
Major
-
None
-
None
Task Description (Required)
We are working on the RHOAI Agentic Sample in the branch https://github.com/thepetk/langgraph-agents/tree/lls-rag-exp/examples/langchain-langgraph
- It would be nice to have a toggle for sync vs async, where users can choose a behaviour.
- Currently the default endpoint localhost:5000 in the sample is sync, but if you want the old async behaviour it is at localhost:5000/classic
- Preference is to have a Streamlit UI with many benefits:
- This helps us making the sample more uv/python oriented
- We can toggle sync vs async behaviour
- We can implement event loop asyncio
- Async Approach Ideas
- We can have a sidebar table, where submissions are listed
- Once we have the Streamlit app, we can push the changes from https://github.com/thepetk/langgraph-agents/tree/lls-rag-exp/examples/langchain-langgraph to https://github.com/redhat-ai-dev/llama-stack-agentic-sample
[OLD DESC BELOW]
We can also consider Fanis' Llama Stack Streamlit Template https://github.com/redhat-ai-dev/llama-stack-template/blob/main/templates/llama-stack-agent/content/llama_stack_agent.py
We will also need to investigate performance issues - async vs sync as MCP tool calls sometimes take > 2 mins for a response.
-
- Gabe Sample Application is async
- Fanis Llama Stack Streamlit is sync (but we can work to make it async)