-
Feature
-
Resolution: Unresolved
-
Normal
-
None
-
None
-
False
-
-
False
-
Not Selected
Feature Overview
Implement Agentic RAG with Llama Stack, with capabilities for:
- multilanguage RAG
- vision retrieval
- interaction with granite 3.x vision model
Goals
- Basic Agentic RAG flow
- Multi-turn interaction before and after retrieval
- English, non-English/multilingual
- Using vectordb over Llama Stack
- Vectordb should support multimodal RAG
- Interaction with granite 3.x vision model
Stretched goal:
- Hybrid Search/Hybrid Retrieval
Requirements
- Implementation of Agentic RAG using Llama Stack API
- Llama Stack Client SDK
- Llama Stack Tool RAG Provider
Done
Use Cases
<your text here>
Out of Scope
<your text here>
Documentation Considerations
<your text here>
Questions to Answer
<your text here>