-
Story
-
Resolution: Done
-
Undefined
-
None
-
None
-
None
-
3
-
False
-
-
False
-
-
Story
As a developer, I want to understand how we can run different LLM models using open source servers so that frontend plugin can talk to LLM models.
Background
Dependencies and Blockers
QE impacted work
Documentation impacted work
Acceptance Criteria
- Should try out different LLM servers.
- Should understand the OpenAI's API specification and different APIs that can be consumed to power chat interface plugin.
- Should explore the authentication methods that can be used to connect to LLM servers.
upstream documentation updates (design docs, release notes etc)
Technical enablement / Demo