• Icon: Story Story
    • Resolution: Done
    • Icon: Major Major
    • 1.8.0
    • None
    • None
    • None
    • DEVAI Sprint 3281

      Story (Required)

      As a user of LCS, we should be able to cache RAG(referenced_documents) responses from the LLM and fetch the data back from cache storage

      Background (Required)

      When you query using streaming_query, you get a response with the RAG data - doc link and title, we need to save these information alongside teh response text

      Out of scope

      If LCS does not have in memory implementation, then we dont need to implement caching for those types. Only check for whatever is currently implemented on LCS.

      Approach (Required)

      Ensure unit tests are also incl as part of the PR

      Dependencies

      <Describes what this story depends on. Dependent Stories and EPICs should be linked to the story.>

      Acceptance Criteria (Required)

      <Describe edge cases to consider when implementing the story and defining tests>

      <Provides a required and minimum list of acceptance tests for this story. More is expected as the engineer implements this story>

      documentation updates (design docs, release notes etc)
      demo needed
      SOP required
      education module update (Filled by RHDHPAI team only)
      R&D label required (Filled by RHDHPAI team only)

      Done Checklist

      Code is completed, reviewed, documented and checked in
      Unit and integration test automation have been delivered and running cleanly in continuous integration/staging/canary environment
      Continuous Delivery pipeline(s) is able to proceed with new code included
      Customer facing documentation, API docs, design docs etc. are produced/updated, reviewed and published
      Acceptance criteria are met
      If the Grafana dashboard is updated, ensure the corresponding SOP is updated as well

              mfaisal2 Maysun Faisal
              mfaisal2 Maysun Faisal
              RHDH AI
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated:
                Resolved: