Uploaded image for project: 'Red Hat Enterprise Linux AI'
  1. Red Hat Enterprise Linux AI
  2. RHELAI-3861

Implement scheduler module for background tasks for LLS

    • Icon: Task Task
    • Resolution: Unresolved
    • Icon: Undefined Undefined
    • None
    • None
    • None
    • Sprint 1

      Goal: 

      Goal based statement, with context. Can take the form of “As a <User/Actor>, I Want <to Achieve Some Goal>, so that <Some Reason/Context>”.

      • When a Job is running, LLS should run it in background. For inline providers, it means some form of async / threaded scheduler for workloads; for remotes, most of the work is done remotely, except maintenance jobs like job status propagation.

       

      Acceptance Criteria:

      A list of verification conditions, successful functional tests, or expected outcomes in order to declare this story/task successfully completed.

      • Providers have access to a scheduler module that can accept an async function to execute in background, and that allows providers to monitor the status of the handler execution, manage it etc.
      • The module serves multiple providers at the same time.

       

      The initial implementation is here: https://github.com/meta-llama/llama-stack/pull/1437 though it will need more work to support more actions (cancel) as well as state persistence.

              Unassigned Unassigned
              ihrachys Ihar Hrachyshka (Inactive)
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated: