Uploaded image for project: 'Red Hat Enterprise Linux AI'
  1. Red Hat Enterprise Linux AI
  2. RHELAI-3914

Update InstructLab to require `vllm>=0.8.0,<0.9.0`

XMLWordPrintable

    • Icon: Feature Feature
    • Resolution: Done
    • Icon: Undefined Undefined
    • rhelai-1.5
    • None
    • DevOps
    • None

      Feature Overview
      We want to allow 0.8.0 <= vLLM < 0.9.0 upstream to enable RHEL AI 1.5 to utilize vLLM 0.8.3 with CUDA accelerators. See relevant cards:

      Goals

      Requirements:

      N/A - this is just tracking the upstream updates in order to enable our downstream processes to build wheels and containers against vllm==0.8.3. (The only exception is Intel Gaudi 3 accelerators, which must use vllm==0.6.6post1.)

      Done - Acceptance Criteria:

      • Replace the `vllm==0.7.3` version pin with `vllm>=0.8.0,<0.9.0` so that the latest Z-stream version of vLLM v0.8 can be theoretically be consumed for CUDA builds
      • CI is green after updating the vLLM range (ensuring that the upstream bits are compatible with this new vLLM version)
      • The 2 vLLM CVEs linked above are remediated

              cpacheco@redhat.com Courtney Pacheco
              cpacheco@redhat.com Courtney Pacheco
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: