Uploaded image for project: 'Red Hat Enterprise Linux AI'
  1. Red Hat Enterprise Linux AI
  2. RHELAI-2252

RHELAI 1.3 CUDA no longer has vllm-flash-attn

XMLWordPrintable

    • False
    • Hide

      None

      Show
      None
    • False
    • RHELAI-1676 - NVIDIA - All dependencies for RHEL AI 1.3

      The package vllm-flash-attn is no longer built and shipped for CUDA variant. Is this an expected change or a bug?

              fdupont@redhat.com Fabien Dupont
              cheimes@redhat.com Christian Heimes
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: