-
Story
-
Resolution: Done
-
Major
-
None
vLLM for Gaudi needs
vllm-hpu-extension @ git+https://github.com/HabanaAI/vllm-hpu-extension.git@d4b0d38
https://github.com/HabanaAI/vllm-fork/blob/v0.6.6.post1%2BGaudi-1.20.0/requirements-hpu.txt
The PyPI package is not up to date. We need to build the package from git and talk to Intel about long-term maintenance.