• torch-2.7
    • False
    • Hide

      None

      Show
      None
    • False
    • Done
    • AIPCC-1394upgrade platform to support torch 2.7
    • 0% To Do, 0% In Progress, 100% Done

      Build a new version of torch, torch-2.7 for CUDA, ROCm, and Gaudi. The CUDA work should
      begin prior to ROCm or Gaudi. The CUDA work should be staged in a branch in the builder
      repository for ROCm and Gaudi to consume. This way work can be done in parallel for all
      three vendor variants.

      We are initially targeting the versions listed in the parent Feature:

      We will minimally aim for:

      torch-2.7 [2]
      ROCM_VERSION=6.3.4 [2]
      CUDA_VERSION=12.8.1 [3]
      NVIDIA_DRIVER_VERSION=570.124.06 [3]
      NVIDIA_REQUIRE_CUDA=cuda>=12.8 [3]
      triton-3.3 [4]
      vllm-0.9 [5]

      [1] https://en.wikipedia.org/wiki/CUDA#GPUs_supported
      [2] https://pytorch.org/get-started/locally/
      [3] https://docs.nvidia.com/cuda/cuda-toolkit-release-notes/index.html#
      [4] https://pytorch.org/blog/pytorch-2-7/
      [5] https://github.com/vllm-project/vllm/blob/main/pyproject.toml

      Intel's Gaudi does not currently support torch-2.7 according to

      https://docs.habana.ai/en/latest/Support_Matrix/Support_Matrix.html

      and as such, no Gaudi work must be completed for this Epic.

              prarit@redhat.com Prarit Bhargava
              prarit@redhat.com Prarit Bhargava
              Ali Raza, Emilien Macchi
              Votes:
              0 Vote for this issue
              Watchers:
              5 Start watching this issue

                Created:
                Updated:
                Resolved: