Uploaded image for project: 'AI Platform Core Components'
  1. AI Platform Core Components
  2. AIPCC-9810

builder: onnxscript package update request

XMLWordPrintable

    • onnxscript package update request
    • False
    • Hide

      None

      Show
      None
    • False
    • To Do
    • 100% To Do, 0% In Progress, 0% Done

      πŸ“‹ New Package Request for AIPCC

      This form is for requesting new Python packages to be built and added to the AIPCC Package Index.

      1. Requester Information

      • Team / Project: ODH Notebooks / OpenDataHub Workbench Images

      2. Package Details

      3. Build & Platform Requirements

      • Hardware Variants:
        Select all additional hardware platforms where this package is needed. A CPU-only variant is built by default for all requests.
      • [x] CUDA (NVIDIA GPU)
      • [x] ROCm (AMD GPU)
      • [ ] TPU (Google TPU)
      • [ ] Gaudi (Intel Gaudi)
      • [ ] Spyre (IBM)
      • [ ] Other

      4. Justification & Testing

      • Business Justification:

      Summary: As of PyTorch 2.9+, onnxscript is no longer installed by default and must be explicitly added as a dependency for ONNX export functionality to work.

      Background Investigation:

      Our CI tests for PyTorch notebook images started failing with:

      ERROR: test_convert_to_onnx (TestPytorchNotebook.test_convert_to_onnx)
      ModuleNotFoundError: No module named 'onnxscript'
      

      The failure occurs when calling torch.onnx.export() in the test notebook at jupyter/rocm/pytorch/ubi9-python-3.12/test/test_notebook.ipynb.

      Cause analysis: PyTorch does not officially require onnxscript as a dependency. The torch.onnx module defers onnxscript imports to runtime (lazy import) rather than at initialization. This was an intentional design decision to handle the absence of the onnxscript module gracefully. See: https://github.com/pytorch/pytorch/issues/103764

      Verification of AIPCC index availability: We checked all three AIPCC indexes and confirmed onnxscript is not available:

      Affected images: All PyTorch-based notebook and runtime images:

        • jupyter/pytorch/ubi9-python-3.12
        • jupyter/pytorch+llmcompressor/ubi9-python-3.12
        • jupyter/rocm/pytorch/ubi9-python-3.12
        • runtimes/pytorch/ubi9-python-3.12
        • runtimes/pytorch+llmcompressor/ubi9-python-3.12
        • runtimes/rocm-pytorch/ubi9-python-3.12

      User impact: Without onnxscript, users cannot export PyTorch models to ONNX format using torch.onnx.export(), which is a common workflow for model deployment and interoperability.

      • Release target: RHOAI 3.4 GA
      • (Optional) Testing Requirements:

      Default import test is sufficient:

      import onnxscript
      

      Additionally, the following validates the integration with PyTorch:

      import torch
      import torch.onnx
      
      # If onnxscript is properly installed, this import chain should work
      from torch.onnx._internal.exporter import _compat
      

              emacchi@redhat.com Emilien Macchi
              jdanek@redhat.com Jiri DanΔ›k
              Einat Pacifici, Reshmi Aravind
              Frank's Team
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated: