Uploaded image for project: 'Performance and Scale for AI Platforms'
  1. Performance and Scale for AI Platforms
  2. PSAP-997

Power measurement for AI/ML inference

XMLWordPrintable

    • Power measurement for AI/ML inference
    • Not Selected
    • False
    • False
    • None
    • 0% To Do, 0% In Progress, 100% Done

      Epic Goal

      • Measure/monitor/characterize the power usage and performance of AI/ML inference workloads like MLPerf Inference

      Why is this important?

      • Gain experience and develop workflows for measuring power usage of workloads running on OCP.
      • Publish a blog post about running Inference workloads on OpenShift, building on Anisha's internship project
      •  

      Acceptance Criteria

      Open questions::

      Done Checklist

      •  

          There are no Sub-Tasks for this issue.

              dagray@redhat.com David Gray
              dagray@redhat.com David Gray
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated:
                Resolved: