Uploaded image for project: 'Performance and Scale for AI Platforms'
  1. Performance and Scale for AI Platforms
  2. PSAP-997

Power measurement for AI/ML inference

XMLWordPrintable

    • Power measurement for AI/ML inference
    • False
    • None
    • False
    • Not Selected
    • To Do
    • Impediment
    • 100
    • 100% 100%

      Epic Goal

      • Measure/monitor/characterize the power usage and performance of AI/ML inference workloads like MLPerf Inference

      Why is this important?

      • Gain experience and develop workflows for measuring power usage of workloads running on OCP.
      • Publish a blog post about running Inference workloads on OpenShift, building on Anisha's internship project
      •  

      Acceptance Criteria

      Open questions::

      Done Checklist

      •  

            dagray@redhat.com David Gray
            dagray@redhat.com David Gray
            Votes:
            0 Vote for this issue
            Watchers:
            3 Start watching this issue

              Created:
              Updated:
              Resolved: