-
Feature
-
Resolution: Unresolved
-
Major
-
None
-
None
-
None
-
Strategic Product Work
-
False
-
-
False
-
OCPSTRAT-1692AI Workloads for OpenShift
-
100% To Do, 0% In Progress, 0% Done
-
0
Overview
OpenShift's Dynamic Resource Allocation (DRA) for AI Workloads is designed to optimize the management of specialized resources like GPUs, critical for running AI/ML and large language model (LLM) inference workloads. By using flexible, dynamic requests for high-performance hardware, OpenShift empowers users to meet demanding AI application needs with cost-effective, real-time resource scaling and efficient workload scheduling.