Uploaded image for project: 'OpenShift Container Platform (OCP) Strategy'
  1. OpenShift Container Platform (OCP) Strategy
  2. OCPSTRAT-2490

Support "Red Hat AI Inference Server" as LLM provider in OLS

XMLWordPrintable

    • Product / Portfolio Work
    • OCPSTRAT-2123OpenShift Lightspeed 2.0
    • False
    • Hide

      None

      Show
      None
    • False
    • None
    • None
    • None
    • None
    • None
    • None
    • None
    • None
    • None

      Feature Overview (aka. Goal Summary)  

      redhat has three inference servers
      1) RHOAI
      2) RHEL AI
      3) Red Hat AI Inference Server

       
      OLS has aligned itself with two (RHOAI and RHEL AI) . We also want to add support for  Red Hat AI Inference Server in OLS 

              gausingh@redhat.com Gaurav Singh
              gausingh@redhat.com Gaurav Singh
              None
              None
              None
              None
              None
              None
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated: