Uploaded image for project: 'Red Hat Internal Developer Platform'
  1. Red Hat Internal Developer Platform
  2. RHIDP-11685

Upstream engagement on how to map AI Model Servers to the Backstage catalog

    • upstream-map-ai-model-catalog
    • False
    • Hide

      None

      Show
      None
    • False
    • RHDHPLAN-944[Spike] investigations, code pocs, upstream engagements for connector upstream plugin
    • To Do
    • RHDHPLAN-944 - [Spike] investigations, code pocs, upstream engagements for connector upstream plugin
    • QE Needed, Docs Needed, TE Needed, Customer Facing, PX Needed
    • 67% To Do, 33% In Progress, 0% Done

      EPIC Goal

      Get upstream consensus on how we map AI Model Servers and AI Models in the the backstage catalog

      Background/Feature Origin

       

      Reviewing the equivalent discussion for mapping MCP servers to the catalog , https://github.com/backstage/backstage/issues/32062 , resonated along  these lines:

      • the similarities between MCP servers and AI Model Servers
      • the simpler approach Spotify maintainers took with focusing on the API entity
      • the increased flexibility on what you could set for the type field of the API entity

      during discussion of this RFC in the Jan 20 Framework SIG, in some exchanges between Ben Lambert and I, he cracked the door open to modeling "other AI stuff" and I broached the subject of modeling AI Model Servers.  

      Nothing definitive arose in some of the details exchanged.

      We were give references to the BEP process over at https://github.com/backstage/backstage/tree/master/beps 

      Clear next steps are to open an RFC and most likely f/up with a BEP to get consensus on what the mapping into the catalog should be.

      Ideally, the MCP Server RFC declares its choice for modeling MCP Servers soon, and we base off of that in our proposal, augmenting as needed, but only as necessary.

      Why is this important?

      Consensus on how mapping to the catalog occurs will best facilitate contributions from anyone in the community who would like to import from any of the multitude of model registries available, but still provide a consistent experience for backstage users, and allow for maximum re-use of components built in either the core or community plugin ecosystems.

      User Scenarios

      • exposing techdocs or fastapi for AI models (though this may be untenable for initial release given upstream vs. downstream dependencies)
      • exposing URL references and related connection information in the catalog in a way that can be leveraged by backstage templates

      Dependencies (internal and external)

      • backstage maintainer cycles 

      Acceptance Criteria

      • upstream RFC and BEP submitted
        • SIG participation / self-advocacy to at least get a sense on when final decisions will land

              gmontero@redhat.com Gabe Montero
              gmontero@redhat.com Gabe Montero
              RHDH AI
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated: