Uploaded image for project: 'Red Hat Internal Developer Platform'
  1. Red Hat Internal Developer Platform
  2. RHIDP-7829

Include rhdh-local in the release process (tagging/branching)

    • Icon: Epic Epic
    • Resolution: Done
    • Icon: Normal Normal
    • 1.8.0
    • 1.6.1
    • Build, Release, RHDH Local
    • None
    • Include rhdh-local in the release process
    • M
    • False
    • Hide

      None

      Show
      None
    • False
    • In Progress
    • RHDHPLAN-237 - Add an RHDH release task to default RHDH Local to the latest RHDH Release
    • QE Needed, Docs Needed, TE Needed, Customer Facing, PX Needed
    • 0% To Do, 0% In Progress, 100% Done

      Goal

      PM would like the rhdh-local repo to get updates every time we do a .0 release, so that the OOTB default points to the same current release as the helm and operator releases.

      Background/Feature Origin

      Thread: https://redhat-internal.slack.com/archives/C04CUSD4JSG/p1749735545824589

      Why is this important?

      Makes rhdh-local more easy to use OOTB
      Related issue: RHIDP-1351 (arm64 builds of upstream and/or downstream builds)

      Requirements

      Unknowns

      • should we use pinned digests like we do in helm and operator? (update with every .z release); or is a floating tag :1.6 good enough for this local installation? (update only on .0 releases)
      • is linking to quay.io/rhdh acceptable, or should we default to using the reg.rh.io GA releases only?
      • should we remove mention of the unsupported arm64 community builds if the goal is to make rhdh-local a fully GA supported thing?
      • do we want to start publishing the same r.r.io 1.y.z tags to quay.io/rhdh so that we can point to 1.7.2 instead of just 1.7 ?

              nickboldt Nick Boldt
              nickboldt Nick Boldt
              RHIDP - Cope
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: