• Icon: Epic Epic
    • Resolution: Unresolved
    • Icon: Critical Critical
    • None
    • 1.9.0
    • Build, Catalog
    • None
    • refactor rhdh-plugin-catalog sync-midstream.sh
    • False
    • Hide

      None

      Show
      None
    • False
    • RHDHPLAN-232Productization: Plugin Catalog / Extensions Marketplace (1.9)
    • To Do
    • RHDHPLAN-232 - Productization: Plugin Catalog / Extensions Marketplace (1.9)
    • QE Needed, Docs Needed, TE Needed, Customer Facing, PX Needed
    • Critical

      In the new year I might have to step back from all this and start a NEW script that takes into account the most logical order for this, so that we:

      • store some sort of digest of the status of the post-clone-cleanup, so we have a state from which to determine "do I need to clone again?" beyond just the contents of source.json, which isn't sufficient anymore
      • clone each repo or SPECIFIC repos (to avoid doing 40 clones if you only want 1) - my --force-clone isn't doing what I want, see previous point
      • avoid re-cloning backstage if we don't have to, but can still resolve the backstage manifest.json even if we don't re-clone the repo, to resolve backstage: refs
      • generate the manifest.json for each workspace so we can use that to remove workspace: refs
      • perform the plugin exports, using the package-list.yaml to filter which ones we want to export
      • keep the dist-dynamic content we need (embedded, package.json, yarn.lock), delete the rest
      • delete examples/ and packages/ {app,backend}

        and plugins /* we didn't export (like -common)

      • commit changes to the GL repo
      • trigger rebuilds using the generatePipelineruns.sh --trigger command

      That might end up being several scripts in a trenchcoat (so to speak) but it would make sync-midstream.sh much easier to read, if it's just calling out to other scripts

      It would also help with single-workspace updates

      Another thing to consider is that we might have to commit the output of `npx rhdh-cli plugin package` to a new repo

      • do all the exports (including generating .oci files) in GL pipeline, non-hermetically;
      • commit those to a new GL repo (better performance so the rhdh-plugin-catalog repo remains the SOURCE of truth, not binaries)

      That might make the SBOM creation happier, but would as a consequence mean we're not really building from source IN Konflux, just fetching sources and copying previously-built stuff stored in GL into the scratch images as oci artifacts.

              nickboldt Nick Boldt
              nickboldt Nick Boldt
              RHDH Cope
              Votes:
              0 Vote for this issue
              Watchers:
              0 Start watching this issue

                Created:
                Updated: