XMLWordPrintable

Details

    • Epic
    • Resolution: Done
    • Undefined
    • None
    • None
    • Component Readiness
    • False
    • None
    • False
    • Not Selected
    • To Do
    • 100
    • 100% 100%

    Description

      This epic will track the work to implement the various phases of David Eads proposed Component Readiness project. This project will expose a dashboard in Sippy which will prove that each individual component in the product is passing sufficiently for release.

      The top level view of the dashboard will use components as defined by Jira OCPBUGS components. Each component will show Red or Green for each NURP (network/upgrade/release/platform) combination we care about. This provides the highest level view on which components are ready for release/healthy.

      Drilling down into a Component will show a second level chart based on the Capabilities/Features of that Component, again with a column for each NURP we care about.

      And drilling down into a Capability of a Component presents the final and third level chart, listing each test that relates to that Component, with a column for each NURP we care about. Cells in this table should list the pass percentage for the test on that NURP over some sampling of recent job runs.

      Goals here are to help teams self-manage the readiness and reliability of their components, and assist patch managers and TRT.

      Design Considerations

      Milestones

      M1 - Proof of Concept

      Port David's PoC to Sippy. Establish relevant data views over Sippy data and first draft of the APIs and UI.

      Simplistic mapping of test to capability/component via existing sig tags in test names. This will be refined in M2.

      Initially based purely on test pass rate over last week.

      M2 - Component Readiness of Master Branch

      Determine how we'll flexibly map tests to components and capabilities. The capabilities layering must not require test name changes, this is already risky the way we embed components in test names and causes a lot of problem for tooling, as a new test name is effectively a new test, all history is lost. This mapping needs to be easy to maintain.

      Complete mapping of all, or as many tests as possible, to components and features.

      Provide a mechanism for adjusting pass thresholds of pass criteria temporarily, with an audit trail, and built in expiration. We expect code based, but other options are on the table. Must be easy to maintain for tests and groups of tests.

      M3 - Component Readiness of Z-Streams

      Once we have master operational, work to display readiness for z-stream releases to help ease or eliminate the patch manager role. z-stream baseline is based on the master readiness at the time we release .0. Will require some kind of snapshot of master at release date. Possibly use sippy snapshot functionality we already have. (or similar)

      M4 - Component Readiness for Payloads

      If we obtain a way from testplatform to run extensive testing against specific payloads, be they nightlies, ci, or PR /payloads, this will allow us to request X runs against that payload over some period of time.

      Once complete, we could build and display the component readiness for a specific payload, eliminating the temporal gap between master component readiness, vs a payload we intend to release at some specific time. This should be available for pull-request /payload runs as well, which would help greatly with landing difficult rebases such as Kube and RHEL.

      Attachments

        Activity

          People

            Unassigned Unassigned
            rhn-engineering-dgoodwin Devan Goodwin
            Votes:
            0 Vote for this issue
            Watchers:
            11 Start watching this issue

            Dates

              Created:
              Updated:
              Resolved: