Uploaded image for project: 'OpenShift Cloud'
  1. OpenShift Cloud
  2. OCPCLOUD-1707

Add Gomega Assertion Annotations to Test Suite in cluster-api-actuator-pkg Repository

XMLWordPrintable

    • Add Gomega Assertion Annotations to Test Suite
    • False
    • None
    • False
    • Not Selected
    • To Do
    • 0% To Do, 0% In Progress, 100% Done

      Epic Goal

      • Add assertion annotations to all the test cases in the cluster-api-actuator-pkg repository so that test failure reasons are more clearly represented in the test logs.

      Why is this important?

      • Assertion annotations are a feature of Gomega that give developers an opportunity to create rich failure messages for tests. These annotations are helpful to provide context on where a test has failed, and the related data that might help in fixing the error.
      • The cluster-api-actuator-pkg contains 6 packages which have Gomega assertions that could be annotated. These packages can be broken down into 3 sizes (large, medium, small) representing the amount of assertions contained within each. All tests should be annotated to improve the accuracy and speed of debugging efforts.

      Scenarios

      When running the test suite it is very common to see errors like this:

      [Feature:Machines] Autoscaler should
      /go/src/github.com/openshift/cluster-api-actuator-pkg/pkg/autoscaler/autoscaler.go:159
        use a ClusterAutoscaler that has 100 maximum total nodes count
        /go/src/github.com/openshift/cluster-api-actuator-pkg/pkg/autoscaler/autoscaler.go:206
          It scales from/to zero [It]
          /go/src/github.com/openshift/cluster-api-actuator-pkg/pkg/autoscaler/autoscaler.go:250
      
          Timed out after 180.002s.
          Expected
              <bool>: false
          to be true
      
      /go/src/github.com/openshift/cluster-api-actuator-pkg/pkg/autoscaler/autoscaler.go:294

      This provides almost no context to a reviewer on how to interpret the failure and what might have happened. With more context a reviewer will understand which part of the test failed and if there is specific data that can be correlated with that failure (eg a machine or node name).

      Acceptance Criteria

      • CI - MUST be running successfully with tests automated
      • All exceptions listed in the attached document have failure annotations.
      • Update contributing docs to include explicit advice about adding annotations when creating new tests.

      Dependencies (internal and external)

      1.  

      Previous Work (Optional):

      1. cluster-api-actuator-pkg testing introspection
      2. Gomega documenation on Annotating Assertions

      Open questions::

      Done Checklist

      • CI - CI is running, tests are automated and merged.
      • DEV - Code and tests merged: <link to meaningful PR or GitHub Issue>
      • QE - Test plans in Polarion: <link or reference to Polarion>
      • QE - Automated tests merged: <link or reference to automated tests>
      • DOC - Documentation merged: <link to meaningful PR>

              mimccune@redhat.com Michael McCune
              mimccune@redhat.com Michael McCune
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated:
                Resolved: