-
Story
-
Resolution: Done
-
Critical
-
None
-
None
-
False
-
None
-
False
We need to have some document which will define how to use Polarion to reduce confusion and noise.
Requirements:
We have Polarion test runs with which we can see information like version of OSS, cloud platform, type of tests etc (see proposals below for more details).
TBD:
We can do queries to extract data from test runs.
e.g. How often do we see failures for Maistra on OCP 4.12?
What is the percentage of tests we run on ROSA in last 3 months vs PSI?
Here is the STRAWMAN proposal for team to further develop.
[Jul 18, 2023]
To make things more easily understandable, we have two proposed options (and the team can further improve / modify what is described below):
1 Put all information in the name of the test run and no custom field
Name like:
OSSM2.4.0{}nightly07182023cypressUI_disconnected_OCP413.4_x86_OVN
2 Put all information in the name of the test run AJD no custom field
Same as #1 above but each field are also added as custom field to allow custom queries like:
select all test runs for OSSM2.4.0 and cypress and nightlybuilds and in (Jun and Jul) and FIPS and OCP 413 and IBM P and networking=OVN
Doing #1 first in 3Q 2023 and then add custom fields queries later (1Q 2024) is also an options.
[Jul 6, 2023]
In Polarion uploader form:
- add custom fields:
- mandatory
- Test_TYPE [nightly, sprintly, regression], TEST_PLAN, COMPONENT (at test run level) [Istio / Kiali Cypress / Kiali E2E], KIND [disconnected, default, FIPS](for future data analysis), ARCHITECTURE (x86, IBM P, IBM Z, ARM)
- optional (team to decide)
- OCP_VERSION, KIND [disconnected, default, FIPS], OCP_NETWORK_TYPE [OVN, SDN], FLAVOR [IPI / UPI]
- mandatory
- remove (from current form): TESTSUB_TYPE1
In the uploader:
- specify test plan when creating the test run
- add auto detect and collect information on TEST_PALN, COMPONENT, TEST_SUBTYPE1, OCP_VERSION, KIND, OCP_NETWORK_TYPE & FLAVOR and ADD to Test Run name
- tests that require retry but eventually pass will be reported as pass
Pipelines:
- generate xml reports and gather info on fields required mentioned above
- add a field to indicate whether results should be uploaded to Polarion (e.g. results for manually triggered tests for investigation may not require to be uploaded)
- for select test runs (e.g. all tests run for a z-stream release testing), upload results to SAME test run (objective is to have one set of data instead of a big table of repeated test runs)