-
Story
-
Resolution: Unresolved
-
Undefined
-
None
-
None
-
None
-
Product / Portfolio Work
-
False
-
-
False
-
None
-
None
-
None
-
None
Ref. Symptoms / observations / job analysis feature
Context: This step focuses on automating the process of applying symptom labels to job runs when they are finished running. This part extends previous work that identified symptoms and applied labels in LoadJobRunData.
Action Items:
Within the job intake cloud function (LoadJobRunData) when finished.json is created
- check for label files under artifacts/job_labels
- if there are any, update the prow job results to display the labels that have been applied based on those files
- keep in mind the label files could be updated retroactively; ideally a static accordion insert (like for intervals or risk analysis) would load the bucket files dynamically to display the unique labels that have been applied, along with their descriptions (if there's a nice way to link all the individual matches, great, but of course users can just look in the bucket files themselves)
As much as possible, logic for writing labels should be located in the sippy codebase (and imported by the cloud function), to facilitate reuse by other tools doing the same things in different contexts.