• Icon: Feature Feature
    • Resolution: Unresolved
    • Icon: Critical Critical
    • None
    • None
    • Product / Portfolio Work
    • L
    • False
    • Hide

      None

      Show
      None
    • False
    • Not Selected
    • 100% To Do, 0% In Progress, 0% Done

      Goal Summary:

      Image vulnerability scanning is the most used feature of ACS however users are often confused how image scans work - why scans are not running? why are they failing?, how often are they executed? what secrets are used?, and so on.

      Understanding and troubleshooting image scans is a cumbersome process. The docs have high level information, the UI has static notes, and to troubleshoot users have to inspect many logs for which will not always have the necessary details (especially at the default Info logging level).

      To make it clear when, where, how, and what was scanned we will introduce a Scan Audit History that, for a given image, will have recent history of scans executed and associated details, such as:

      • what requested the scan?
      • the image integrations attempted/used
      • where/when/what pulled image metadata (central? sensor? cluster?)
      • where/when/what executed the scan (delegated? handled by Central? scanner v2 used? scanner v4?)
      • where/when/what errors occurred (did pulling metadata fail? contacting scanner? did errors occur but ACS recover? etc.)
      • where/when/what signatures were pulled and if they were valid
      • was a new scan executed? results retrieved from cache/database?
      • etc.

      Goals and expected user outcomes:

      This increased visibility will reduce current guesses/assumptions - idealling leading to a better understanding of the product and less support interactions (and/or more efficient interactions) - in turn this may increase confidence in the product.

      From the UI and API a user will be able to obtain the history of recent scans for any image.

      Acceptance Criteria:

      WIP

      • Design created / reviewed (including defining events to capture and API contracts)
      • UI/UX designed / implemented
      • Events captured and stored
      • ...

      Success Criteria or KPIs measured:

      A list of specific, measurable criteria that will be used to determine if the feature is successful. Include key performance indicators (KPIs) or other metrics., etc. Initial completion during Refinement status.

      <enter success criteria and/or KPIs here>

      Use Cases (Optional):

      Include use case diagrams, main success scenarios, alternative flow scenarios together with user type/persona. Initial completion during Refinement status.

      <your text here>

      Out of Scope (Optional):

      High-level list of items that are out of scope. Initial completion during Refinement status.

      <your text here>

        1. hackathon-1.png
          480 kB
          David Caravello
        2. hackathon-2.png
          469 kB
          David Caravello
        3. hackathon-3.png
          314 kB
          David Caravello
        4. hackathon-4.png
          283 kB
          David Caravello
        5. hackathon-5.png
          620 kB
          David Caravello
        6. hackathon-6.png
          463 kB
          David Caravello

              dcaravel David Caravello
              dcaravel David Caravello
              Shubha Badve Shubha Badve
              ACS Scanner
              Votes:
              1 Vote for this issue
              Watchers:
              8 Start watching this issue

                Created:
                Updated: