Uploaded image for project: 'Red Hat OpenShift Data Science'
  1. Red Hat OpenShift Data Science
  2. RHODS-8175

Create a POC for having logs push to S3 via artifact script

XMLWordPrintable

    • Icon: Spike Spike
    • Resolution: Done
    • Icon: Normal Normal
    • None
    • RHODS_1.26.0_GA
    • Pipelines
    • 2
    • False
    • Hide

      None

      Show
      None
    • False
    • None
    • Testable
    • No
    • No
    • No
    • Pending
    • None
    • ML Ops Sprint 1.28, ML Ops Sprint 1.29

      Description of problem:

      When archiveLogs is set to true, Data Science Pipelines should get all pipeline tasks logs and store in S3. However, when running a pipeline with that parameter, the following error is shown in the log:

       
      failed to create task run pod "iris-pipeline-4bd96-data-prep": pods "iris-pipeline-4bd96-data-prep-pod" is forbidden: unable to validate against any security context constraint: [provider "anyuid": Forbidden: not usable by user or serviceaccount, provider "pipelines-scc": Forbidden: not usable by user or serviceaccount, spec.volumes[17]: Invalid value: "hostPath": hostPath volumes are not allowed to be used, spec.volumes[18]: Invalid value: "hostPath": hostPath volumes are not allowed to be used, spec.volumes[19]: Invalid value: "hostPath": hostPath volumes are not allowed to be used, spec.volumes[20]: Invalid value: "hostPath": hostPath volumes are not allowed to be used, provider "restricted": Forbidden: not usable by user or serviceaccount, provider "nonroot-v2": Forbidden: not usable by user or serviceaccount, provider "hostmount-anyuid": Forbidden: not usable by user or serviceaccount, provider "machine-api-termination-handler": Forbidden: not usable by user or serviceaccount, provider "hostnetwork-v2": Forbidden: not usable by user or serviceaccount, provider "hostnetwork": Forbidden: not usable by user or serviceaccount, provider "hostaccess": Forbidden: not usable by user or serviceaccount, provider "node-exporter": Forbidden: not usable by user or serviceaccount, provider "privileged": Forbidden: not usable by user or serviceaccount]. Maybe invalid TaskSpec

      Prerequisites (if any, like setup, operators/versions):

      Deploy a DataSciencePipelinesApplication CR with the archiveLogs parameter set to true

      Steps to Reproduce

      1. Deploy a DataSciencePipelinesApplication CR with the archiveLogs parameter set to true
      2. Create a sample pipeline (iris-pipeline can be used)
      3. Create a pipeline run from the previous created pipeline

      Actual results:

      Pipeline fails

      Expected results:

      Pipeline should run, and all tasks logs should be stored in S3

      As a part of the workaround for this, task is to create a POC for having logs push to s3 via artifact script in a dev cluster.

        1. scc-rb.yaml
          0.3 kB
          Dharmit Dalvi
        2. allow-hostpath-scc.yaml
          0.8 kB
          Dharmit Dalvi

              rhn-support-ddalvi Dharmit Dalvi (Inactive)
              rhn-support-rmartine Ricardo Martinelli (Inactive)
              Humair Khan
              Jorge Garcia Oncins Jorge Garcia Oncins
              Votes:
              0 Vote for this issue
              Watchers:
              4 Start watching this issue

                Created:
                Updated:
                Resolved: