Description of problem:
When archiveLogs is set to true, Data Science Pipelines should get all pipeline tasks logs and store in S3. However, when running a pipeline with that parameter, the following error is shown in the log:
failed to create task run pod "iris-pipeline-4bd96-data-prep": pods "iris-pipeline-4bd96-data-prep-pod" is forbidden: unable to validate against any security context constraint: [provider "anyuid": Forbidden: not usable by user or serviceaccount, provider "pipelines-scc": Forbidden: not usable by user or serviceaccount, spec.volumes[17]: Invalid value: "hostPath": hostPath volumes are not allowed to be used, spec.volumes[18]: Invalid value: "hostPath": hostPath volumes are not allowed to be used, spec.volumes[19]: Invalid value: "hostPath": hostPath volumes are not allowed to be used, spec.volumes[20]: Invalid value: "hostPath": hostPath volumes are not allowed to be used, provider "restricted": Forbidden: not usable by user or serviceaccount, provider "nonroot-v2": Forbidden: not usable by user or serviceaccount, provider "hostmount-anyuid": Forbidden: not usable by user or serviceaccount, provider "machine-api-termination-handler": Forbidden: not usable by user or serviceaccount, provider "hostnetwork-v2": Forbidden: not usable by user or serviceaccount, provider "hostnetwork": Forbidden: not usable by user or serviceaccount, provider "hostaccess": Forbidden: not usable by user or serviceaccount, provider "node-exporter": Forbidden: not usable by user or serviceaccount, provider "privileged": Forbidden: not usable by user or serviceaccount]. Maybe invalid TaskSpec
Prerequisites (if any, like setup, operators/versions):
Deploy a DataSciencePipelinesApplication CR with the archiveLogs parameter set to true
Steps to Reproduce
- Deploy a DataSciencePipelinesApplication CR with the archiveLogs parameter set to true
- Create a sample pipeline (iris-pipeline can be used)
- Create a pipeline run from the previous created pipeline
Actual results:
Pipeline fails
Expected results:
Pipeline should run, and all tasks logs should be stored in S3
As a part of the workaround for this, task is to create a POC for having logs push to s3 via artifact script in a dev cluster.