Extract /home/jenkins/oadp-e2e-qe.tar.gz to /alabama/cspi Extract /home/jenkins/oadp-apps-deployer.tar.gz to /alabama/oadpApps Extract /home/jenkins/mtc-python-client.tar.gz to /alabama/pyclient Create and populate /tmp/test-settings... Login as Kubeadmin to the test cluster at https://api.ci-op-24wp7hk6-2e88b.cspilp.interop.ccitredhat.com:6443... WARNING: Using insecure TLS client config. Setting this option is not supported! Login successful. You have access to 70 projects, the list has been suppressed. You can list all projects with 'oc projects' Using project "default". Create virtual environment and install required packages... Collecting ansible_runner Downloading ansible_runner-2.3.4-py3-none-any.whl (81 kB) Collecting six Downloading six-1.16.0-py2.py3-none-any.whl (11 kB) Collecting pyyaml Downloading PyYAML-6.0.1-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (705 kB) Collecting python-daemon Downloading python_daemon-3.0.1-py3-none-any.whl (31 kB) Collecting pexpect>=4.5 Downloading pexpect-4.9.0-py2.py3-none-any.whl (63 kB) Collecting packaging Downloading packaging-23.2-py3-none-any.whl (53 kB) Collecting ptyprocess>=0.5 Downloading ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB) Collecting lockfile>=0.10 Downloading lockfile-0.12.2-py2.py3-none-any.whl (13 kB) Collecting setuptools>=62.4.0 Downloading setuptools-69.0.2-py3-none-any.whl (819 kB) Collecting docutils Downloading docutils-0.20.1-py3-none-any.whl (572 kB) Installing collected packages: setuptools, ptyprocess, lockfile, docutils, six, pyyaml, python-daemon, pexpect, packaging, ansible-runner Attempting uninstall: setuptools Found existing installation: setuptools 57.4.0 Uninstalling setuptools-57.4.0: Successfully uninstalled setuptools-57.4.0 Successfully installed ansible-runner-2.3.4 docutils-0.20.1 lockfile-0.12.2 packaging-23.2 pexpect-4.9.0 ptyprocess-0.7.0 python-daemon-3.0.1 pyyaml-6.0.1 setuptools-69.0.2 six-1.16.0 WARNING: You are using pip version 21.2.3; however, version 23.3.1 is available. You should consider upgrading via the '/alabama/venv/bin/python3 -m pip install --upgrade pip' command. Processing /alabama/oadpApps DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default. pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555. Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing wheel metadata: started Preparing wheel metadata: finished with status 'done' Building wheels for collected packages: ocpdeployer Building wheel for ocpdeployer (PEP 517): started Building wheel for ocpdeployer (PEP 517): finished with status 'done' Created wheel for ocpdeployer: filename=ocpdeployer-0.0.1-py2.py3-none-any.whl size=199073 sha256=3e307f558eb00ce7c117e57c149623c06771a3d285f69dd7964fd2083c756b69 Stored in directory: /tmp/pip-ephem-wheel-cache-ty046_gm/wheels/ea/a6/87/9d98fc51cc395d30bd2147a7e53e3f5a2e80044d9dd9e64977 Successfully built ocpdeployer Installing collected packages: ocpdeployer WARNING: Value for scheme.platlib does not match. Please report this to distutils: /tmp/pip-target-9xtuc0ny/lib64/python sysconfig: /tmp/pip-target-9xtuc0ny/lib/python WARNING: Additional context: user = False home = '/tmp/pip-target-9xtuc0ny' root = None prefix = None Successfully installed ocpdeployer-0.0.1 WARNING: You are using pip version 21.2.3; however, version 23.3.1 is available. You should consider upgrading via the '/alabama/venv/bin/python3 -m pip install --upgrade pip' command. Processing /alabama/pyclient DEPRECATION: A future pip version will change local packages to be built in-place without first copying to a temporary directory. We recommend you use --use-feature=in-tree-build to test your packages with this new behavior before it becomes the default. pip 21.3 will remove support for this functionality. You can find discussion regarding this at https://github.com/pypa/pip/issues/7555. Collecting suds-py3 Downloading suds_py3-1.4.5.0-py3-none-any.whl (298 kB) Collecting requests Downloading requests-2.31.0-py3-none-any.whl (62 kB) Collecting jinja2 Downloading Jinja2-3.1.2-py3-none-any.whl (133 kB) Collecting kubernetes==11.0.0 Downloading kubernetes-11.0.0-py3-none-any.whl (1.5 MB) Collecting openshift==0.11.2 Downloading openshift-0.11.2.tar.gz (19 kB) Requirement already satisfied: setuptools>=21.0.0 in /alabama/venv/lib/python3.10/site-packages (from kubernetes==11.0.0->mtc==0.0.1) (69.0.2) Collecting websocket-client!=0.40.0,!=0.41.*,!=0.42.*,>=0.32.0 Downloading websocket_client-1.7.0-py3-none-any.whl (58 kB) Requirement already satisfied: pyyaml>=3.12 in /alabama/venv/lib/python3.10/site-packages (from kubernetes==11.0.0->mtc==0.0.1) (6.0.1) Collecting requests-oauthlib Downloading requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB) Collecting urllib3>=1.24.2 Downloading urllib3-2.1.0-py3-none-any.whl (104 kB) Requirement already satisfied: six>=1.9.0 in /alabama/venv/lib/python3.10/site-packages (from kubernetes==11.0.0->mtc==0.0.1) (1.16.0) Collecting google-auth>=1.0.1 Downloading google_auth-2.24.0-py2.py3-none-any.whl (183 kB) Collecting python-dateutil>=2.5.3 Downloading python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) Collecting certifi>=14.05.14 Downloading certifi-2023.11.17-py3-none-any.whl (162 kB) Collecting python-string-utils Downloading python_string_utils-1.0.0-py3-none-any.whl (26 kB) Collecting ruamel.yaml>=0.15 Downloading ruamel.yaml-0.18.5-py3-none-any.whl (116 kB) Collecting cachetools<6.0,>=2.0.0 Downloading cachetools-5.3.2-py3-none-any.whl (9.3 kB) Collecting rsa<5,>=3.1.4 Downloading rsa-4.9-py3-none-any.whl (34 kB) Collecting pyasn1-modules>=0.2.1 Downloading pyasn1_modules-0.3.0-py2.py3-none-any.whl (181 kB) Collecting pyasn1<0.6.0,>=0.4.6 Downloading pyasn1-0.5.1-py2.py3-none-any.whl (84 kB) Collecting ruamel.yaml.clib>=0.2.7 Downloading ruamel.yaml.clib-0.2.8-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (526 kB) Collecting MarkupSafe>=2.0 Downloading MarkupSafe-2.1.3-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB) Collecting idna<4,>=2.5 Downloading idna-3.6-py3-none-any.whl (61 kB) Collecting charset-normalizer<4,>=2 Downloading charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (142 kB) Collecting oauthlib>=3.0.0 Downloading oauthlib-3.2.2-py3-none-any.whl (151 kB) Using legacy 'setup.py install' for mtc, since package 'wheel' is not installed. Using legacy 'setup.py install' for openshift, since package 'wheel' is not installed. Installing collected packages: urllib3, pyasn1, idna, charset-normalizer, certifi, rsa, requests, pyasn1-modules, oauthlib, cachetools, websocket-client, ruamel.yaml.clib, requests-oauthlib, python-dateutil, MarkupSafe, google-auth, ruamel.yaml, python-string-utils, kubernetes, jinja2, suds-py3, openshift, mtc Running setup.py install for openshift: started Running setup.py install for openshift: finished with status 'done' Running setup.py install for mtc: started Running setup.py install for mtc: finished with status 'done' Successfully installed MarkupSafe-2.1.3 cachetools-5.3.2 certifi-2023.11.17 charset-normalizer-3.3.2 google-auth-2.24.0 idna-3.6 jinja2-3.1.2 kubernetes-11.0.0 mtc-0.0.1 oauthlib-3.2.2 openshift-0.11.2 pyasn1-0.5.1 pyasn1-modules-0.3.0 python-dateutil-2.8.2 python-string-utils-1.0.0 requests-2.31.0 requests-oauthlib-1.3.1 rsa-4.9 ruamel.yaml-0.18.5 ruamel.yaml.clib-0.2.8 suds-py3-1.4.5.0 urllib3-2.1.0 websocket-client-1.7.0 WARNING: You are using pip version 21.2.3; however, version 23.3.1 is available. You should consider upgrading via the '/alabama/venv/bin/python3 -m pip install --upgrade pip' command. Executing tests... + readonly 'RED=\e[31m' + RED='\e[31m' + readonly 'BLUE=\033[34m' + BLUE='\033[34m' + readonly 'CLEAR=\e[39m' + CLEAR='\e[39m' ++ oc get infrastructures cluster -o 'jsonpath={.status.platform}' ++ awk '{print tolower($0)}' + CLOUD_PROVIDER=aws + [[ aws == *-arm* ]] + E2E_TIMEOUT_MULTIPLIER=2 + export NAMESPACE=openshift-adp + NAMESPACE=openshift-adp + export PROVIDER=aws + PROVIDER=aws ++ echo aws ++ awk '{print tolower($0)}' + BACKUP_LOCATION=aws + export BACKUP_LOCATION=aws + BACKUP_LOCATION=aws + export BUCKET=ci-op-24wp7hk6-interopoadp + BUCKET=ci-op-24wp7hk6-interopoadp + OADP_CREDS_FILE=/tmp/test-settings/credentials +++ readlink -f /alabama/cspi/test_settings/scripts/test_runner.sh ++ dirname /alabama/cspi/test_settings/scripts/test_runner.sh + readonly SCRIPT_DIR=/alabama/cspi/test_settings/scripts + SCRIPT_DIR=/alabama/cspi/test_settings/scripts ++ cd /alabama/cspi/test_settings/scripts ++ git rev-parse --show-toplevel + readonly TOP_DIR=/alabama/cspi + TOP_DIR=/alabama/cspi + echo /alabama/cspi /alabama/cspi + TESTS_FOLDER=e2e ++ oc get nodes -o 'jsonpath={.items[*].metadata.labels.kubernetes\.io/arch}' ++ tr ' ' '\n' ++ sort -u ++ xargs + export NODES_ARCHITECTURE=amd64 + NODES_ARCHITECTURE=amd64 ++ oc get ns openshift-storage ++ echo false + OPENSHIFT_STORAGE=false + '[' false == true ']' ++ oc get storageclass -o 'jsonpath={.items[?(@.metadata.annotations.storageclass\.kubernetes\.io/is-default-class=='\''true'\'')].metadata.name}' + DEFAULT_SC=gp3-csi + export STORAGE_CLASS=gp3-csi + STORAGE_CLASS=gp3-csi + '[' -n gp3-csi ']' + '[' gp3-csi '!=' gp3-csi ']' + export STORAGE_CLASS_OPENSHIFT_STORAGE=gp3-csi + STORAGE_CLASS_OPENSHIFT_STORAGE=gp3-csi + echo 'Using the StorageClass for openshift-storage: gp3-csi' Using the StorageClass for openshift-storage: gp3-csi + [[ amd64 != \a\m\d\6\4 ]] + TEST_FILTER='!// || (// && !exclude_aws && (!/target/ || target_aws) ) ' + [[ aws =~ aws ]] ++ oc config current-context ++ awk -F / '{print $2}' + SETTINGS_TMP=/alabama/cspi/output_files/api-ci-op-24wp7hk6-2e88b-cspilp-interop-ccitredhat-com:6443 + mkdir -p /alabama/cspi/output_files/api-ci-op-24wp7hk6-2e88b-cspilp-interop-ccitredhat-com:6443 ++ oc get authentication cluster -o 'jsonpath={.spec.serviceAccountIssuer}' + IS_OIDC= + '[' '!' -z ']' + [[ aws == \a\w\s ]] + export PROVIDER=aws + PROVIDER=aws + export CREDS_SECRET_REF=cloud-credentials + CREDS_SECRET_REF=cloud-credentials ++ oc get infrastructures cluster -o 'jsonpath={.status.platformStatus.aws.region}' --allow-missing-template-keys=false + export REGION=us-east-1 + REGION=us-east-1 + settings_script=aws_settings.sh + '[' aws == aws-sts ']' + BUCKET=ci-op-24wp7hk6-interopoadp + TMP_DIR=/alabama/cspi/output_files/api-ci-op-24wp7hk6-2e88b-cspilp-interop-ccitredhat-com:6443 + source /alabama/cspi/test_settings/scripts/aws_settings.sh ++ cat ++ [[ aws == \a\w\s ]] ++ cat ++ echo -e '\n }\n}' +++ cat /alabama/cspi/output_files/api-ci-op-24wp7hk6-2e88b-cspilp-interop-ccitredhat-com:6443/settings.json ++ x='{ "metadata": { "namespace": "openshift-adp" }, "spec": { "configuration":{ "velero":{ "defaultPlugins": [ "openshift", "aws" ] } }, "backupLocations": [ { "velero": { "provider": "aws", "default": true, "config": { "region": "us-east-1" }, "credential":{ "name": "cloud-credentials", "key": "cloud" }, "objectStorage":{ "bucket": "ci-op-24wp7hk6-interopoadp" } } } ] , "snapshotLocations": [ { "velero": { "provider": "aws", "config": { "profile": "default", "region": "us-east-1" } } } ] } }' ++ echo '{ "metadata": { "namespace": "openshift-adp" }, "spec": { "configuration":{ "velero":{ "defaultPlugins": [ "openshift", "aws" ] } }, "backupLocations": [ { "velero": { "provider": "aws", "default": true, "config": { "region": "us-east-1" }, "credential":{ "name": "cloud-credentials", "key": "cloud" }, "objectStorage":{ "bucket": "ci-op-24wp7hk6-interopoadp" } } } ] , "snapshotLocations": [ { "velero": { "provider": "aws", "config": { "profile": "default", "region": "us-east-1" } } } ] } }' ++ grep -o '^[^#]*' + FILE_SETTINGS_NAME=settings.json + printf '\033[34mGenerated settings file under /alabama/cspi/output_files/api-ci-op-24wp7hk6-2e88b-cspilp-interop-ccitredhat-com:6443/settings.json\e[39m\n' Generated settings file under /alabama/cspi/output_files/api-ci-op-24wp7hk6-2e88b-cspilp-interop-ccitredhat-com:6443/settings.json + cat /alabama/cspi/output_files/api-ci-op-24wp7hk6-2e88b-cspilp-interop-ccitredhat-com:6443/settings.json ++ oc get volumesnapshotclass -o name + for i in $(oc get volumesnapshotclass -o name) + oc annotate volumesnapshotclass.snapshot.storage.k8s.io/csi-aws-vsc snapshot.storage.kubernetes.io/is-default-class- volumesnapshotclass.snapshot.storage.k8s.io/csi-aws-vsc annotate ++ ./e2e/must-gather/get-latest-build.sh + oc get configmaps -n default must-gather-image + UPSTREAM_VERSION=99.0.0 ++ oc get OperatorCondition -n openshift-adp -o 'jsonpath={.items[*].metadata.name}' ++ awk -F v '{print $2}' + OADP_VERSION=1.3.0 + '[' -z 1.3.0 ']' + '[' 1.3.0 == 99.0.0 ']' ++ oc get sub redhat-oadp-operator -n openshift-adp -o 'jsonpath={.spec.source}' + OADP_REPO=redhat-operators + '[' -z redhat-operators ']' + '[' redhat-operators == redhat-operators ']' + REGISTRY_PATH=registry.redhat.io/oadp/oadp-mustgather-rhel9: + TAG=1.3.0 + export MUST_GATHER_BUILD=registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 + MUST_GATHER_BUILD=registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 + echo registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 + export MUST_GATHER_BUILD=registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 + MUST_GATHER_BUILD=registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 + '[' -z registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 ']' + export NUM_OF_OADP_INSTANCES=1 + NUM_OF_OADP_INSTANCES=1 + ginkgo run --flake-attempts=3 -p --nodes=1 -mod=mod e2e/ --show-node-events -- -credentials_file=/tmp/test-settings/credentials -oadp_namespace=openshift-adp -settings=/alabama/cspi/output_files/api-ci-op-24wp7hk6-2e88b-cspilp-interop-ccitredhat-com:6443/settings.json -must_gather_image=registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 -timeout_multiplier=2 --ginkgo.junit-report=/alabama/cspi/e2e/junit_report.xml '--ginkgo.label-filter=!// || (// && !exclude_aws && (!/target/ || target_aws) ) ' --ginkgo.focus=test-upstream Ginkgo detected a version mismatch between the Ginkgo CLI and the version of Ginkgo imported by your packages: Ginkgo CLI Version: 2.13.2 Mismatched package versions found: 2.11.0 used by e2e Ginkgo will continue to attempt to run but you may see errors (including flag parsing errors) and should either update your go.mod or your version of the Ginkgo CLI to match. To install the matching version of the CLI run go install github.com/onsi/ginkgo/v2/ginkgo from a path that contains a go.mod file. Alternatively you can use go run github.com/onsi/ginkgo/v2/ginkgo from a path that contains a go.mod file to invoke the matching version of the Ginkgo CLI. If you are attempting to test multiple packages that each have a different version of the Ginkgo library with a single Ginkgo CLI that is currently unsupported.  go: downloading github.com/openshift/oadp-operator v1.0.2-0.20231003152846-925e5cc16370 go: downloading github.com/onsi/ginkgo/v2 v2.11.0 go: downloading github.com/onsi/gomega v1.27.8 go: downloading github.com/vmware-tanzu/velero v1.12.0 go: downloading k8s.io/api v0.25.6 go: downloading k8s.io/apimachinery v0.25.6 go: downloading k8s.io/client-go v0.25.6 go: downloading github.com/andygrunwald/go-jira v1.16.0 go: downloading sigs.k8s.io/controller-runtime v0.12.2 go: downloading github.com/operator-framework/api v0.14.1-0.20220413143725-33310d6154f3 go: downloading k8s.io/utils v0.0.0-20230115233650-391b47cb4029 go: downloading github.com/backube/volsync v0.7.0 go: downloading github.com/google/uuid v1.3.0 go: downloading github.com/konveyor/volume-snapshot-mover v0.0.0-20230510202920-14b0e79dbaaa go: downloading github.com/kubernetes-csi/external-snapshotter/client/v4 v4.2.0 go: downloading github.com/openshift/api v0.0.0-20230414143018-3367bc7e6ac7 go: downloading github.com/openshift/client-go v0.0.0-20211209144617-7385dd6338e3 go: downloading k8s.io/kubectl v0.25.6 go: downloading github.com/apenella/go-ansible v1.1.5 go: downloading github.com/fatih/structs v1.1.0 go: downloading github.com/golang-jwt/jwt/v4 v4.5.0 go: downloading github.com/google/go-querystring v1.1.0 go: downloading github.com/pkg/errors v0.9.1 go: downloading github.com/trivago/tgo v1.0.7 go: downloading github.com/google/go-cmp v0.5.9 go: downloading github.com/go-logr/logr v1.2.4 go: downloading github.com/evanphx/json-patch v5.6.0+incompatible go: downloading github.com/sirupsen/logrus v1.9.0 go: downloading github.com/gogo/protobuf v1.3.2 go: downloading github.com/google/gofuzz v1.2.0 go: downloading gopkg.in/inf.v0 v0.9.1 go: downloading k8s.io/klog/v2 v2.90.0 go: downloading sigs.k8s.io/structured-merge-diff/v4 v4.2.3 go: downloading github.com/imdario/mergo v0.3.13 go: downloading github.com/spf13/pflag v1.0.5 go: downloading golang.org/x/term v0.10.0 go: downloading golang.org/x/net v0.12.0 go: downloading gopkg.in/yaml.v3 v3.0.1 go: downloading golang.org/x/sys v0.10.0 go: downloading sigs.k8s.io/json v0.0.0-20221116044647-bc3834ca7abd go: downloading k8s.io/kube-openapi v0.0.0-20230118215034-64b6bb138190 go: downloading golang.org/x/time v0.3.0 go: downloading github.com/blang/semver/v4 v4.0.0 go: downloading github.com/golang/groupcache v0.0.0-20210331224755-41bb18bfe9da go: downloading github.com/golang/protobuf v1.5.3 go: downloading github.com/google/gnostic v0.6.9 go: downloading gopkg.in/yaml.v2 v2.4.0 go: downloading github.com/json-iterator/go v1.1.12 go: downloading github.com/davecgh/go-spew v1.1.1 go: downloading golang.org/x/oauth2 v0.7.0 go: downloading github.com/apenella/go-common-utils/data v0.0.0-20210528133155-34ba915e28c8 go: downloading github.com/apenella/go-common-utils/error v0.0.0-20210528133155-34ba915e28c8 go: downloading sigs.k8s.io/yaml v1.3.0 go: downloading golang.org/x/text v0.11.0 go: downloading google.golang.org/protobuf v1.30.0 go: downloading github.com/moby/spdystream v0.2.0 go: downloading github.com/modern-go/concurrent v0.0.0-20180306012644-bacd9c7ef1dd go: downloading github.com/modern-go/reflect2 v1.0.2 go: downloading github.com/spf13/cobra v1.6.1 go: downloading k8s.io/cli-runtime v0.25.6 go: downloading k8s.io/component-base v0.26.1 go: downloading github.com/munnerz/goautoneg v0.0.0-20191010083416-a7dc8b61c822 go: downloading github.com/jonboulle/clockwork v0.2.2 go: downloading k8s.io/component-helpers v0.26.1 go: downloading github.com/daviddengcn/go-colortext v1.0.0 go: downloading github.com/liggitt/tabwriter v0.0.0-20181228230101-89fcab3d43de go: downloading github.com/docker/distribution v2.8.1+incompatible go: downloading github.com/moby/term v0.0.0-20220808134915-39b0c02b01ae go: downloading github.com/fvbommel/sortorder v1.0.1 go: downloading sigs.k8s.io/kustomize/kustomize/v4 v4.5.7 go: downloading sigs.k8s.io/kustomize/kyaml v0.13.9 go: downloading github.com/lithammer/dedent v1.1.0 go: downloading k8s.io/metrics v0.26.1 go: downloading github.com/chai2010/gettext-go v1.0.2 go: downloading github.com/MakeNowJust/heredoc v1.0.0 go: downloading github.com/mitchellh/go-wordwrap v1.0.0 go: downloading github.com/russross/blackfriday v1.5.2 go: downloading github.com/emicklei/go-restful/v3 v3.10.1 go: downloading github.com/go-openapi/swag v0.22.3 go: downloading github.com/go-openapi/jsonreference v0.20.2 go: downloading github.com/exponent-io/jsonpath v0.0.0-20151013193312-d6023ce2651d go: downloading github.com/fatih/camelcase v1.0.0 go: downloading sigs.k8s.io/kustomize/api v0.12.1 go: downloading k8s.io/apiextensions-apiserver v0.26.1 go: downloading github.com/bombsimon/logrusr/v3 v3.0.0 go: downloading github.com/prometheus/client_golang v1.15.0 go: downloading k8s.io/kube-aggregator v0.19.12 go: downloading github.com/mxk/go-flowrate v0.0.0-20140419014527-cca7078d478f go: downloading github.com/opencontainers/go-digest v1.0.0 go: downloading github.com/go-openapi/jsonpointer v0.19.6 go: downloading github.com/mailru/easyjson v0.7.7 go: downloading github.com/hashicorp/go-plugin v1.4.3 go: downloading google.golang.org/grpc v1.54.0 go: downloading github.com/robfig/cron v1.1.0 go: downloading github.com/gobwas/glob v0.2.3 go: downloading github.com/hashicorp/go-hclog v1.2.0 go: downloading github.com/gregjones/httpcache v0.0.0-20180305231024-9cad4c3443a7 go: downloading github.com/peterbourgon/diskv v2.0.1+incompatible go: downloading github.com/josharian/intern v1.0.0 go: downloading github.com/mitchellh/go-testing-interface v1.0.0 go: downloading github.com/hashicorp/yamux v0.0.0-20180604194846-3520598351bb go: downloading github.com/oklog/run v1.0.0 go: downloading github.com/prometheus/client_model v0.3.0 go: downloading github.com/prometheus/common v0.42.0 go: downloading github.com/Azure/azure-sdk-for-go v67.2.0+incompatible go: downloading github.com/Azure/go-autorest/autorest v0.11.27 go: downloading github.com/Azure/go-autorest/autorest/azure/auth v0.5.8 go: downloading github.com/aws/aws-sdk-go v1.44.253 go: downloading github.com/Azure/go-autorest v14.2.0+incompatible go: downloading github.com/joho/godotenv v1.3.0 go: downloading github.com/beorn7/perks v1.0.1 go: downloading github.com/cespare/xxhash/v2 v2.2.0 go: downloading github.com/prometheus/procfs v0.9.0 go: downloading gomodules.xyz/jsonpatch/v2 v2.2.0 go: downloading github.com/fatih/color v1.15.0 go: downloading github.com/mattn/go-isatty v0.0.17 go: downloading github.com/kopia/kopia v0.13.0 go: downloading github.com/google/btree v1.0.1 go: downloading github.com/Azure/go-autorest/logger v0.2.1 go: downloading github.com/Azure/go-autorest/tracing v0.6.0 go: downloading github.com/Azure/go-autorest/autorest/adal v0.9.20 go: downloading github.com/Azure/go-autorest/autorest/azure/cli v0.4.2 go: downloading github.com/dimchansky/utfbom v1.1.1 go: downloading github.com/matttproud/golang_protobuf_extensions v1.0.4 go: downloading github.com/go-errors/errors v1.0.1 go: downloading github.com/fsnotify/fsnotify v1.6.0 go: downloading github.com/mattn/go-colorable v0.1.13 go: downloading github.com/monochromegane/go-gitignore v0.0.0-20200626010858-205db1a8cc00 go: downloading github.com/xlab/treeprint v1.1.0 go: downloading github.com/Azure/go-autorest/autorest/date v0.3.0 go: downloading github.com/mitchellh/go-homedir v1.1.0 go: downloading golang.org/x/crypto v0.11.0 go: downloading github.com/google/shlex v0.0.0-20191202100458-e7afc7fbc510 go: downloading google.golang.org/genproto v0.0.0-20230410155749-daa745c078e1 go: downloading go.uber.org/zap v1.24.0 go: downloading github.com/klauspost/compress v1.16.5 go: downloading github.com/klauspost/pgzip v1.2.5 go: downloading github.com/pierrec/lz4 v2.6.1+incompatible go: downloading go.uber.org/multierr v1.11.0 go: downloading github.com/gofrs/flock v0.8.1 go: downloading golang.org/x/sync v0.3.0 go: downloading go.opentelemetry.io/otel v1.14.0 go: downloading go.opentelemetry.io/otel/trace v1.14.0 go: downloading github.com/edsrzf/mmap-go v1.1.0 go: downloading github.com/zeebo/blake3 v0.2.3 go: downloading golang.org/x/exp v0.0.0-20230213192124-5e25df0256eb go: downloading github.com/Azure/azure-sdk-for-go/sdk/storage/azblob v0.3.0 go: downloading cloud.google.com/go/storage v1.30.1 go: downloading google.golang.org/api v0.120.0 go: downloading github.com/minio/minio-go/v7 v7.0.52 go: downloading cloud.google.com/go v0.110.0 go: downloading github.com/chmduquesne/rollinghash v4.0.0+incompatible go: downloading go.starlark.net v0.0.0-20201006213952-227f4aabceb5 go: downloading github.com/natefinch/atomic v1.0.1 go: downloading github.com/klauspost/reedsolomon v1.11.7 go: downloading go.uber.org/atomic v1.10.0 go: downloading cloud.google.com/go/compute/metadata v0.2.3 go: downloading cloud.google.com/go/compute v1.19.0 go: downloading cloud.google.com/go/iam v0.13.0 go: downloading github.com/googleapis/gax-go/v2 v2.8.0 go: downloading gopkg.in/ini.v1 v1.67.0 go: downloading github.com/minio/md5-simd v1.1.2 go: downloading github.com/minio/sha256-simd v1.0.0 go: downloading github.com/klauspost/cpuid/v2 v2.2.4 go: downloading github.com/go-logr/stdr v1.2.2 go: downloading go.opencensus.io v0.24.0 go: downloading golang.org/x/xerrors v0.0.0-20220907171357-04be3eba64a2 go: downloading github.com/rs/xid v1.4.0 go: downloading github.com/jmespath/go-jmespath v0.4.0 go: downloading github.com/googleapis/enterprise-certificate-proxy v0.2.3 go: downloading github.com/google/s2a-go v0.1.2 go: downloading google.golang.org/appengine v1.6.7 go: downloading github.com/Azure/go-autorest/autorest/validation v0.2.0 go: downloading github.com/Azure/go-autorest/autorest/to v0.3.0 go: downloading github.com/Azure/azure-sdk-for-go/sdk/azcore v0.21.1 go: downloading github.com/Azure/azure-sdk-for-go/sdk/internal v0.8.3 2023/12/04 17:16:12 Setting up clients I1204 17:16:13.805776 22324 request.go:690] Waited for 1.032401041s due to client-side throttling, not priority and fairness, request: GET:https://api.ci-op-24wp7hk6-2e88b.cspilp.interop.ccitredhat.com:6443/apis/policy/v1?timeout=32s 2023/12/04 17:16:15 Getting default StorageClass... Run the command: oc get sc 2023/12/04 17:16:15 Got default StorageClass gp3-csi 2023/12/04 17:16:15 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 29m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 29m 2023/12/04 17:16:15 Using velero prefix: velero-e2e-d4b1a7d5-92c8-11ee-b39d-0a580a838148 Running Suite: OADP E2E Suite - /alabama/cspi/e2e ================================================= Random Seed: 1701710123 Will run 7 of 124 specs ------------------------------ [SynchronizedBeforeSuite]  /alabama/cspi/e2e/e2e_suite_test.go:72 2023/12/04 17:16:15 Update resource allocation for the manager pod in the OADP namespace: openshift-adp 2023/12/04 17:16:15 Get the OADP ClusterServiceVersion in the namespace: openshift-adp 2023/12/04 17:16:18 Successfully got the OADP ClusterServiceVersion name: oadp-operator.v1.3.0 2023/12/04 17:16:20 Successfully updated the OADP ClusterServiceVersion name: oadp-operator.v1.3.0 [SynchronizedBeforeSuite] PASSED [5.277 seconds] ------------------------------ SSSSS ------------------------------ Backup hooks tests Pre exec hook [tc-id:OADP-92][test-upstream] Cassandra app with Restic /alabama/cspi/e2e/hooks/backup_hooks.go:110 2023/12/04 17:16:20 Delete all downloadrequest No download requests are found STEP: Create DPA CR @ 12/04/23 17:16:20.753 Updating resource allocations for NodeAgent because running tests in parallel Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:16:20 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "29a25562-3ab6-4844-a714-f5882c3ce1dd", "resourceVersion": "38379", "generation": 1, "creationTimestamp": "2023-12-04T17:16:20Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:16:20Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:nodeAgent": { ".": {}, "f:enable": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } }, "f:uploaderType": {} }, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } }, "nodeAgent": { "enable": true, "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } }, "uploaderType": "restic" } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:16:20.788 2023/12/04 17:16:20 Waiting for velero pod to be running 2023/12/04 17:16:25 pod: velero-5cf59669b7-g2hk8 is not yet running with status: {Pending [{Initialized False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotInitialized containers with incomplete status: [openshift-velero-plugin velero-plugin-for-aws kubevirt-velero-plugin]} {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotReady containers with unready status: [velero]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotReady containers with unready status: [velero]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC }] 10.0.95.106 [] 2023-12-04 17:16:20 +0000 UTC [{openshift-velero-plugin {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-plugin-rhel9@sha256:98264ebcc6950f6f240a547740260e9755ef757cec336ab6b5e8bba4d75e9502 0xc0012b27fb} {velero-plugin-for-aws {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-plugin-for-aws-rhel9@sha256:32309eaa2e565b349f2806c6bc6f834876a64cd106e48044e982ed6925d5c6bf 0xc0012b27fc} {kubevirt-velero-plugin {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-kubevirt-velero-plugin-rhel9@sha256:b49ab89e7bc68b9e4e83fbbf33e215a339c28cf48df8a2ed3e8440f35f22b6a6 0xc0012b27fd}] [{velero {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-rhel9@sha256:06482afcea65eff184901c63cbe9fc6e5b3172d23857301ad4e7b4daf362e79c 0xc0012b2836}] Burstable []} 2023/12/04 17:16:30 pod: velero-5cf59669b7-g2hk8 is not yet running with status: {Pending [{Initialized False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotInitialized containers with incomplete status: [velero-plugin-for-aws kubevirt-velero-plugin]} {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotReady containers with unready status: [velero]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotReady containers with unready status: [velero]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC }] 10.0.95.106 10.131.0.22 [{10.131.0.22}] 2023-12-04 17:16:20 +0000 UTC [{openshift-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:16:28 +0000 UTC,FinishedAt:2023-12-04 17:16:28 +0000 UTC,ContainerID:cri-o://88b753eff2938bc21c81a0626e24787bc5de0f02c86903229448817da66d2d7e,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-rhel9@sha256:98264ebcc6950f6f240a547740260e9755ef757cec336ab6b5e8bba4d75e9502 a27d719be33680a0083b7bef21a83ebb0f94eb61a9562bc9950fb479ae35057a cri-o://88b753eff2938bc21c81a0626e24787bc5de0f02c86903229448817da66d2d7e 0xc000f59619} {velero-plugin-for-aws {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-plugin-for-aws-rhel9@sha256:32309eaa2e565b349f2806c6bc6f834876a64cd106e48044e982ed6925d5c6bf 0xc000f5961a} {kubevirt-velero-plugin {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-kubevirt-velero-plugin-rhel9@sha256:b49ab89e7bc68b9e4e83fbbf33e215a339c28cf48df8a2ed3e8440f35f22b6a6 0xc000f5961b}] [{velero {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-rhel9@sha256:06482afcea65eff184901c63cbe9fc6e5b3172d23857301ad4e7b4daf362e79c 0xc000f59646}] Burstable []} 2023/12/04 17:16:35 pod: velero-5cf59669b7-g2hk8 is not yet running with status: {Pending [{Initialized False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotInitialized containers with incomplete status: [kubevirt-velero-plugin]} {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotReady containers with unready status: [velero]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC ContainersNotReady containers with unready status: [velero]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:16:20 +0000 UTC }] 10.0.95.106 10.131.0.22 [{10.131.0.22}] 2023-12-04 17:16:20 +0000 UTC [{openshift-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:16:28 +0000 UTC,FinishedAt:2023-12-04 17:16:28 +0000 UTC,ContainerID:cri-o://88b753eff2938bc21c81a0626e24787bc5de0f02c86903229448817da66d2d7e,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-rhel9@sha256:98264ebcc6950f6f240a547740260e9755ef757cec336ab6b5e8bba4d75e9502 a27d719be33680a0083b7bef21a83ebb0f94eb61a9562bc9950fb479ae35057a cri-o://88b753eff2938bc21c81a0626e24787bc5de0f02c86903229448817da66d2d7e 0xc000f59d09} {velero-plugin-for-aws {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:16:32 +0000 UTC,FinishedAt:2023-12-04 17:16:32 +0000 UTC,ContainerID:cri-o://5ba474e49f611809232b7ab692ab1fe0a9f194a3a75ce72b25c8551f203e9461,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-for-aws-rhel9@sha256:32309eaa2e565b349f2806c6bc6f834876a64cd106e48044e982ed6925d5c6bf 9c953b830e58ee14db469d2b88e4fda9406f0e5f4b0f88c7913cd9a4bc7b0641 cri-o://5ba474e49f611809232b7ab692ab1fe0a9f194a3a75ce72b25c8551f203e9461 0xc000f59d19} {kubevirt-velero-plugin {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-kubevirt-velero-plugin-rhel9@sha256:b49ab89e7bc68b9e4e83fbbf33e215a339c28cf48df8a2ed3e8440f35f22b6a6 0xc000f59d1a}] [{velero {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-rhel9@sha256:06482afcea65eff184901c63cbe9fc6e5b3172d23857301ad4e7b4daf362e79c 0xc000f59d36}] Burstable []} 2023/12/04 17:16:40 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:16:40.844 2023/12/04 17:16:40 Checking for correct number of running NodeAgent pods... STEP: Installing application for case cassandra-hooks-e2e @ 12/04/23 17:16:40.857 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] /usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py:1013: InsecureRequestWarning: Unverified HTTPS request is being made to host 'api.ci-op-24wp7hk6-2e88b.cspilp.interop.ccitredhat.com'. Adding certificate verification is strongly advised. See: https://urllib3.readthedocs.io/en/1.26.x/advanced-usage.html#ssl-warnings warnings.warn( TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Check namespace] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Create namespace] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Add scc privileged to service account] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Create a service object required to provide network identity] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Create a statefulset with the existing yaml] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check pods status (30 retries left). FAILED - RETRYING: [localhost]: Check pods status (29 retries left). FAILED - RETRYING: [localhost]: Check pods status (28 retries left). FAILED - RETRYING: [localhost]: Check pods status (27 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Check pods status] *** ok: [localhost] FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (30 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (29 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (28 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (27 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (26 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (25 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (24 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (23 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (22 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (21 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (20 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (19 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (18 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (17 retries left). FAILED - RETRYING: [localhost]: Wait until all cassandra node are ready (Status=Up and State=Normal) (16 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Wait until all cassandra node are ready (Status=Up and State=Normal)] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Populate the DB with some sample data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=18  changed=9  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2023/12/04 17:19:38 2023-12-04 17:16:42,278 p=22364 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:16:42,278 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:16:42,489 p=22364 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:16:42,489 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:16:42,705 p=22364 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:16:42,706 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:16:42,720 p=22364 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:16:42,720 p=22364 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:16:43,021 p=22364 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:16:43,021 p=22364 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:16:43,045 p=22364 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:16:43,046 p=22364 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:16:43,067 p=22364 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:16:43,067 p=22364 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:16:43,077 p=22364 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:16:43,601 p=22364 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:16:43,601 p=22364 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:16:44,304 p=22364 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Check namespace] *** 2023-12-04 17:16:44,304 p=22364 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:16:44,720 p=22364 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Create namespace] *** 2023-12-04 17:16:44,721 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:16:44,972 p=22364 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Add scc privileged to service account] *** 2023-12-04 17:16:44,972 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:16:45,693 p=22364 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Create a service object required to provide network identity] *** 2023-12-04 17:16:45,693 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:16:46,278 p=22364 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Create a statefulset with the existing yaml] *** 2023-12-04 17:16:46,278 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:17:09,395 p=22364 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Check pods status] *** 2023-12-04 17:17:09,396 p=22364 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:19:35,401 p=22364 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Wait until all cassandra node are ready (Status=Up and State=Normal)] *** 2023-12-04 17:19:35,401 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:38,696 p=22364 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Populate the DB with some sample data] *** 2023-12-04 17:19:38,696 p=22364 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:38,807 p=22364 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:19:38,807 p=22364 u=1008320000 n=ansible | localhost : ok=18 changed=9 unreachable=0 failed=0 skipped=6 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:19:38.849 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Check pods status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Wait until all cassandra node are ready (Status=Up and State=Normal)] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Verify Data was populated] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : copy] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Verify Data Integrity] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=15  changed=7  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2023/12/04 17:19:48 2023-12-04 17:19:40,518 p=23261 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:19:40,518 p=23261 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:40,723 p=23261 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:19:40,723 p=23261 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:40,939 p=23261 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:19:40,939 p=23261 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:40,952 p=23261 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:19:40,952 p=23261 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:19:41,249 p=23261 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:19:41,249 p=23261 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:19:41,272 p=23261 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:19:41,273 p=23261 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:19:41,294 p=23261 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:19:41,294 p=23261 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:19:41,308 p=23261 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:19:41,815 p=23261 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:19:41,815 p=23261 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:19:42,794 p=23261 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Check pods status] *** 2023-12-04 17:19:42,794 p=23261 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:19:47,100 p=23261 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Wait until all cassandra node are ready (Status=Up and State=Normal)] *** 2023-12-04 17:19:47,101 p=23261 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:48,279 p=23261 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Verify Data was populated] *** 2023-12-04 17:19:48,279 p=23261 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:48,728 p=23261 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : copy] *** 2023-12-04 17:19:48,728 p=23261 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:48,896 p=23261 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Verify Data Integrity] *** 2023-12-04 17:19:48,896 p=23261 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:19:48,907 p=23261 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:19:48,907 p=23261 u=1008320000 n=ansible | localhost : ok=15 changed=7 unreachable=0 failed=0 skipped=9 rescued=0 ignored=0 STEP: Creating backup cassandra-hooks-e2e @ 12/04/23 17:19:48.953 2023/12/04 17:19:48 Wait until backup cassandra-hooks-e2e is completed backup phase: InProgress backup phase: InProgress backup phase: Completed STEP: Verify backup cassandra-hooks-e2e has completed successfully @ 12/04/23 17:20:49.007 2023/12/04 17:20:49 Backup for case cassandra-hooks-e2e succeeded STEP: Verify pre-backup hooks were executed; verify that the cassandra app is quiesced and it's not possible to connect to the DB server @ 12/04/23 17:20:49.066 2023/12/04 17:20:49 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 2023/12/04 17:20:49 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Remove namespace test-oadp-92] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=13  rescued=0 ignored=0 2023/12/04 17:21:12 2023-12-04 17:20:50,617 p=23566 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:20:50,618 p=23566 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:20:50,818 p=23566 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:20:50,819 p=23566 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:20:51,020 p=23566 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:20:51,020 p=23566 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:20:51,032 p=23566 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:20:51,032 p=23566 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:20:51,267 p=23566 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:20:51,267 p=23566 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:20:51,289 p=23566 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:20:51,289 p=23566 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:20:51,308 p=23566 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:20:51,308 p=23566 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:20:51,317 p=23566 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:20:51,807 p=23566 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:20:51,807 p=23566 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:12,545 p=23566 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-cassandra : Remove namespace test-oadp-92] *** 2023-12-04 17:21:12,545 p=23566 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:12,786 p=23566 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:21:12,787 p=23566 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=13 rescued=0 ignored=0 2023/12/04 17:21:12 Cleaning setup resources for the backup • [292.095 seconds] ------------------------------ S ------------------------------ Backup restore tests Application backup [tc-id:OADP-198][test-upstream][smoke] Different labels selector: Backup and Restore with multiple matched labels [orLabelSelectors] [labels] /alabama/cspi/e2e/app_backup/backup_restore_labels.go:46 2023/12/04 17:21:12 Delete all downloadrequest No download requests are found STEP: Create DPA CR @ 12/04/23 17:21:12.853 Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:21:12 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "658c0c5e-7561-412f-898a-7fe32675b74e", "resourceVersion": "40722", "generation": 1, "creationTimestamp": "2023-12-04T17:21:12Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:21:12Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:21:12.914 2023/12/04 17:21:12 Waiting for velero pod to be running 2023/12/04 17:21:12 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' 2023/12/04 17:21:12 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "658c0c5e-7561-412f-898a-7fe32675b74e", "resourceVersion": "40722", "generation": 1, "creationTimestamp": "2023-12-04T17:21:12Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:21:12Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:21:17.937 Run the command: oc get ns openshift-storage &> /dev/null && echo true || echo false 2023/12/04 17:21:18 The 'openshift-storage' namespace does not exist 2023/12/04 17:21:18 Using default CSI driver based on infrastructure: ebs.csi.aws.com 2023/12/04 17:21:18 Snapclass 'example-snapclass' doesn't exist, creating 2023/12/04 17:21:18 Setting new default StorageClass 'gp2-csi' Run the command: oc get sc 2023/12/04 17:21:18 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 35m gp3-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 35m STEP: Installing application for case mysql198 @ 12/04/23 17:21:18.132 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-198] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check pod status (30 retries left). FAILED - RETRYING: [localhost]: Check pod status (29 retries left). FAILED - RETRYING: [localhost]: Check pod status (28 retries left). FAILED - RETRYING: [localhost]: Check pod status (27 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Wait until service ready for connections (30 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2023/12/04 17:21:51 2023-12-04 17:21:19,427 p=23814 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:21:19,428 p=23814 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:19,634 p=23814 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:21:19,634 p=23814 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:19,828 p=23814 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:21:19,828 p=23814 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:19,841 p=23814 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:21:19,841 p=23814 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:20,088 p=23814 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:21:20,088 p=23814 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:20,114 p=23814 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:21:20,114 p=23814 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:20,136 p=23814 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:21:20,136 p=23814 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:20,147 p=23814 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:21:20,637 p=23814 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:21:20,637 p=23814 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:21,356 p=23814 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-198] *** 2023-12-04 17:21:21,356 p=23814 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:21,672 p=23814 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** 2023-12-04 17:21:21,672 p=23814 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:22,461 p=23814 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** 2023-12-04 17:21:22,461 p=23814 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:45,019 p=23814 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** 2023-12-04 17:21:45,019 p=23814 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:45,427 p=23814 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** 2023-12-04 17:21:45,427 p=23814 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:50,913 p=23814 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:21:50,913 p=23814 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:51,449 p=23814 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** 2023-12-04 17:21:51,449 p=23814 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:51,516 p=23814 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:21:51,517 p=23814 u=1008320000 n=ansible | localhost : ok=17 changed=8 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:21:51.565 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:21:55 2023-12-04 17:21:52,934 p=24256 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:21:52,935 p=24256 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:53,140 p=24256 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:21:53,140 p=24256 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:53,357 p=24256 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:21:53,358 p=24256 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:53,370 p=24256 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:21:53,370 p=24256 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:53,630 p=24256 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:21:53,630 p=24256 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:53,653 p=24256 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:21:53,653 p=24256 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:53,674 p=24256 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:21:53,674 p=24256 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:53,684 p=24256 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:21:54,175 p=24256 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:21:54,175 p=24256 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:55,087 p=24256 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** 2023-12-04 17:21:55,088 p=24256 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:21:55,347 p=24256 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:21:55,348 p=24256 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:55,680 p=24256 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:21:55,680 p=24256 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:21:55,691 p=24256 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:21:55,691 p=24256 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 STEP: Creating backup mysql198-85f38a45-92c9-11ee-b39d-0a580a838148 @ 12/04/23 17:21:55.743 2023/12/04 17:21:55 Wait until backup mysql198-85f38a45-92c9-11ee-b39d-0a580a838148 is completed backup phase: InProgress backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: PartiallyFailed STEP: Verify backup mysql198-85f38a45-92c9-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:23:15.791 [FAILED] in [It] - /alabama/cspi/test_common/backup_restore_case.go:125 @ 12/04/23 17:23:15.839 2023/12/04 17:23:15 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 STEP: Get the failed spec name @ 12/04/23 17:23:15.839 2023/12/04 17:23:15 The failed spec name is: Backup restore tests Application backup [tc-id:OADP-198][test-upstream][smoke] Different labels selector: Backup and Restore with multiple matched labels [orLabelSelectors] STEP: Create a folder for all must-gather files if it doesn't exists already @ 12/04/23 17:23:15.839 2023/12/04 17:23:15 The folder logs does not exists, creating new folder with the name: logs STEP: Create a folder for the failed spec if it doesn't exists already @ 12/04/23 17:23:15.839 2023/12/04 17:23:15 The folder logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-198_test-upstream_smoke_Different_labels_selector_Backup_and_Restore_with_multiple_matched_labels_orLabelSelectors_labels does not exists, creating new folder with the name: logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-198_test-upstream_smoke_Different_labels_selector_Backup_and_Restore_with_multiple_matched_labels_orLabelSelectors_labels STEP: Run must-gather because the spec failed @ 12/04/23 17:23:15.839 2023/12/04 17:23:15 [adm must-gather --dest-dir logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-198_test-upstream_smoke_Different_labels_selector_Backup_and_Restore_with_multiple_matched_labels_orLabelSelectors_labels --image registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0] STEP: Find must-gather folder and rename it to a shorter more readable name @ 12/04/23 17:23:50.232 2023/12/04 17:23:50 Failed to find must-gather folder 2023/12/04 17:23:50 rename logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-198_test-upstream_smoke_Different_labels_selector_Backup_and_Restore_with_multiple_matched_labels_orLabelSelectors_labels/must-gather: no such file or directory [FAILED] in [JustAfterEach] - /alabama/cspi/lib/must_gather_helpers.go:111 @ 12/04/23 17:23:50.233 2023/12/04 17:23:50 The backup operation was not successful. Removing the namespace finalizers 2023/12/04 17:23:50 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-198] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:24:09 2023-12-04 17:23:51,688 p=24561 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:23:51,688 p=24561 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:23:51,911 p=24561 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:23:51,911 p=24561 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:23:52,141 p=24561 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:23:52,141 p=24561 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:23:52,154 p=24561 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:23:52,155 p=24561 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:23:52,401 p=24561 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:23:52,401 p=24561 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:23:52,423 p=24561 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:23:52,424 p=24561 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:23:52,444 p=24561 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:23:52,444 p=24561 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:23:52,454 p=24561 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:23:52,969 p=24561 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:23:52,969 p=24561 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:08,756 p=24561 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-198] *** 2023-12-04 17:24:08,757 p=24561 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:08,961 p=24561 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:24:08,961 p=24561 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:24:09 Cleaning setup resources for the backup 2023/12/04 17:24:09 Setting new default StorageClass 'gp3-csi' Run the command: oc get sc 2023/12/04 17:24:09 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 37m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 37m 2023/12/04 17:24:09 Deleting VolumeSnapshotClass 'example-snapclass' Attempt #1 Failed. Retrying ↺ @ 12/04/23 17:24:09.142 2023/12/04 17:24:09 Delete all downloadrequest cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-2faf1ba1-2725-42f5-8ef4-7386921c2249 cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-ca00b5ff-4bfa-4dde-8bcd-bbec4095538c mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-1ad0a923-5541-474f-87ad-de9bf86117f9 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-bfea09de-47a0-40dd-8692-7f6946ee82a9 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-e0db19bb-0413-4c2c-a66d-f054f1e40964 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-ea1c970e-86ca-4af6-880a-5c47ab0bf675 STEP: Create DPA CR @ 12/04/23 17:24:09.222 Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:24:09 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "160c8b5b-9cbc-4845-a1fe-1ca881c0cced", "resourceVersion": "42843", "generation": 1, "creationTimestamp": "2023-12-04T17:24:09Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:24:09Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:24:09.246 2023/12/04 17:24:09 Waiting for velero pod to be running 2023/12/04 17:24:09 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' 2023/12/04 17:24:09 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "160c8b5b-9cbc-4845-a1fe-1ca881c0cced", "resourceVersion": "42843", "generation": 1, "creationTimestamp": "2023-12-04T17:24:09Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:24:09Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:24:14.263 2023/12/04 17:24:14 Snapclass 'example-snapclass' doesn't exist, creating 2023/12/04 17:24:14 Setting new default StorageClass 'gp2-csi' Run the command: oc get sc 2023/12/04 17:24:14 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 37m gp3-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 37m STEP: Installing application for case mysql198 @ 12/04/23 17:24:14.378 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-198] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check pod status (30 retries left). FAILED - RETRYING: [localhost]: Check pod status (29 retries left). FAILED - RETRYING: [localhost]: Check pod status (28 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2023/12/04 17:24:37 2023-12-04 17:24:15,731 p=24803 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:24:15,732 p=24803 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:15,948 p=24803 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:24:15,948 p=24803 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:16,158 p=24803 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:24:16,158 p=24803 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:16,170 p=24803 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:24:16,170 p=24803 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:16,421 p=24803 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:24:16,421 p=24803 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:16,449 p=24803 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:24:16,450 p=24803 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:16,469 p=24803 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:24:16,469 p=24803 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:16,480 p=24803 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:24:16,995 p=24803 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:24:16,996 p=24803 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:17,769 p=24803 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-198] *** 2023-12-04 17:24:17,770 p=24803 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:18,100 p=24803 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** 2023-12-04 17:24:18,100 p=24803 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:18,935 p=24803 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** 2023-12-04 17:24:18,935 p=24803 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:36,171 p=24803 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** 2023-12-04 17:24:36,172 p=24803 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:36,558 p=24803 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** 2023-12-04 17:24:36,558 p=24803 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:36,834 p=24803 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:24:36,834 p=24803 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:37,331 p=24803 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** 2023-12-04 17:24:37,331 p=24803 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:37,400 p=24803 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:24:37,400 p=24803 u=1008320000 n=ansible | localhost : ok=17 changed=8 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:24:37.442 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:24:41 2023-12-04 17:24:38,953 p=25201 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:24:38,953 p=25201 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:39,156 p=25201 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:24:39,156 p=25201 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:39,359 p=25201 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:24:39,359 p=25201 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:39,371 p=25201 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:24:39,371 p=25201 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:39,621 p=25201 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:24:39,621 p=25201 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:39,645 p=25201 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:24:39,645 p=25201 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:39,664 p=25201 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:24:39,664 p=25201 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:39,675 p=25201 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:24:40,176 p=25201 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:24:40,176 p=25201 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:41,066 p=25201 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** 2023-12-04 17:24:41,066 p=25201 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:24:41,325 p=25201 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:24:41,326 p=25201 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:41,651 p=25201 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:24:41,651 p=25201 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:24:41,664 p=25201 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:24:41,664 p=25201 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 STEP: Creating backup mysql198-ef0921de-92c9-11ee-b39d-0a580a838148 @ 12/04/23 17:24:41.714 2023/12/04 17:24:41 Wait until backup mysql198-ef0921de-92c9-11ee-b39d-0a580a838148 is completed backup phase: InProgress backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: PartiallyFailed STEP: Verify backup mysql198-ef0921de-92c9-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:26:01.803 [FAILED] in [It] - /alabama/cspi/test_common/backup_restore_case.go:125 @ 12/04/23 17:26:01.859 2023/12/04 17:26:01 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 STEP: Get the failed spec name @ 12/04/23 17:26:01.859 2023/12/04 17:26:01 The failed spec name is: Backup restore tests Application backup [tc-id:OADP-198][test-upstream][smoke] Different labels selector: Backup and Restore with multiple matched labels [orLabelSelectors] STEP: Create a folder for all must-gather files if it doesn't exists already @ 12/04/23 17:26:01.859 STEP: Create a folder for the failed spec if it doesn't exists already @ 12/04/23 17:26:01.859 STEP: Run must-gather because the spec failed @ 12/04/23 17:26:01.859 2023/12/04 17:26:01 [adm must-gather --dest-dir logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-198_test-upstream_smoke_Different_labels_selector_Backup_and_Restore_with_multiple_matched_labels_orLabelSelectors_labels --image registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0] STEP: Find must-gather folder and rename it to a shorter more readable name @ 12/04/23 17:26:12.346 2023/12/04 17:26:12 Failed to find must-gather folder 2023/12/04 17:26:12 rename logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-198_test-upstream_smoke_Different_labels_selector_Backup_and_Restore_with_multiple_matched_labels_orLabelSelectors_labels/must-gather: no such file or directory [FAILED] in [JustAfterEach] - /alabama/cspi/lib/must_gather_helpers.go:111 @ 12/04/23 17:26:12.346 2023/12/04 17:26:12 The backup operation was not successful. Removing the namespace finalizers 2023/12/04 17:26:12 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-198] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:26:26 2023-12-04 17:26:13,774 p=25508 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:26:13,775 p=25508 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:13,985 p=25508 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:26:13,986 p=25508 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:14,203 p=25508 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:26:14,204 p=25508 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:14,219 p=25508 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:26:14,219 p=25508 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:14,482 p=25508 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:26:14,482 p=25508 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:14,507 p=25508 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:26:14,508 p=25508 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:14,529 p=25508 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:26:14,529 p=25508 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:14,541 p=25508 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:26:15,059 p=25508 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:26:15,059 p=25508 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:25,889 p=25508 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-198] *** 2023-12-04 17:26:25,889 p=25508 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:26,087 p=25508 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:26:26,087 p=25508 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:26:26 Cleaning setup resources for the backup 2023/12/04 17:26:26 Setting new default StorageClass 'gp3-csi' Run the command: oc get sc 2023/12/04 17:26:26 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 40m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 40m 2023/12/04 17:26:26 Deleting VolumeSnapshotClass 'example-snapclass' Attempt #2 Failed. Retrying ↺ @ 12/04/23 17:26:26.265 2023/12/04 17:26:26 Delete all downloadrequest cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-26b148a8-40d9-4119-bb5e-7f5dc3d3e3e4 cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-f94f8de4-0af7-4d27-a4b9-51457b315153 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-1d72604a-6e15-40d1-8c39-4202408209ef mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-1dff3725-78dc-47b1-ac37-1a7c236183bc mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-4281da54-609c-42ac-857f-d243d5edeef1 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-d36d7ea3-6126-47a9-b851-2d3200b8d4fd mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-0a68697e-81c9-4bc1-bba3-52b3cd7cb3a4 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-14e181b6-445e-4654-b821-2a34fd95ce7c mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-7a14e9e0-19ae-4162-a83f-048e7b537bcf mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-9745db06-826b-4bff-b1ff-7f69ccf360f7 STEP: Create DPA CR @ 12/04/23 17:26:26.391 Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:26:26 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "12c35b4a-4775-40af-84c6-a775805fc1e6", "resourceVersion": "44844", "generation": 1, "creationTimestamp": "2023-12-04T17:26:26Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:26:26Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:26:26.416 2023/12/04 17:26:26 Waiting for velero pod to be running 2023/12/04 17:26:26 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' 2023/12/04 17:26:26 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "12c35b4a-4775-40af-84c6-a775805fc1e6", "resourceVersion": "44844", "generation": 1, "creationTimestamp": "2023-12-04T17:26:26Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:26:26Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:26:31.436 2023/12/04 17:26:31 Snapclass 'example-snapclass' doesn't exist, creating 2023/12/04 17:26:31 Setting new default StorageClass 'gp2-csi' Run the command: oc get sc 2023/12/04 17:26:31 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 40m gp3-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 40m STEP: Installing application for case mysql198 @ 12/04/23 17:26:31.573 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-198] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check pod status (30 retries left). FAILED - RETRYING: [localhost]: Check pod status (29 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Wait until service ready for connections (30 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2023/12/04 17:26:54 2023-12-04 17:26:32,976 p=25750 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:26:32,977 p=25750 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:33,198 p=25750 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:26:33,198 p=25750 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:33,414 p=25750 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:26:33,414 p=25750 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:33,427 p=25750 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:26:33,427 p=25750 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:33,680 p=25750 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:26:33,680 p=25750 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:33,702 p=25750 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:26:33,702 p=25750 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:33,721 p=25750 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:26:33,721 p=25750 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:33,730 p=25750 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:26:34,241 p=25750 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:26:34,242 p=25750 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:35,076 p=25750 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-198] *** 2023-12-04 17:26:35,076 p=25750 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:35,451 p=25750 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** 2023-12-04 17:26:35,451 p=25750 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:36,336 p=25750 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** 2023-12-04 17:26:36,336 p=25750 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:48,142 p=25750 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** 2023-12-04 17:26:48,143 p=25750 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:48,547 p=25750 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** 2023-12-04 17:26:48,547 p=25750 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:54,078 p=25750 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:26:54,078 p=25750 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:54,617 p=25750 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** 2023-12-04 17:26:54,617 p=25750 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:54,683 p=25750 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:26:54,683 p=25750 u=1008320000 n=ansible | localhost : ok=17 changed=8 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:26:54.732 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:26:58 2023-12-04 17:26:56,112 p=26160 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:26:56,112 p=26160 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:56,351 p=26160 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:26:56,351 p=26160 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:56,600 p=26160 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:26:56,600 p=26160 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:56,614 p=26160 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:26:56,614 p=26160 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:56,893 p=26160 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:26:56,893 p=26160 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:56,917 p=26160 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:26:56,917 p=26160 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:56,939 p=26160 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:26:56,939 p=26160 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:56,955 p=26160 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:26:57,458 p=26160 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:26:57,459 p=26160 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:58,355 p=26160 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** 2023-12-04 17:26:58,355 p=26160 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:26:58,601 p=26160 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:26:58,602 p=26160 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:58,911 p=26160 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:26:58,911 p=26160 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:26:58,922 p=26160 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:26:58,922 p=26160 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 STEP: Creating backup mysql198-40c490ac-92ca-11ee-b39d-0a580a838148 @ 12/04/23 17:26:58.976 2023/12/04 17:26:58 Wait until backup mysql198-40c490ac-92ca-11ee-b39d-0a580a838148 is completed backup phase: InProgress backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: PartiallyFailed STEP: Verify backup mysql198-40c490ac-92ca-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:28:19.052 [FAILED] in [It] - /alabama/cspi/test_common/backup_restore_case.go:125 @ 12/04/23 17:28:19.106 2023/12/04 17:28:19 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 STEP: Get the failed spec name @ 12/04/23 17:28:19.106 2023/12/04 17:28:19 The failed spec name is: Backup restore tests Application backup [tc-id:OADP-198][test-upstream][smoke] Different labels selector: Backup and Restore with multiple matched labels [orLabelSelectors] STEP: Create a folder for all must-gather files if it doesn't exists already @ 12/04/23 17:28:19.106 STEP: Create a folder for the failed spec if it doesn't exists already @ 12/04/23 17:28:19.106 STEP: Run must-gather because the spec failed @ 12/04/23 17:28:19.106 2023/12/04 17:28:19 [adm must-gather --dest-dir logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-198_test-upstream_smoke_Different_labels_selector_Backup_and_Restore_with_multiple_matched_labels_orLabelSelectors_labels --image registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0] STEP: Find must-gather folder and rename it to a shorter more readable name @ 12/04/23 17:28:29.55 2023/12/04 17:28:29 Failed to find must-gather folder 2023/12/04 17:28:29 rename logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-198_test-upstream_smoke_Different_labels_selector_Backup_and_Restore_with_multiple_matched_labels_orLabelSelectors_labels/must-gather: no such file or directory [FAILED] in [JustAfterEach] - /alabama/cspi/lib/must_gather_helpers.go:111 @ 12/04/23 17:28:29.55 2023/12/04 17:28:29 The backup operation was not successful. Removing the namespace finalizers 2023/12/04 17:28:29 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-198] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:28:48 2023-12-04 17:28:31,014 p=26461 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:28:31,014 p=26461 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:28:31,259 p=26461 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:28:31,260 p=26461 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:28:31,520 p=26461 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:28:31,520 p=26461 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:28:31,533 p=26461 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:28:31,533 p=26461 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:31,787 p=26461 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:28:31,787 p=26461 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:31,809 p=26461 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:28:31,809 p=26461 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:31,828 p=26461 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:28:31,829 p=26461 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:31,839 p=26461 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:28:32,358 p=26461 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:28:32,358 p=26461 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:48,135 p=26461 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-198] *** 2023-12-04 17:28:48,135 p=26461 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:28:48,316 p=26461 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:28:48,316 p=26461 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:28:48 Cleaning setup resources for the backup 2023/12/04 17:28:48 Setting new default StorageClass 'gp3-csi' Run the command: oc get sc 2023/12/04 17:28:48 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 42m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 42m 2023/12/04 17:28:48 Deleting VolumeSnapshotClass 'example-snapclass' • [FAILED] [455.633 seconds] Backup restore tests Application backup [It] [tc-id:OADP-198][test-upstream][smoke] Different labels selector: Backup and Restore with multiple matched labels [orLabelSelectors] [labels] /alabama/cspi/e2e/app_backup/backup_restore_labels.go:46 [FAILED] backup phase is: PartiallyFailed; expected: Completed validation errors: [] velero failure logs: [velero container contains "level=error" in line#158: time="2023-12-04T17:26:46Z" level=error msg="Current BackupStorageLocations available/unavailable/unknown: 0/0/1)" controller=backup-storage-location logSource="/remote-source/velero/app/pkg/controller/backup_storage_location_controller.go:194" velero container contains "level=error" in line#1611: time="2023-12-04T17:27:29Z" level=error msg=0 backup=openshift-adp/mysql198-40c490ac-92ca-11ee-b39d-0a580a838148 logSource="/remote-source/velero/app/pkg/controller/backup_controller.go:729" ] Expected : PartiallyFailed to equal : Completed In [It] at: /alabama/cspi/test_common/backup_restore_case.go:125 @ 12/04/23 17:28:19.106 There were additional failures detected. To view them in detail run ginkgo -vv ------------------------------ S ------------------------------ Backup restore tests Application backup [tc-id:OADP-200][test-upstream] Different labels selector: Backup and Restore with multiple matched multiple labels under (matchLabels) [labels] /alabama/cspi/e2e/app_backup/backup_restore_labels.go:85 2023/12/04 17:28:48 Delete all downloadrequest cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-58d652f3-4a3f-4f1c-b59f-0d3b2f7b0734 cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-5b2bbf13-23ef-4885-8a11-cbfb349ecf10 mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-4e18a8fc-a5a0-4c1f-9380-f882c40a2db0 mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-639b8c74-fa58-48db-86aa-ec4a1e47bc28 mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-85a7d0fd-020d-428d-b1ef-d24bfee6f3bb mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-be6196b2-d990-4e11-90b1-b75fd5e4adeb mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-3890e7ea-72b6-403a-bdda-f3e7abd49c2a mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-98810682-c03b-4b8a-85e2-a4dc05a8a1b7 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-a7fbbc26-c832-433a-8fb3-367b69f5fb6c mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-f382e398-cd64-4e30-b2b2-54264169ece1 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-236fe8f0-3c3e-4003-841b-fe4603e60bd3 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-5ecca1fd-105a-429a-9520-30824de133b4 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-9c2e10f0-e4f2-4166-ab1f-c0f390566f1c mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-cc3d7c20-94ea-404b-9f4d-2c463550c9ff STEP: Create DPA CR @ 12/04/23 17:28:48.603 Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:28:48 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "724d1d57-85f7-4bc6-bb35-ffb7b6a1623b", "resourceVersion": "46710", "generation": 1, "creationTimestamp": "2023-12-04T17:28:48Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:28:48Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:28:48.628 2023/12/04 17:28:48 Waiting for velero pod to be running 2023/12/04 17:28:48 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' 2023/12/04 17:28:48 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "724d1d57-85f7-4bc6-bb35-ffb7b6a1623b", "resourceVersion": "46710", "generation": 1, "creationTimestamp": "2023-12-04T17:28:48Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:28:48Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:28:53.642 Run the command: oc get ns openshift-storage &> /dev/null && echo true || echo false 2023/12/04 17:28:53 The 'openshift-storage' namespace does not exist 2023/12/04 17:28:53 Using default CSI driver based on infrastructure: ebs.csi.aws.com 2023/12/04 17:28:53 Snapclass 'example-snapclass' doesn't exist, creating 2023/12/04 17:28:53 Setting new default StorageClass 'gp2-csi' Run the command: oc get sc 2023/12/04 17:28:53 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 42m gp3-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 42m STEP: Installing application for case mysql @ 12/04/23 17:28:53.854 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-200] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check pod status (30 retries left). FAILED - RETRYING: [localhost]: Check pod status (29 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Wait until service ready for connections (30 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2023/12/04 17:29:16 2023-12-04 17:28:55,224 p=26723 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:28:55,224 p=26723 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:28:55,447 p=26723 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:28:55,447 p=26723 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:28:55,654 p=26723 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:28:55,654 p=26723 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:28:55,666 p=26723 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:28:55,666 p=26723 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:55,908 p=26723 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:28:55,908 p=26723 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:55,931 p=26723 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:28:55,931 p=26723 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:55,950 p=26723 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:28:55,950 p=26723 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:55,960 p=26723 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:28:56,475 p=26723 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:28:56,475 p=26723 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:57,196 p=26723 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-200] *** 2023-12-04 17:28:57,196 p=26723 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:28:57,509 p=26723 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** 2023-12-04 17:28:57,509 p=26723 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:28:58,336 p=26723 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** 2023-12-04 17:28:58,336 p=26723 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:10,034 p=26723 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** 2023-12-04 17:29:10,035 p=26723 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:29:10,434 p=26723 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** 2023-12-04 17:29:10,434 p=26723 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:15,924 p=26723 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:29:15,924 p=26723 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:16,431 p=26723 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** 2023-12-04 17:29:16,431 p=26723 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:16,498 p=26723 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:29:16,498 p=26723 u=1008320000 n=ansible | localhost : ok=17 changed=8 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:29:16.553 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:29:20 2023-12-04 17:29:17,992 p=27141 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:29:17,992 p=27141 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:18,221 p=27141 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:29:18,221 p=27141 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:18,441 p=27141 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:29:18,442 p=27141 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:18,455 p=27141 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:29:18,456 p=27141 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:29:18,710 p=27141 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:29:18,711 p=27141 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:29:18,734 p=27141 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:29:18,734 p=27141 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:29:18,756 p=27141 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:29:18,756 p=27141 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:29:18,766 p=27141 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:29:19,284 p=27141 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:29:19,284 p=27141 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:29:20,204 p=27141 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** 2023-12-04 17:29:20,204 p=27141 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:29:20,464 p=27141 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:29:20,464 p=27141 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:20,791 p=27141 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:29:20,791 p=27141 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:29:20,802 p=27141 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:29:20,802 p=27141 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 STEP: Creating backup mysql-9586dd40-92ca-11ee-b39d-0a580a838148 @ 12/04/23 17:29:20.843 2023/12/04 17:29:20 Wait until backup mysql-9586dd40-92ca-11ee-b39d-0a580a838148 is completed backup phase: InProgress backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: PartiallyFailed STEP: Verify backup mysql-9586dd40-92ca-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:31:00.93 [FAILED] in [It] - /alabama/cspi/test_common/backup_restore_case.go:125 @ 12/04/23 17:31:00.982 2023/12/04 17:31:00 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 STEP: Get the failed spec name @ 12/04/23 17:31:00.982 2023/12/04 17:31:00 The failed spec name is: Backup restore tests Application backup [tc-id:OADP-200][test-upstream] Different labels selector: Backup and Restore with multiple matched multiple labels under (matchLabels) STEP: Create a folder for all must-gather files if it doesn't exists already @ 12/04/23 17:31:00.982 STEP: Create a folder for the failed spec if it doesn't exists already @ 12/04/23 17:31:00.982 2023/12/04 17:31:00 The folder logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-200_test-upstream_Different_labels_selector_Backup_and_Restore_with_multiple_matched_multiple_labels_under_(matchLabels)_labels does not exists, creating new folder with the name: logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-200_test-upstream_Different_labels_selector_Backup_and_Restore_with_multiple_matched_multiple_labels_under_(matchLabels)_labels STEP: Run must-gather because the spec failed @ 12/04/23 17:31:00.982 2023/12/04 17:31:00 [adm must-gather --dest-dir logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-200_test-upstream_Different_labels_selector_Backup_and_Restore_with_multiple_matched_multiple_labels_under_(matchLabels)_labels --image registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0] STEP: Find must-gather folder and rename it to a shorter more readable name @ 12/04/23 17:31:11.454 2023/12/04 17:31:11 Failed to find must-gather folder 2023/12/04 17:31:11 rename logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-200_test-upstream_Different_labels_selector_Backup_and_Restore_with_multiple_matched_multiple_labels_under_(matchLabels)_labels/must-gather: no such file or directory [FAILED] in [JustAfterEach] - /alabama/cspi/lib/must_gather_helpers.go:111 @ 12/04/23 17:31:11.454 2023/12/04 17:31:11 The backup operation was not successful. Removing the namespace finalizers 2023/12/04 17:31:11 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-200] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:31:29 2023-12-04 17:31:12,809 p=27446 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:31:12,809 p=27446 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:13,016 p=27446 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:31:13,016 p=27446 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:13,230 p=27446 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:31:13,231 p=27446 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:13,244 p=27446 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:31:13,244 p=27446 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:13,493 p=27446 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:31:13,493 p=27446 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:13,515 p=27446 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:31:13,515 p=27446 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:13,534 p=27446 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:31:13,534 p=27446 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:13,544 p=27446 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:31:14,037 p=27446 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:31:14,037 p=27446 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:29,775 p=27446 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-200] *** 2023-12-04 17:31:29,775 p=27446 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:29,950 p=27446 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:31:29,950 p=27446 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:31:29 Cleaning setup resources for the backup 2023/12/04 17:31:29 Setting new default StorageClass 'gp3-csi' Run the command: oc get sc 2023/12/04 17:31:30 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 45m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 45m 2023/12/04 17:31:30 Deleting VolumeSnapshotClass 'example-snapclass' Attempt #1 Failed. Retrying ↺ @ 12/04/23 17:31:30.121 2023/12/04 17:31:30 Delete all downloadrequest cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-7ad4a941-a130-4f82-ae89-1bc29cc9d846 cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-9c7b0687-e678-432b-a0e4-36b7ba1332a8 mysql-9586dd40-92ca-11ee-b39d-0a580a838148-1e9e704d-450d-4def-8147-ae4a6e83f829 mysql-9586dd40-92ca-11ee-b39d-0a580a838148-22d39ab4-f2f5-4032-9984-81916d066709 mysql-9586dd40-92ca-11ee-b39d-0a580a838148-b152ae08-37aa-49d9-84aa-c8832c5ef35e mysql-9586dd40-92ca-11ee-b39d-0a580a838148-c4664cc5-0598-4cf8-a9a2-741758795033 mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-2e005ace-8372-40c5-a840-30ad546087bf mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-5416d350-ed09-46e3-bd42-0f8ce6793e91 mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-aa142bc3-dec3-4fd9-a51e-b1f847067cbf mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-adfa7195-c616-4a82-a8fb-a4e8d60d9e61 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-2c204634-e2a3-4216-9998-aadc727fcf49 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-38099388-2025-4af7-9d5c-b42dae37bf9b mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-59b52609-db61-4960-ae87-c2a41bac5e70 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-9e565045-75f5-4f0b-814e-378195ebbd7d mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-1d1a8ee7-6fb8-4bc3-ac0e-313ca5adb342 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-62936c37-0c8e-44c9-833d-8fe4375c0db2 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-74ea79fc-cac2-40be-aa1b-e1256c3c0ef9 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-e21ea0c8-9106-4379-9c3d-d62a1f111752 STEP: Create DPA CR @ 12/04/23 17:31:30.314 Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:31:30 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "c837039c-f1b5-41ba-a90e-547cf3ec0a2c", "resourceVersion": "48668", "generation": 1, "creationTimestamp": "2023-12-04T17:31:30Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:31:30Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:31:30.338 2023/12/04 17:31:30 Waiting for velero pod to be running 2023/12/04 17:31:30 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' 2023/12/04 17:31:30 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "c837039c-f1b5-41ba-a90e-547cf3ec0a2c", "resourceVersion": "48668", "generation": 1, "creationTimestamp": "2023-12-04T17:31:30Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:31:30Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:31:35.355 2023/12/04 17:31:35 Snapclass 'example-snapclass' doesn't exist, creating 2023/12/04 17:31:35 Setting new default StorageClass 'gp2-csi' Run the command: oc get sc 2023/12/04 17:31:35 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 45m gp3-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 45m STEP: Installing application for case mysql @ 12/04/23 17:31:35.483 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-200] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check pod status (30 retries left). FAILED - RETRYING: [localhost]: Check pod status (29 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Wait until service ready for connections (30 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2023/12/04 17:31:57 2023-12-04 17:31:36,872 p=27689 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:31:36,872 p=27689 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:37,071 p=27689 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:31:37,071 p=27689 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:37,270 p=27689 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:31:37,270 p=27689 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:37,282 p=27689 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:31:37,282 p=27689 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:37,522 p=27689 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:31:37,522 p=27689 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:37,544 p=27689 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:31:37,544 p=27689 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:37,563 p=27689 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:31:37,563 p=27689 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:37,573 p=27689 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:31:38,060 p=27689 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:31:38,061 p=27689 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:38,744 p=27689 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-200] *** 2023-12-04 17:31:38,744 p=27689 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:39,039 p=27689 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** 2023-12-04 17:31:39,039 p=27689 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:39,836 p=27689 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** 2023-12-04 17:31:39,837 p=27689 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:51,474 p=27689 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** 2023-12-04 17:31:51,474 p=27689 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:51,857 p=27689 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** 2023-12-04 17:31:51,858 p=27689 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:57,332 p=27689 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:31:57,333 p=27689 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:57,815 p=27689 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** 2023-12-04 17:31:57,815 p=27689 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:57,878 p=27689 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:31:57,878 p=27689 u=1008320000 n=ansible | localhost : ok=17 changed=8 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:31:57.917 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:32:02 2023-12-04 17:31:59,249 p=28108 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:31:59,249 p=28108 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:59,458 p=28108 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:31:59,458 p=28108 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:59,682 p=28108 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:31:59,682 p=28108 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:31:59,700 p=28108 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:31:59,701 p=28108 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:59,963 p=28108 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:31:59,963 p=28108 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:31:59,985 p=28108 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:31:59,985 p=28108 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:32:00,003 p=28108 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:32:00,004 p=28108 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:32:00,013 p=28108 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:32:00,503 p=28108 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:32:00,503 p=28108 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:32:01,422 p=28108 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** 2023-12-04 17:32:01,422 p=28108 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:32:01,672 p=28108 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:32:01,672 p=28108 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:32:01,982 p=28108 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:32:01,982 p=28108 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:32:01,993 p=28108 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:32:01,993 p=28108 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 STEP: Creating backup mysql-f5e14ee1-92ca-11ee-b39d-0a580a838148 @ 12/04/23 17:32:02.032 2023/12/04 17:32:02 Wait until backup mysql-f5e14ee1-92ca-11ee-b39d-0a580a838148 is completed backup phase: InProgress backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: WaitingForPluginOperationsPartiallyFailed backup phase: PartiallyFailed STEP: Verify backup mysql-f5e14ee1-92ca-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:33:22.119 [FAILED] in [It] - /alabama/cspi/test_common/backup_restore_case.go:125 @ 12/04/23 17:33:22.166 2023/12/04 17:33:22 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 STEP: Get the failed spec name @ 12/04/23 17:33:22.166 2023/12/04 17:33:22 The failed spec name is: Backup restore tests Application backup [tc-id:OADP-200][test-upstream] Different labels selector: Backup and Restore with multiple matched multiple labels under (matchLabels) STEP: Create a folder for all must-gather files if it doesn't exists already @ 12/04/23 17:33:22.166 STEP: Create a folder for the failed spec if it doesn't exists already @ 12/04/23 17:33:22.166 STEP: Run must-gather because the spec failed @ 12/04/23 17:33:22.166 2023/12/04 17:33:22 [adm must-gather --dest-dir logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-200_test-upstream_Different_labels_selector_Backup_and_Restore_with_multiple_matched_multiple_labels_under_(matchLabels)_labels --image registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0] STEP: Find must-gather folder and rename it to a shorter more readable name @ 12/04/23 17:33:32.659 2023/12/04 17:33:32 Failed to find must-gather folder 2023/12/04 17:33:32 rename logs/It_Backup_restore_tests_Application_backup_tc-id_OADP-200_test-upstream_Different_labels_selector_Backup_and_Restore_with_multiple_matched_multiple_labels_under_(matchLabels)_labels/must-gather: no such file or directory [FAILED] in [JustAfterEach] - /alabama/cspi/lib/must_gather_helpers.go:111 @ 12/04/23 17:33:32.659 2023/12/04 17:33:32 The backup operation was not successful. Removing the namespace finalizers 2023/12/04 17:33:32 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-200] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:33:51 2023-12-04 17:33:34,047 p=28416 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:33:34,047 p=28416 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:33:34,255 p=28416 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:33:34,255 p=28416 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:33:34,457 p=28416 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:33:34,458 p=28416 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:33:34,470 p=28416 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:33:34,470 p=28416 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:34,701 p=28416 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:33:34,702 p=28416 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:34,724 p=28416 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:33:34,724 p=28416 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:34,745 p=28416 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:33:34,745 p=28416 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:34,754 p=28416 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:33:35,251 p=28416 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:33:35,251 p=28416 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:50,987 p=28416 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-200] *** 2023-12-04 17:33:50,987 p=28416 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:33:51,169 p=28416 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:33:51,169 p=28416 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:33:51 Cleaning setup resources for the backup 2023/12/04 17:33:51 Setting new default StorageClass 'gp3-csi' Run the command: oc get sc 2023/12/04 17:33:51 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 47m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 47m 2023/12/04 17:33:51 Deleting VolumeSnapshotClass 'example-snapclass' Attempt #2 Failed. Retrying ↺ @ 12/04/23 17:33:51.324 2023/12/04 17:33:51 Delete all downloadrequest cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-3b0d5722-150e-4b9b-abad-be5921789da5 cassandra-hooks-e2e-d7d844cf-92c8-11ee-b39d-0a580a838148-902e37a2-9ebd-40e4-a246-2a435553a687 mysql-9586dd40-92ca-11ee-b39d-0a580a838148-013169c7-e68e-44c4-8582-2318cb2d563e mysql-9586dd40-92ca-11ee-b39d-0a580a838148-7de74e36-23dc-4c9f-839e-e55ba31d5abf mysql-9586dd40-92ca-11ee-b39d-0a580a838148-8185b740-1fcf-4b84-8669-af9ace0117fa mysql-9586dd40-92ca-11ee-b39d-0a580a838148-dbafc9e6-49d1-4e24-963b-fec1a06cc8c5 mysql-f5e14ee1-92ca-11ee-b39d-0a580a838148-5a7fea93-c6df-4ab3-9dc3-02e2bf52c742 mysql-f5e14ee1-92ca-11ee-b39d-0a580a838148-800c3eba-4766-40a8-bc33-45372b640896 mysql-f5e14ee1-92ca-11ee-b39d-0a580a838148-9bcd93bc-0ca1-486d-b03f-da921b706b81 mysql-f5e14ee1-92ca-11ee-b39d-0a580a838148-dd5c5e05-eb70-4d85-9b14-977420d76d7d mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-b05ccd1c-6ed5-4805-af25-5a69ee3fa970 mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-c4f1e219-2cfa-46eb-b64b-eea7f25cdc39 mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-f04be213-1b0e-419d-b4db-1afce0e632a7 mysql198-40c490ac-92ca-11ee-b39d-0a580a838148-f631dcaa-c4ff-4f3c-a12b-c9f3d2d104fc mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-5a26e5a6-4e18-4258-9d9e-3f2197ec1eb4 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-88282858-5f89-4284-a4b1-c62a688308d7 mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-9cae2bc4-4bb8-470b-8531-a8b3247e889c mysql198-85f38a45-92c9-11ee-b39d-0a580a838148-d5f33396-ced7-4d37-a789-e6309f41b252 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-6d2de23d-5757-4306-9ac6-1ea339467fac mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-9564fa91-6981-471a-be20-0dc4d64014b6 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-b83c58a2-5ba4-4f40-8eb3-4777ff36bba3 mysql198-ef0921de-92c9-11ee-b39d-0a580a838148-c90e987d-b5c6-4859-99b9-fcbc633fa156 STEP: Create DPA CR @ 12/04/23 17:33:51.529 Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:33:51 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "f2f0b16c-2e35-4d93-8662-b51cf9145270", "resourceVersion": "50844", "generation": 1, "creationTimestamp": "2023-12-04T17:33:51Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:33:51Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:33:51.553 2023/12/04 17:33:51 Waiting for velero pod to be running 2023/12/04 17:33:51 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' 2023/12/04 17:33:51 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "f2f0b16c-2e35-4d93-8662-b51cf9145270", "resourceVersion": "50844", "generation": 1, "creationTimestamp": "2023-12-04T17:33:51Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:33:51Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:33:56.579 2023/12/04 17:33:56 Snapclass 'example-snapclass' doesn't exist, creating 2023/12/04 17:33:56 Setting new default StorageClass 'gp2-csi' Run the command: oc get sc 2023/12/04 17:33:56 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 47m gp3-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 47m STEP: Installing application for case mysql @ 12/04/23 17:33:56.699 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-200] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check pod status (30 retries left). FAILED - RETRYING: [localhost]: Check pod status (29 retries left). FAILED - RETRYING: [localhost]: Check pod status (28 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2023/12/04 17:34:19 2023-12-04 17:33:58,072 p=28658 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:33:58,072 p=28658 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:33:58,277 p=28658 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:33:58,278 p=28658 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:33:58,500 p=28658 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:33:58,501 p=28658 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:33:58,513 p=28658 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:33:58,513 p=28658 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:58,754 p=28658 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:33:58,755 p=28658 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:58,777 p=28658 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:33:58,777 p=28658 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:58,796 p=28658 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:33:58,796 p=28658 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:58,806 p=28658 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:33:59,296 p=28658 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:33:59,297 p=28658 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:33:59,974 p=28658 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-200] *** 2023-12-04 17:33:59,974 p=28658 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:34:00,276 p=28658 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** 2023-12-04 17:34:00,276 p=28658 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:01,077 p=28658 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** 2023-12-04 17:34:01,078 p=28658 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:18,204 p=28658 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** 2023-12-04 17:34:18,204 p=28658 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:34:18,608 p=28658 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** 2023-12-04 17:34:18,608 p=28658 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:18,854 p=28658 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:34:18,854 p=28658 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:19,348 p=28658 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** 2023-12-04 17:34:19,349 p=28658 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:19,412 p=28658 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:34:19,413 p=28658 u=1008320000 n=ansible | localhost : ok=17 changed=8 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:34:19.457 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:34:23 2023-12-04 17:34:20,774 p=29054 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:34:20,774 p=29054 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:20,978 p=29054 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:34:20,978 p=29054 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:21,176 p=29054 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:34:21,177 p=29054 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:21,189 p=29054 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:34:21,189 p=29054 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:34:21,430 p=29054 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:34:21,431 p=29054 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:34:21,452 p=29054 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:34:21,453 p=29054 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:34:21,471 p=29054 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:34:21,472 p=29054 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:34:21,481 p=29054 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:34:21,984 p=29054 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:34:21,984 p=29054 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:34:22,930 p=29054 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** 2023-12-04 17:34:22,930 p=29054 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:34:23,183 p=29054 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:34:23,183 p=29054 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:23,506 p=29054 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:34:23,506 p=29054 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:34:23,517 p=29054 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:34:23,517 p=29054 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 STEP: Creating backup mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148 @ 12/04/23 17:34:23.558 2023/12/04 17:34:23 Wait until backup mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148 is completed backup phase: InProgress backup phase: WaitingForPluginOperations backup phase: Completed STEP: Verify backup mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:35:23.617 2023/12/04 17:35:23 Backup for case mysql succeeded STEP: Delete the appplication resources mysql @ 12/04/23 17:35:23.671 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-200] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:35:42 2023-12-04 17:35:24,950 p=29323 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:35:24,950 p=29323 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:35:25,168 p=29323 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:35:25,168 p=29323 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:35:25,387 p=29323 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:35:25,387 p=29323 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:35:25,401 p=29323 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:35:25,401 p=29323 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:25,645 p=29323 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:35:25,645 p=29323 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:25,667 p=29323 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:35:25,667 p=29323 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:25,686 p=29323 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:35:25,687 p=29323 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:25,696 p=29323 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:35:26,210 p=29323 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:35:26,210 p=29323 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:41,906 p=29323 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-200] *** 2023-12-04 17:35:41,906 p=29323 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:35:42,084 p=29323 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:35:42,084 p=29323 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:35:42 Creating restore mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148 for case mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148 STEP: Create restore mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148 from backup mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148 @ 12/04/23 17:35:42.122 2023/12/04 17:35:42 Wait until restore mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148 is complete restore phase: Completed STEP: Verify restore mysql-4a0b13b9-92cb-11ee-b39d-0a580a838148has completed successfully @ 12/04/23 17:35:52.149 STEP: Verify Application restore @ 12/04/23 17:35:52.153 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] FAILED - RETRYING: [localhost]: Check labels pod status (30 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:36:01 2023-12-04 17:35:53,576 p=29526 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:35:53,576 p=29526 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:35:53,775 p=29526 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:35:53,775 p=29526 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:35:53,973 p=29526 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:35:53,973 p=29526 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:35:53,985 p=29526 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:35:53,985 p=29526 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:54,247 p=29526 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:35:54,247 p=29526 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:54,272 p=29526 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:35:54,272 p=29526 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:54,293 p=29526 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:35:54,293 p=29526 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:35:54,302 p=29526 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:35:54,800 p=29526 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:35:54,800 p=29526 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:01,213 p=29526 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check labels pod status] *** 2023-12-04 17:36:01,213 p=29526 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:01,495 p=29526 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:36:01,495 p=29526 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:01,813 p=29526 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:36:01,813 p=29526 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:01,824 p=29526 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:36:01,824 p=29526 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 2023/12/04 17:36:01 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 2023/12/04 17:36:01 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-200] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:36:20 2023-12-04 17:36:03,167 p=29808 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:36:03,167 p=29808 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:03,371 p=29808 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:36:03,372 p=29808 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:03,572 p=29808 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:36:03,573 p=29808 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:03,585 p=29808 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:36:03,585 p=29808 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:03,834 p=29808 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:36:03,834 p=29808 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:03,855 p=29808 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:36:03,856 p=29808 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:03,876 p=29808 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:36:03,876 p=29808 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:03,886 p=29808 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:36:04,412 p=29808 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:36:04,412 p=29808 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:20,175 p=29808 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-200] *** 2023-12-04 17:36:20,175 p=29808 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:20,379 p=29808 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:36:20,380 p=29808 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:36:20 Cleaning setup resources for the backup 2023/12/04 17:36:20 Setting new default StorageClass 'gp3-csi' Run the command: oc get sc 2023/12/04 17:36:20 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 50m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 50m 2023/12/04 17:36:20 Deleting VolumeSnapshotClass 'example-snapclass' ↺ [FLAKEY TEST - TOOK 3 ATTEMPTS TO PASS] [452.113 seconds] ------------------------------ S ------------------------------ Backup restore tests Application backup [tc-id:OADP-210][test-upstream] Different labels selector: verify that labelSelector and orLabelSelectors cannot co-exist [labels] /alabama/cspi/e2e/app_backup/backup_restore_labels.go:219 2023/12/04 17:36:20 Delete all downloadrequest No download requests are found STEP: Create DPA CR @ 12/04/23 17:36:20.618 Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:36:20 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "33adbe1a-6c4e-496a-9d28-42297e446745", "resourceVersion": "52933", "generation": 1, "creationTimestamp": "2023-12-04T17:36:20Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:36:20Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:36:20.664 2023/12/04 17:36:20 Waiting for velero pod to be running 2023/12/04 17:36:20 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' 2023/12/04 17:36:20 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "33adbe1a-6c4e-496a-9d28-42297e446745", "resourceVersion": "52933", "generation": 1, "creationTimestamp": "2023-12-04T17:36:20Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:36:20Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:36:25.697 Run the command: oc get ns openshift-storage &> /dev/null && echo true || echo false 2023/12/04 17:36:25 The 'openshift-storage' namespace does not exist 2023/12/04 17:36:25 Using default CSI driver based on infrastructure: ebs.csi.aws.com 2023/12/04 17:36:25 Snapclass 'example-snapclass' doesn't exist, creating 2023/12/04 17:36:25 Setting new default StorageClass 'gp2-csi' Run the command: oc get sc 2023/12/04 17:36:25 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 50m gp3-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 50m STEP: Creating backup mysql-a302d8a9-92cb-11ee-b39d-0a580a838148 @ 12/04/23 17:36:25.924 2023/12/04 17:36:25 Wait until backup mysql-a302d8a9-92cb-11ee-b39d-0a580a838148 is completed backup phase: FailedValidation STEP: Verify backup mysql-a302d8a9-92cb-11ee-b39d-0a580a838148 has completed with validation error @ 12/04/23 17:36:45.969 2023/12/04 17:36:45 Backup for case mysql completed with validation error as expected STEP: Verify backup failed with the expected validation error message @ 12/04/23 17:36:45.986 2023/12/04 17:36:45 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 2023/12/04 17:36:45 Cleaning setup resources for the backup 2023/12/04 17:36:45 Setting new default StorageClass 'gp3-csi' Run the command: oc get sc 2023/12/04 17:36:46 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 50m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 50m 2023/12/04 17:36:46 Deleting VolumeSnapshotClass 'example-snapclass' • [25.530 seconds] ------------------------------ SSS ------------------------------ Backup restore tests Application backup [bug-id:OADP-1077] [test-upstream] [smoke] MySQL application with Restic [mr-check] /alabama/cspi/e2e/app_backup/backup_restore.go:48 2023/12/04 17:36:46 Delete all downloadrequest No download requests are found STEP: Create DPA CR @ 12/04/23 17:36:46.449 Updating resource allocations for NodeAgent because running tests in parallel Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:36:46 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "e374ff33-ee38-4dce-aed5-14d1f8f38675", "resourceVersion": "53226", "generation": 1, "creationTimestamp": "2023-12-04T17:36:46Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:36:46Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:nodeAgent": { ".": {}, "f:enable": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } }, "f:uploaderType": {} }, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } }, "nodeAgent": { "enable": true, "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } }, "uploaderType": "restic" } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:36:46.477 2023/12/04 17:36:46 Waiting for velero pod to be running 2023/12/04 17:36:46 pod: velero-66c9b58556-4q8nd is not yet running with status: {Succeeded [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:36:24 +0000 UTC PodCompleted } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:36:46 +0000 UTC PodCompleted } {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:36:46 +0000 UTC PodCompleted } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:36:20 +0000 UTC }] 10.0.64.104 [] 2023-12-04 17:36:20 +0000 UTC [{openshift-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:36:21 +0000 UTC,FinishedAt:2023-12-04 17:36:21 +0000 UTC,ContainerID:cri-o://f2304c0df1641bf770d23778c7e743687106666cccf3c168d95e47cea7505051,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-rhel9@sha256:98264ebcc6950f6f240a547740260e9755ef757cec336ab6b5e8bba4d75e9502 a27d719be33680a0083b7bef21a83ebb0f94eb61a9562bc9950fb479ae35057a cri-o://f2304c0df1641bf770d23778c7e743687106666cccf3c168d95e47cea7505051 0xc001406299} {velero-plugin-for-aws {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:36:22 +0000 UTC,FinishedAt:2023-12-04 17:36:22 +0000 UTC,ContainerID:cri-o://25744932e004233a50f57ede3e193f43266cece9173bd6815560728f0da159b7,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-for-aws-rhel9@sha256:32309eaa2e565b349f2806c6bc6f834876a64cd106e48044e982ed6925d5c6bf 9c953b830e58ee14db469d2b88e4fda9406f0e5f4b0f88c7913cd9a4bc7b0641 cri-o://25744932e004233a50f57ede3e193f43266cece9173bd6815560728f0da159b7 0xc0014062f9} {kubevirt-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:36:23 +0000 UTC,FinishedAt:2023-12-04 17:36:23 +0000 UTC,ContainerID:cri-o://59c6fab2cec6f83a9be8949487383861a6a1893b0bfe096d91da0d6657ca66ba,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-kubevirt-velero-plugin-rhel9@sha256:b49ab89e7bc68b9e4e83fbbf33e215a339c28cf48df8a2ed3e8440f35f22b6a6 691a4d608f0798d31f3f9916532817effc5fe93bcfa2409da069a8a1c128f0c8 cri-o://59c6fab2cec6f83a9be8949487383861a6a1893b0bfe096d91da0d6657ca66ba 0xc001406879} {velero-plugin-for-csi {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:36:24 +0000 UTC,FinishedAt:2023-12-04 17:36:24 +0000 UTC,ContainerID:cri-o://82396299ec7dba09f1cbcc872e0cfd9aea60b2f435d87dcd3cd2e2bcd8b8dc8f,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-for-csi-rhel9@sha256:4a4e93abdeecc3647620dd9a17c5a3fe833659f6c0f2058a44b38da1a75b5aad 1d2287a478f54fa0f9cca68baffda82f7eb045de5fcab2b9e3b1f8547b2d182a cri-o://82396299ec7dba09f1cbcc872e0cfd9aea60b2f435d87dcd3cd2e2bcd8b8dc8f 0xc001406889}] [{velero {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:36:25 +0000 UTC,FinishedAt:2023-12-04 17:36:46 +0000 UTC,ContainerID:cri-o://f433e467a54aa23e409b87a9337aeffcc1566f0efe4f7b3244f76f5ad534a2dc,}} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-rhel9@sha256:06482afcea65eff184901c63cbe9fc6e5b3172d23857301ad4e7b4daf362e79c a97767c21761d43f42ade7602f71338fe9623c667536f99fef24f182fcc6d0f0 cri-o://f433e467a54aa23e409b87a9337aeffcc1566f0efe4f7b3244f76f5ad534a2dc 0xc001406899}] Burstable []} 2023/12/04 17:36:51 pod: velero-5cf59669b7-sxgc9 is not yet running with status: {Pending [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:36:50 +0000 UTC } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:36:46 +0000 UTC ContainersNotReady containers with unready status: [velero]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:36:46 +0000 UTC ContainersNotReady containers with unready status: [velero]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:36:46 +0000 UTC }] 10.0.64.104 10.128.2.33 [{10.128.2.33}] 2023-12-04 17:36:46 +0000 UTC [{openshift-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:36:47 +0000 UTC,FinishedAt:2023-12-04 17:36:47 +0000 UTC,ContainerID:cri-o://1878283e6b3dd8399e5d70f5f4c68ba3220476d61b4d96ed38f4e0ea3db2739e,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-rhel9@sha256:98264ebcc6950f6f240a547740260e9755ef757cec336ab6b5e8bba4d75e9502 a27d719be33680a0083b7bef21a83ebb0f94eb61a9562bc9950fb479ae35057a cri-o://1878283e6b3dd8399e5d70f5f4c68ba3220476d61b4d96ed38f4e0ea3db2739e 0xc0010da479} {velero-plugin-for-aws {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:36:49 +0000 UTC,FinishedAt:2023-12-04 17:36:49 +0000 UTC,ContainerID:cri-o://ad22893652b80adc57375bad23340df9716c4f7630bf03cee171e6161b8d007d,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-for-aws-rhel9@sha256:32309eaa2e565b349f2806c6bc6f834876a64cd106e48044e982ed6925d5c6bf 9c953b830e58ee14db469d2b88e4fda9406f0e5f4b0f88c7913cd9a4bc7b0641 cri-o://ad22893652b80adc57375bad23340df9716c4f7630bf03cee171e6161b8d007d 0xc0010da489} {kubevirt-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:36:50 +0000 UTC,FinishedAt:2023-12-04 17:36:50 +0000 UTC,ContainerID:cri-o://de5aaa51ab80e0eb661e6fb0ee25b985d0e23bc2accf83e5fb08d3d7062035a9,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-kubevirt-velero-plugin-rhel9@sha256:b49ab89e7bc68b9e4e83fbbf33e215a339c28cf48df8a2ed3e8440f35f22b6a6 691a4d608f0798d31f3f9916532817effc5fe93bcfa2409da069a8a1c128f0c8 cri-o://de5aaa51ab80e0eb661e6fb0ee25b985d0e23bc2accf83e5fb08d3d7062035a9 0xc0010da499}] [{velero {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-rhel9@sha256:06482afcea65eff184901c63cbe9fc6e5b3172d23857301ad4e7b4daf362e79c 0xc0010da4af}] Burstable []} 2023/12/04 17:36:56 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:36:56.505 2023/12/04 17:36:56 Checking for correct number of running NodeAgent pods... STEP: Installing application for case mysql @ 12/04/23 17:36:56.516 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-1077] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check pod status (30 retries left). FAILED - RETRYING: [localhost]: Check pod status (29 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=8  unreachable=0 failed=0 skipped=4  rescued=0 ignored=0 2023/12/04 17:37:13 2023-12-04 17:36:57,838 p=30090 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:36:57,838 p=30090 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:58,035 p=30090 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:36:58,035 p=30090 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:58,232 p=30090 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:36:58,232 p=30090 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:36:58,244 p=30090 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:36:58,244 p=30090 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:58,495 p=30090 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:36:58,495 p=30090 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:58,518 p=30090 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:36:58,518 p=30090 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:58,538 p=30090 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:36:58,539 p=30090 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:58,549 p=30090 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:36:59,037 p=30090 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:36:59,037 p=30090 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:36:59,761 p=30090 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check namespace test-oadp-1077] *** 2023-12-04 17:36:59,761 p=30090 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:37:00,099 p=30090 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Create namespace] *** 2023-12-04 17:37:00,099 p=30090 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:00,867 p=30090 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Deploy a mysql pod] *** 2023-12-04 17:37:00,867 p=30090 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:12,524 p=30090 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check pod status] *** 2023-12-04 17:37:12,524 p=30090 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:37:12,911 p=30090 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Copy mysql provision script to pod] *** 2023-12-04 17:37:12,912 p=30090 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:13,170 p=30090 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:37:13,170 p=30090 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:13,675 p=30090 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Provision the mysql database] *** 2023-12-04 17:37:13,675 p=30090 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:13,747 p=30090 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:37:13,747 p=30090 u=1008320000 n=ansible | localhost : ok=17 changed=8 unreachable=0 failed=0 skipped=4 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:37:13.793 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check mysql pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:37:18 2023-12-04 17:37:15,206 p=30475 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:37:15,206 p=30475 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:15,414 p=30475 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:37:15,414 p=30475 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:15,634 p=30475 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:37:15,634 p=30475 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:15,648 p=30475 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:37:15,649 p=30475 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:37:15,892 p=30475 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:37:15,892 p=30475 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:37:15,916 p=30475 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:37:15,916 p=30475 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:37:15,938 p=30475 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:37:15,938 p=30475 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:37:15,948 p=30475 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:37:16,464 p=30475 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:37:16,465 p=30475 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:37:17,399 p=30475 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check mysql pod status] *** 2023-12-04 17:37:17,399 p=30475 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:37:17,685 p=30475 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:37:17,685 p=30475 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:18,005 p=30475 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:37:18,005 p=30475 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:37:18,016 p=30475 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:37:18,016 p=30475 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 STEP: Creating backup mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148 @ 12/04/23 17:37:18.058 2023/12/04 17:37:18 Wait until backup mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148 is completed backup phase: InProgress backup phase: InProgress backup phase: InProgress backup phase: Completed STEP: Verify backup mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:38:38.111 2023/12/04 17:38:38 Backup for case mysql succeeded STEP: Delete the appplication resources mysql @ 12/04/23 17:38:38.147 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-1077] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:38:56 2023-12-04 17:38:39,478 p=30744 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:38:39,478 p=30744 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:38:39,676 p=30744 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:38:39,677 p=30744 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:38:39,901 p=30744 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:38:39,901 p=30744 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:38:39,913 p=30744 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:38:39,913 p=30744 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:38:40,171 p=30744 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:38:40,171 p=30744 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:38:40,194 p=30744 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:38:40,194 p=30744 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:38:40,214 p=30744 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:38:40,215 p=30744 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:38:40,224 p=30744 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:38:40,743 p=30744 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:38:40,743 p=30744 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:38:56,580 p=30744 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-1077] *** 2023-12-04 17:38:56,580 p=30744 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:38:56,797 p=30744 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:38:56,797 p=30744 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:38:56 Creating restore mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148 for case mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148 STEP: Create restore mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148 from backup mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148 @ 12/04/23 17:38:56.843 2023/12/04 17:38:56 Wait until restore mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148 is complete restore phase: InProgress restore phase: InProgress restore phase: InProgress restore phase: InProgress restore phase: InProgress restore phase: InProgress restore phase: InProgress restore phase: InProgress restore phase: InProgress restore phase: Completed STEP: Verify restore mysql-b26b3fc2-92cb-11ee-b39d-0a580a838148has completed successfully @ 12/04/23 17:40:36.924 STEP: Verify Application restore @ 12/04/23 17:40:36.928 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check mysql pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=13  changed=5  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 2023/12/04 17:40:41 2023-12-04 17:40:38,384 p=30946 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:40:38,384 p=30946 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:38,595 p=30946 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:40:38,595 p=30946 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:38,818 p=30946 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:40:38,818 p=30946 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:38,832 p=30946 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:40:38,832 p=30946 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:39,080 p=30946 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:40:39,081 p=30946 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:39,106 p=30946 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:40:39,106 p=30946 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:39,130 p=30946 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:40:39,130 p=30946 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:39,144 p=30946 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:40:39,690 p=30946 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:40:39,690 p=30946 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:40,697 p=30946 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Check mysql pod status] *** 2023-12-04 17:40:40,697 p=30946 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:41,006 p=30946 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Wait until service ready for connections] *** 2023-12-04 17:40:41,007 p=30946 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:41,355 p=30946 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Query the inserted data] *** 2023-12-04 17:40:41,356 p=30946 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:41,367 p=30946 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:40:41,367 p=30946 u=1008320000 n=ansible | localhost : ok=13 changed=5 unreachable=0 failed=0 skipped=8 rescued=0 ignored=0 2023/12/04 17:40:41 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 2023/12/04 17:40:41 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-1077] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=10  rescued=0 ignored=0 2023/12/04 17:40:55 2023-12-04 17:40:42,918 p=31212 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:40:42,918 p=31212 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:43,140 p=31212 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:40:43,141 p=31212 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:43,375 p=31212 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:40:43,375 p=31212 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:43,389 p=31212 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:40:43,389 p=31212 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:43,672 p=31212 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:40:43,672 p=31212 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:43,705 p=31212 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:40:43,705 p=31212 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:43,737 p=31212 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:40:43,737 p=31212 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:43,749 p=31212 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:40:44,270 p=31212 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:40:44,270 p=31212 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:40:55,065 p=31212 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-mysql : Remove namespace test-oadp-1077] *** 2023-12-04 17:40:55,065 p=31212 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:40:55,247 p=31212 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:40:55,247 p=31212 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=10 rescued=0 ignored=0 2023/12/04 17:40:55 Cleaning setup resources for the backup • [249.182 seconds] ------------------------------ SS ------------------------------ Backup restore tests Application backup [tc-id:OADP-122] [test-upstream] [skip-disconnected] Django application with BSL&CSI [exclude_aro-4] /alabama/cspi/e2e/app_backup/backup_restore.go:93 2023/12/04 17:40:55 Delete all downloadrequest No download requests are found STEP: Create DPA CR @ 12/04/23 17:40:55.315 Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:40:55 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "123ae579-f7d1-44a3-878c-3b521678f4f0", "resourceVersion": "55417", "generation": 1, "creationTimestamp": "2023-12-04T17:40:55Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:40:55Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:40:55.379 2023/12/04 17:40:55 Waiting for velero pod to be running 2023/12/04 17:40:55 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' 2023/12/04 17:40:55 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "123ae579-f7d1-44a3-878c-3b521678f4f0", "resourceVersion": "55417", "generation": 1, "creationTimestamp": "2023-12-04T17:40:55Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:40:55Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt", "csi" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } } }, "features": null }, "status": {} } STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:41:00.425 Run the command: oc get ns openshift-storage &> /dev/null && echo true || echo false 2023/12/04 17:41:00 The 'openshift-storage' namespace does not exist 2023/12/04 17:41:00 Using default CSI driver based on infrastructure: ebs.csi.aws.com 2023/12/04 17:41:00 Snapclass 'example-snapclass' doesn't exist, creating 2023/12/04 17:41:00 Setting new default StorageClass 'gp2-csi' Run the command: oc get sc 2023/12/04 17:41:00 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 54m gp3-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 54m STEP: Installing application for case django-persistent @ 12/04/23 17:41:00.654 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check namespace test-oadp-122] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Create namespace test-oadp-122] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Create the mtc test django psql persistent template] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Create openshift django psql persistent application from openshift templates] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=14  changed=6  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2023/12/04 17:41:06 2023-12-04 17:41:02,210 p=31457 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:41:02,210 p=31457 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:02,438 p=31457 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:41:02,438 p=31457 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:02,672 p=31457 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:41:02,672 p=31457 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:02,685 p=31457 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:41:02,685 p=31457 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:02,955 p=31457 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:41:02,955 p=31457 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:02,978 p=31457 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:41:02,978 p=31457 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:03,001 p=31457 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:41:03,001 p=31457 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:03,012 p=31457 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:41:03,548 p=31457 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:41:03,548 p=31457 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:04,294 p=31457 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check namespace test-oadp-122] *** 2023-12-04 17:41:04,294 p=31457 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:04,628 p=31457 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Create namespace test-oadp-122] *** 2023-12-04 17:41:04,629 p=31457 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:05,450 p=31457 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Create the mtc test django psql persistent template] *** 2023-12-04 17:41:05,450 p=31457 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:05,890 p=31457 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Create openshift django psql persistent application from openshift templates] *** 2023-12-04 17:41:05,890 p=31457 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:06,052 p=31457 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:41:06,053 p=31457 u=1008320000 n=ansible | localhost : ok=14 changed=6 unreachable=0 failed=0 skipped=9 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:41:06.095 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] FAILED - RETRYING: [localhost]: Check postgresql pod status (30 retries left). FAILED - RETRYING: [localhost]: Check postgresql pod status (29 retries left). FAILED - RETRYING: [localhost]: Check postgresql pod status (28 retries left). FAILED - RETRYING: [localhost]: Check postgresql pod status (27 retries left). FAILED - RETRYING: [localhost]: Check postgresql pod status (26 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check postgresql pod status] *** ok: [localhost] FAILED - RETRYING: [localhost]: Check application pod status (30 retries left). FAILED - RETRYING: [localhost]: Check application pod status (29 retries left). FAILED - RETRYING: [localhost]: Check application pod status (28 retries left). FAILED - RETRYING: [localhost]: Check application pod status (27 retries left). FAILED - RETRYING: [localhost]: Check application pod status (26 retries left). FAILED - RETRYING: [localhost]: Check application pod status (25 retries left). FAILED - RETRYING: [localhost]: Check application pod status (24 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check application pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Get route] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Access the html file] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : set_fact] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Get num visits up to now] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Print num of visits] *** ok: [localhost] => {  "msg": "PASS: # of visits should be 1; actual 1" } PLAY RECAP ********************************************************************* localhost : ok=17  changed=3  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2023/12/04 17:42:18 2023-12-04 17:41:07,545 p=31737 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:41:07,545 p=31737 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:07,778 p=31737 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:41:07,778 p=31737 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:08,003 p=31737 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:41:08,003 p=31737 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:41:08,017 p=31737 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:41:08,017 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:08,315 p=31737 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:41:08,315 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:08,343 p=31737 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:41:08,343 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:08,365 p=31737 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:41:08,365 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:08,375 p=31737 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:41:08,919 p=31737 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:41:08,919 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:41:37,555 p=31737 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check postgresql pod status] *** 2023-12-04 17:41:37,555 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:17,086 p=31737 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check application pod status] *** 2023-12-04 17:42:17,087 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:17,884 p=31737 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Get route] *** 2023-12-04 17:42:17,884 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:18,316 p=31737 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Access the html file] *** 2023-12-04 17:42:18,316 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:18,343 p=31737 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : set_fact] *** 2023-12-04 17:42:18,344 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:18,635 p=31737 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Get num visits up to now] *** 2023-12-04 17:42:18,635 p=31737 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:18,681 p=31737 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Print num of visits] *** 2023-12-04 17:42:18,681 p=31737 u=1008320000 n=ansible | ok: [localhost] => { "msg": "PASS: # of visits should be 1; actual 1" } 2023-12-04 17:42:18,694 p=31737 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:42:18,694 p=31737 u=1008320000 n=ansible | localhost : ok=17 changed=3 unreachable=0 failed=0 skipped=6 rescued=0 ignored=0 STEP: Creating backup django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148 @ 12/04/23 17:42:18.752 2023/12/04 17:42:18 Wait until backup django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148 is completed backup phase: Completed STEP: Verify backup django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:42:38.776 2023/12/04 17:42:38 Backup for case django-persistent succeeded STEP: Delete the appplication resources django-persistent @ 12/04/23 17:42:38.831 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Remove namespace test-oadp-122] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2023/12/04 17:42:57 2023-12-04 17:42:40,229 p=32153 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:42:40,229 p=32153 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:42:40,451 p=32153 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:42:40,451 p=32153 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:42:40,651 p=32153 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:42:40,651 p=32153 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:42:40,667 p=32153 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:42:40,667 p=32153 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:40,925 p=32153 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:42:40,926 p=32153 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:40,952 p=32153 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:42:40,952 p=32153 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:40,974 p=32153 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:42:40,974 p=32153 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:40,984 p=32153 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:42:41,490 p=32153 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:42:41,490 p=32153 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:42:57,244 p=32153 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Remove namespace test-oadp-122] *** 2023-12-04 17:42:57,244 p=32153 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:42:57,496 p=32153 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:42:57,496 p=32153 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 2023/12/04 17:42:57 Creating restore django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148 for case django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148 STEP: Create restore django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148 from backup django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148 @ 12/04/23 17:42:57.541 2023/12/04 17:42:57 Wait until restore django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148 is complete restore phase: InProgress restore phase: Completed STEP: Verify restore django-persistent-46c068ad-92cc-11ee-b39d-0a580a838148has completed successfully @ 12/04/23 17:43:17.596 STEP: Verify Application restore @ 12/04/23 17:43:17.6 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] FAILED - RETRYING: [localhost]: Check postgresql pod status (30 retries left). FAILED - RETRYING: [localhost]: Check postgresql pod status (29 retries left). FAILED - RETRYING: [localhost]: Check postgresql pod status (28 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check postgresql pod status] *** ok: [localhost] FAILED - RETRYING: [localhost]: Check application pod status (30 retries left). FAILED - RETRYING: [localhost]: Check application pod status (29 retries left). FAILED - RETRYING: [localhost]: Check application pod status (28 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check application pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Get route] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Access the html file] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : set_fact] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Get num visits up to now] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Print num of visits] *** ok: [localhost] => {  "msg": "PASS: # of visits should be 2; actual 2" } PLAY RECAP ********************************************************************* localhost : ok=17  changed=3  unreachable=0 failed=0 skipped=6  rescued=0 ignored=0 2023/12/04 17:43:56 2023-12-04 17:43:19,073 p=32359 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:43:19,073 p=32359 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:43:19,284 p=32359 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:43:19,284 p=32359 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:43:19,495 p=32359 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:43:19,495 p=32359 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:43:19,508 p=32359 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:43:19,508 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:19,764 p=32359 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:43:19,764 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:19,788 p=32359 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:43:19,788 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:19,810 p=32359 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:43:19,810 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:19,820 p=32359 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:43:20,329 p=32359 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:43:20,330 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:37,897 p=32359 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check postgresql pod status] *** 2023-12-04 17:43:37,897 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:55,235 p=32359 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Check application pod status] *** 2023-12-04 17:43:55,236 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:55,981 p=32359 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Get route] *** 2023-12-04 17:43:55,981 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:56,431 p=32359 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Access the html file] *** 2023-12-04 17:43:56,431 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:56,465 p=32359 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : set_fact] *** 2023-12-04 17:43:56,465 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:56,811 p=32359 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Get num visits up to now] *** 2023-12-04 17:43:56,811 p=32359 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:56,855 p=32359 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Print num of visits] *** 2023-12-04 17:43:56,855 p=32359 u=1008320000 n=ansible | ok: [localhost] => { "msg": "PASS: # of visits should be 2; actual 2" } 2023-12-04 17:43:56,869 p=32359 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:43:56,869 p=32359 u=1008320000 n=ansible | localhost : ok=17 changed=3 unreachable=0 failed=0 skipped=6 rescued=0 ignored=0 2023/12/04 17:43:56 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 2023/12/04 17:43:56 Reset number of visits to 0 2023/12/04 17:43:56 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Remove namespace test-oadp-122] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=11  changed=4  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2023/12/04 17:44:15 2023-12-04 17:43:58,413 p=32696 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:43:58,413 p=32696 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:43:58,621 p=32696 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:43:58,621 p=32696 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:43:58,840 p=32696 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:43:58,840 p=32696 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:43:58,852 p=32696 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:43:58,852 p=32696 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:59,132 p=32696 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:43:59,132 p=32696 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:59,159 p=32696 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:43:59,159 p=32696 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:59,179 p=32696 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:43:59,180 p=32696 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:43:59,189 p=32696 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:43:59,720 p=32696 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:43:59,720 p=32696 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:15,497 p=32696 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-django : Remove namespace test-oadp-122] *** 2023-12-04 17:44:15,497 p=32696 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:15,746 p=32696 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:44:15,746 p=32696 u=1008320000 n=ansible | localhost : ok=11 changed=4 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 2023/12/04 17:44:15 Cleaning setup resources for the backup 2023/12/04 17:44:15 Setting new default StorageClass 'gp3-csi' Run the command: oc get sc 2023/12/04 17:44:15 oc get sc NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE gp2-csi ebs.csi.aws.com Delete WaitForFirstConsumer true 57m gp3-csi (default) ebs.csi.aws.com Delete WaitForFirstConsumer true 57m 2023/12/04 17:44:15 Deleting VolumeSnapshotClass 'example-snapclass' • [200.639 seconds] ------------------------------ SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS ------------------------------ Incremental backup restore tests Incremental restore pod count [test-upstream] [bug-id:OADP-1077] Todolist app with Restic - policy: none /alabama/cspi/e2e/incremental_restore/backup_restore_incremental.go:107 2023/12/04 17:44:16 Delete all downloadrequest No download requests are found STEP: Create DPA CR @ 12/04/23 17:44:16.232 Updating resource allocations for NodeAgent because running tests in parallel Updating resource allocations for Velero because running tests in parallel 2023/12/04 17:44:16 { "metadata": { "name": "ts-dpa", "namespace": "openshift-adp", "uid": "9fd6c779-71b1-43c0-8172-245f1329a626", "resourceVersion": "57849", "generation": 1, "creationTimestamp": "2023-12-04T17:44:16Z", "managedFields": [ { "manager": "e2e.test", "operation": "Update", "apiVersion": "oadp.openshift.io/v1alpha1", "time": "2023-12-04T17:44:16Z", "fieldsType": "FieldsV1", "fieldsV1": { "f:spec": { ".": {}, "f:backupLocations": {}, "f:configuration": { ".": {}, "f:nodeAgent": { ".": {}, "f:enable": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } }, "f:uploaderType": {} }, "f:velero": { ".": {}, "f:defaultPlugins": {}, "f:podConfig": { ".": {}, "f:resourceAllocations": { ".": {}, "f:requests": { ".": {}, "f:cpu": {}, "f:memory": {} } } } } }, "f:podDnsConfig": {}, "f:snapshotLocations": {} } } } ] }, "spec": { "backupLocations": [ { "velero": { "provider": "aws", "config": { "region": "us-east-1" }, "credential": { "name": "cloud-credentials", "key": "cloud" }, "objectStorage": { "bucket": "ci-op-24wp7hk6-interopoadp", "prefix": "velero-e2e-d7d4c03a-92c8-11ee-b39d-0a580a838148" }, "default": true } } ], "snapshotLocations": [], "podDnsConfig": {}, "configuration": { "velero": { "defaultPlugins": [ "openshift", "aws", "kubevirt" ], "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } } }, "nodeAgent": { "enable": true, "podConfig": { "resourceAllocations": { "requests": { "cpu": "100m", "memory": "64Mi" } } }, "uploaderType": "restic" } }, "features": null }, "status": {} } Delete all the backups that remained in the phase InProgress Deleting backup CRs in progress Deletion of backup CRs in progress completed Delete all the restores that remained in the phase InProgress Deleting restore CRs in progress Deletion of restore CRs in progress completed STEP: Verify DPA CR setup @ 12/04/23 17:44:16.272 2023/12/04 17:44:16 Waiting for velero pod to be running 2023/12/04 17:44:16 pod: velero-66c9b58556-xmbnb is not yet running with status: {Succeeded [{Initialized True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:41:02 +0000 UTC PodCompleted } {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:44:16 +0000 UTC PodCompleted } {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:44:16 +0000 UTC PodCompleted } {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:40:55 +0000 UTC }] 10.0.64.104 [] 2023-12-04 17:40:55 +0000 UTC [{openshift-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:40:57 +0000 UTC,FinishedAt:2023-12-04 17:40:57 +0000 UTC,ContainerID:cri-o://58e649bf8802be357eafa8fb5ac5cffae4517232bc3a7a403385c26952e11477,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-rhel9@sha256:98264ebcc6950f6f240a547740260e9755ef757cec336ab6b5e8bba4d75e9502 a27d719be33680a0083b7bef21a83ebb0f94eb61a9562bc9950fb479ae35057a cri-o://58e649bf8802be357eafa8fb5ac5cffae4517232bc3a7a403385c26952e11477 0xc000a97479} {velero-plugin-for-aws {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:40:58 +0000 UTC,FinishedAt:2023-12-04 17:40:58 +0000 UTC,ContainerID:cri-o://0fa81bba7a4dee654f91e633aa6471a3bb94e8574e858229ea90e0623b247315,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-for-aws-rhel9@sha256:32309eaa2e565b349f2806c6bc6f834876a64cd106e48044e982ed6925d5c6bf 9c953b830e58ee14db469d2b88e4fda9406f0e5f4b0f88c7913cd9a4bc7b0641 cri-o://0fa81bba7a4dee654f91e633aa6471a3bb94e8574e858229ea90e0623b247315 0xc000a97489} {kubevirt-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:41:00 +0000 UTC,FinishedAt:2023-12-04 17:41:00 +0000 UTC,ContainerID:cri-o://9b3477b62feaae72881f579ad00229463387c12592e546d0336352e9fbfdc0eb,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-kubevirt-velero-plugin-rhel9@sha256:b49ab89e7bc68b9e4e83fbbf33e215a339c28cf48df8a2ed3e8440f35f22b6a6 691a4d608f0798d31f3f9916532817effc5fe93bcfa2409da069a8a1c128f0c8 cri-o://9b3477b62feaae72881f579ad00229463387c12592e546d0336352e9fbfdc0eb 0xc000a97499} {velero-plugin-for-csi {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:41:02 +0000 UTC,FinishedAt:2023-12-04 17:41:02 +0000 UTC,ContainerID:cri-o://f6e6ba97085dddd531aacae7b84bd3dee2e1497be7f0035f15ab1ec36312ba86,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-for-csi-rhel9@sha256:4a4e93abdeecc3647620dd9a17c5a3fe833659f6c0f2058a44b38da1a75b5aad 1d2287a478f54fa0f9cca68baffda82f7eb045de5fcab2b9e3b1f8547b2d182a cri-o://f6e6ba97085dddd531aacae7b84bd3dee2e1497be7f0035f15ab1ec36312ba86 0xc000a974a9}] [{velero {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:41:03 +0000 UTC,FinishedAt:2023-12-04 17:44:16 +0000 UTC,ContainerID:cri-o://19ee48e1cc8d4fb89de2c1c46f0252e691e7c83204bce9815e7461c06cead3f1,}} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-rhel9@sha256:06482afcea65eff184901c63cbe9fc6e5b3172d23857301ad4e7b4daf362e79c a97767c21761d43f42ade7602f71338fe9623c667536f99fef24f182fcc6d0f0 cri-o://19ee48e1cc8d4fb89de2c1c46f0252e691e7c83204bce9815e7461c06cead3f1 0xc000a974b9}] Burstable []} 2023/12/04 17:44:21 pod: velero-5cf59669b7-55tzf is not yet running with status: {Pending [{Initialized False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:44:16 +0000 UTC ContainersNotInitialized containers with incomplete status: [kubevirt-velero-plugin]} {Ready False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:44:16 +0000 UTC ContainersNotReady containers with unready status: [velero]} {ContainersReady False 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:44:16 +0000 UTC ContainersNotReady containers with unready status: [velero]} {PodScheduled True 0001-01-01 00:00:00 +0000 UTC 2023-12-04 17:44:16 +0000 UTC }] 10.0.64.104 10.128.2.45 [{10.128.2.45}] 2023-12-04 17:44:16 +0000 UTC [{openshift-velero-plugin {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:44:18 +0000 UTC,FinishedAt:2023-12-04 17:44:18 +0000 UTC,ContainerID:cri-o://4788a7c7cdfe332d4ec322a54f428bf46fbd7f015c94970ed67f81f16552f646,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-rhel9@sha256:98264ebcc6950f6f240a547740260e9755ef757cec336ab6b5e8bba4d75e9502 a27d719be33680a0083b7bef21a83ebb0f94eb61a9562bc9950fb479ae35057a cri-o://4788a7c7cdfe332d4ec322a54f428bf46fbd7f015c94970ed67f81f16552f646 0xc000732ea9} {velero-plugin-for-aws {nil nil &ContainerStateTerminated{ExitCode:0,Signal:0,Reason:Completed,Message:,StartedAt:2023-12-04 17:44:20 +0000 UTC,FinishedAt:2023-12-04 17:44:20 +0000 UTC,ContainerID:cri-o://4f0d43ba99f72442216e0e8fb3f3060ee9bbceac6120b34a34c391f7113d8c63,}} {nil nil nil} true 0 registry.redhat.io/oadp/oadp-velero-plugin-for-aws-rhel9@sha256:32309eaa2e565b349f2806c6bc6f834876a64cd106e48044e982ed6925d5c6bf 9c953b830e58ee14db469d2b88e4fda9406f0e5f4b0f88c7913cd9a4bc7b0641 cri-o://4f0d43ba99f72442216e0e8fb3f3060ee9bbceac6120b34a34c391f7113d8c63 0xc000732eb9} {kubevirt-velero-plugin {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-kubevirt-velero-plugin-rhel9@sha256:b49ab89e7bc68b9e4e83fbbf33e215a339c28cf48df8a2ed3e8440f35f22b6a6 0xc000732eba}] [{velero {&ContainerStateWaiting{Reason:PodInitializing,Message:,} nil nil} {nil nil nil} false 0 registry.redhat.io/oadp/oadp-velero-rhel9@sha256:06482afcea65eff184901c63cbe9fc6e5b3172d23857301ad4e7b4daf362e79c 0xc000733086}] Burstable []} 2023/12/04 17:44:26 Wait for DPA status.condition.reason to be 'Completed' and and message to be 'Reconcile complete' STEP: Installing application for case todolist-backup @ 12/04/23 17:44:26.3 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check namespace todolist-mariadb-restic] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Create namespace todolist-mariadb-restic] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Ensure namespace todolist-mariadb-restic is present] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Deploy todolist-mysql application] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check mysql pod status (30 retries left). FAILED - RETRYING: [localhost]: Check mysql pod status (29 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check mysql pod status] *** ok: [localhost] FAILED - RETRYING: [localhost]: Check todolist pod status (30 retries left). FAILED - RETRYING: [localhost]: Check todolist pod status (29 retries left). FAILED - RETRYING: [localhost]: Check todolist pod status (28 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check todolist pod status] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until service is ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until todolist API server starts] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Add additional items todo list] *** changed: [localhost] Pausing for 30 seconds TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait for 30 seconds] *** ok: [localhost] PLAY RECAP ********************************************************************* localhost : ok=20  changed=8  unreachable=0 failed=0 skipped=9  rescued=0 ignored=0 2023/12/04 17:45:22 2023-12-04 17:44:27,708 p=32921 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:44:27,708 p=32921 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:27,930 p=32921 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:44:27,930 p=32921 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:28,157 p=32921 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:44:28,157 p=32921 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:28,170 p=32921 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:44:28,170 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:28,431 p=32921 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:44:28,432 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:28,458 p=32921 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:44:28,458 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:28,478 p=32921 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:44:28,479 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:28,488 p=32921 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:44:29,001 p=32921 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:44:29,001 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:29,784 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check namespace todolist-mariadb-restic] *** 2023-12-04 17:44:29,785 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:30,178 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Create namespace todolist-mariadb-restic] *** 2023-12-04 17:44:30,179 p=32921 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:30,787 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Ensure namespace todolist-mariadb-restic is present] *** 2023-12-04 17:44:30,787 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:31,766 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Deploy todolist-mysql application] *** 2023-12-04 17:44:31,766 p=32921 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:39,589 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check mysql pod status] *** 2023-12-04 17:44:39,590 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:50,996 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check todolist pod status] *** 2023-12-04 17:44:50,996 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:44:51,266 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until service is ready for connections] *** 2023-12-04 17:44:51,266 p=32921 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:51,531 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until todolist API server starts] *** 2023-12-04 17:44:51,531 p=32921 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:51,835 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Add additional items todo list] *** 2023-12-04 17:44:51,835 p=32921 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:44:51,861 p=32921 u=1008320000 n=ansible | Pausing for 30 seconds 2023-12-04 17:45:21,894 p=32921 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait for 30 seconds] *** 2023-12-04 17:45:21,895 p=32921 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:22,064 p=32921 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:45:22,065 p=32921 u=1008320000 n=ansible | localhost : ok=20 changed=8 unreachable=0 failed=0 skipped=9 rescued=0 ignored=0 STEP: Verify Application deployment @ 12/04/23 17:45:22.111 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check mysql pod is running] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until mysql service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check todolist pod is running] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until todolist API server starts] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Obtain todolist route] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find 1st database item] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find the string in incomplete items] *** ok: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=5  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2023/12/04 17:45:28 2023-12-04 17:45:23,618 p=33376 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:45:23,618 p=33376 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:45:23,857 p=33376 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:45:23,857 p=33376 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:45:24,088 p=33376 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:45:24,089 p=33376 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:45:24,104 p=33376 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:45:24,105 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:24,405 p=33376 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:45:24,405 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:24,431 p=33376 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:45:24,431 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:24,456 p=33376 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:45:24,456 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:24,467 p=33376 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:45:24,993 p=33376 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:45:24,993 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:26,028 p=33376 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check mysql pod is running] *** 2023-12-04 17:45:26,028 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:26,343 p=33376 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until mysql service ready for connections] *** 2023-12-04 17:45:26,343 p=33376 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:45:26,991 p=33376 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check todolist pod is running] *** 2023-12-04 17:45:26,991 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:27,258 p=33376 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until todolist API server starts] *** 2023-12-04 17:45:27,258 p=33376 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:45:28,045 p=33376 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Obtain todolist route] *** 2023-12-04 17:45:28,045 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:28,445 p=33376 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find 1st database item] *** 2023-12-04 17:45:28,445 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:28,727 p=33376 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find the string in incomplete items] *** 2023-12-04 17:45:28,728 p=33376 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:45:28,739 p=33376 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:45:28,739 p=33376 u=1008320000 n=ansible | localhost : ok=17 changed=5 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:45:28.795 2023/12/04 17:45:28 Checking for correct number of running NodeAgent pods... STEP: Creating backup todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148 @ 12/04/23 17:45:28.805 2023/12/04 17:45:28 Wait until backup todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148 is completed backup phase: Completed STEP: Verify backup todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:45:48.825 2023/12/04 17:45:48 Backup for case todolist-backup succeeded STEP: Scale application @ 12/04/23 17:45:48.862 2023/12/04 17:45:48 Scaling deployment 'todolist' to 2 replicas 2023/12/04 17:45:48 Deployment updated successfully 2023/12/04 17:45:48 number of running pods: 1 2023/12/04 17:45:53 number of running pods: 1 2023/12/04 17:45:58 number of running pods: 1 2023/12/04 17:46:03 number of running pods: 1 2023/12/04 17:46:08 Application reached target number of replicas: 2 STEP: Prepare backup resources, depending on the volumes backup type @ 12/04/23 17:46:08.907 2023/12/04 17:46:08 Checking for correct number of running NodeAgent pods... STEP: Creating backup todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148 @ 12/04/23 17:46:08.923 2023/12/04 17:46:08 Wait until backup todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148 is completed backup phase: Completed STEP: Verify backup todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148 has completed successfully @ 12/04/23 17:46:28.947 2023/12/04 17:46:29 Backup for case todolist-backup succeeded STEP: Cleanup application and restore 1st backup @ 12/04/23 17:46:29.003 STEP: Delete the appplication resources todolist-backup @ 12/04/23 17:46:29.003 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Remove namespace todolist-mariadb-restic] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Remove todolist-mariadb-restic SCC] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=12  changed=5  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2023/12/04 17:46:48 2023-12-04 17:46:30,487 p=33701 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:46:30,487 p=33701 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:46:30,751 p=33701 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:46:30,751 p=33701 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:46:30,973 p=33701 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:46:30,973 p=33701 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:46:30,987 p=33701 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:46:30,987 p=33701 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:46:31,269 p=33701 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:46:31,269 p=33701 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:46:31,293 p=33701 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:46:31,294 p=33701 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:46:31,315 p=33701 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:46:31,315 p=33701 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:46:31,327 p=33701 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:46:31,870 p=33701 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:46:31,870 p=33701 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:46:47,684 p=33701 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Remove namespace todolist-mariadb-restic] *** 2023-12-04 17:46:47,684 p=33701 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:46:48,435 p=33701 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Remove todolist-mariadb-restic SCC] *** 2023-12-04 17:46:48,435 p=33701 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:46:48,771 p=33701 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:46:48,772 p=33701 u=1008320000 n=ansible | localhost : ok=12 changed=5 unreachable=0 failed=0 skipped=17 rescued=0 ignored=0 2023/12/04 17:46:48 Creating restore todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148 for case todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148 STEP: Create restore todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148 from backup todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148 @ 12/04/23 17:46:48.816 2023/12/04 17:46:48 Wait until restore todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148 is complete restore phase: InProgress restore phase: InProgress restore phase: InProgress restore phase: Completed STEP: Verify restore todolist-backup-e9c37b6f-92cc-11ee-b39d-0a580a838148has completed successfully @ 12/04/23 17:47:28.9 STEP: Verify Application restore @ 12/04/23 17:47:28.906 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check mysql pod is running] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until mysql service ready for connections] *** changed: [localhost] FAILED - RETRYING: [localhost]: Check todolist pod is running (30 retries left). TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check todolist pod is running] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until todolist API server starts] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Obtain todolist route] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find 1st database item] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find the string in incomplete items] *** ok: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=5  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2023/12/04 17:47:41 2023-12-04 17:47:30,291 p=33923 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:47:30,292 p=33923 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:47:30,515 p=33923 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:47:30,516 p=33923 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:47:30,731 p=33923 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:47:30,731 p=33923 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:47:30,744 p=33923 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:47:30,744 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:31,017 p=33923 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:47:31,017 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:31,042 p=33923 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:47:31,042 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:31,063 p=33923 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:47:31,063 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:31,073 p=33923 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:47:31,592 p=33923 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:47:31,593 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:32,704 p=33923 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check mysql pod is running] *** 2023-12-04 17:47:32,704 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:32,998 p=33923 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until mysql service ready for connections] *** 2023-12-04 17:47:32,998 p=33923 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:47:39,281 p=33923 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check todolist pod is running] *** 2023-12-04 17:47:39,282 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:39,546 p=33923 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until todolist API server starts] *** 2023-12-04 17:47:39,546 p=33923 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:47:40,392 p=33923 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Obtain todolist route] *** 2023-12-04 17:47:40,392 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:40,867 p=33923 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find 1st database item] *** 2023-12-04 17:47:40,867 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:41,171 p=33923 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find the string in incomplete items] *** 2023-12-04 17:47:41,171 p=33923 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:47:41,184 p=33923 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:47:41,184 p=33923 u=1008320000 n=ansible | localhost : ok=17 changed=5 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 2023/12/04 17:47:41 Application reached target number of replicas: 1 STEP: Restore 2nd backup with existingRessourcePolicy: none @ 12/04/23 17:47:41.252 2023/12/04 17:47:41 Creating restore todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148 for case todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148 STEP: Create restore todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148 from backup todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148 @ 12/04/23 17:47:41.252 2023/12/04 17:47:41 Wait until restore todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148 is complete restore phase: InProgress restore phase: Completed STEP: Verify restore todolist-backup-01ac3767-92cd-11ee-b39d-0a580a838148has completed successfully @ 12/04/23 17:48:01.289 STEP: Verify Application restore @ 12/04/23 17:48:01.294 [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check mysql pod is running] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until mysql service ready for connections] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check todolist pod is running] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until todolist API server starts] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Obtain todolist route] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find 1st database item] *** ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find the string in incomplete items] *** ok: [localhost] PLAY RECAP ********************************************************************* localhost : ok=17  changed=5  unreachable=0 failed=0 skipped=12  rescued=0 ignored=0 2023/12/04 17:48:07 2023-12-04 17:48:02,634 p=34264 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:48:02,634 p=34264 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:02,834 p=34264 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:48:02,834 p=34264 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:03,039 p=34264 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:48:03,039 p=34264 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:03,051 p=34264 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:48:03,051 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:03,328 p=34264 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:48:03,328 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:03,351 p=34264 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:48:03,351 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:03,371 p=34264 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:48:03,371 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:03,380 p=34264 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:48:03,901 p=34264 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:48:03,901 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:05,006 p=34264 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check mysql pod is running] *** 2023-12-04 17:48:05,007 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:05,277 p=34264 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until mysql service ready for connections] *** 2023-12-04 17:48:05,277 p=34264 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:05,927 p=34264 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Check todolist pod is running] *** 2023-12-04 17:48:05,927 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:06,183 p=34264 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Wait until todolist API server starts] *** 2023-12-04 17:48:06,183 p=34264 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:06,987 p=34264 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Obtain todolist route] *** 2023-12-04 17:48:06,987 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:07,380 p=34264 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find 1st database item] *** 2023-12-04 17:48:07,381 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:07,648 p=34264 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Find the string in incomplete items] *** 2023-12-04 17:48:07,648 p=34264 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:07,659 p=34264 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:48:07,659 p=34264 u=1008320000 n=ansible | localhost : ok=17 changed=5 unreachable=0 failed=0 skipped=12 rescued=0 ignored=0 2023/12/04 17:48:07 Application reached target number of replicas: 1 2023/12/04 17:48:07 Using Must-gather image: registry.redhat.io/oadp/oadp-mustgather-rhel9:1.3.0 2023/12/04 17:48:07 Cleaning setup resources for the backup 2023/12/04 17:48:07 Cleaning setup resources for the backup 2023/12/04 17:48:07 Cleaning app [WARNING]: No inventory was parsed, only implicit localhost is available [WARNING]: provided hosts list is empty, only localhost is available. Note that the implicit localhost does not match 'all' [WARNING]: Found variable using reserved name: namespace PLAY [localhost] *************************************************************** TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [include_vars] ************************************************************ ok: [localhost] TASK [Remove all the contents from the file] *********************************** changed: [localhost] TASK [Get cluster endpoint] **************************************************** changed: [localhost] TASK [Get current admin token] ************************************************* changed: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] TASK [Extract kubernetes minor version from cluster_info] ********************** ok: [localhost] TASK [Map kubernetes minor to ocp releases] ************************************ ok: [localhost] TASK [set_fact] **************************************************************** ok: [localhost] PLAY [Execute Task] ************************************************************ TASK [Gathering Facts] ********************************************************* ok: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Remove namespace todolist-mariadb-restic] *** changed: [localhost] TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Remove todolist-mariadb-restic SCC] *** changed: [localhost] PLAY RECAP ********************************************************************* localhost : ok=12  changed=5  unreachable=0 failed=0 skipped=17  rescued=0 ignored=0 2023/12/04 17:48:27 2023-12-04 17:48:09,089 p=34593 u=1008320000 n=ansible | TASK [Remove all the contents from the file] *********************************** 2023-12-04 17:48:09,089 p=34593 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:09,307 p=34593 u=1008320000 n=ansible | TASK [Get cluster endpoint] **************************************************** 2023-12-04 17:48:09,308 p=34593 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:09,537 p=34593 u=1008320000 n=ansible | TASK [Get current admin token] ************************************************* 2023-12-04 17:48:09,537 p=34593 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:09,550 p=34593 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:48:09,550 p=34593 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:09,848 p=34593 u=1008320000 n=ansible | TASK [Extract kubernetes minor version from cluster_info] ********************** 2023-12-04 17:48:09,848 p=34593 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:09,872 p=34593 u=1008320000 n=ansible | TASK [Map kubernetes minor to ocp releases] ************************************ 2023-12-04 17:48:09,872 p=34593 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:09,896 p=34593 u=1008320000 n=ansible | TASK [set_fact] **************************************************************** 2023-12-04 17:48:09,896 p=34593 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:09,909 p=34593 u=1008320000 n=ansible | PLAY [Execute Task] ************************************************************ 2023-12-04 17:48:10,478 p=34593 u=1008320000 n=ansible | TASK [Gathering Facts] ********************************************************* 2023-12-04 17:48:10,478 p=34593 u=1008320000 n=ansible | ok: [localhost] 2023-12-04 17:48:26,252 p=34593 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Remove namespace todolist-mariadb-restic] *** 2023-12-04 17:48:26,252 p=34593 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:27,072 p=34593 u=1008320000 n=ansible | TASK [/alabama/cspi/sample-applications/ocpdeployer/ansible/roles/ocp-todolist-mariadb : Remove todolist-mariadb-restic SCC] *** 2023-12-04 17:48:27,073 p=34593 u=1008320000 n=ansible | changed: [localhost] 2023-12-04 17:48:27,448 p=34593 u=1008320000 n=ansible | PLAY RECAP ********************************************************************* 2023-12-04 17:48:27,448 p=34593 u=1008320000 n=ansible | localhost : ok=12 changed=5 unreachable=0 failed=0 skipped=17 rescued=0 ignored=0 • [251.597 seconds] ------------------------------ SSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSSS ------------------------------ [SynchronizedAfterSuite]  /alabama/cspi/e2e/e2e_suite_test.go:159 2023/12/04 17:48:27 Deleting Velero CR [SynchronizedAfterSuite] PASSED [0.010 seconds] ------------------------------ [DeferCleanup (Suite)]  /alabama/cspi/utils/subscription/subscription.go:216 I1204 17:48:28.604682 22324 request.go:690] Waited for 1.045531337s due to client-side throttling, not priority and fairness, request: GET:https://api.ci-op-24wp7hk6-2e88b.cspilp.interop.ccitredhat.com:6443/apis/autoscaling.openshift.io/v1?timeout=32s 2023/12/04 17:48:30 Successfully got the OADP ClusterServiceVersion name: oadp-operator.v1.3.0 2023/12/04 17:48:32 Successfully updated the OADP ClusterServiceVersion name: oadp-operator.v1.3.0 [DeferCleanup (Suite)] PASSED [5.259 seconds] ------------------------------ [ReportAfterSuite] Autogenerated ReportAfterSuite for --junit-report autogenerated by Ginkgo [ReportAfterSuite] PASSED [0.009 seconds] ------------------------------ Summarizing 1 Failure: [FAIL] Backup restore tests Application backup [It] [tc-id:OADP-198][test-upstream][smoke] Different labels selector: Backup and Restore with multiple matched labels [orLabelSelectors] [labels] /alabama/cspi/test_common/backup_restore_case.go:125 Ran 7 of 124 Specs in 1937.336 seconds FAIL! -- 6 Passed | 1 Failed | 1 Flaked | 0 Pending | 117 Skipped --- FAIL: TestOADPE2E (1937.35s) FAIL Ginkgo ran 1 suite in 33m9.183702748s Test Suite Failed Copying /alabama/cspi/e2e/junit_report.xml to /logs/artifacts/junit_oadp_interop_results.xml...