Moving oc binary to /usr/bin/oc Extracting cluster data, mtc-apps-deployer, and mtc-python-client. ./ ./aws/ ./aws/mtc-source-4bze/ ./aws/mtc-source-4bze/auth/ ./aws/mtc-source-4bze/auth/kubeconfig ./aws/mtc-source-4bze/auth/kubeadmin-password ./aws/mtc-source-4bze/cluster_data.yaml ./aws/mtc-target-nzvv/ ./aws/mtc-target-nzvv/auth/ ./aws/mtc-target-nzvv/auth/kubeconfig ./aws/mtc-target-nzvv/auth/kubeadmin-password ./aws/mtc-target-nzvv/cluster_data.yaml Creating Python virtual environment Installing mtc-apps-deployer and mtc-python-client. Collecting pytest (from -r /mtc-e2e-qev2/requirements.txt (line 1)) Downloading pytest-8.1.1-py3-none-any.whl.metadata (7.6 kB) Collecting pytest-rerunfailures (from -r /mtc-e2e-qev2/requirements.txt (line 2)) Downloading pytest_rerunfailures-14.0-py3-none-any.whl.metadata (18 kB) Collecting ansible==4.4.0 (from -r /mtc-e2e-qev2/requirements.txt (line 3)) Downloading ansible-4.4.0.tar.gz (35.4 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 35.4/35.4 MB 1.7 MB/s eta 0:00:00 Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Installing backend dependencies: started Installing backend dependencies: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Collecting ansible-runner (from -r /mtc-e2e-qev2/requirements.txt (line 4)) Downloading ansible_runner-2.3.6-py3-none-any.whl.metadata (3.5 kB) Collecting jmespath (from -r /mtc-e2e-qev2/requirements.txt (line 5)) Downloading jmespath-1.0.1-py3-none-any.whl.metadata (7.6 kB) Collecting requests (from -r /mtc-e2e-qev2/requirements.txt (line 6)) Downloading requests-2.31.0-py3-none-any.whl.metadata (4.6 kB) Collecting pre-commit (from -r /mtc-e2e-qev2/requirements.txt (line 7)) Downloading pre_commit-3.7.0-py2.py3-none-any.whl.metadata (1.3 kB) Collecting ansible-core<2.12,>=2.11.3 (from ansible==4.4.0->-r /mtc-e2e-qev2/requirements.txt (line 3)) Downloading ansible-core-2.11.12.tar.gz (7.1 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 7.1/7.1 MB 5.1 MB/s eta 0:00:00 Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Installing backend dependencies: started Installing backend dependencies: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Collecting iniconfig (from pytest->-r /mtc-e2e-qev2/requirements.txt (line 1)) Downloading iniconfig-2.0.0-py3-none-any.whl.metadata (2.6 kB) Collecting packaging (from pytest->-r /mtc-e2e-qev2/requirements.txt (line 1)) Downloading packaging-24.0-py3-none-any.whl.metadata (3.2 kB) Collecting pluggy<2.0,>=1.4 (from pytest->-r /mtc-e2e-qev2/requirements.txt (line 1)) Downloading pluggy-1.4.0-py3-none-any.whl.metadata (4.3 kB) Collecting pexpect>=4.5 (from ansible-runner->-r /mtc-e2e-qev2/requirements.txt (line 4)) Downloading pexpect-4.9.0-py2.py3-none-any.whl.metadata (2.5 kB) Collecting python-daemon (from ansible-runner->-r /mtc-e2e-qev2/requirements.txt (line 4)) Downloading python_daemon-3.0.1-py3-none-any.whl.metadata (2.2 kB) Collecting pyyaml (from ansible-runner->-r /mtc-e2e-qev2/requirements.txt (line 4)) Downloading PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (2.1 kB) Collecting six (from ansible-runner->-r /mtc-e2e-qev2/requirements.txt (line 4)) Downloading six-1.16.0-py2.py3-none-any.whl.metadata (1.8 kB) Collecting charset-normalizer<4,>=2 (from requests->-r /mtc-e2e-qev2/requirements.txt (line 6)) Downloading charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (33 kB) Collecting idna<4,>=2.5 (from requests->-r /mtc-e2e-qev2/requirements.txt (line 6)) Downloading idna-3.6-py3-none-any.whl.metadata (9.9 kB) Collecting urllib3<3,>=1.21.1 (from requests->-r /mtc-e2e-qev2/requirements.txt (line 6)) Downloading urllib3-2.2.1-py3-none-any.whl.metadata (6.4 kB) Collecting certifi>=2017.4.17 (from requests->-r /mtc-e2e-qev2/requirements.txt (line 6)) Downloading certifi-2024.2.2-py3-none-any.whl.metadata (2.2 kB) Collecting cfgv>=2.0.0 (from pre-commit->-r /mtc-e2e-qev2/requirements.txt (line 7)) Downloading cfgv-3.4.0-py2.py3-none-any.whl.metadata (8.5 kB) Collecting identify>=1.0.0 (from pre-commit->-r /mtc-e2e-qev2/requirements.txt (line 7)) Downloading identify-2.5.35-py2.py3-none-any.whl.metadata (4.4 kB) Collecting nodeenv>=0.11.1 (from pre-commit->-r /mtc-e2e-qev2/requirements.txt (line 7)) Downloading nodeenv-1.8.0-py2.py3-none-any.whl.metadata (21 kB) Collecting virtualenv>=20.10.0 (from pre-commit->-r /mtc-e2e-qev2/requirements.txt (line 7)) Downloading virtualenv-20.25.1-py3-none-any.whl.metadata (4.4 kB) Collecting jinja2 (from ansible-core<2.12,>=2.11.3->ansible==4.4.0->-r /mtc-e2e-qev2/requirements.txt (line 3)) Downloading Jinja2-3.1.3-py3-none-any.whl.metadata (3.3 kB) Collecting cryptography (from ansible-core<2.12,>=2.11.3->ansible==4.4.0->-r /mtc-e2e-qev2/requirements.txt (line 3)) Downloading cryptography-42.0.5-cp39-abi3-manylinux_2_28_x86_64.whl.metadata (5.3 kB) Collecting resolvelib<0.6.0,>=0.5.3 (from ansible-core<2.12,>=2.11.3->ansible==4.4.0->-r /mtc-e2e-qev2/requirements.txt (line 3)) Downloading resolvelib-0.5.4-py2.py3-none-any.whl.metadata (3.7 kB) Collecting setuptools (from nodeenv>=0.11.1->pre-commit->-r /mtc-e2e-qev2/requirements.txt (line 7)) Using cached setuptools-69.2.0-py3-none-any.whl.metadata (6.3 kB) Collecting ptyprocess>=0.5 (from pexpect>=4.5->ansible-runner->-r /mtc-e2e-qev2/requirements.txt (line 4)) Downloading ptyprocess-0.7.0-py2.py3-none-any.whl.metadata (1.3 kB) Collecting distlib<1,>=0.3.7 (from virtualenv>=20.10.0->pre-commit->-r /mtc-e2e-qev2/requirements.txt (line 7)) Downloading distlib-0.3.8-py2.py3-none-any.whl.metadata (5.1 kB) Collecting filelock<4,>=3.12.2 (from virtualenv>=20.10.0->pre-commit->-r /mtc-e2e-qev2/requirements.txt (line 7)) Downloading filelock-3.13.3-py3-none-any.whl.metadata (2.8 kB) Collecting platformdirs<5,>=3.9.1 (from virtualenv>=20.10.0->pre-commit->-r /mtc-e2e-qev2/requirements.txt (line 7)) Downloading platformdirs-4.2.0-py3-none-any.whl.metadata (11 kB) Collecting docutils (from python-daemon->ansible-runner->-r /mtc-e2e-qev2/requirements.txt (line 4)) Downloading docutils-0.20.1-py3-none-any.whl.metadata (2.8 kB) Collecting lockfile>=0.10 (from python-daemon->ansible-runner->-r /mtc-e2e-qev2/requirements.txt (line 4)) Downloading lockfile-0.12.2-py2.py3-none-any.whl.metadata (2.4 kB) Collecting cffi>=1.12 (from cryptography->ansible-core<2.12,>=2.11.3->ansible==4.4.0->-r /mtc-e2e-qev2/requirements.txt (line 3)) Downloading cffi-1.16.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (1.5 kB) Collecting MarkupSafe>=2.0 (from jinja2->ansible-core<2.12,>=2.11.3->ansible==4.4.0->-r /mtc-e2e-qev2/requirements.txt (line 3)) Downloading MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl.metadata (3.0 kB) Collecting pycparser (from cffi>=1.12->cryptography->ansible-core<2.12,>=2.11.3->ansible==4.4.0->-r /mtc-e2e-qev2/requirements.txt (line 3)) Downloading pycparser-2.22-py3-none-any.whl.metadata (943 bytes) Downloading pytest-8.1.1-py3-none-any.whl (337 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 337.4/337.4 kB 5.6 MB/s eta 0:00:00 Downloading pytest_rerunfailures-14.0-py3-none-any.whl (12 kB) Downloading ansible_runner-2.3.6-py3-none-any.whl (81 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 81.6/81.6 kB 7.2 MB/s eta 0:00:00 Downloading jmespath-1.0.1-py3-none-any.whl (20 kB) Downloading requests-2.31.0-py3-none-any.whl (62 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 62.6/62.6 kB 6.1 MB/s eta 0:00:00 Downloading pre_commit-3.7.0-py2.py3-none-any.whl (204 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 204.2/204.2 kB 5.0 MB/s eta 0:00:00 Downloading certifi-2024.2.2-py3-none-any.whl (163 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 163.8/163.8 kB 13.5 MB/s eta 0:00:00 Downloading cfgv-3.4.0-py2.py3-none-any.whl (7.2 kB) Downloading charset_normalizer-3.3.2-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (140 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 140.3/140.3 kB 12.6 MB/s eta 0:00:00 Downloading identify-2.5.35-py2.py3-none-any.whl (98 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 98.9/98.9 kB 8.7 MB/s eta 0:00:00 Downloading idna-3.6-py3-none-any.whl (61 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 61.6/61.6 kB 5.7 MB/s eta 0:00:00 Downloading nodeenv-1.8.0-py2.py3-none-any.whl (22 kB) Downloading packaging-24.0-py3-none-any.whl (53 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 53.5/53.5 kB 4.6 MB/s eta 0:00:00 Downloading pexpect-4.9.0-py2.py3-none-any.whl (63 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 63.8/63.8 kB 6.2 MB/s eta 0:00:00 Downloading pluggy-1.4.0-py3-none-any.whl (20 kB) Downloading PyYAML-6.0.1-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (757 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 757.7/757.7 kB 41.2 MB/s eta 0:00:00 Downloading urllib3-2.2.1-py3-none-any.whl (121 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 121.1/121.1 kB 3.1 MB/s eta 0:00:00 Downloading virtualenv-20.25.1-py3-none-any.whl (3.8 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 3.8/3.8 MB 62.0 MB/s eta 0:00:00 Downloading iniconfig-2.0.0-py3-none-any.whl (5.9 kB) Downloading python_daemon-3.0.1-py3-none-any.whl (31 kB) Downloading six-1.16.0-py2.py3-none-any.whl (11 kB) Downloading distlib-0.3.8-py2.py3-none-any.whl (468 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 468.9/468.9 kB 28.4 MB/s eta 0:00:00 Downloading filelock-3.13.3-py3-none-any.whl (11 kB) Downloading lockfile-0.12.2-py2.py3-none-any.whl (13 kB) Downloading platformdirs-4.2.0-py3-none-any.whl (17 kB) Downloading ptyprocess-0.7.0-py2.py3-none-any.whl (13 kB) Downloading resolvelib-0.5.4-py2.py3-none-any.whl (12 kB) Using cached setuptools-69.2.0-py3-none-any.whl (821 kB) Downloading cryptography-42.0.5-cp39-abi3-manylinux_2_28_x86_64.whl (4.6 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 4.6/4.6 MB 69.6 MB/s eta 0:00:00 Downloading docutils-0.20.1-py3-none-any.whl (572 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 572.7/572.7 kB 18.1 MB/s eta 0:00:00 Downloading Jinja2-3.1.3-py3-none-any.whl (133 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 133.2/133.2 kB 10.7 MB/s eta 0:00:00 Downloading cffi-1.16.0-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (464 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 464.8/464.8 kB 30.8 MB/s eta 0:00:00 Downloading MarkupSafe-2.1.5-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (28 kB) Downloading pycparser-2.22-py3-none-any.whl (117 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 117.6/117.6 kB 8.9 MB/s eta 0:00:00 Building wheels for collected packages: ansible, ansible-core Building wheel for ansible (pyproject.toml): started Building wheel for ansible (pyproject.toml): still running... Building wheel for ansible (pyproject.toml): finished with status 'done' Created wheel for ansible: filename=ansible-4.4.0-py3-none-any.whl size=58221786 sha256=c3b94960fed2f8c58208be3f235fb67b2d9b793d793544c6c1f47e383d1039e2 Stored in directory: /alabama/.cache/pip/wheels/a9/4b/4c/a8a1d660e94dd77f527528cb79fb998e7fecd5caf8366e1e0d Building wheel for ansible-core (pyproject.toml): started Building wheel for ansible-core (pyproject.toml): finished with status 'done' Created wheel for ansible-core: filename=ansible_core-2.11.12-py3-none-any.whl size=1960954 sha256=b18501892e641809d2a91342712de07d47b971611582ad3736720714d36a0d23 Stored in directory: /alabama/.cache/pip/wheels/3d/3e/04/62bf38af3a3bb2162e12579c66440d1800e11e1f42572ff9d0 Successfully built ansible ansible-core Installing collected packages: resolvelib, ptyprocess, lockfile, distlib, urllib3, six, setuptools, pyyaml, pycparser, pluggy, platformdirs, pexpect, packaging, MarkupSafe, jmespath, iniconfig, idna, identify, filelock, docutils, charset-normalizer, cfgv, certifi, virtualenv, requests, python-daemon, pytest, nodeenv, jinja2, cffi, pytest-rerunfailures, pre-commit, cryptography, ansible-runner, ansible-core, ansible Successfully installed MarkupSafe-2.1.5 ansible-4.4.0 ansible-core-2.11.12 ansible-runner-2.3.6 certifi-2024.2.2 cffi-1.16.0 cfgv-3.4.0 charset-normalizer-3.3.2 cryptography-42.0.5 distlib-0.3.8 docutils-0.20.1 filelock-3.13.3 identify-2.5.35 idna-3.6 iniconfig-2.0.0 jinja2-3.1.3 jmespath-1.0.1 lockfile-0.12.2 nodeenv-1.8.0 packaging-24.0 pexpect-4.9.0 platformdirs-4.2.0 pluggy-1.4.0 pre-commit-3.7.0 ptyprocess-0.7.0 pycparser-2.22 pytest-8.1.1 pytest-rerunfailures-14.0 python-daemon-3.0.1 pyyaml-6.0.1 requests-2.31.0 resolvelib-0.5.4 setuptools-65.5.0 six-1.16.0 urllib3-2.2.1 virtualenv-20.25.1 Processing /mtc-apps-deployer Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Building wheels for collected packages: ocpdeployer Building wheel for ocpdeployer (pyproject.toml): started Building wheel for ocpdeployer (pyproject.toml): finished with status 'done' Created wheel for ocpdeployer: filename=ocpdeployer-0.0.1-py2.py3-none-any.whl size=87303 sha256=a430dfeaa10607f27dc2b8fa9cd2934cf81bdaf421d3e30b6eaa549417ec9e4a Stored in directory: /alabama/.cache/pip/wheels/a5/d5/1e/0e1b7c5d0565ce18da090e556f5635431975eb81d49e457e4e Successfully built ocpdeployer Installing collected packages: ocpdeployer Successfully installed ocpdeployer-0.0.1 Processing /mtc-python-client Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Installing backend dependencies: started Installing backend dependencies: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Collecting suds-py3 (from mtc==0.0.1) Downloading suds_py3-1.4.5.0-py3-none-any.whl.metadata (778 bytes) Requirement already satisfied: requests in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from mtc==0.0.1) (2.31.0) Requirement already satisfied: jinja2 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from mtc==0.0.1) (3.1.3) Collecting kubernetes==11.0.0 (from mtc==0.0.1) Downloading kubernetes-11.0.0-py3-none-any.whl.metadata (1.5 kB) Collecting openshift==0.11.2 (from mtc==0.0.1) Downloading openshift-0.11.2.tar.gz (19 kB) Installing build dependencies: started Installing build dependencies: finished with status 'done' Getting requirements to build wheel: started Getting requirements to build wheel: finished with status 'done' Installing backend dependencies: started Installing backend dependencies: finished with status 'done' Preparing metadata (pyproject.toml): started Preparing metadata (pyproject.toml): finished with status 'done' Requirement already satisfied: certifi>=14.05.14 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from kubernetes==11.0.0->mtc==0.0.1) (2024.2.2) Requirement already satisfied: six>=1.9.0 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from kubernetes==11.0.0->mtc==0.0.1) (1.16.0) Collecting python-dateutil>=2.5.3 (from kubernetes==11.0.0->mtc==0.0.1) Downloading python_dateutil-2.9.0.post0-py2.py3-none-any.whl.metadata (8.4 kB) Requirement already satisfied: setuptools>=21.0.0 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from kubernetes==11.0.0->mtc==0.0.1) (65.5.0) Requirement already satisfied: pyyaml>=3.12 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from kubernetes==11.0.0->mtc==0.0.1) (6.0.1) Collecting google-auth>=1.0.1 (from kubernetes==11.0.0->mtc==0.0.1) Downloading google_auth-2.29.0-py2.py3-none-any.whl.metadata (4.7 kB) Collecting websocket-client!=0.40.0,!=0.41.*,!=0.42.*,>=0.32.0 (from kubernetes==11.0.0->mtc==0.0.1) Downloading websocket_client-1.7.0-py3-none-any.whl.metadata (7.9 kB) Collecting requests-oauthlib (from kubernetes==11.0.0->mtc==0.0.1) Downloading requests_oauthlib-2.0.0-py2.py3-none-any.whl.metadata (11 kB) Requirement already satisfied: urllib3>=1.24.2 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from kubernetes==11.0.0->mtc==0.0.1) (2.2.1) Collecting python-string-utils (from openshift==0.11.2->mtc==0.0.1) Downloading python_string_utils-1.0.0-py3-none-any.whl.metadata (12 kB) Collecting ruamel.yaml>=0.15 (from openshift==0.11.2->mtc==0.0.1) Downloading ruamel.yaml-0.18.6-py3-none-any.whl.metadata (23 kB) Requirement already satisfied: MarkupSafe>=2.0 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from jinja2->mtc==0.0.1) (2.1.5) Requirement already satisfied: charset-normalizer<4,>=2 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from requests->mtc==0.0.1) (3.3.2) Requirement already satisfied: idna<4,>=2.5 in /mtc-e2e-qev2/venv/lib/python3.11/site-packages (from requests->mtc==0.0.1) (3.6) Collecting cachetools<6.0,>=2.0.0 (from google-auth>=1.0.1->kubernetes==11.0.0->mtc==0.0.1) Downloading cachetools-5.3.3-py3-none-any.whl.metadata (5.3 kB) Collecting pyasn1-modules>=0.2.1 (from google-auth>=1.0.1->kubernetes==11.0.0->mtc==0.0.1) Downloading pyasn1_modules-0.4.0-py3-none-any.whl.metadata (3.4 kB) Collecting rsa<5,>=3.1.4 (from google-auth>=1.0.1->kubernetes==11.0.0->mtc==0.0.1) Downloading rsa-4.9-py3-none-any.whl.metadata (4.2 kB) Collecting ruamel.yaml.clib>=0.2.7 (from ruamel.yaml>=0.15->openshift==0.11.2->mtc==0.0.1) Downloading ruamel.yaml.clib-0.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl.metadata (2.2 kB) Collecting oauthlib>=3.0.0 (from requests-oauthlib->kubernetes==11.0.0->mtc==0.0.1) Downloading oauthlib-3.2.2-py3-none-any.whl.metadata (7.5 kB) Collecting pyasn1<0.7.0,>=0.4.6 (from pyasn1-modules>=0.2.1->google-auth>=1.0.1->kubernetes==11.0.0->mtc==0.0.1) Downloading pyasn1-0.6.0-py2.py3-none-any.whl.metadata (8.3 kB) Downloading kubernetes-11.0.0-py3-none-any.whl (1.5 MB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 1.5/1.5 MB 14.8 MB/s eta 0:00:00 Downloading suds_py3-1.4.5.0-py3-none-any.whl (298 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 298.8/298.8 kB 7.0 MB/s eta 0:00:00 Downloading google_auth-2.29.0-py2.py3-none-any.whl (189 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 189.2/189.2 kB 2.3 MB/s eta 0:00:00 Downloading python_dateutil-2.9.0.post0-py2.py3-none-any.whl (229 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 229.9/229.9 kB 573.5 kB/s eta 0:00:00 Downloading ruamel.yaml-0.18.6-py3-none-any.whl (117 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 117.8/117.8 kB 969.3 kB/s eta 0:00:00 Downloading websocket_client-1.7.0-py3-none-any.whl (58 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 58.5/58.5 kB 258.1 kB/s eta 0:00:00 Downloading python_string_utils-1.0.0-py3-none-any.whl (26 kB) Downloading requests_oauthlib-2.0.0-py2.py3-none-any.whl (24 kB) Downloading cachetools-5.3.3-py3-none-any.whl (9.3 kB) Downloading oauthlib-3.2.2-py3-none-any.whl (151 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 151.7/151.7 kB 5.7 MB/s eta 0:00:00 Downloading pyasn1_modules-0.4.0-py3-none-any.whl (181 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 181.2/181.2 kB 2.9 MB/s eta 0:00:00 Downloading rsa-4.9-py3-none-any.whl (34 kB) Downloading ruamel.yaml.clib-0.2.8-cp311-cp311-manylinux_2_17_x86_64.manylinux2014_x86_64.manylinux_2_24_x86_64.whl (544 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 544.0/544.0 kB 4.4 MB/s eta 0:00:00 Downloading pyasn1-0.6.0-py2.py3-none-any.whl (85 kB) ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 85.3/85.3 kB 1.5 MB/s eta 0:00:00 Building wheels for collected packages: mtc, openshift Building wheel for mtc (pyproject.toml): started Building wheel for mtc (pyproject.toml): finished with status 'done' Created wheel for mtc: filename=mtc-0.0.1-py3-none-any.whl size=30973 sha256=5d4dbe689888d89eb14c07b9d1ff5ae072b3aff51e0aa5672cfbd56532017461 Stored in directory: /alabama/.cache/pip/wheels/e1/98/5c/be40f505fcb26ada945b520c235f77481ebe8ad452170000f0 Building wheel for openshift (pyproject.toml): started Building wheel for openshift (pyproject.toml): finished with status 'done' Created wheel for openshift: filename=openshift-0.11.2-py3-none-any.whl size=19905 sha256=4751286c37467b9f79f8b827d3c7917e7d64b30e4900120c967685b8bfdb1f78 Stored in directory: /alabama/.cache/pip/wheels/56/d5/ca/4237e0b01d25fb1a13f79aaaacf9958447f6dbe824a39cc089 Successfully built mtc openshift Installing collected packages: suds-py3, websocket-client, ruamel.yaml.clib, python-string-utils, python-dateutil, pyasn1, oauthlib, cachetools, ruamel.yaml, rsa, requests-oauthlib, pyasn1-modules, google-auth, kubernetes, openshift, mtc Successfully installed cachetools-5.3.3 google-auth-2.29.0 kubernetes-11.0.0 mtc-0.0.1 oauthlib-3.2.2 openshift-0.11.2 pyasn1-0.6.0 pyasn1-modules-0.4.0 python-dateutil-2.9.0.post0 python-string-utils-1.0.0 requests-oauthlib-2.0.0 rsa-4.9 ruamel.yaml-0.18.6 ruamel.yaml.clib-0.2.8 suds-py3-1.4.5.0 websocket-client-1.7.0 Logging into source cluster. Login successful. You have access to 71 projects, the list has been suppressed. You can list all projects with 'oc projects' Using project "default". Executing tests. ============================= test session starts ============================== platform linux -- Python 3.11.8, pytest-8.1.1, pluggy-1.4.0 rootdir: /mtc-e2e-qev2 configfile: pytest.ini plugins: rerunfailures-14.0 collected 6 items mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_87_interop -------------------------------- live log setup -------------------------------- 19:28:54.754 - INFO: Removing namespace fixture: longnameprojecttest-longnameprojecttest-longnameprojecttest-123 from cluster source-cluster 19:28:55.455 - INFO: Removing namespace fixture: longnameprojecttest-longnameprojecttest-longnameprojecttest-123 from cluster host 19:28:56.222 - INFO: Removing app [ocp-attached-pvc] in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] from cluster [host] 19:28:56.222 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:28:56.840 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:28:56.844 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:28:56.844 - INFO: the implicit localhost does not match 'all' 19:28:56.953 - INFO: [WARNING]: Found variable using reserved name: namespace 19:28:56.953 - INFO: 19:28:56.954 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:28:58.044 - INFO: 19:28:58.044 - INFO: TASK [Gathering Facts] ********************************************************* 19:28:58.044 - INFO: ok: [localhost] 19:28:58.068 - INFO: 19:28:58.068 - INFO: TASK [include_vars] ************************************************************ 19:28:58.068 - INFO: ok: [localhost] 19:28:59.893 - INFO: 19:28:59.893 - INFO: TASK [ocp-attached-pvc : Remove namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:28:59.893 - INFO: ok: [localhost] 19:29:01.915 - INFO: 19:29:01.915 - INFO: TASK [Remove Namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:29:01.915 - INFO: ok: [localhost] 19:29:01.930 - INFO: 19:29:01.930 - INFO: PLAY RECAP ********************************************************************* 19:29:01.931 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:29:02.093 - DEBUG: Removed private data directory: /tmp/tmpma1d3h2i 19:29:02.094 - INFO: Removing app [ocp-attached-pvc] in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] from cluster [source-cluster] 19:29:02.094 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:29:02.693 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:29:02.696 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:29:02.696 - INFO: the implicit localhost does not match 'all' 19:29:02.796 - INFO: [WARNING]: Found variable using reserved name: namespace 19:29:02.796 - INFO: 19:29:02.796 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:29:03.781 - INFO: 19:29:03.781 - INFO: TASK [Gathering Facts] ********************************************************* 19:29:03.781 - INFO: ok: [localhost] 19:29:03.801 - INFO: 19:29:03.801 - INFO: TASK [include_vars] ************************************************************ 19:29:03.801 - INFO: ok: [localhost] 19:29:05.774 - INFO: 19:29:05.774 - INFO: TASK [ocp-attached-pvc : Remove namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:29:05.775 - INFO: ok: [localhost] 19:29:07.818 - INFO: 19:29:07.819 - INFO: TASK [Remove Namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:29:07.819 - INFO: ok: [localhost] 19:29:07.831 - INFO: 19:29:07.831 - INFO: PLAY RECAP ********************************************************************* 19:29:07.831 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:29:07.984 - DEBUG: Removed private data directory: /tmp/tmp6he6y7vg 19:29:07.984 - INFO: Deploying app [ocp-attached-pvc] in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] in cluster [source-cluster] 19:29:07.985 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:29:08.598 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:29:08.601 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:29:08.602 - INFO: the implicit localhost does not match 'all' 19:29:08.721 - INFO: [WARNING]: Found variable using reserved name: namespace 19:29:08.721 - INFO: 19:29:08.721 - INFO: PLAY [Deploy Application] ****************************************************** 19:29:09.886 - INFO: 19:29:09.886 - INFO: TASK [Gathering Facts] ********************************************************* 19:29:09.886 - INFO: ok: [localhost] 19:29:09.938 - INFO: 19:29:09.939 - INFO: TASK [include_vars] ************************************************************ 19:29:09.939 - INFO: ok: [localhost] 19:29:12.502 - INFO: 19:29:12.503 - INFO: TASK [ocp-attached-pvc : Check namespace] ************************************** 19:29:12.503 - INFO: ok: [localhost] 19:29:13.459 - INFO: 19:29:13.460 - INFO: TASK [ocp-attached-pvc : Create namespace] ************************************* 19:29:13.460 - INFO: changed: [localhost] 19:29:15.896 - INFO: 19:29:15.897 - INFO: TASK [ocp-attached-pvc : Create the pvc-attached application resources] ******** 19:29:15.897 - INFO: fatal: [localhost]: FAILED! => {"changed": false, "error": 422, "msg": "Failed to create object: b'{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"Deployment.apps \\\\\"attached-pvc\\\\\" is invalid: spec.template.spec.containers[0].restartPolicy: Forbidden: may not be set for non-init containers\",\"reason\":\"Invalid\",\"details\":{\"name\":\"attached-pvc\",\"group\":\"apps\",\"kind\":\"Deployment\",\"causes\":[{\"reason\":\"FieldValueForbidden\",\"message\":\"Forbidden: may not be set for non-init containers\",\"field\":\"spec.template.spec.containers[0].restartPolicy\"}]},\"code\":422}\\n'", "reason": "Unprocessable Entity", "status": 422} 19:29:15.901 - INFO: 19:29:15.901 - INFO: PLAY RECAP ********************************************************************* 19:29:15.901 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 19:29:16.111 - DEBUG: Removed private data directory: /tmp/tmpbna9xll7 19:29:16.209 - INFO: Migplan test-interop-migplan has been created 19:29:16.210 - INFO: Migplan IDIM: False 19:29:16.210 - INFO: Migplan IDVM: False 19:29:16.271 - INFO: Waiting for Ready status... 19:29:26.285 - INFO: Waiting for Ready status... 19:29:36.300 - INFO: The migration plan is Ready. ------------------------------ live log logreport ------------------------------ 19:29:36.301 - INFO: -------------------------------- live log call --------------------------------- 19:29:36.302 - INFO: Migrating from ns:longnameprojecttest-longnameprojecttest-longnameprojecttest-123 in cluster:source-cluster to ns:longnameprojecttest-longnameprojecttest-longnameprojecttest-123 in cluster:host 19:29:36.302 - INFO: Migplan test-interop-migplan. Wait until ready 19:29:36.316 - INFO: The migration plan is Ready. 19:29:36.316 - INFO: MIGPLAN READY 19:29:46.391 - INFO: RIGHT WARNINGS IN MIGPLAN: [{'category': 'Warn', 'lastTransitionTime': '2024-04-02T19:29:23Z', 'message': 'Namespaces [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] exceed 59 characters and no destination cluster route subdomain was configured. Direct Volume Migration may fail if you do not set `cluster_subdomain` value on the `MigrationController` CR.', 'reason': 'LengthExceeded', 'status': 'True', 'type': 'NamespaceLengthExceeded'}] 19:29:46.391 - INFO: EXECUTE MIGRATION 19:29:46.441 - INFO: Not started. Waiting... 19:29:56.456 - INFO: Step: 14/49. Waiting... 19:30:06.471 - INFO: Step: 14/49. Waiting... 19:30:16.487 - INFO: Step: 14/49. Waiting... 19:30:26.502 - INFO: Step: 41/49. Waiting... 19:30:36.518 - INFO: Finished. 19:30:36.518 - INFO: VALIDATE APPLICATION 19:30:36.518 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 19:30:37.108 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:30:37.112 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:30:37.112 - INFO: the implicit localhost does not match 'all' 19:30:37.213 - INFO: [WARNING]: Found variable using reserved name: namespace 19:30:37.214 - INFO: 19:30:37.214 - INFO: PLAY [Validate application] **************************************************** 19:30:38.149 - INFO: 19:30:38.150 - INFO: TASK [Gathering Facts] ********************************************************* 19:30:38.150 - INFO: ok: [localhost] 19:30:38.169 - INFO: 19:30:38.170 - INFO: TASK [include_vars] ************************************************************ 19:30:38.170 - INFO: ok: [localhost] 19:30:40.117 - INFO: FAILED - RETRYING: Check pod status (40 retries left). 19:30:46.699 - INFO: FAILED - RETRYING: Check pod status (39 retries left). 19:30:53.285 - INFO: FAILED - RETRYING: Check pod status (38 retries left). 19:30:59.847 - INFO: FAILED - RETRYING: Check pod status (37 retries left). 19:31:06.464 - INFO: FAILED - RETRYING: Check pod status (36 retries left). 19:31:13.037 - INFO: FAILED - RETRYING: Check pod status (35 retries left). 19:31:19.671 - INFO: FAILED - RETRYING: Check pod status (34 retries left). 19:31:26.262 - INFO: FAILED - RETRYING: Check pod status (33 retries left). 19:31:32.839 - INFO: FAILED - RETRYING: Check pod status (32 retries left). 19:31:39.417 - INFO: FAILED - RETRYING: Check pod status (31 retries left). 19:31:46.044 - INFO: FAILED - RETRYING: Check pod status (30 retries left). 19:31:52.659 - INFO: FAILED - RETRYING: Check pod status (29 retries left). 19:31:59.291 - INFO: FAILED - RETRYING: Check pod status (28 retries left). 19:32:05.889 - INFO: FAILED - RETRYING: Check pod status (27 retries left). 19:32:12.476 - INFO: FAILED - RETRYING: Check pod status (26 retries left). 19:32:19.027 - INFO: FAILED - RETRYING: Check pod status (25 retries left). 19:32:25.593 - INFO: FAILED - RETRYING: Check pod status (24 retries left). 19:32:32.162 - INFO: FAILED - RETRYING: Check pod status (23 retries left). 19:32:38.789 - INFO: FAILED - RETRYING: Check pod status (22 retries left). 19:32:45.386 - INFO: FAILED - RETRYING: Check pod status (21 retries left). 19:32:51.967 - INFO: FAILED - RETRYING: Check pod status (20 retries left). 19:32:58.547 - INFO: FAILED - RETRYING: Check pod status (19 retries left). 19:33:05.130 - INFO: FAILED - RETRYING: Check pod status (18 retries left). 19:33:11.716 - INFO: FAILED - RETRYING: Check pod status (17 retries left). 19:33:18.308 - INFO: FAILED - RETRYING: Check pod status (16 retries left). 19:33:24.878 - INFO: FAILED - RETRYING: Check pod status (15 retries left). 19:33:31.455 - INFO: FAILED - RETRYING: Check pod status (14 retries left). 19:33:38.036 - INFO: FAILED - RETRYING: Check pod status (13 retries left). 19:33:44.596 - INFO: FAILED - RETRYING: Check pod status (12 retries left). 19:33:51.221 - INFO: FAILED - RETRYING: Check pod status (11 retries left). 19:33:57.823 - INFO: FAILED - RETRYING: Check pod status (10 retries left). 19:34:04.415 - INFO: FAILED - RETRYING: Check pod status (9 retries left). 19:34:10.993 - INFO: FAILED - RETRYING: Check pod status (8 retries left). 19:34:17.606 - INFO: FAILED - RETRYING: Check pod status (7 retries left). 19:34:24.172 - INFO: FAILED - RETRYING: Check pod status (6 retries left). 19:34:30.780 - INFO: FAILED - RETRYING: Check pod status (5 retries left). 19:34:37.353 - INFO: FAILED - RETRYING: Check pod status (4 retries left). 19:34:43.915 - INFO: FAILED - RETRYING: Check pod status (3 retries left). 19:34:50.516 - INFO: FAILED - RETRYING: Check pod status (2 retries left). 19:34:57.079 - INFO: FAILED - RETRYING: Check pod status (1 retries left). 19:35:03.717 - INFO: 19:35:03.717 - INFO: TASK [ocp-attached-pvc : Check pod status] ************************************* 19:35:03.718 - INFO: fatal: [localhost]: FAILED! => {"api_found": true, "attempts": 40, "changed": false, "resources": []} 19:35:03.719 - INFO: 19:35:03.719 - INFO: PLAY RECAP ********************************************************************* 19:35:03.719 - INFO: localhost : ok=2  changed=0 unreachable=0 failed=1  skipped=7  rescued=0 ignored=0 19:35:03.874 - DEBUG: Removed private data directory: /tmp/tmpgfaexoyy FAILED------------------------------ live log logreport ------------------------------ 19:35:03.991 - INFO: migplan = apps = [] migrated_namespaces = 'longnameprojecttest-longnameprojecttest-longnameprojecttest-123:longnameprojecttest-longnameprojecttest-longnameprojecttest-123' src_cluster = tgt_cluster = pytestconfig = <_pytest.config.Config object at 0x7f84215e7710> @pytest.mark.app( app_id="ocp-attached-pvc", app_namespace="longnameprojecttest-longnameprojecttest-longnameprojecttest-123" ) @pytest.mark.migrated_namespaces( "longnameprojecttest-longnameprojecttest-longnameprojecttest-123:longnameprojecttest-longnameprojecttest-longnameprojecttest-123" ) def test_mtc_87_interop(migplan, apps, migrated_namespaces, src_cluster, tgt_cluster, pytestconfig): tgt_namespace = get_target_namespace_from_namespace(migrated_namespaces) src_namespace = get_source_namespace_from_namespace(migrated_namespaces) app = apps[0] logger.info( "Migrating from ns:{0} in cluster:{1} to ns:{2} in cluster:{3}".format( src_namespace, src_cluster.name, tgt_namespace, tgt_cluster.name ) ) logger.info("Migplan {0}. Wait until ready".format(migplan.name)) ready = migplan.wait_until_ready() # Assert that plan is in Ready status assert ready, "Migplan must be ready:{0}".format(pretty(migplan.definition)) logger.info("MIGPLAN READY") idvm = pytestconfig.getoption("--idvm") if not idvm: migplan.refresh() has_warnings = migplan.wait_until_warnings() assert has_warnings, "There should be a warning {0}".format(pretty(migplan.definition)) warning_type_expected = "NamespaceLengthExceeded" warnings = migplan.get_warnings(warning_type=warning_type_expected) assert len(warnings) == 1, "There should be a {1} warning {0}".format( pretty(migplan.definition), warning_type_expected ) logger.info("RIGHT WARNINGS IN MIGPLAN: {0}".format(warnings)) logger.info("EXECUTE MIGRATION") migmigration = migplan.migrate(quiesce=True) success = migmigration.wait_until_success() assert success, "The migration should succeed. {0}".format(pretty(migmigration.definition)) migrated_app = get_migrated_app(app, tgt_cluster, tgt_namespace) logger.info("VALIDATE APPLICATION") ok = migrated_app.validate() > assert ok, "The application should be validated OK in the target cluster" E AssertionError: The application should be validated OK in the target cluster E assert False mtc-e2e-qev2/mtc_tests/tests/test_interop.py:61: AssertionError ------------------------------ live log teardown ------------------------------- 19:35:03.991 - INFO: Deleting Migplan test-interop-migplan... 19:35:14.126 - INFO: Removing app in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] from cluster [host] 19:35:14.126 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:35:14.720 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:35:14.724 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:35:14.724 - INFO: the implicit localhost does not match 'all' 19:35:14.824 - INFO: [WARNING]: Found variable using reserved name: namespace 19:35:14.825 - INFO: 19:35:14.825 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:35:15.750 - INFO: 19:35:15.751 - INFO: TASK [Gathering Facts] ********************************************************* 19:35:15.751 - INFO: ok: [localhost] 19:35:15.771 - INFO: 19:35:15.771 - INFO: TASK [include_vars] ************************************************************ 19:35:15.771 - INFO: ok: [localhost] 19:35:27.575 - INFO: 19:35:27.576 - INFO: TASK [ocp-attached-pvc : Remove namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:35:27.576 - INFO: changed: [localhost] 19:35:29.462 - INFO: 19:35:29.463 - INFO: TASK [Remove Namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:35:29.463 - INFO: ok: [localhost] 19:35:29.476 - INFO: 19:35:29.476 - INFO: PLAY RECAP ********************************************************************* 19:35:29.476 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:35:29.629 - DEBUG: Removed private data directory: /tmp/tmpcpwbteol 19:35:29.630 - INFO: Removing app in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] from cluster [source-cluster] 19:35:29.630 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:35:30.217 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:35:30.220 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:35:30.220 - INFO: the implicit localhost does not match 'all' 19:35:30.320 - INFO: [WARNING]: Found variable using reserved name: namespace 19:35:30.320 - INFO: 19:35:30.321 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:35:31.248 - INFO: 19:35:31.248 - INFO: TASK [Gathering Facts] ********************************************************* 19:35:31.248 - INFO: ok: [localhost] 19:35:31.268 - INFO: 19:35:31.268 - INFO: TASK [include_vars] ************************************************************ 19:35:31.268 - INFO: ok: [localhost] 19:35:48.067 - INFO: 19:35:48.067 - INFO: TASK [ocp-attached-pvc : Remove namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:35:48.067 - INFO: changed: [localhost] 19:35:49.913 - INFO: 19:35:49.913 - INFO: TASK [Remove Namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:35:49.913 - INFO: ok: [localhost] 19:35:49.926 - INFO: 19:35:49.926 - INFO: PLAY RECAP ********************************************************************* 19:35:49.926 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:35:50.075 - DEBUG: Removed private data directory: /tmp/tmp4wgohcq6 19:35:50.076 - INFO: Removing namespace fixture: longnameprojecttest-longnameprojecttest-longnameprojecttest-123 from cluster source-cluster 19:35:50.122 - INFO: Removing namespace fixture: longnameprojecttest-longnameprojecttest-longnameprojecttest-123 from cluster host ------------------------------ live log logreport ------------------------------ 19:35:50.165 - INFO: mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_98_interop -------------------------------- live log setup -------------------------------- 19:35:53.728 - INFO: Removing namespace fixture: ocp-41968-nsmap-1 from cluster source-cluster 19:35:54.386 - INFO: Removing namespace fixture: ocp-41968-nsmap-1 from cluster host 19:35:55.131 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-3] from cluster [host] 19:35:55.145 - INFO: Waiting for resources to be deleted 19:35:55.160 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-3] from cluster [source-cluster] 19:35:55.175 - INFO: Waiting for resources to be deleted 19:35:55.189 - INFO: Deploying app [empty-namespace] in namespace [ocp-41968-nsmap-3] in cluster [source-cluster] 19:35:55.203 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:35:55.203 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:35:55.414 - INFO: Deployed namespace ocp-41968-nsmap-3 in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:35:55.415 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-2] from cluster [host] 19:35:55.430 - INFO: Waiting for resources to be deleted 19:35:55.445 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-2] from cluster [source-cluster] 19:35:55.460 - INFO: Waiting for resources to be deleted 19:35:55.475 - INFO: Deploying app [empty-namespace] in namespace [ocp-41968-nsmap-2] in cluster [source-cluster] 19:35:55.489 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:35:55.489 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:35:55.658 - INFO: Deployed namespace ocp-41968-nsmap-2 in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:35:55.659 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-1] from cluster [host] 19:35:55.673 - INFO: Waiting for resources to be deleted 19:35:55.688 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-1] from cluster [source-cluster] 19:35:55.703 - INFO: Waiting for resources to be deleted 19:35:55.717 - INFO: Deploying app [empty-namespace] in namespace [ocp-41968-nsmap-1] in cluster [source-cluster] 19:35:55.731 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:35:55.731 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:35:55.973 - INFO: Deployed namespace ocp-41968-nsmap-1 in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:35:56.056 - INFO: Migplan test-interop-migplan has been created 19:35:56.057 - INFO: Migplan IDIM: False 19:35:56.057 - INFO: Migplan IDVM: False 19:35:56.115 - INFO: Waiting for Ready status... 19:36:06.129 - INFO: Waiting for Ready status... 19:36:16.143 - INFO: The migration plan is Ready. ------------------------------ live log logreport ------------------------------ 19:36:16.143 - INFO: -------------------------------- live log call --------------------------------- 19:36:16.144 - INFO: Migrating from ns:ocp-41968-nsmap-1 in cluster:source-cluster to ns:ocp-41968-nsmap-1 in cluster:host 19:36:16.144 - INFO: Migplan test-interop-migplan. Wait until ready 19:36:16.159 - INFO: The migration plan is Ready. 19:36:16.159 - INFO: MIGPLAN READY 19:36:16.159 - INFO: CHECKING FOR WARNINGS 19:36:16.159 - INFO: NO WARNINGS FOUND 19:36:16.159 - INFO: FIRST TIME PATCHING WITH WRONG 19:36:26.220 - INFO: { "apiVersion": "migration.openshift.io/v1alpha1", "kind": "MigPlan", "metadata": { "annotations": { "kubectl.kubernetes.io/last-applied-configuration": "{\"apiVersion\":\"migration.openshift.io/v1alpha1\",\"kind\":\"MigPlan\",\"metadata\":{\"name\":\"test-interop-migplan\",\"namespace\":\"openshift-migration\"},\"spec\":{\"destMigClusterRef\":{\"name\":\"host\",\"namespace\":\"openshift-migration\"},\"migStorageRef\":{\"name\":\"minio.automation.test\",\"namespace\":\"openshift-migration\"},\"namespaces\":[\"ocp-41968-nsmap-1\",\"ocp-41968-nsmap-2:ocp-41968-nsmap-1\"],\"refresh\":true,\"srcMigClusterRef\":{\"name\":\"source-cluster\",\"namespace\":\"openshift-migration\"}}}", "migration.openshift.io/selected-migplan-type": "full", "openshift.io/touch": "4a0d3be6-f128-11ee-bfbf-0a580a830017" }, "creationTimestamp": "2024-04-02T19:35:56Z", "generation": 5, "labels": { "controller-tools.k8s.io": "1.0" }, "managedFields": [ { "apiVersion": "migration.openshift.io/v1alpha1", "fieldsType": "FieldsV1", "fieldsV1": { "f:metadata": { "f:annotations": { ".": {}, "f:kubectl.kubernetes.io/last-applied-configuration": {}, "f:migration.openshift.io/selected-migplan-type": {} }, "f:labels": { ".": {}, "f:controller-tools.k8s.io": {} } }, "f:spec": { ".": {}, "f:destMigClusterRef": { ".": {}, "f:name": {}, "f:namespace": {} }, "f:migStorageRef": { ".": {}, "f:name": {}, "f:namespace": {} }, "f:namespaces": {}, "f:srcMigClusterRef": { ".": {}, "f:name": {}, "f:namespace": {} } } }, "manager": "OpenAPI-Generator", "operation": "Update", "time": "2024-04-02T19:36:16Z" }, { "apiVersion": "migration.openshift.io/v1alpha1", "fieldsType": "FieldsV1", "fieldsV1": { "f:metadata": { "f:annotations": { "f:openshift.io/touch": {} } }, "f:status": { ".": {}, "f:conditions": {}, "f:destStorageClasses": {}, "f:excludedResources": {}, "f:observedDigest": {}, "f:srcStorageClasses": {} } }, "manager": "manager", "operation": "Update", "time": "2024-04-02T19:36:23Z" } ], "name": "test-interop-migplan", "namespace": "openshift-migration", "resourceVersion": "43277", "uid": "4225e85b-2ab6-465b-886b-eae8a601003e" }, "spec": { "destMigClusterRef": { "name": "host", "namespace": "openshift-migration" }, "migStorageRef": { "name": "minio.automation.test", "namespace": "openshift-migration" }, "namespaces": [ "ocp-41968-nsmap-1", "ocp-41968-nsmap-2:ocp-41968-nsmap-1" ], "srcMigClusterRef": { "name": "source-cluster", "namespace": "openshift-migration" } }, "status": { "conditions": [ { "category": "Required", "lastTransitionTime": "2024-04-02T19:36:11Z", "message": "The `persistentVolumes` list has been updated with discovered PVs.", "reason": "Done", "status": "True", "type": "PvsDiscovered" }, { "category": "Required", "lastTransitionTime": "2024-04-02T19:36:11Z", "message": "The storage resources have been created.", "reason": "Done", "status": "True", "type": "StorageEnsured" }, { "category": "Critical", "lastTransitionTime": "2024-04-02T19:36:16Z", "message": "Duplicate destination cluster namespaces [ocp-41968-nsmap-1] in migplan.", "reason": "DuplicateNamespaces", "status": "True", "type": "DuplicateNamespaceOnDestinationCluster" } ], "destStorageClasses": [ { "accessModes": [ "ReadWriteOnce" ], "name": "gp2-csi", "provisioner": "ebs.csi.aws.com" }, { "accessModes": [ "ReadWriteOnce" ], "default": true, "name": "gp3-csi", "provisioner": "ebs.csi.aws.com" } ], "excludedResources": [ "imagetags", "templateinstances", "clusterserviceversions", "packagemanifests", "subscriptions", "servicebrokers", "servicebindings", "serviceclasses", "serviceinstances", "serviceplans", "operatorgroups", "events", "events.events.k8s.io", "rolebindings.authorization.openshift.io" ], "observedDigest": "81de117904227c9004f544bc042b92e27e9df8b0766cab00e66f1dcb5618a44d", "srcStorageClasses": [ { "accessModes": [ "ReadWriteOnce" ], "name": "gp2-csi", "provisioner": "ebs.csi.aws.com" }, { "accessModes": [ "ReadWriteOnce" ], "default": true, "name": "gp3-csi", "provisioner": "ebs.csi.aws.com" } ] } } 19:36:26.220 - INFO: SECOND TIME PATCHING CORRECT 19:36:36.281 - INFO: Migplan test-interop-migplan. Wait until ready 19:36:36.295 - INFO: The migration plan is Ready. 19:36:36.296 - INFO: MIGPLAN READY 19:36:36.296 - INFO: CHECKING FOR WARNINGS 19:36:36.296 - INFO: NO WARNINGS FOUND 19:36:36.296 - INFO: THIRD TIME PATCHING WRONG AGAIN 19:36:46.355 - INFO: { "apiVersion": "migration.openshift.io/v1alpha1", "kind": "MigPlan", "metadata": { "annotations": { "kubectl.kubernetes.io/last-applied-configuration": "{\"apiVersion\":\"migration.openshift.io/v1alpha1\",\"kind\":\"MigPlan\",\"metadata\":{\"name\":\"test-interop-migplan\",\"namespace\":\"openshift-migration\"},\"spec\":{\"destMigClusterRef\":{\"name\":\"host\",\"namespace\":\"openshift-migration\"},\"migStorageRef\":{\"name\":\"minio.automation.test\",\"namespace\":\"openshift-migration\"},\"namespaces\":[\"ocp-41968-nsmap-1:ocp-41968-nsmap-a\",\"ocp-41968-nsmap-2:ocp-41968-nsmap-a\"],\"refresh\":true,\"srcMigClusterRef\":{\"name\":\"source-cluster\",\"namespace\":\"openshift-migration\"}}}", "migration.openshift.io/selected-migplan-type": "full", "openshift.io/touch": "55f110a3-f128-11ee-bfbf-0a580a830017" }, "creationTimestamp": "2024-04-02T19:35:56Z", "generation": 9, "labels": { "controller-tools.k8s.io": "1.0" }, "managedFields": [ { "apiVersion": "migration.openshift.io/v1alpha1", "fieldsType": "FieldsV1", "fieldsV1": { "f:metadata": { "f:annotations": { ".": {}, "f:kubectl.kubernetes.io/last-applied-configuration": {}, "f:migration.openshift.io/selected-migplan-type": {} }, "f:labels": { ".": {}, "f:controller-tools.k8s.io": {} } }, "f:spec": { ".": {}, "f:destMigClusterRef": { ".": {}, "f:name": {}, "f:namespace": {} }, "f:migStorageRef": { ".": {}, "f:name": {}, "f:namespace": {} }, "f:namespaces": {}, "f:srcMigClusterRef": { ".": {}, "f:name": {}, "f:namespace": {} } } }, "manager": "OpenAPI-Generator", "operation": "Update", "time": "2024-04-02T19:36:36Z" }, { "apiVersion": "migration.openshift.io/v1alpha1", "fieldsType": "FieldsV1", "fieldsV1": { "f:metadata": { "f:annotations": { "f:openshift.io/touch": {} } }, "f:status": { ".": {}, "f:conditions": {}, "f:destStorageClasses": {}, "f:excludedResources": {}, "f:observedDigest": {}, "f:srcStorageClasses": {} } }, "manager": "manager", "operation": "Update", "time": "2024-04-02T19:36:43Z" } ], "name": "test-interop-migplan", "namespace": "openshift-migration", "resourceVersion": "43436", "uid": "4225e85b-2ab6-465b-886b-eae8a601003e" }, "spec": { "destMigClusterRef": { "name": "host", "namespace": "openshift-migration" }, "migStorageRef": { "name": "minio.automation.test", "namespace": "openshift-migration" }, "namespaces": [ "ocp-41968-nsmap-1:ocp-41968-nsmap-a", "ocp-41968-nsmap-2:ocp-41968-nsmap-a" ], "srcMigClusterRef": { "name": "source-cluster", "namespace": "openshift-migration" } }, "status": { "conditions": [ { "category": "Required", "lastTransitionTime": "2024-04-02T19:36:11Z", "message": "The `persistentVolumes` list has been updated with discovered PVs.", "reason": "Done", "status": "True", "type": "PvsDiscovered" }, { "category": "Required", "lastTransitionTime": "2024-04-02T19:36:11Z", "message": "The storage resources have been created.", "reason": "Done", "status": "True", "type": "StorageEnsured" }, { "category": "Critical", "lastTransitionTime": "2024-04-02T19:36:36Z", "message": "Duplicate destination cluster namespaces [ocp-41968-nsmap-a] in migplan.", "reason": "DuplicateNamespaces", "status": "True", "type": "DuplicateNamespaceOnDestinationCluster" } ], "destStorageClasses": [ { "accessModes": [ "ReadWriteOnce" ], "name": "gp2-csi", "provisioner": "ebs.csi.aws.com" }, { "accessModes": [ "ReadWriteOnce" ], "default": true, "name": "gp3-csi", "provisioner": "ebs.csi.aws.com" } ], "excludedResources": [ "imagetags", "templateinstances", "clusterserviceversions", "packagemanifests", "subscriptions", "servicebrokers", "servicebindings", "serviceclasses", "serviceinstances", "serviceplans", "operatorgroups", "events", "events.events.k8s.io", "rolebindings.authorization.openshift.io" ], "observedDigest": "6d46ee7080d20beb45908ec43740ef6ee7e8153a09c2288816312936a9d4a15d", "srcStorageClasses": [ { "accessModes": [ "ReadWriteOnce" ], "name": "gp2-csi", "provisioner": "ebs.csi.aws.com" }, { "accessModes": [ "ReadWriteOnce" ], "default": true, "name": "gp3-csi", "provisioner": "ebs.csi.aws.com" } ] } } 19:36:46.355 - INFO: FOURTH TIME PATCHING RIGHT AGAIN 19:36:56.415 - INFO: Migplan test-interop-migplan. Wait until ready 19:36:56.429 - INFO: The migration plan is Ready. 19:36:56.429 - INFO: MIGPLAN READY 19:36:56.429 - INFO: CHECKING FOR WARNINGS 19:36:56.429 - INFO: NO WARNINGS FOUND PASSED------------------------------ live log logreport ------------------------------ 19:36:56.430 - INFO: ------------------------------ live log teardown ------------------------------- 19:36:56.430 - INFO: Deleting Migplan test-interop-migplan... 19:37:06.538 - INFO: Removing app in namespace [ocp-41968-nsmap-3] from cluster [host] 19:37:06.555 - INFO: Waiting for resources to be deleted 19:37:06.569 - INFO: Removing app in namespace [ocp-41968-nsmap-3] from cluster [source-cluster] 19:37:06.585 - INFO: Removing namespace: ocp-41968-nsmap-3 19:37:06.606 - INFO: Waiting for resources to be deleted 19:37:13.723 - INFO: Removing app in namespace [ocp-41968-nsmap-2] from cluster [host] 19:37:13.738 - INFO: Waiting for resources to be deleted 19:37:13.753 - INFO: Removing app in namespace [ocp-41968-nsmap-2] from cluster [source-cluster] 19:37:13.767 - INFO: Removing namespace: ocp-41968-nsmap-2 19:37:13.785 - INFO: Waiting for resources to be deleted 19:37:20.907 - INFO: Removing app in namespace [ocp-41968-nsmap-1] from cluster [host] 19:37:20.925 - INFO: Waiting for resources to be deleted 19:37:20.939 - INFO: Removing app in namespace [ocp-41968-nsmap-1] from cluster [source-cluster] 19:37:20.954 - INFO: Removing namespace: ocp-41968-nsmap-1 19:37:20.971 - INFO: Waiting for resources to be deleted 19:37:28.105 - INFO: Removing namespace fixture: ocp-41968-nsmap-1 from cluster source-cluster 19:37:28.119 - INFO: Removing namespace fixture: ocp-41968-nsmap-1 from cluster host ------------------------------ live log logreport ------------------------------ 19:37:28.134 - INFO: mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_101_interop -------------------------------- live log setup -------------------------------- 19:37:31.762 - INFO: Removing namespace fixture: ocp-41965-a from cluster source-cluster 19:37:32.448 - INFO: Removing namespace fixture: ocp-41965-anew from cluster host 19:37:33.184 - INFO: Removing namespace fixture: ocp-41965-b from cluster source-cluster 19:37:33.198 - INFO: Removing namespace fixture: ocp-41965-bnew from cluster host 19:37:33.213 - INFO: Removing app [ocp-attached-pvc] in namespace [ocp-41965-b] from cluster [host] 19:37:33.214 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:37:33.803 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:33.807 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:33.807 - INFO: the implicit localhost does not match 'all' 19:37:33.907 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:33.908 - INFO: 19:37:33.908 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:37:34.849 - INFO: 19:37:34.849 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:34.849 - INFO: ok: [localhost] 19:37:34.869 - INFO: 19:37:34.869 - INFO: TASK [include_vars] ************************************************************ 19:37:34.869 - INFO: ok: [localhost] 19:37:36.645 - INFO: 19:37:36.645 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-b] ************************* 19:37:36.646 - INFO: ok: [localhost] 19:37:38.540 - INFO: 19:37:38.540 - INFO: TASK [Remove Namespace ocp-41965-b] ******************************************** 19:37:38.541 - INFO: ok: [localhost] 19:37:38.553 - INFO: 19:37:38.553 - INFO: PLAY RECAP ********************************************************************* 19:37:38.553 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:37:38.705 - DEBUG: Removed private data directory: /tmp/tmp243we_l8 19:37:38.706 - INFO: Removing app [ocp-attached-pvc] in namespace [ocp-41965-b] from cluster [source-cluster] 19:37:38.706 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:37:39.297 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:39.300 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:39.300 - INFO: the implicit localhost does not match 'all' 19:37:39.403 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:39.403 - INFO: 19:37:39.403 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:37:40.323 - INFO: 19:37:40.323 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:40.323 - INFO: ok: [localhost] 19:37:40.343 - INFO: 19:37:40.344 - INFO: TASK [include_vars] ************************************************************ 19:37:40.344 - INFO: ok: [localhost] 19:37:42.054 - INFO: 19:37:42.054 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-b] ************************* 19:37:42.054 - INFO: ok: [localhost] 19:37:43.868 - INFO: 19:37:43.868 - INFO: TASK [Remove Namespace ocp-41965-b] ******************************************** 19:37:43.868 - INFO: ok: [localhost] 19:37:43.881 - INFO: 19:37:43.881 - INFO: PLAY RECAP ********************************************************************* 19:37:43.881 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:37:44.034 - DEBUG: Removed private data directory: /tmp/tmp48j7rtsi 19:37:44.035 - INFO: Deploying app [ocp-attached-pvc] in namespace [ocp-41965-b] in cluster [source-cluster] 19:37:44.035 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:37:44.624 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:44.627 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:44.627 - INFO: the implicit localhost does not match 'all' 19:37:44.729 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:44.729 - INFO: 19:37:44.729 - INFO: PLAY [Deploy Application] ****************************************************** 19:37:45.644 - INFO: 19:37:45.645 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:45.645 - INFO: ok: [localhost] 19:37:45.666 - INFO: 19:37:45.666 - INFO: TASK [include_vars] ************************************************************ 19:37:45.666 - INFO: ok: [localhost] 19:37:47.387 - INFO: 19:37:47.387 - INFO: TASK [ocp-attached-pvc : Check namespace] ************************************** 19:37:47.387 - INFO: ok: [localhost] 19:37:48.103 - INFO: 19:37:48.103 - INFO: TASK [ocp-attached-pvc : Create namespace] ************************************* 19:37:48.103 - INFO: changed: [localhost] 19:37:49.879 - INFO: 19:37:49.879 - INFO: TASK [ocp-attached-pvc : Create the pvc-attached application resources] ******** 19:37:49.879 - INFO: fatal: [localhost]: FAILED! => {"changed": false, "error": 422, "msg": "Failed to create object: b'{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"Deployment.apps \\\\\"attached-pvc\\\\\" is invalid: spec.template.spec.containers[0].restartPolicy: Forbidden: may not be set for non-init containers\",\"reason\":\"Invalid\",\"details\":{\"name\":\"attached-pvc\",\"group\":\"apps\",\"kind\":\"Deployment\",\"causes\":[{\"reason\":\"FieldValueForbidden\",\"message\":\"Forbidden: may not be set for non-init containers\",\"field\":\"spec.template.spec.containers[0].restartPolicy\"}]},\"code\":422}\\n'", "reason": "Unprocessable Entity", "status": 422} 19:37:49.880 - INFO: 19:37:49.880 - INFO: PLAY RECAP ********************************************************************* 19:37:49.880 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 19:37:50.035 - DEBUG: Removed private data directory: /tmp/tmpjmbo6x4i 19:37:50.036 - INFO: Removing app [ocp-attached-pvc] in namespace [ocp-41965-a] from cluster [host] 19:37:50.036 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:37:50.619 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:50.623 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:50.623 - INFO: the implicit localhost does not match 'all' 19:37:50.723 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:50.724 - INFO: 19:37:50.724 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:37:51.645 - INFO: 19:37:51.646 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:51.646 - INFO: ok: [localhost] 19:37:51.666 - INFO: 19:37:51.666 - INFO: TASK [include_vars] ************************************************************ 19:37:51.666 - INFO: ok: [localhost] 19:37:53.488 - INFO: 19:37:53.489 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-a] ************************* 19:37:53.489 - INFO: ok: [localhost] 19:37:55.383 - INFO: 19:37:55.384 - INFO: TASK [Remove Namespace ocp-41965-a] ******************************************** 19:37:55.384 - INFO: ok: [localhost] 19:37:55.396 - INFO: 19:37:55.397 - INFO: PLAY RECAP ********************************************************************* 19:37:55.397 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:37:55.547 - DEBUG: Removed private data directory: /tmp/tmpolfxr357 19:37:55.548 - INFO: Removing app [ocp-attached-pvc] in namespace [ocp-41965-a] from cluster [source-cluster] 19:37:55.548 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:37:56.140 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:56.144 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:56.144 - INFO: the implicit localhost does not match 'all' 19:37:56.247 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:56.247 - INFO: 19:37:56.247 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:37:57.166 - INFO: 19:37:57.166 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:57.166 - INFO: ok: [localhost] 19:37:57.186 - INFO: 19:37:57.186 - INFO: TASK [include_vars] ************************************************************ 19:37:57.186 - INFO: ok: [localhost] 19:37:58.947 - INFO: 19:37:58.947 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-a] ************************* 19:37:58.947 - INFO: ok: [localhost] 19:38:00.782 - INFO: 19:38:00.783 - INFO: TASK [Remove Namespace ocp-41965-a] ******************************************** 19:38:00.783 - INFO: ok: [localhost] 19:38:00.796 - INFO: 19:38:00.796 - INFO: PLAY RECAP ********************************************************************* 19:38:00.796 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:38:00.950 - DEBUG: Removed private data directory: /tmp/tmp7t1_jacf 19:38:00.951 - INFO: Deploying app [ocp-attached-pvc] in namespace [ocp-41965-a] in cluster [source-cluster] 19:38:00.951 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:38:01.543 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:38:01.546 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:38:01.547 - INFO: the implicit localhost does not match 'all' 19:38:01.649 - INFO: [WARNING]: Found variable using reserved name: namespace 19:38:01.649 - INFO: 19:38:01.649 - INFO: PLAY [Deploy Application] ****************************************************** 19:38:02.564 - INFO: 19:38:02.564 - INFO: TASK [Gathering Facts] ********************************************************* 19:38:02.564 - INFO: ok: [localhost] 19:38:02.584 - INFO: 19:38:02.584 - INFO: TASK [include_vars] ************************************************************ 19:38:02.584 - INFO: ok: [localhost] 19:38:04.318 - INFO: 19:38:04.318 - INFO: TASK [ocp-attached-pvc : Check namespace] ************************************** 19:38:04.318 - INFO: ok: [localhost] 19:38:04.958 - INFO: 19:38:04.958 - INFO: TASK [ocp-attached-pvc : Create namespace] ************************************* 19:38:04.958 - INFO: changed: [localhost] 19:38:06.705 - INFO: 19:38:06.705 - INFO: TASK [ocp-attached-pvc : Create the pvc-attached application resources] ******** 19:38:06.706 - INFO: fatal: [localhost]: FAILED! => {"changed": false, "error": 422, "msg": "Failed to create object: b'{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"Deployment.apps \\\\\"attached-pvc\\\\\" is invalid: spec.template.spec.containers[0].restartPolicy: Forbidden: may not be set for non-init containers\",\"reason\":\"Invalid\",\"details\":{\"name\":\"attached-pvc\",\"group\":\"apps\",\"kind\":\"Deployment\",\"causes\":[{\"reason\":\"FieldValueForbidden\",\"message\":\"Forbidden: may not be set for non-init containers\",\"field\":\"spec.template.spec.containers[0].restartPolicy\"}]},\"code\":422}\\n'", "reason": "Unprocessable Entity", "status": 422} 19:38:06.707 - INFO: 19:38:06.707 - INFO: PLAY RECAP ********************************************************************* 19:38:06.707 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 19:38:06.863 - DEBUG: Removed private data directory: /tmp/tmpoz1hi301 19:38:06.948 - INFO: Migplan test-interop-migplan has been created 19:38:06.948 - INFO: Migplan IDIM: False 19:38:06.948 - INFO: Migplan IDVM: False 19:38:07.005 - INFO: Waiting for Ready status... 19:38:17.020 - INFO: Waiting for Ready status... 19:38:27.035 - INFO: The migration plan is Ready. ------------------------------ live log logreport ------------------------------ 19:38:27.049 - INFO: -------------------------------- live log call --------------------------------- 19:38:27.049 - INFO: Migrating from ns:ocp-41965-b in cluster:source-cluster to ns:ocp-41965-bnew in cluster:host 19:38:27.049 - INFO: Migrating from ns:ocp-41965-a in cluster:source-cluster to ns:ocp-41965-anew in cluster:host 19:38:27.049 - INFO: Migplan test-interop-migplan. Wait until ready 19:38:27.062 - INFO: The migration plan is Ready. 19:38:27.062 - INFO: MIGPLAN READY 19:38:27.063 - INFO: NO WARNINGS IN MIGPLAN 19:38:27.063 - INFO: EXECUTE MIGRATION 19:38:27.108 - INFO: Not started. Waiting... 19:38:37.122 - INFO: Step: 18/49. Waiting... 19:38:47.137 - INFO: Step: 18/49. Waiting... 19:38:57.155 - INFO: Step: 41/49. Waiting... 19:39:07.190 - INFO: Step: 41/49. Waiting... 19:39:17.205 - INFO: Step: 41/49. Waiting... 19:39:27.219 - INFO: Step: 43/49. Waiting... 19:39:37.235 - INFO: Finished. 19:39:37.235 - INFO: VALIDATE APPLICATION 19:39:37.235 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 19:39:37.831 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:39:37.834 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:39:37.835 - INFO: the implicit localhost does not match 'all' 19:39:37.936 - INFO: [WARNING]: Found variable using reserved name: namespace 19:39:37.937 - INFO: 19:39:37.937 - INFO: PLAY [Validate application] **************************************************** 19:39:38.863 - INFO: 19:39:38.863 - INFO: TASK [Gathering Facts] ********************************************************* 19:39:38.863 - INFO: ok: [localhost] 19:39:38.883 - INFO: 19:39:38.883 - INFO: TASK [include_vars] ************************************************************ 19:39:38.883 - INFO: ok: [localhost] 19:39:40.816 - INFO: FAILED - RETRYING: Check pod status (40 retries left). 19:39:47.442 - INFO: FAILED - RETRYING: Check pod status (39 retries left). 19:39:54.054 - INFO: FAILED - RETRYING: Check pod status (38 retries left). 19:40:00.704 - INFO: FAILED - RETRYING: Check pod status (37 retries left). 19:40:07.323 - INFO: FAILED - RETRYING: Check pod status (36 retries left). 19:40:13.945 - INFO: FAILED - RETRYING: Check pod status (35 retries left). 19:40:20.621 - INFO: FAILED - RETRYING: Check pod status (34 retries left). 19:40:27.264 - INFO: FAILED - RETRYING: Check pod status (33 retries left). 19:40:33.894 - INFO: FAILED - RETRYING: Check pod status (32 retries left). 19:40:40.577 - INFO: FAILED - RETRYING: Check pod status (31 retries left). 19:40:47.223 - INFO: FAILED - RETRYING: Check pod status (30 retries left). 19:40:53.808 - INFO: FAILED - RETRYING: Check pod status (29 retries left). 19:41:00.454 - INFO: FAILED - RETRYING: Check pod status (28 retries left). 19:41:07.113 - INFO: FAILED - RETRYING: Check pod status (27 retries left). 19:41:13.769 - INFO: FAILED - RETRYING: Check pod status (26 retries left). 19:41:20.372 - INFO: FAILED - RETRYING: Check pod status (25 retries left). 19:41:27.022 - INFO: FAILED - RETRYING: Check pod status (24 retries left). 19:41:33.638 - INFO: FAILED - RETRYING: Check pod status (23 retries left). 19:41:40.249 - INFO: FAILED - RETRYING: Check pod status (22 retries left). 19:41:46.888 - INFO: FAILED - RETRYING: Check pod status (21 retries left). 19:41:53.526 - INFO: FAILED - RETRYING: Check pod status (20 retries left). 19:42:00.111 - INFO: FAILED - RETRYING: Check pod status (19 retries left). 19:42:06.684 - INFO: FAILED - RETRYING: Check pod status (18 retries left). 19:42:13.271 - INFO: FAILED - RETRYING: Check pod status (17 retries left). 19:42:19.861 - INFO: FAILED - RETRYING: Check pod status (16 retries left). 19:42:26.481 - INFO: FAILED - RETRYING: Check pod status (15 retries left). 19:42:33.108 - INFO: FAILED - RETRYING: Check pod status (14 retries left). 19:42:39.741 - INFO: FAILED - RETRYING: Check pod status (13 retries left). 19:42:46.301 - INFO: FAILED - RETRYING: Check pod status (12 retries left). 19:42:52.908 - INFO: FAILED - RETRYING: Check pod status (11 retries left). 19:42:59.486 - INFO: FAILED - RETRYING: Check pod status (10 retries left). 19:43:06.060 - INFO: FAILED - RETRYING: Check pod status (9 retries left). 19:43:12.619 - INFO: FAILED - RETRYING: Check pod status (8 retries left). 19:43:19.214 - INFO: FAILED - RETRYING: Check pod status (7 retries left). 19:43:25.774 - INFO: FAILED - RETRYING: Check pod status (6 retries left). 19:43:32.347 - INFO: FAILED - RETRYING: Check pod status (5 retries left). 19:43:38.914 - INFO: FAILED - RETRYING: Check pod status (4 retries left). 19:43:45.489 - INFO: FAILED - RETRYING: Check pod status (3 retries left). 19:43:52.120 - INFO: FAILED - RETRYING: Check pod status (2 retries left). 19:43:58.799 - INFO: FAILED - RETRYING: Check pod status (1 retries left). 19:44:05.668 - INFO: 19:44:05.669 - INFO: TASK [ocp-attached-pvc : Check pod status] ************************************* 19:44:05.669 - INFO: fatal: [localhost]: FAILED! => {"api_found": true, "attempts": 40, "changed": false, "resources": []} 19:44:05.670 - INFO: 19:44:05.670 - INFO: PLAY RECAP ********************************************************************* 19:44:05.670 - INFO: localhost : ok=2  changed=0 unreachable=0 failed=1  skipped=7  rescued=0 ignored=0 19:44:05.834 - DEBUG: Removed private data directory: /tmp/tmpdf7fc7c6 FAILED------------------------------ live log logreport ------------------------------ 19:44:05.844 - INFO: mtc = migplan = apps = [, ] migrationcontroller = migrated_namespaces = ['ocp-41965-a:ocp-41965-anew', 'ocp-41965-b:ocp-41965-bnew'] src_cluster = tgt_cluster = @pytest.mark.app(app_id="ocp-attached-pvc", app_namespace="ocp-41965-a") @pytest.mark.app(app_id="ocp-attached-pvc", app_namespace="ocp-41965-b") @pytest.mark.migrated_namespaces(["ocp-41965-a:ocp-41965-anew", "ocp-41965-b:ocp-41965-bnew"]) def test_mtc_101_interop( mtc, migplan, apps, migrationcontroller, migrated_namespaces, src_cluster, tgt_cluster ): app1 = apps[0] app2 = apps[1] src_namespace1 = app1.namespace tgt_namespace1 = get_app_target_namespace_from_namespaces(app1, migrated_namespaces) src_namespace2 = app2.namespace tgt_namespace2 = get_app_target_namespace_from_namespaces(app2, migrated_namespaces) logger.info( "Migrating {0} from ns:{1} in cluster:{2} to ns:{3} in cluster:{4}".format( app1, src_namespace1, src_cluster.name, tgt_namespace1, tgt_cluster.name ) ) logger.info( "Migrating {0} from ns:{1} in cluster:{2} to ns:{3} in cluster:{4}".format( app2, src_namespace2, src_cluster.name, tgt_namespace2, tgt_cluster.name ) ) logger.info("Migplan {0}. Wait until ready".format(migplan.name)) ready = migplan.wait_until_ready() # Assert that plan is in Ready status assert ready, "Migplan must be ready:{0}".format(pretty(migplan.definition)) logger.info("MIGPLAN READY") # Assert that there are no warnings in the migplan assert len(migplan.get_warnings()) == 0, "There should be no warnings {0}".format(pretty(migplan.definition)) logger.info("NO WARNINGS IN MIGPLAN") logger.info("EXECUTE MIGRATION") migmigration = migplan.migrate(quiesce=True) success = migmigration.wait_until_success(timeout=500) assert success, "The migration should succeed. {0}".format(pretty(migmigration.definition)) migrated_app1 = get_migrated_app(app1, tgt_cluster, tgt_namespace1) migrated_app2 = get_migrated_app(app2, tgt_cluster, tgt_namespace2) logger.info("VALIDATE APPLICATION {}".format(migrated_app1)) ok = migrated_app1.validate() > assert ok, "The application {} should be validated OK in the target cluster".format(migrated_app1) E AssertionError: The application should be validated OK in the target cluster E assert False mtc-e2e-qev2/mtc_tests/tests/test_interop.py:259: AssertionError ------------------------------ live log teardown ------------------------------- 19:44:05.845 - INFO: Deleting Migplan test-interop-migplan... 19:44:15.991 - INFO: Removing app in namespace [ocp-41965-b] from cluster [host] 19:44:15.992 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:44:16.645 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:44:16.648 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:44:16.649 - INFO: the implicit localhost does not match 'all' 19:44:16.760 - INFO: [WARNING]: Found variable using reserved name: namespace 19:44:16.761 - INFO: 19:44:16.761 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:44:17.720 - INFO: 19:44:17.720 - INFO: TASK [Gathering Facts] ********************************************************* 19:44:17.720 - INFO: ok: [localhost] 19:44:17.740 - INFO: 19:44:17.740 - INFO: TASK [include_vars] ************************************************************ 19:44:17.740 - INFO: ok: [localhost] 19:44:19.536 - INFO: 19:44:19.536 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-b] ************************* 19:44:19.536 - INFO: ok: [localhost] 19:44:21.559 - INFO: 19:44:21.560 - INFO: TASK [Remove Namespace ocp-41965-b] ******************************************** 19:44:21.560 - INFO: ok: [localhost] 19:44:21.578 - INFO: 19:44:21.579 - INFO: PLAY RECAP ********************************************************************* 19:44:21.579 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:44:21.750 - DEBUG: Removed private data directory: /tmp/tmpzne_g86e 19:44:21.751 - INFO: Removing app in namespace [ocp-41965-b] from cluster [source-cluster] 19:44:21.751 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:44:22.494 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:44:22.498 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:44:22.498 - INFO: the implicit localhost does not match 'all' 19:44:22.604 - INFO: [WARNING]: Found variable using reserved name: namespace 19:44:22.605 - INFO: 19:44:22.605 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:44:23.611 - INFO: 19:44:23.611 - INFO: TASK [Gathering Facts] ********************************************************* 19:44:23.611 - INFO: ok: [localhost] 19:44:23.634 - INFO: 19:44:23.634 - INFO: TASK [include_vars] ************************************************************ 19:44:23.634 - INFO: ok: [localhost] 19:44:40.567 - INFO: 19:44:40.567 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-b] ************************* 19:44:40.567 - INFO: changed: [localhost] 19:44:42.464 - INFO: 19:44:42.465 - INFO: TASK [Remove Namespace ocp-41965-b] ******************************************** 19:44:42.465 - INFO: ok: [localhost] 19:44:42.477 - INFO: 19:44:42.477 - INFO: PLAY RECAP ********************************************************************* 19:44:42.477 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:44:42.629 - DEBUG: Removed private data directory: /tmp/tmprle4ew6a 19:44:42.630 - INFO: Removing app in namespace [ocp-41965-a] from cluster [host] 19:44:42.630 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:44:43.217 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:44:43.220 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:44:43.221 - INFO: the implicit localhost does not match 'all' 19:44:43.321 - INFO: [WARNING]: Found variable using reserved name: namespace 19:44:43.321 - INFO: 19:44:43.321 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:44:44.244 - INFO: 19:44:44.244 - INFO: TASK [Gathering Facts] ********************************************************* 19:44:44.244 - INFO: ok: [localhost] 19:44:44.268 - INFO: 19:44:44.269 - INFO: TASK [include_vars] ************************************************************ 19:44:44.269 - INFO: ok: [localhost] 19:44:46.048 - INFO: 19:44:46.048 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-a] ************************* 19:44:46.048 - INFO: ok: [localhost] 19:44:47.936 - INFO: 19:44:47.936 - INFO: TASK [Remove Namespace ocp-41965-a] ******************************************** 19:44:47.936 - INFO: ok: [localhost] 19:44:47.949 - INFO: 19:44:47.949 - INFO: PLAY RECAP ********************************************************************* 19:44:47.949 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:44:48.100 - DEBUG: Removed private data directory: /tmp/tmpx2wjsq4p 19:44:48.100 - INFO: Removing app in namespace [ocp-41965-a] from cluster [source-cluster] 19:44:48.100 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:44:48.687 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:44:48.690 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:44:48.690 - INFO: the implicit localhost does not match 'all' 19:44:48.791 - INFO: [WARNING]: Found variable using reserved name: namespace 19:44:48.791 - INFO: 19:44:48.792 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:44:49.714 - INFO: 19:44:49.714 - INFO: TASK [Gathering Facts] ********************************************************* 19:44:49.714 - INFO: ok: [localhost] 19:44:49.734 - INFO: 19:44:49.734 - INFO: TASK [include_vars] ************************************************************ 19:44:49.734 - INFO: ok: [localhost] 19:45:06.523 - INFO: 19:45:06.524 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-a] ************************* 19:45:06.524 - INFO: changed: [localhost] 19:45:08.351 - INFO: 19:45:08.351 - INFO: TASK [Remove Namespace ocp-41965-a] ******************************************** 19:45:08.351 - INFO: ok: [localhost] 19:45:08.364 - INFO: 19:45:08.364 - INFO: PLAY RECAP ********************************************************************* 19:45:08.364 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:45:08.512 - DEBUG: Removed private data directory: /tmp/tmpnk2maa2r 19:45:08.513 - INFO: Removing namespace fixture: ocp-41965-a from cluster source-cluster 19:45:08.557 - INFO: Removing namespace fixture: ocp-41965-anew from cluster host 19:45:08.616 - INFO: Waiting for namespace fixture to be deleted 19:45:14.714 - INFO: Removing namespace fixture: ocp-41965-b from cluster source-cluster 19:45:14.730 - INFO: Removing namespace fixture: ocp-41965-bnew from cluster host 19:45:14.760 - INFO: Waiting for namespace fixture to be deleted ------------------------------ live log logreport ------------------------------ 19:45:21.885 - INFO: mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_116_interop -------------------------------- live log setup -------------------------------- 19:45:25.414 - INFO: Removing namespace fixture: op-37589-rollbacksuccess from cluster source-cluster 19:45:26.082 - INFO: Removing namespace fixture: op-37589-rollbacksuccess from cluster host 19:45:26.770 - INFO: Removing app [nginx-j2] in namespace [op-37589-rollbacksuccess] from cluster [host] 19:45:26.783 - INFO: Waiting for resources to be deleted 19:45:26.797 - INFO: Removing app [nginx-j2] in namespace [op-37589-rollbacksuccess] from cluster [source-cluster] 19:45:26.812 - INFO: Waiting for resources to be deleted 19:45:26.827 - INFO: Deploying app [nginx-j2] in namespace [op-37589-rollbacksuccess] in cluster [source-cluster] 19:45:26.841 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:45:26.841 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:45:27.036 - INFO: Deployed namespace op-37589-rollbacksuccess in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:45:27.055 - INFO: Namespace properly created and "Active": op-37589-rollbacksuccess 19:45:27.055 - INFO: Deploying nginx application in namespace in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:45:27.055 - DEBUG: Rendering template: nginxpv/deployment.yml.j2 19:45:27.056 - DEBUG: Using vars: app_name: nginx app_namespace: op-37589-rollbacksuccess deployment_api: apps/v1 html_accessmode: ReadWriteOnce logs_accessmode: ReadWriteOnce storage_class: default storage_size: 1Gi 19:45:42.384 - DEBUG: Requesting get: http://my-nginx-op-37589-rollbacksuccess.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:45:42.759 - DEBUG: Requesting get: http://my-nginx-op-37589-rollbacksuccess.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:45:42.784 - INFO: Validating nginx application in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:45:43.142 - DEBUG: Validated that 1 errors were reported in the errors log file 19:45:43.542 - DEBUG: Validated that 1 errors and 1 success were reported in the access log file 19:45:43.630 - INFO: Migplan test-interop-migplan has been created 19:45:43.631 - INFO: Migplan IDIM: False 19:45:43.631 - INFO: Migplan IDVM: False 19:45:43.694 - INFO: Waiting for Ready status... 19:45:53.710 - INFO: Waiting for Ready status... 19:46:03.725 - INFO: The migration plan is Ready. ------------------------------ live log logreport ------------------------------ 19:46:03.741 - INFO: -------------------------------- live log call --------------------------------- 19:46:03.741 - INFO: Migrating from ns:op-37589-rollbacksuccess in cluster:source-cluster to ns:op-37589-rollbacksuccess in cluster:host 19:46:03.741 - INFO: Migplan test-interop-migplan. Wait until ready 19:46:03.757 - INFO: The migration plan is Ready. 19:46:03.757 - INFO: MIGPLAN READY 19:46:03.757 - INFO: EXECUTE STAGE 19:46:03.809 - INFO: Step: 2/38. Waiting... 19:46:13.860 - INFO: Step: 32/38. Waiting... 19:46:23.875 - INFO: Step: 32/38. Waiting... 19:46:33.892 - INFO: Step: 32/38. Waiting... 19:46:43.907 - INFO: Step: 32/38. Waiting... 19:46:53.922 - INFO: Finished. 19:46:53.923 - INFO: CHECK PVC 19:46:53.923 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:46:54.208 - INFO: EXECUTE MIGRATION 19:46:54.260 - INFO: Not started. Waiting... 19:47:04.275 - INFO: Step: 18/49. Waiting... 19:47:14.296 - INFO: Step: 38/49. Waiting... 19:47:24.312 - INFO: Step: 38/49. Waiting... 19:47:34.327 - INFO: Step: 38/49. Waiting... 19:47:44.342 - INFO: Step: 43/49. Waiting... 19:47:54.358 - INFO: Finished. 19:47:54.359 - INFO: VALIDATE APPLICATION 19:47:54.359 - INFO: Validating migrated nginx application in cluster https://api.mtc-target-nzvv.cspilp.interop.ccitredhat.com:6443 19:47:54.423 - DEBUG: Requesting get: http://my-nginx-op-37589-rollbacksuccess.apps.mtc-target-nzvv.cspilp.interop.ccitredhat.com 19:47:54.928 - DEBUG: Validated that 1 errors were reported in the errors log file 19:47:55.328 - DEBUG: Validated that 1 errors and 2 success were reported in the access log file 19:47:55.328 - INFO: EXECUTE ROLLBACK 19:47:55.379 - INFO: Step: 2/11. Waiting... 19:48:05.395 - INFO: Step: 8/11. Waiting... 19:48:15.433 - INFO: Step: 9/11. Waiting... 19:48:25.448 - INFO: Finished. 19:48:25.449 - INFO: VALIDATE APPLICATION 19:48:25.449 - INFO: Validating migrated nginx application in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:48:25.519 - DEBUG: Requesting get: http://my-nginx-op-37589-rollbacksuccess.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:48:25.946 - DEBUG: Validated that 1 errors were reported in the errors log file 19:48:26.342 - DEBUG: Validated that 1 errors and 2 success were reported in the access log file PASSED------------------------------ live log logreport ------------------------------ 19:48:26.343 - INFO: ------------------------------ live log teardown ------------------------------- 19:48:26.343 - INFO: Deleting Migplan test-interop-migplan... 19:48:36.441 - INFO: Removing app in namespace [op-37589-rollbacksuccess] from cluster [host] 19:48:36.458 - INFO: Removing namespace: op-37589-rollbacksuccess 19:48:36.473 - INFO: Waiting for resources to be deleted 19:48:42.569 - INFO: Removing app in namespace [op-37589-rollbacksuccess] from cluster [source-cluster] 19:48:42.586 - INFO: Removing namespace: op-37589-rollbacksuccess 19:48:42.605 - INFO: Waiting for resources to be deleted 19:48:50.755 - INFO: Removing namespace fixture: op-37589-rollbacksuccess from cluster source-cluster 19:48:50.771 - INFO: Removing namespace fixture: op-37589-rollbacksuccess from cluster host ------------------------------ live log logreport ------------------------------ 19:48:50.786 - INFO: mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_147_interop -------------------------------- live log setup -------------------------------- 19:48:54.449 - INFO: Removing namespace fixture: ocp-longpvcname from cluster source-cluster 19:48:55.149 - INFO: Removing namespace fixture: ocp-longpvcname from cluster host 19:48:55.903 - INFO: Removing app [ocp-longpvcname] in namespace [ocp-longpvcname] from cluster [host] 19:48:55.904 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:48:56.525 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:48:56.528 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:48:56.529 - INFO: the implicit localhost does not match 'all' 19:48:56.633 - INFO: [WARNING]: Found variable using reserved name: namespace 19:48:56.634 - INFO: 19:48:56.634 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:48:57.598 - INFO: 19:48:57.598 - INFO: TASK [Gathering Facts] ********************************************************* 19:48:57.598 - INFO: ok: [localhost] 19:48:57.618 - INFO: 19:48:57.619 - INFO: TASK [include_vars] ************************************************************ 19:48:57.619 - INFO: ok: [localhost] 19:48:59.476 - INFO: 19:48:59.476 - INFO: TASK [ocp-longpvcname : Remove namespace ocp-longpvcname] ********************** 19:48:59.476 - INFO: ok: [localhost] 19:49:01.379 - INFO: 19:49:01.380 - INFO: TASK [Remove Namespace ocp-longpvcname] **************************************** 19:49:01.380 - INFO: ok: [localhost] 19:49:01.393 - INFO: 19:49:01.393 - INFO: PLAY RECAP ********************************************************************* 19:49:01.394 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:49:01.553 - DEBUG: Removed private data directory: /tmp/tmprur9uphr 19:49:01.554 - INFO: Removing app [ocp-longpvcname] in namespace [ocp-longpvcname] from cluster [source-cluster] 19:49:01.554 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:49:02.174 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:49:02.178 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:49:02.178 - INFO: the implicit localhost does not match 'all' 19:49:02.282 - INFO: [WARNING]: Found variable using reserved name: namespace 19:49:02.282 - INFO: 19:49:02.282 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:49:03.246 - INFO: 19:49:03.247 - INFO: TASK [Gathering Facts] ********************************************************* 19:49:03.247 - INFO: ok: [localhost] 19:49:03.267 - INFO: 19:49:03.267 - INFO: TASK [include_vars] ************************************************************ 19:49:03.267 - INFO: ok: [localhost] 19:49:05.095 - INFO: 19:49:05.095 - INFO: TASK [ocp-longpvcname : Remove namespace ocp-longpvcname] ********************** 19:49:05.096 - INFO: ok: [localhost] 19:49:07.077 - INFO: 19:49:07.078 - INFO: TASK [Remove Namespace ocp-longpvcname] **************************************** 19:49:07.078 - INFO: ok: [localhost] 19:49:07.090 - INFO: 19:49:07.091 - INFO: PLAY RECAP ********************************************************************* 19:49:07.091 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:49:07.249 - DEBUG: Removed private data directory: /tmp/tmpo0zum77j 19:49:07.250 - INFO: Deploying app [ocp-longpvcname] in namespace [ocp-longpvcname] in cluster [source-cluster] 19:49:07.250 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:49:07.859 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:49:07.863 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:49:07.863 - INFO: the implicit localhost does not match 'all' 19:49:07.967 - INFO: [WARNING]: Found variable using reserved name: namespace 19:49:07.967 - INFO: 19:49:07.967 - INFO: PLAY [Deploy Application] ****************************************************** 19:49:08.902 - INFO: 19:49:08.902 - INFO: TASK [Gathering Facts] ********************************************************* 19:49:08.902 - INFO: ok: [localhost] 19:49:08.923 - INFO: 19:49:08.923 - INFO: TASK [include_vars] ************************************************************ 19:49:08.923 - INFO: ok: [localhost] 19:49:10.728 - INFO: 19:49:10.728 - INFO: TASK [ocp-longpvcname : Check namespace] *************************************** 19:49:10.728 - INFO: ok: [localhost] 19:49:11.376 - INFO: 19:49:11.376 - INFO: TASK [ocp-longpvcname : Create namespace] ************************************** 19:49:11.376 - INFO: changed: [localhost] 19:49:13.292 - INFO: 19:49:13.292 - INFO: TASK [ocp-longpvcname : Create the longpvcname application resources] ********** 19:49:13.292 - INFO: fatal: [localhost]: FAILED! => {"changed": false, "error": 422, "msg": "Failed to create object: b'{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"Deployment.apps \\\\\"longpvc-test\\\\\" is invalid: spec.template.spec.containers[0].restartPolicy: Forbidden: may not be set for non-init containers\",\"reason\":\"Invalid\",\"details\":{\"name\":\"longpvc-test\",\"group\":\"apps\",\"kind\":\"Deployment\",\"causes\":[{\"reason\":\"FieldValueForbidden\",\"message\":\"Forbidden: may not be set for non-init containers\",\"field\":\"spec.template.spec.containers[0].restartPolicy\"}]},\"code\":422}\\n'", "reason": "Unprocessable Entity", "status": 422} 19:49:13.293 - INFO: 19:49:13.293 - INFO: PLAY RECAP ********************************************************************* 19:49:13.293 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 19:49:13.449 - DEBUG: Removed private data directory: /tmp/tmp_rnvkfnp 19:49:13.534 - INFO: Migplan test-interop-migplan has been created 19:49:13.535 - INFO: Migplan IDIM: False 19:49:13.535 - INFO: Migplan IDVM: False 19:49:13.595 - INFO: Waiting for Ready status... 19:49:23.612 - INFO: Waiting for Ready status... 19:49:33.627 - INFO: The migration plan is Ready. ------------------------------ live log logreport ------------------------------ 19:49:33.628 - INFO: -------------------------------- live log call --------------------------------- 19:49:33.629 - INFO: Migrating from ns:ocp-longpvcname in cluster:source-cluster to ns:ocp-longpvcname in cluster:host 19:49:33.629 - INFO: Migplan test-interop-migplan. Wait until ready 19:49:33.644 - INFO: The migration plan is Ready. 19:49:33.644 - INFO: MIGPLAN READY 19:49:33.644 - INFO: NO WARNINGS IN MIGPLAN 19:49:33.644 - INFO: EXECUTE MIGRATION 19:49:33.697 - INFO: Not started. Waiting... 19:49:43.712 - INFO: Step: 18/49. Waiting... 19:49:53.728 - INFO: Step: 41/49. Waiting... 19:50:03.745 - INFO: Step: 41/49. Waiting... 19:50:13.760 - INFO: Step: 41/49. Waiting... 19:50:23.819 - INFO: Step: 41/49. Waiting... 19:50:33.834 - INFO: Finished. 19:50:33.834 - INFO: VALIDATE APPLICATION 19:50:33.834 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 19:50:34.431 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:50:34.434 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:50:34.435 - INFO: the implicit localhost does not match 'all' 19:50:34.535 - INFO: [WARNING]: Found variable using reserved name: namespace 19:50:34.535 - INFO: 19:50:34.535 - INFO: PLAY [Validate application] **************************************************** 19:50:35.480 - INFO: 19:50:35.480 - INFO: TASK [Gathering Facts] ********************************************************* 19:50:35.480 - INFO: ok: [localhost] 19:50:35.501 - INFO: 19:50:35.501 - INFO: TASK [include_vars] ************************************************************ 19:50:35.501 - INFO: ok: [localhost] 19:50:37.616 - INFO: FAILED - RETRYING: Check pod status (40 retries left). 19:50:44.332 - INFO: FAILED - RETRYING: Check pod status (39 retries left). 19:50:51.020 - INFO: FAILED - RETRYING: Check pod status (38 retries left). 19:50:57.704 - INFO: FAILED - RETRYING: Check pod status (37 retries left). 19:51:04.404 - INFO: FAILED - RETRYING: Check pod status (36 retries left). 19:51:11.116 - INFO: FAILED - RETRYING: Check pod status (35 retries left). 19:51:17.820 - INFO: FAILED - RETRYING: Check pod status (34 retries left). 19:51:24.476 - INFO: FAILED - RETRYING: Check pod status (33 retries left). 19:51:31.129 - INFO: FAILED - RETRYING: Check pod status (32 retries left). 19:51:37.789 - INFO: FAILED - RETRYING: Check pod status (31 retries left). 19:51:44.488 - INFO: FAILED - RETRYING: Check pod status (30 retries left). 19:51:51.206 - INFO: FAILED - RETRYING: Check pod status (29 retries left). 19:51:57.875 - INFO: FAILED - RETRYING: Check pod status (28 retries left). 19:52:04.490 - INFO: FAILED - RETRYING: Check pod status (27 retries left). 19:52:11.154 - INFO: FAILED - RETRYING: Check pod status (26 retries left). 19:52:17.793 - INFO: FAILED - RETRYING: Check pod status (25 retries left). 19:52:24.422 - INFO: FAILED - RETRYING: Check pod status (24 retries left). 19:52:31.100 - INFO: FAILED - RETRYING: Check pod status (23 retries left). 19:52:37.784 - INFO: FAILED - RETRYING: Check pod status (22 retries left). 19:52:44.378 - INFO: FAILED - RETRYING: Check pod status (21 retries left). 19:52:51.034 - INFO: FAILED - RETRYING: Check pod status (20 retries left). 19:52:57.633 - INFO: FAILED - RETRYING: Check pod status (19 retries left). 19:53:04.272 - INFO: FAILED - RETRYING: Check pod status (18 retries left). 19:53:10.858 - INFO: FAILED - RETRYING: Check pod status (17 retries left). 19:53:17.433 - INFO: FAILED - RETRYING: Check pod status (16 retries left). 19:53:24.071 - INFO: FAILED - RETRYING: Check pod status (15 retries left). 19:53:30.638 - INFO: FAILED - RETRYING: Check pod status (14 retries left). 19:53:37.206 - INFO: FAILED - RETRYING: Check pod status (13 retries left). 19:53:43.795 - INFO: FAILED - RETRYING: Check pod status (12 retries left). 19:53:50.409 - INFO: FAILED - RETRYING: Check pod status (11 retries left). 19:53:57.067 - INFO: FAILED - RETRYING: Check pod status (10 retries left). 19:54:03.727 - INFO: FAILED - RETRYING: Check pod status (9 retries left). 19:54:10.404 - INFO: FAILED - RETRYING: Check pod status (8 retries left). 19:54:17.056 - INFO: FAILED - RETRYING: Check pod status (7 retries left). 19:54:23.681 - INFO: FAILED - RETRYING: Check pod status (6 retries left). 19:54:30.347 - INFO: FAILED - RETRYING: Check pod status (5 retries left). 19:54:36.950 - INFO: FAILED - RETRYING: Check pod status (4 retries left). 19:54:43.577 - INFO: FAILED - RETRYING: Check pod status (3 retries left). 19:54:50.186 - INFO: FAILED - RETRYING: Check pod status (2 retries left). 19:54:56.754 - INFO: FAILED - RETRYING: Check pod status (1 retries left). 19:55:03.331 - INFO: 19:55:03.331 - INFO: TASK [ocp-longpvcname : Check pod status] ************************************** 19:55:03.332 - INFO: fatal: [localhost]: FAILED! => {"api_found": true, "attempts": 40, "changed": false, "resources": []} 19:55:03.333 - INFO: 19:55:03.333 - INFO: PLAY RECAP ********************************************************************* 19:55:03.333 - INFO: localhost : ok=2  changed=0 unreachable=0 failed=1  skipped=7  rescued=0 ignored=0 19:55:03.487 - DEBUG: Removed private data directory: /tmp/tmp6n4ci2u2 FAILED------------------------------ live log logreport ------------------------------ 19:55:03.496 - INFO: mtc = migplan = apps = [] migrated_namespaces = 'ocp-longpvcname:ocp-longpvcname' src_cluster = tgt_cluster = @pytest.mark.app(app_id="ocp-longpvcname", app_namespace="ocp-longpvcname") @pytest.mark.migrated_namespaces("ocp-longpvcname:ocp-longpvcname") def test_mtc_147_interop(mtc, migplan, apps, migrated_namespaces, src_cluster, tgt_cluster): tgt_namespace = get_target_namespace_from_namespace(migrated_namespaces) src_namespace = get_source_namespace_from_namespace(migrated_namespaces) app = apps[0] logger.info( "Migrating from ns:{0} in cluster:{1} to ns:{2} in cluster:{3}".format( src_namespace, src_cluster.name, tgt_namespace, tgt_cluster.name ) ) logger.info("Migplan {0}. Wait until ready".format(migplan.name)) ready = migplan.wait_until_ready() # Assert that plan is in Ready status assert ready, "Migplan must be ready:{0}".format(pretty(migplan.definition)) logger.info("MIGPLAN READY") # Assert that there are no warnings in the migplan assert len(migplan.get_warnings()) == 0, "There should be no warnings {0}".format(pretty(migplan.definition)) logger.info("NO WARNINGS IN MIGPLAN") logger.info("EXECUTE MIGRATION") migmigration = migplan.migrate(quiesce=True) success = migmigration.wait_until_success(timeout=600) assert success, "The migration should succeed. {0}".format(pretty(migmigration.definition)) migrated_app = get_migrated_app(app, tgt_cluster, tgt_namespace) logger.info("VALIDATE APPLICATION") ok = migrated_app.validate() > assert ok, "The application should be validated OK in the target cluster" E AssertionError: The application should be validated OK in the target cluster E assert False mtc-e2e-qev2/mtc_tests/tests/test_interop.py:363: AssertionError ------------------------------ live log teardown ------------------------------- 19:55:03.497 - INFO: Deleting Migplan test-interop-migplan... 19:55:13.636 - INFO: Removing app in namespace [ocp-longpvcname] from cluster [host] 19:55:13.636 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:55:14.234 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:55:14.237 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:55:14.237 - INFO: the implicit localhost does not match 'all' 19:55:14.341 - INFO: [WARNING]: Found variable using reserved name: namespace 19:55:14.341 - INFO: 19:55:14.341 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:55:15.266 - INFO: 19:55:15.266 - INFO: TASK [Gathering Facts] ********************************************************* 19:55:15.266 - INFO: ok: [localhost] 19:55:15.286 - INFO: 19:55:15.287 - INFO: TASK [include_vars] ************************************************************ 19:55:15.287 - INFO: ok: [localhost] 19:55:27.072 - INFO: 19:55:27.072 - INFO: TASK [ocp-longpvcname : Remove namespace ocp-longpvcname] ********************** 19:55:27.072 - INFO: changed: [localhost] 19:55:28.988 - INFO: 19:55:28.989 - INFO: TASK [Remove Namespace ocp-longpvcname] **************************************** 19:55:28.989 - INFO: ok: [localhost] 19:55:29.001 - INFO: 19:55:29.002 - INFO: PLAY RECAP ********************************************************************* 19:55:29.002 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:55:29.156 - DEBUG: Removed private data directory: /tmp/tmpifynvvlc 19:55:29.157 - INFO: Removing app in namespace [ocp-longpvcname] from cluster [source-cluster] 19:55:29.157 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:55:29.754 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:55:29.757 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:55:29.757 - INFO: the implicit localhost does not match 'all' 19:55:29.858 - INFO: [WARNING]: Found variable using reserved name: namespace 19:55:29.859 - INFO: 19:55:29.859 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:55:30.785 - INFO: 19:55:30.786 - INFO: TASK [Gathering Facts] ********************************************************* 19:55:30.786 - INFO: ok: [localhost] 19:55:30.806 - INFO: 19:55:30.806 - INFO: TASK [include_vars] ************************************************************ 19:55:30.806 - INFO: ok: [localhost] 19:55:47.619 - INFO: 19:55:47.619 - INFO: TASK [ocp-longpvcname : Remove namespace ocp-longpvcname] ********************** 19:55:47.619 - INFO: changed: [localhost] 19:55:49.498 - INFO: 19:55:49.498 - INFO: TASK [Remove Namespace ocp-longpvcname] **************************************** 19:55:49.498 - INFO: ok: [localhost] 19:55:49.511 - INFO: 19:55:49.511 - INFO: PLAY RECAP ********************************************************************* 19:55:49.511 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:55:49.662 - DEBUG: Removed private data directory: /tmp/tmp4g666zub 19:55:49.663 - INFO: Removing namespace fixture: ocp-longpvcname from cluster source-cluster 19:55:49.705 - INFO: Removing namespace fixture: ocp-longpvcname from cluster host ------------------------------ live log logreport ------------------------------ 19:55:49.748 - INFO: mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_172_interop -------------------------------- live log setup -------------------------------- 19:55:53.410 - INFO: Removing namespace fixture: ocp-41820-configmap-source from cluster source-cluster 19:55:54.075 - INFO: Removing namespace fixture: ocp-41820-configmap-target from cluster host 19:55:54.763 - INFO: Removing namespace fixture: ocp-41820-django-source from cluster source-cluster 19:55:54.777 - INFO: Removing namespace fixture: ocp-41820-django-target from cluster host 19:55:54.790 - INFO: Removing namespace fixture: ocp-41820-nginx-source from cluster source-cluster 19:55:54.805 - INFO: Removing namespace fixture: ocp-41820-nginx-target from cluster host 19:55:54.818 - INFO: Removing app [nginx-j2] in namespace [ocp-41820-nginx-source] from cluster [host] 19:55:54.831 - INFO: Waiting for resources to be deleted 19:55:54.845 - INFO: Removing app [nginx-j2] in namespace [ocp-41820-nginx-source] from cluster [source-cluster] 19:55:54.859 - INFO: Waiting for resources to be deleted 19:55:54.874 - INFO: Deploying app [nginx-j2] in namespace [ocp-41820-nginx-source] in cluster [source-cluster] 19:55:54.888 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:55:54.888 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:55:55.075 - INFO: Deployed namespace ocp-41820-nginx-source in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:55:55.095 - INFO: Namespace properly created and "Active": ocp-41820-nginx-source 19:55:55.095 - INFO: Deploying nginx application in namespace in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:55:55.095 - DEBUG: Rendering template: nginxpv/deployment.yml.j2 19:55:55.096 - DEBUG: Using vars: app_name: nginx app_namespace: ocp-41820-nginx-source deployment_api: apps/v1 html_accessmode: ReadWriteOnce logs_accessmode: ReadWriteOnce storage_class: default storage_size: 1Gi 19:56:10.441 - DEBUG: Requesting get: http://my-nginx-ocp-41820-nginx-source.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:56:10.871 - DEBUG: Requesting get: http://my-nginx-ocp-41820-nginx-source.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:56:10.895 - INFO: Validating nginx application in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:56:11.251 - DEBUG: Validated that 1 errors were reported in the errors log file 19:56:11.649 - DEBUG: Validated that 1 errors and 1 success were reported in the access log file 19:56:11.650 - INFO: Removing app [ocp-django] in namespace [ocp-41820-django-source] from cluster [host] 19:56:11.650 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:56:12.246 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:56:12.249 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:56:12.249 - INFO: the implicit localhost does not match 'all' 19:56:12.353 - INFO: [WARNING]: Found variable using reserved name: namespace 19:56:12.354 - INFO: 19:56:12.354 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:56:13.294 - INFO: 19:56:13.294 - INFO: TASK [Gathering Facts] ********************************************************* 19:56:13.294 - INFO: ok: [localhost] 19:56:13.315 - INFO: 19:56:13.315 - INFO: TASK [include_vars] ************************************************************ 19:56:13.315 - INFO: ok: [localhost] 19:56:15.069 - INFO: 19:56:15.069 - INFO: TASK [ocp-django : Remove namespace ocp-41820-django-source] ******************* 19:56:15.069 - INFO: ok: [localhost] 19:56:16.952 - INFO: 19:56:16.952 - INFO: TASK [Remove Namespace ocp-41820-django-source] ******************************** 19:56:16.952 - INFO: ok: [localhost] 19:56:16.965 - INFO: 19:56:16.965 - INFO: PLAY RECAP ********************************************************************* 19:56:16.965 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:56:17.120 - DEBUG: Removed private data directory: /tmp/tmpslgd_azt 19:56:17.121 - INFO: Removing app [ocp-django] in namespace [ocp-41820-django-source] from cluster [source-cluster] 19:56:17.121 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:56:17.711 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:56:17.714 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:56:17.714 - INFO: the implicit localhost does not match 'all' 19:56:17.815 - INFO: [WARNING]: Found variable using reserved name: namespace 19:56:17.816 - INFO: 19:56:17.816 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:56:18.757 - INFO: 19:56:18.758 - INFO: TASK [Gathering Facts] ********************************************************* 19:56:18.758 - INFO: ok: [localhost] 19:56:18.778 - INFO: 19:56:18.779 - INFO: TASK [include_vars] ************************************************************ 19:56:18.779 - INFO: ok: [localhost] 19:56:20.509 - INFO: 19:56:20.509 - INFO: TASK [ocp-django : Remove namespace ocp-41820-django-source] ******************* 19:56:20.509 - INFO: ok: [localhost] 19:56:22.315 - INFO: 19:56:22.316 - INFO: TASK [Remove Namespace ocp-41820-django-source] ******************************** 19:56:22.316 - INFO: ok: [localhost] 19:56:22.329 - INFO: 19:56:22.329 - INFO: PLAY RECAP ********************************************************************* 19:56:22.329 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:56:22.487 - DEBUG: Removed private data directory: /tmp/tmp_dx48zl9 19:56:22.488 - INFO: Deploying app [ocp-django] in namespace [ocp-41820-django-source] in cluster [source-cluster] 19:56:22.488 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:56:23.095 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:56:23.099 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:56:23.099 - INFO: the implicit localhost does not match 'all' 19:56:23.200 - INFO: [WARNING]: Found variable using reserved name: namespace 19:56:23.200 - INFO: 19:56:23.200 - INFO: PLAY [Deploy Application] ****************************************************** 19:56:24.127 - INFO: 19:56:24.127 - INFO: TASK [Gathering Facts] ********************************************************* 19:56:24.127 - INFO: ok: [localhost] 19:56:24.148 - INFO: 19:56:24.148 - INFO: TASK [include_vars] ************************************************************ 19:56:24.148 - INFO: ok: [localhost] 19:56:25.924 - INFO: 19:56:25.924 - INFO: TASK [ocp-django : Check namespace] ******************************************** 19:56:25.924 - INFO: ok: [localhost] 19:56:26.550 - INFO: 19:56:26.550 - INFO: TASK [ocp-django : Create namespace] ******************************************* 19:56:26.550 - INFO: changed: [localhost] 19:56:27.669 - INFO: 19:56:27.669 - INFO: TASK [ocp-django : Create the mtc test django psql persistent template] ******** 19:56:27.669 - INFO: changed: [localhost] 19:56:28.477 - INFO: 19:56:28.477 - INFO: TASK [ocp-django : Create openshift django psql persisten application from openshift templates] *** 19:56:28.477 - INFO: ok: [localhost] 19:56:30.092 - INFO: FAILED - RETRYING: Check postgresql pod status (60 retries left). 19:56:36.672 - INFO: FAILED - RETRYING: Check postgresql pod status (59 retries left). 19:56:43.263 - INFO: FAILED - RETRYING: Check postgresql pod status (58 retries left). 19:56:49.845 - INFO: FAILED - RETRYING: Check postgresql pod status (57 retries left). 19:56:56.412 - INFO: FAILED - RETRYING: Check postgresql pod status (56 retries left). 19:57:02.987 - INFO: 19:57:02.987 - INFO: TASK [ocp-django : Check postgresql pod status] ******************************** 19:57:02.987 - INFO: ok: [localhost] 19:57:04.593 - INFO: FAILED - RETRYING: Check application pod status (60 retries left). 19:57:11.152 - INFO: FAILED - RETRYING: Check application pod status (59 retries left). 19:57:17.734 - INFO: FAILED - RETRYING: Check application pod status (58 retries left). 19:57:24.327 - INFO: FAILED - RETRYING: Check application pod status (57 retries left). 19:57:30.912 - INFO: FAILED - RETRYING: Check application pod status (56 retries left). 19:57:37.543 - INFO: FAILED - RETRYING: Check application pod status (55 retries left). 19:57:44.162 - INFO: 19:57:44.162 - INFO: TASK [ocp-django : Check application pod status] ******************************* 19:57:44.162 - INFO: ok: [localhost] 19:57:45.820 - INFO: 19:57:45.820 - INFO: TASK [ocp-django : Get route] ************************************************** 19:57:45.820 - INFO: ok: [localhost] 19:57:46.563 - INFO: 19:57:46.563 - INFO: TASK [ocp-django : Access the html file] *************************************** 19:57:46.563 - INFO: ok: [localhost] => (item=1) 19:57:47.147 - INFO: ok: [localhost] => (item=2) 19:57:47.734 - INFO: ok: [localhost] => (item=3) 19:57:48.309 - INFO: ok: [localhost] => (item=4) 19:57:48.885 - INFO: ok: [localhost] => (item=5) 19:57:48.907 - INFO: 19:57:48.907 - INFO: PLAY RECAP ********************************************************************* 19:57:48.907 - INFO: localhost : ok=10  changed=2  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 19:57:49.064 - DEBUG: Removed private data directory: /tmp/tmp2m410kig 19:57:49.065 - INFO: Removing app [ocp-configmap] in namespace [ocp-41820-configmap-source] from cluster [host] 19:57:49.066 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:57:49.662 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:57:49.665 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:57:49.665 - INFO: the implicit localhost does not match 'all' 19:57:49.766 - INFO: [WARNING]: Found variable using reserved name: namespace 19:57:49.766 - INFO: 19:57:49.766 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:57:50.696 - INFO: 19:57:50.697 - INFO: TASK [Gathering Facts] ********************************************************* 19:57:50.697 - INFO: ok: [localhost] 19:57:50.718 - INFO: 19:57:50.718 - INFO: TASK [include_vars] ************************************************************ 19:57:50.718 - INFO: ok: [localhost] 19:57:52.467 - INFO: 19:57:52.468 - INFO: TASK [ocp-configmap : Remove namespace ocp-41820-configmap-source] ************* 19:57:52.468 - INFO: ok: [localhost] 19:57:54.304 - INFO: 19:57:54.304 - INFO: TASK [Remove Namespace ocp-41820-configmap-source] ***************************** 19:57:54.304 - INFO: ok: [localhost] 19:57:54.317 - INFO: 19:57:54.317 - INFO: PLAY RECAP ********************************************************************* 19:57:54.317 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:57:54.469 - DEBUG: Removed private data directory: /tmp/tmpxlp06ee3 19:57:54.469 - INFO: Removing app [ocp-configmap] in namespace [ocp-41820-configmap-source] from cluster [source-cluster] 19:57:54.470 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:57:55.061 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:57:55.064 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:57:55.064 - INFO: the implicit localhost does not match 'all' 19:57:55.167 - INFO: [WARNING]: Found variable using reserved name: namespace 19:57:55.167 - INFO: 19:57:55.167 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:57:56.096 - INFO: 19:57:56.096 - INFO: TASK [Gathering Facts] ********************************************************* 19:57:56.096 - INFO: ok: [localhost] 19:57:56.116 - INFO: 19:57:56.116 - INFO: TASK [include_vars] ************************************************************ 19:57:56.116 - INFO: ok: [localhost] 19:57:57.886 - INFO: 19:57:57.886 - INFO: TASK [ocp-configmap : Remove namespace ocp-41820-configmap-source] ************* 19:57:57.887 - INFO: ok: [localhost] 19:57:59.757 - INFO: 19:57:59.758 - INFO: TASK [Remove Namespace ocp-41820-configmap-source] ***************************** 19:57:59.758 - INFO: ok: [localhost] 19:57:59.771 - INFO: 19:57:59.771 - INFO: PLAY RECAP ********************************************************************* 19:57:59.771 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:57:59.925 - DEBUG: Removed private data directory: /tmp/tmpsaicwhyp 19:57:59.926 - INFO: Deploying app [ocp-configmap] in namespace [ocp-41820-configmap-source] in cluster [source-cluster] 19:57:59.926 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:58:00.513 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:58:00.517 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:58:00.517 - INFO: the implicit localhost does not match 'all' 19:58:00.617 - INFO: [WARNING]: Found variable using reserved name: namespace 19:58:00.617 - INFO: 19:58:00.617 - INFO: PLAY [Deploy Application] ****************************************************** 19:58:01.549 - INFO: 19:58:01.550 - INFO: TASK [Gathering Facts] ********************************************************* 19:58:01.550 - INFO: ok: [localhost] 19:58:01.570 - INFO: 19:58:01.570 - INFO: TASK [include_vars] ************************************************************ 19:58:01.570 - INFO: ok: [localhost] 19:58:03.381 - INFO: 19:58:03.381 - INFO: TASK [ocp-configmap : Check namespace] ***************************************** 19:58:03.381 - INFO: ok: [localhost] 19:58:04.029 - INFO: 19:58:04.029 - INFO: TASK [ocp-configmap : Create namespace] **************************************** 19:58:04.029 - INFO: changed: [localhost] 19:58:05.848 - INFO: 19:58:05.848 - INFO: TASK [ocp-configmap : Deploy configuration map] ******************************** 19:58:05.848 - INFO: changed: [localhost] 19:58:07.513 - INFO: 19:58:07.513 - INFO: TASK [ocp-configmap : Deploy redis appliation] ********************************* 19:58:07.513 - INFO: changed: [localhost] 19:58:07.632 - INFO: 19:58:07.632 - INFO: PLAY RECAP ********************************************************************* 19:58:07.632 - INFO: localhost : ok=6  changed=3  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 19:58:07.788 - DEBUG: Removed private data directory: /tmp/tmpjj_4jzjt 19:58:07.914 - INFO: Migplan test-interop-migplan has been created 19:58:07.915 - INFO: Migplan IDIM: False 19:58:07.915 - INFO: Migplan IDVM: False 19:58:08.003 - INFO: Waiting for Ready status... 19:58:18.017 - INFO: Waiting for Ready status... 19:58:28.031 - INFO: The migration plan is Ready. ------------------------------ live log logreport ------------------------------ 19:58:28.032 - INFO: -------------------------------- live log call --------------------------------- 19:58:28.032 - INFO: Migrating from ns:ocp-41820-nginx-source in cluster:source-cluster to ns:ocp-41820-nginx-target in cluster:host 19:58:28.032 - INFO: Migrating from ns:ocp-41820-django-source in cluster:source-cluster to ns:ocp-41820-django-target in cluster:host 19:58:28.032 - INFO: Migrating from ns:ocp-41820-configmap-source in cluster:source-cluster to ns:ocp-41820-configmap-target in cluster:host 19:58:28.032 - INFO: Migplan test-interop-migplan. Wait until ready 19:58:28.046 - INFO: The migration plan is Ready. 19:58:28.046 - INFO: MIGPLAN READY 19:58:28.046 - INFO: NO WARNINGS IN MIGPLAN 19:58:28.046 - INFO: EXECUTE MIGRATION 19:58:28.093 - INFO: Not started. Waiting... 19:58:38.106 - INFO: Step: 18/49. Waiting... 19:58:48.121 - INFO: Step: 18/49. Waiting... 19:58:58.135 - INFO: Step: 38/49. Waiting... 19:59:08.149 - INFO: Step: 38/49. Waiting... 19:59:18.162 - INFO: Step: 38/49. Waiting... 19:59:28.178 - INFO: Step: 38/49. Waiting... 19:59:38.192 - INFO: Step: 38/49. Waiting... 19:59:48.206 - INFO: Step: 43/49. Waiting... 19:59:58.221 - INFO: Finished. 19:59:58.221 - INFO: VALIDATE APPLICATION 19:59:58.221 - INFO: Validating migrated nginx application in cluster https://api.mtc-target-nzvv.cspilp.interop.ccitredhat.com:6443 20:00:07.334 - DEBUG: Requesting get: http://my-nginx-ocp-41820-nginx-target.apps.mtc-target-nzvv.cspilp.interop.ccitredhat.com 20:00:07.790 - DEBUG: Validated that 1 errors were reported in the errors log file 20:00:08.193 - DEBUG: Validated that 1 errors and 2 success were reported in the access log file 20:00:08.193 - INFO: VALIDATE APPLICATION 20:00:08.193 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 20:00:08.786 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:00:08.789 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:00:08.790 - INFO: the implicit localhost does not match 'all' 20:00:08.888 - INFO: [WARNING]: Found variable using reserved name: namespace 20:00:08.889 - INFO: 20:00:08.889 - INFO: PLAY [Validate application] **************************************************** 20:00:09.810 - INFO: 20:00:09.811 - INFO: TASK [Gathering Facts] ********************************************************* 20:00:09.811 - INFO: ok: [localhost] 20:00:09.831 - INFO: 20:00:09.831 - INFO: TASK [include_vars] ************************************************************ 20:00:09.831 - INFO: ok: [localhost] 20:00:11.782 - INFO: FAILED - RETRYING: Check postgresql pod status (60 retries left). 20:00:18.448 - INFO: 20:00:18.448 - INFO: TASK [ocp-django : Check postgresql pod status] ******************************** 20:00:18.448 - INFO: ok: [localhost] 20:00:20.123 - INFO: FAILED - RETRYING: Check application pod status (60 retries left). 20:00:26.741 - INFO: FAILED - RETRYING: Check application pod status (59 retries left). 20:00:33.386 - INFO: 20:00:33.386 - INFO: TASK [ocp-django : Check application pod status] ******************************* 20:00:33.386 - INFO: ok: [localhost] 20:00:35.068 - INFO: 20:00:35.068 - INFO: TASK [ocp-django : Get route] ************************************************** 20:00:35.068 - INFO: ok: [localhost] 20:00:35.797 - INFO: 20:00:35.798 - INFO: TASK [ocp-django : Access the html file] *************************************** 20:00:35.798 - INFO: ok: [localhost] => (item=1) 20:00:36.378 - INFO: ok: [localhost] => (item=2) 20:00:36.945 - INFO: ok: [localhost] => (item=3) 20:00:37.529 - INFO: ok: [localhost] => (item=4) 20:00:38.099 - INFO: ok: [localhost] => (item=5) 20:00:38.121 - INFO: 20:00:38.121 - INFO: PLAY RECAP ********************************************************************* 20:00:38.121 - INFO: localhost : ok=6  changed=0 unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 20:00:38.279 - DEBUG: Removed private data directory: /tmp/tmp8n0eyp6b 20:00:38.279 - INFO: VALIDATE APPLICATION 20:00:38.280 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 20:00:38.869 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:00:38.872 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:00:38.873 - INFO: the implicit localhost does not match 'all' 20:00:38.971 - INFO: [WARNING]: Found variable using reserved name: namespace 20:00:38.971 - INFO: 20:00:38.972 - INFO: PLAY [Validate application] **************************************************** 20:00:39.906 - INFO: 20:00:39.906 - INFO: TASK [Gathering Facts] ********************************************************* 20:00:39.907 - INFO: ok: [localhost] 20:00:39.927 - INFO: 20:00:39.927 - INFO: TASK [include_vars] ************************************************************ 20:00:39.927 - INFO: ok: [localhost] 20:00:41.854 - INFO: 20:00:41.854 - INFO: TASK [ocp-configmap : Check config map] **************************************** 20:00:41.854 - INFO: ok: [localhost] 20:00:43.590 - INFO: 20:00:43.590 - INFO: TASK [ocp-configmap : Check redis app] ***************************************** 20:00:43.590 - INFO: ok: [localhost] 20:00:44.524 - INFO: 20:00:44.524 - INFO: TASK [ocp-configmap : Verify max memory configuration] ************************* 20:00:44.524 - INFO: changed: [localhost] 20:00:45.423 - INFO: 20:00:45.424 - INFO: TASK [ocp-configmap : Verify memory policy configuration] ********************** 20:00:45.424 - INFO: changed: [localhost] 20:00:45.442 - INFO: 20:00:45.442 - INFO: PLAY RECAP ********************************************************************* 20:00:45.442 - INFO: localhost : ok=6  changed=2  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 20:00:45.594 - DEBUG: Removed private data directory: /tmp/tmpiqsulpgd PASSED------------------------------ live log logreport ------------------------------ 20:00:45.595 - INFO: ------------------------------ live log teardown ------------------------------- 20:00:45.595 - INFO: Deleting Migplan test-interop-migplan... 20:00:55.708 - INFO: Removing app in namespace [ocp-41820-nginx-source] from cluster [host] 20:00:55.725 - INFO: Waiting for resources to be deleted 20:00:55.739 - INFO: Removing app in namespace [ocp-41820-nginx-source] from cluster [source-cluster] 20:00:55.786 - INFO: Removing namespace: ocp-41820-nginx-source 20:00:55.805 - INFO: Waiting for resources to be deleted 20:01:09.022 - INFO: Removing app in namespace [ocp-41820-django-source] from cluster [host] 20:01:09.022 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 20:01:09.622 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:01:09.625 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:01:09.626 - INFO: the implicit localhost does not match 'all' 20:01:09.731 - INFO: [WARNING]: Found variable using reserved name: namespace 20:01:09.731 - INFO: 20:01:09.731 - INFO: PLAY [Execute Application Cleanup] ********************************************* 20:01:10.667 - INFO: 20:01:10.668 - INFO: TASK [Gathering Facts] ********************************************************* 20:01:10.668 - INFO: ok: [localhost] 20:01:10.688 - INFO: 20:01:10.688 - INFO: TASK [include_vars] ************************************************************ 20:01:10.688 - INFO: ok: [localhost] 20:01:12.476 - INFO: 20:01:12.476 - INFO: TASK [ocp-django : Remove namespace ocp-41820-django-source] ******************* 20:01:12.476 - INFO: ok: [localhost] 20:01:14.270 - INFO: 20:01:14.270 - INFO: TASK [Remove Namespace ocp-41820-django-source] ******************************** 20:01:14.270 - INFO: ok: [localhost] 20:01:14.284 - INFO: 20:01:14.284 - INFO: PLAY RECAP ********************************************************************* 20:01:14.284 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 20:01:14.439 - DEBUG: Removed private data directory: /tmp/tmphj4gsuq7 20:01:14.440 - INFO: Removing app in namespace [ocp-41820-django-source] from cluster [source-cluster] 20:01:14.440 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 20:01:15.042 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:01:15.045 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:01:15.045 - INFO: the implicit localhost does not match 'all' 20:01:15.149 - INFO: [WARNING]: Found variable using reserved name: namespace 20:01:15.149 - INFO: 20:01:15.149 - INFO: PLAY [Execute Application Cleanup] ********************************************* 20:01:16.081 - INFO: 20:01:16.081 - INFO: TASK [Gathering Facts] ********************************************************* 20:01:16.081 - INFO: ok: [localhost] 20:01:16.101 - INFO: 20:01:16.102 - INFO: TASK [include_vars] ************************************************************ 20:01:16.102 - INFO: ok: [localhost] 20:01:32.933 - INFO: 20:01:32.933 - INFO: TASK [ocp-django : Remove namespace ocp-41820-django-source] ******************* 20:01:32.933 - INFO: changed: [localhost] 20:01:34.785 - INFO: 20:01:34.785 - INFO: TASK [Remove Namespace ocp-41820-django-source] ******************************** 20:01:34.785 - INFO: ok: [localhost] 20:01:34.798 - INFO: 20:01:34.798 - INFO: PLAY RECAP ********************************************************************* 20:01:34.798 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 20:01:34.947 - DEBUG: Removed private data directory: /tmp/tmpkvbj94fz 20:01:34.948 - INFO: Removing app in namespace [ocp-41820-configmap-source] from cluster [host] 20:01:34.948 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 20:01:35.538 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:01:35.541 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:01:35.541 - INFO: the implicit localhost does not match 'all' 20:01:35.641 - INFO: [WARNING]: Found variable using reserved name: namespace 20:01:35.642 - INFO: 20:01:35.642 - INFO: PLAY [Execute Application Cleanup] ********************************************* 20:01:36.574 - INFO: 20:01:36.575 - INFO: TASK [Gathering Facts] ********************************************************* 20:01:36.575 - INFO: ok: [localhost] 20:01:36.595 - INFO: 20:01:36.595 - INFO: TASK [include_vars] ************************************************************ 20:01:36.595 - INFO: ok: [localhost] 20:01:38.337 - INFO: 20:01:38.337 - INFO: TASK [ocp-configmap : Remove namespace ocp-41820-configmap-source] ************* 20:01:38.337 - INFO: ok: [localhost] 20:01:40.144 - INFO: 20:01:40.144 - INFO: TASK [Remove Namespace ocp-41820-configmap-source] ***************************** 20:01:40.144 - INFO: ok: [localhost] 20:01:40.157 - INFO: 20:01:40.158 - INFO: PLAY RECAP ********************************************************************* 20:01:40.158 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 20:01:40.312 - DEBUG: Removed private data directory: /tmp/tmpuq6_uqe4 20:01:40.313 - INFO: Removing app in namespace [ocp-41820-configmap-source] from cluster [source-cluster] 20:01:40.313 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 20:01:40.910 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:01:40.913 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:01:40.914 - INFO: the implicit localhost does not match 'all' 20:01:41.014 - INFO: [WARNING]: Found variable using reserved name: namespace 20:01:41.015 - INFO: 20:01:41.015 - INFO: PLAY [Execute Application Cleanup] ********************************************* 20:01:41.938 - INFO: 20:01:41.939 - INFO: TASK [Gathering Facts] ********************************************************* 20:01:41.939 - INFO: ok: [localhost] 20:01:41.959 - INFO: 20:01:41.959 - INFO: TASK [include_vars] ************************************************************ 20:01:41.959 - INFO: ok: [localhost] 20:01:53.759 - INFO: 20:01:53.759 - INFO: TASK [ocp-configmap : Remove namespace ocp-41820-configmap-source] ************* 20:01:53.760 - INFO: changed: [localhost] 20:01:55.618 - INFO: 20:01:55.619 - INFO: TASK [Remove Namespace ocp-41820-configmap-source] ***************************** 20:01:55.619 - INFO: ok: [localhost] 20:01:55.631 - INFO: 20:01:55.631 - INFO: PLAY RECAP ********************************************************************* 20:01:55.631 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 20:01:55.786 - DEBUG: Removed private data directory: /tmp/tmpy849al34 20:01:55.787 - INFO: Removing namespace fixture: ocp-41820-configmap-source from cluster source-cluster 20:01:55.803 - INFO: Removing namespace fixture: ocp-41820-configmap-target from cluster host 20:01:55.837 - INFO: Waiting for namespace fixture to be deleted 20:02:02.957 - INFO: Removing namespace fixture: ocp-41820-django-source from cluster source-cluster 20:02:02.974 - INFO: Removing namespace fixture: ocp-41820-django-target from cluster host 20:02:03.006 - INFO: Waiting for namespace fixture to be deleted 20:02:15.204 - INFO: Removing namespace fixture: ocp-41820-nginx-source from cluster source-cluster 20:02:15.221 - INFO: Removing namespace fixture: ocp-41820-nginx-target from cluster host 20:02:15.252 - INFO: Waiting for namespace fixture to be deleted ------------------------------ live log logreport ------------------------------ 20:02:27.454 - INFO: =================================== FAILURES =================================== _____________________________ test_mtc_87_interop ______________________________ migplan = apps = [] migrated_namespaces = 'longnameprojecttest-longnameprojecttest-longnameprojecttest-123:longnameprojecttest-longnameprojecttest-longnameprojecttest-123' src_cluster = tgt_cluster = pytestconfig = <_pytest.config.Config object at 0x7f84215e7710> @pytest.mark.app( app_id="ocp-attached-pvc", app_namespace="longnameprojecttest-longnameprojecttest-longnameprojecttest-123" ) @pytest.mark.migrated_namespaces( "longnameprojecttest-longnameprojecttest-longnameprojecttest-123:longnameprojecttest-longnameprojecttest-longnameprojecttest-123" ) def test_mtc_87_interop(migplan, apps, migrated_namespaces, src_cluster, tgt_cluster, pytestconfig): tgt_namespace = get_target_namespace_from_namespace(migrated_namespaces) src_namespace = get_source_namespace_from_namespace(migrated_namespaces) app = apps[0] logger.info( "Migrating from ns:{0} in cluster:{1} to ns:{2} in cluster:{3}".format( src_namespace, src_cluster.name, tgt_namespace, tgt_cluster.name ) ) logger.info("Migplan {0}. Wait until ready".format(migplan.name)) ready = migplan.wait_until_ready() # Assert that plan is in Ready status assert ready, "Migplan must be ready:{0}".format(pretty(migplan.definition)) logger.info("MIGPLAN READY") idvm = pytestconfig.getoption("--idvm") if not idvm: migplan.refresh() has_warnings = migplan.wait_until_warnings() assert has_warnings, "There should be a warning {0}".format(pretty(migplan.definition)) warning_type_expected = "NamespaceLengthExceeded" warnings = migplan.get_warnings(warning_type=warning_type_expected) assert len(warnings) == 1, "There should be a {1} warning {0}".format( pretty(migplan.definition), warning_type_expected ) logger.info("RIGHT WARNINGS IN MIGPLAN: {0}".format(warnings)) logger.info("EXECUTE MIGRATION") migmigration = migplan.migrate(quiesce=True) success = migmigration.wait_until_success() assert success, "The migration should succeed. {0}".format(pretty(migmigration.definition)) migrated_app = get_migrated_app(app, tgt_cluster, tgt_namespace) logger.info("VALIDATE APPLICATION") ok = migrated_app.validate() > assert ok, "The application should be validated OK in the target cluster" E AssertionError: The application should be validated OK in the target cluster E assert False mtc-e2e-qev2/mtc_tests/tests/test_interop.py:61: AssertionError ------------------------------ Captured log setup ------------------------------ 19:28:54.754 - INFO: Removing namespace fixture: longnameprojecttest-longnameprojecttest-longnameprojecttest-123 from cluster source-cluster 19:28:55.455 - INFO: Removing namespace fixture: longnameprojecttest-longnameprojecttest-longnameprojecttest-123 from cluster host 19:28:56.222 - INFO: Removing app [ocp-attached-pvc] in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] from cluster [host] 19:28:56.222 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:28:56.840 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:28:56.844 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:28:56.844 - INFO: the implicit localhost does not match 'all' 19:28:56.953 - INFO: [WARNING]: Found variable using reserved name: namespace 19:28:56.953 - INFO: 19:28:56.954 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:28:58.044 - INFO: 19:28:58.044 - INFO: TASK [Gathering Facts] ********************************************************* 19:28:58.044 - INFO: ok: [localhost] 19:28:58.068 - INFO: 19:28:58.068 - INFO: TASK [include_vars] ************************************************************ 19:28:58.068 - INFO: ok: [localhost] 19:28:59.893 - INFO: 19:28:59.893 - INFO: TASK [ocp-attached-pvc : Remove namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:28:59.893 - INFO: ok: [localhost] 19:29:01.915 - INFO: 19:29:01.915 - INFO: TASK [Remove Namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:29:01.915 - INFO: ok: [localhost] 19:29:01.930 - INFO: 19:29:01.930 - INFO: PLAY RECAP ********************************************************************* 19:29:01.931 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:29:02.093 - DEBUG: Removed private data directory: /tmp/tmpma1d3h2i 19:29:02.094 - INFO: Removing app [ocp-attached-pvc] in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] from cluster [source-cluster] 19:29:02.094 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:29:02.693 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:29:02.696 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:29:02.696 - INFO: the implicit localhost does not match 'all' 19:29:02.796 - INFO: [WARNING]: Found variable using reserved name: namespace 19:29:02.796 - INFO: 19:29:02.796 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:29:03.781 - INFO: 19:29:03.781 - INFO: TASK [Gathering Facts] ********************************************************* 19:29:03.781 - INFO: ok: [localhost] 19:29:03.801 - INFO: 19:29:03.801 - INFO: TASK [include_vars] ************************************************************ 19:29:03.801 - INFO: ok: [localhost] 19:29:05.774 - INFO: 19:29:05.774 - INFO: TASK [ocp-attached-pvc : Remove namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:29:05.775 - INFO: ok: [localhost] 19:29:07.818 - INFO: 19:29:07.819 - INFO: TASK [Remove Namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:29:07.819 - INFO: ok: [localhost] 19:29:07.831 - INFO: 19:29:07.831 - INFO: PLAY RECAP ********************************************************************* 19:29:07.831 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:29:07.984 - DEBUG: Removed private data directory: /tmp/tmp6he6y7vg 19:29:07.984 - INFO: Deploying app [ocp-attached-pvc] in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] in cluster [source-cluster] 19:29:07.985 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:29:08.598 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:29:08.601 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:29:08.602 - INFO: the implicit localhost does not match 'all' 19:29:08.721 - INFO: [WARNING]: Found variable using reserved name: namespace 19:29:08.721 - INFO: 19:29:08.721 - INFO: PLAY [Deploy Application] ****************************************************** 19:29:09.886 - INFO: 19:29:09.886 - INFO: TASK [Gathering Facts] ********************************************************* 19:29:09.886 - INFO: ok: [localhost] 19:29:09.938 - INFO: 19:29:09.939 - INFO: TASK [include_vars] ************************************************************ 19:29:09.939 - INFO: ok: [localhost] 19:29:12.502 - INFO: 19:29:12.503 - INFO: TASK [ocp-attached-pvc : Check namespace] ************************************** 19:29:12.503 - INFO: ok: [localhost] 19:29:13.459 - INFO: 19:29:13.460 - INFO: TASK [ocp-attached-pvc : Create namespace] ************************************* 19:29:13.460 - INFO: changed: [localhost] 19:29:15.896 - INFO: 19:29:15.897 - INFO: TASK [ocp-attached-pvc : Create the pvc-attached application resources] ******** 19:29:15.897 - INFO: fatal: [localhost]: FAILED! => {"changed": false, "error": 422, "msg": "Failed to create object: b'{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"Deployment.apps \\\\\"attached-pvc\\\\\" is invalid: spec.template.spec.containers[0].restartPolicy: Forbidden: may not be set for non-init containers\",\"reason\":\"Invalid\",\"details\":{\"name\":\"attached-pvc\",\"group\":\"apps\",\"kind\":\"Deployment\",\"causes\":[{\"reason\":\"FieldValueForbidden\",\"message\":\"Forbidden: may not be set for non-init containers\",\"field\":\"spec.template.spec.containers[0].restartPolicy\"}]},\"code\":422}\\n'", "reason": "Unprocessable Entity", "status": 422} 19:29:15.901 - INFO: 19:29:15.901 - INFO: PLAY RECAP ********************************************************************* 19:29:15.901 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 19:29:16.111 - DEBUG: Removed private data directory: /tmp/tmpbna9xll7 19:29:16.209 - INFO: Migplan test-interop-migplan has been created 19:29:16.210 - INFO: Migplan IDIM: False 19:29:16.210 - INFO: Migplan IDVM: False 19:29:16.271 - INFO: Waiting for Ready status... 19:29:26.285 - INFO: Waiting for Ready status... 19:29:36.300 - INFO: The migration plan is Ready. ------------------------------ Captured log call ------------------------------- 19:29:36.302 - INFO: Migrating from ns:longnameprojecttest-longnameprojecttest-longnameprojecttest-123 in cluster:source-cluster to ns:longnameprojecttest-longnameprojecttest-longnameprojecttest-123 in cluster:host 19:29:36.302 - INFO: Migplan test-interop-migplan. Wait until ready 19:29:36.316 - INFO: The migration plan is Ready. 19:29:36.316 - INFO: MIGPLAN READY 19:29:46.391 - INFO: RIGHT WARNINGS IN MIGPLAN: [{'category': 'Warn', 'lastTransitionTime': '2024-04-02T19:29:23Z', 'message': 'Namespaces [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] exceed 59 characters and no destination cluster route subdomain was configured. Direct Volume Migration may fail if you do not set `cluster_subdomain` value on the `MigrationController` CR.', 'reason': 'LengthExceeded', 'status': 'True', 'type': 'NamespaceLengthExceeded'}] 19:29:46.391 - INFO: EXECUTE MIGRATION 19:29:46.441 - INFO: Not started. Waiting... 19:29:56.456 - INFO: Step: 14/49. Waiting... 19:30:06.471 - INFO: Step: 14/49. Waiting... 19:30:16.487 - INFO: Step: 14/49. Waiting... 19:30:26.502 - INFO: Step: 41/49. Waiting... 19:30:36.518 - INFO: Finished. 19:30:36.518 - INFO: VALIDATE APPLICATION 19:30:36.518 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 19:30:37.108 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:30:37.112 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:30:37.112 - INFO: the implicit localhost does not match 'all' 19:30:37.213 - INFO: [WARNING]: Found variable using reserved name: namespace 19:30:37.214 - INFO: 19:30:37.214 - INFO: PLAY [Validate application] **************************************************** 19:30:38.149 - INFO: 19:30:38.150 - INFO: TASK [Gathering Facts] ********************************************************* 19:30:38.150 - INFO: ok: [localhost] 19:30:38.169 - INFO: 19:30:38.170 - INFO: TASK [include_vars] ************************************************************ 19:30:38.170 - INFO: ok: [localhost] 19:30:40.117 - INFO: FAILED - RETRYING: Check pod status (40 retries left). 19:30:46.699 - INFO: FAILED - RETRYING: Check pod status (39 retries left). 19:30:53.285 - INFO: FAILED - RETRYING: Check pod status (38 retries left). 19:30:59.847 - INFO: FAILED - RETRYING: Check pod status (37 retries left). 19:31:06.464 - INFO: FAILED - RETRYING: Check pod status (36 retries left). 19:31:13.037 - INFO: FAILED - RETRYING: Check pod status (35 retries left). 19:31:19.671 - INFO: FAILED - RETRYING: Check pod status (34 retries left). 19:31:26.262 - INFO: FAILED - RETRYING: Check pod status (33 retries left). 19:31:32.839 - INFO: FAILED - RETRYING: Check pod status (32 retries left). 19:31:39.417 - INFO: FAILED - RETRYING: Check pod status (31 retries left). 19:31:46.044 - INFO: FAILED - RETRYING: Check pod status (30 retries left). 19:31:52.659 - INFO: FAILED - RETRYING: Check pod status (29 retries left). 19:31:59.291 - INFO: FAILED - RETRYING: Check pod status (28 retries left). 19:32:05.889 - INFO: FAILED - RETRYING: Check pod status (27 retries left). 19:32:12.476 - INFO: FAILED - RETRYING: Check pod status (26 retries left). 19:32:19.027 - INFO: FAILED - RETRYING: Check pod status (25 retries left). 19:32:25.593 - INFO: FAILED - RETRYING: Check pod status (24 retries left). 19:32:32.162 - INFO: FAILED - RETRYING: Check pod status (23 retries left). 19:32:38.789 - INFO: FAILED - RETRYING: Check pod status (22 retries left). 19:32:45.386 - INFO: FAILED - RETRYING: Check pod status (21 retries left). 19:32:51.967 - INFO: FAILED - RETRYING: Check pod status (20 retries left). 19:32:58.547 - INFO: FAILED - RETRYING: Check pod status (19 retries left). 19:33:05.130 - INFO: FAILED - RETRYING: Check pod status (18 retries left). 19:33:11.716 - INFO: FAILED - RETRYING: Check pod status (17 retries left). 19:33:18.308 - INFO: FAILED - RETRYING: Check pod status (16 retries left). 19:33:24.878 - INFO: FAILED - RETRYING: Check pod status (15 retries left). 19:33:31.455 - INFO: FAILED - RETRYING: Check pod status (14 retries left). 19:33:38.036 - INFO: FAILED - RETRYING: Check pod status (13 retries left). 19:33:44.596 - INFO: FAILED - RETRYING: Check pod status (12 retries left). 19:33:51.221 - INFO: FAILED - RETRYING: Check pod status (11 retries left). 19:33:57.823 - INFO: FAILED - RETRYING: Check pod status (10 retries left). 19:34:04.415 - INFO: FAILED - RETRYING: Check pod status (9 retries left). 19:34:10.993 - INFO: FAILED - RETRYING: Check pod status (8 retries left). 19:34:17.606 - INFO: FAILED - RETRYING: Check pod status (7 retries left). 19:34:24.172 - INFO: FAILED - RETRYING: Check pod status (6 retries left). 19:34:30.780 - INFO: FAILED - RETRYING: Check pod status (5 retries left). 19:34:37.353 - INFO: FAILED - RETRYING: Check pod status (4 retries left). 19:34:43.915 - INFO: FAILED - RETRYING: Check pod status (3 retries left). 19:34:50.516 - INFO: FAILED - RETRYING: Check pod status (2 retries left). 19:34:57.079 - INFO: FAILED - RETRYING: Check pod status (1 retries left). 19:35:03.717 - INFO: 19:35:03.717 - INFO: TASK [ocp-attached-pvc : Check pod status] ************************************* 19:35:03.718 - INFO: fatal: [localhost]: FAILED! => {"api_found": true, "attempts": 40, "changed": false, "resources": []} 19:35:03.719 - INFO: 19:35:03.719 - INFO: PLAY RECAP ********************************************************************* 19:35:03.719 - INFO: localhost : ok=2  changed=0 unreachable=0 failed=1  skipped=7  rescued=0 ignored=0 19:35:03.874 - DEBUG: Removed private data directory: /tmp/tmpgfaexoyy ---------------------------- Captured log teardown ----------------------------- 19:35:03.991 - INFO: Deleting Migplan test-interop-migplan... 19:35:14.126 - INFO: Removing app in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] from cluster [host] 19:35:14.126 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:35:14.720 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:35:14.724 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:35:14.724 - INFO: the implicit localhost does not match 'all' 19:35:14.824 - INFO: [WARNING]: Found variable using reserved name: namespace 19:35:14.825 - INFO: 19:35:14.825 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:35:15.750 - INFO: 19:35:15.751 - INFO: TASK [Gathering Facts] ********************************************************* 19:35:15.751 - INFO: ok: [localhost] 19:35:15.771 - INFO: 19:35:15.771 - INFO: TASK [include_vars] ************************************************************ 19:35:15.771 - INFO: ok: [localhost] 19:35:27.575 - INFO: 19:35:27.576 - INFO: TASK [ocp-attached-pvc : Remove namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:35:27.576 - INFO: changed: [localhost] 19:35:29.462 - INFO: 19:35:29.463 - INFO: TASK [Remove Namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:35:29.463 - INFO: ok: [localhost] 19:35:29.476 - INFO: 19:35:29.476 - INFO: PLAY RECAP ********************************************************************* 19:35:29.476 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:35:29.629 - DEBUG: Removed private data directory: /tmp/tmpcpwbteol 19:35:29.630 - INFO: Removing app in namespace [longnameprojecttest-longnameprojecttest-longnameprojecttest-123] from cluster [source-cluster] 19:35:29.630 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:35:30.217 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:35:30.220 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:35:30.220 - INFO: the implicit localhost does not match 'all' 19:35:30.320 - INFO: [WARNING]: Found variable using reserved name: namespace 19:35:30.320 - INFO: 19:35:30.321 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:35:31.248 - INFO: 19:35:31.248 - INFO: TASK [Gathering Facts] ********************************************************* 19:35:31.248 - INFO: ok: [localhost] 19:35:31.268 - INFO: 19:35:31.268 - INFO: TASK [include_vars] ************************************************************ 19:35:31.268 - INFO: ok: [localhost] 19:35:48.067 - INFO: 19:35:48.067 - INFO: TASK [ocp-attached-pvc : Remove namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:35:48.067 - INFO: changed: [localhost] 19:35:49.913 - INFO: 19:35:49.913 - INFO: TASK [Remove Namespace longnameprojecttest-longnameprojecttest-longnameprojecttest-123] *** 19:35:49.913 - INFO: ok: [localhost] 19:35:49.926 - INFO: 19:35:49.926 - INFO: PLAY RECAP ********************************************************************* 19:35:49.926 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:35:50.075 - DEBUG: Removed private data directory: /tmp/tmp4wgohcq6 19:35:50.076 - INFO: Removing namespace fixture: longnameprojecttest-longnameprojecttest-longnameprojecttest-123 from cluster source-cluster 19:35:50.122 - INFO: Removing namespace fixture: longnameprojecttest-longnameprojecttest-longnameprojecttest-123 from cluster host _____________________________ test_mtc_101_interop _____________________________ mtc = migplan = apps = [, ] migrationcontroller = migrated_namespaces = ['ocp-41965-a:ocp-41965-anew', 'ocp-41965-b:ocp-41965-bnew'] src_cluster = tgt_cluster = @pytest.mark.app(app_id="ocp-attached-pvc", app_namespace="ocp-41965-a") @pytest.mark.app(app_id="ocp-attached-pvc", app_namespace="ocp-41965-b") @pytest.mark.migrated_namespaces(["ocp-41965-a:ocp-41965-anew", "ocp-41965-b:ocp-41965-bnew"]) def test_mtc_101_interop( mtc, migplan, apps, migrationcontroller, migrated_namespaces, src_cluster, tgt_cluster ): app1 = apps[0] app2 = apps[1] src_namespace1 = app1.namespace tgt_namespace1 = get_app_target_namespace_from_namespaces(app1, migrated_namespaces) src_namespace2 = app2.namespace tgt_namespace2 = get_app_target_namespace_from_namespaces(app2, migrated_namespaces) logger.info( "Migrating {0} from ns:{1} in cluster:{2} to ns:{3} in cluster:{4}".format( app1, src_namespace1, src_cluster.name, tgt_namespace1, tgt_cluster.name ) ) logger.info( "Migrating {0} from ns:{1} in cluster:{2} to ns:{3} in cluster:{4}".format( app2, src_namespace2, src_cluster.name, tgt_namespace2, tgt_cluster.name ) ) logger.info("Migplan {0}. Wait until ready".format(migplan.name)) ready = migplan.wait_until_ready() # Assert that plan is in Ready status assert ready, "Migplan must be ready:{0}".format(pretty(migplan.definition)) logger.info("MIGPLAN READY") # Assert that there are no warnings in the migplan assert len(migplan.get_warnings()) == 0, "There should be no warnings {0}".format(pretty(migplan.definition)) logger.info("NO WARNINGS IN MIGPLAN") logger.info("EXECUTE MIGRATION") migmigration = migplan.migrate(quiesce=True) success = migmigration.wait_until_success(timeout=500) assert success, "The migration should succeed. {0}".format(pretty(migmigration.definition)) migrated_app1 = get_migrated_app(app1, tgt_cluster, tgt_namespace1) migrated_app2 = get_migrated_app(app2, tgt_cluster, tgt_namespace2) logger.info("VALIDATE APPLICATION {}".format(migrated_app1)) ok = migrated_app1.validate() > assert ok, "The application {} should be validated OK in the target cluster".format(migrated_app1) E AssertionError: The application should be validated OK in the target cluster E assert False mtc-e2e-qev2/mtc_tests/tests/test_interop.py:259: AssertionError ------------------------------ Captured log setup ------------------------------ 19:37:31.762 - INFO: Removing namespace fixture: ocp-41965-a from cluster source-cluster 19:37:32.448 - INFO: Removing namespace fixture: ocp-41965-anew from cluster host 19:37:33.184 - INFO: Removing namespace fixture: ocp-41965-b from cluster source-cluster 19:37:33.198 - INFO: Removing namespace fixture: ocp-41965-bnew from cluster host 19:37:33.213 - INFO: Removing app [ocp-attached-pvc] in namespace [ocp-41965-b] from cluster [host] 19:37:33.214 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:37:33.803 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:33.807 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:33.807 - INFO: the implicit localhost does not match 'all' 19:37:33.907 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:33.908 - INFO: 19:37:33.908 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:37:34.849 - INFO: 19:37:34.849 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:34.849 - INFO: ok: [localhost] 19:37:34.869 - INFO: 19:37:34.869 - INFO: TASK [include_vars] ************************************************************ 19:37:34.869 - INFO: ok: [localhost] 19:37:36.645 - INFO: 19:37:36.645 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-b] ************************* 19:37:36.646 - INFO: ok: [localhost] 19:37:38.540 - INFO: 19:37:38.540 - INFO: TASK [Remove Namespace ocp-41965-b] ******************************************** 19:37:38.541 - INFO: ok: [localhost] 19:37:38.553 - INFO: 19:37:38.553 - INFO: PLAY RECAP ********************************************************************* 19:37:38.553 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:37:38.705 - DEBUG: Removed private data directory: /tmp/tmp243we_l8 19:37:38.706 - INFO: Removing app [ocp-attached-pvc] in namespace [ocp-41965-b] from cluster [source-cluster] 19:37:38.706 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:37:39.297 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:39.300 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:39.300 - INFO: the implicit localhost does not match 'all' 19:37:39.403 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:39.403 - INFO: 19:37:39.403 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:37:40.323 - INFO: 19:37:40.323 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:40.323 - INFO: ok: [localhost] 19:37:40.343 - INFO: 19:37:40.344 - INFO: TASK [include_vars] ************************************************************ 19:37:40.344 - INFO: ok: [localhost] 19:37:42.054 - INFO: 19:37:42.054 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-b] ************************* 19:37:42.054 - INFO: ok: [localhost] 19:37:43.868 - INFO: 19:37:43.868 - INFO: TASK [Remove Namespace ocp-41965-b] ******************************************** 19:37:43.868 - INFO: ok: [localhost] 19:37:43.881 - INFO: 19:37:43.881 - INFO: PLAY RECAP ********************************************************************* 19:37:43.881 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:37:44.034 - DEBUG: Removed private data directory: /tmp/tmp48j7rtsi 19:37:44.035 - INFO: Deploying app [ocp-attached-pvc] in namespace [ocp-41965-b] in cluster [source-cluster] 19:37:44.035 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:37:44.624 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:44.627 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:44.627 - INFO: the implicit localhost does not match 'all' 19:37:44.729 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:44.729 - INFO: 19:37:44.729 - INFO: PLAY [Deploy Application] ****************************************************** 19:37:45.644 - INFO: 19:37:45.645 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:45.645 - INFO: ok: [localhost] 19:37:45.666 - INFO: 19:37:45.666 - INFO: TASK [include_vars] ************************************************************ 19:37:45.666 - INFO: ok: [localhost] 19:37:47.387 - INFO: 19:37:47.387 - INFO: TASK [ocp-attached-pvc : Check namespace] ************************************** 19:37:47.387 - INFO: ok: [localhost] 19:37:48.103 - INFO: 19:37:48.103 - INFO: TASK [ocp-attached-pvc : Create namespace] ************************************* 19:37:48.103 - INFO: changed: [localhost] 19:37:49.879 - INFO: 19:37:49.879 - INFO: TASK [ocp-attached-pvc : Create the pvc-attached application resources] ******** 19:37:49.879 - INFO: fatal: [localhost]: FAILED! => {"changed": false, "error": 422, "msg": "Failed to create object: b'{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"Deployment.apps \\\\\"attached-pvc\\\\\" is invalid: spec.template.spec.containers[0].restartPolicy: Forbidden: may not be set for non-init containers\",\"reason\":\"Invalid\",\"details\":{\"name\":\"attached-pvc\",\"group\":\"apps\",\"kind\":\"Deployment\",\"causes\":[{\"reason\":\"FieldValueForbidden\",\"message\":\"Forbidden: may not be set for non-init containers\",\"field\":\"spec.template.spec.containers[0].restartPolicy\"}]},\"code\":422}\\n'", "reason": "Unprocessable Entity", "status": 422} 19:37:49.880 - INFO: 19:37:49.880 - INFO: PLAY RECAP ********************************************************************* 19:37:49.880 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 19:37:50.035 - DEBUG: Removed private data directory: /tmp/tmpjmbo6x4i 19:37:50.036 - INFO: Removing app [ocp-attached-pvc] in namespace [ocp-41965-a] from cluster [host] 19:37:50.036 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:37:50.619 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:50.623 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:50.623 - INFO: the implicit localhost does not match 'all' 19:37:50.723 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:50.724 - INFO: 19:37:50.724 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:37:51.645 - INFO: 19:37:51.646 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:51.646 - INFO: ok: [localhost] 19:37:51.666 - INFO: 19:37:51.666 - INFO: TASK [include_vars] ************************************************************ 19:37:51.666 - INFO: ok: [localhost] 19:37:53.488 - INFO: 19:37:53.489 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-a] ************************* 19:37:53.489 - INFO: ok: [localhost] 19:37:55.383 - INFO: 19:37:55.384 - INFO: TASK [Remove Namespace ocp-41965-a] ******************************************** 19:37:55.384 - INFO: ok: [localhost] 19:37:55.396 - INFO: 19:37:55.397 - INFO: PLAY RECAP ********************************************************************* 19:37:55.397 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:37:55.547 - DEBUG: Removed private data directory: /tmp/tmpolfxr357 19:37:55.548 - INFO: Removing app [ocp-attached-pvc] in namespace [ocp-41965-a] from cluster [source-cluster] 19:37:55.548 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:37:56.140 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:37:56.144 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:37:56.144 - INFO: the implicit localhost does not match 'all' 19:37:56.247 - INFO: [WARNING]: Found variable using reserved name: namespace 19:37:56.247 - INFO: 19:37:56.247 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:37:57.166 - INFO: 19:37:57.166 - INFO: TASK [Gathering Facts] ********************************************************* 19:37:57.166 - INFO: ok: [localhost] 19:37:57.186 - INFO: 19:37:57.186 - INFO: TASK [include_vars] ************************************************************ 19:37:57.186 - INFO: ok: [localhost] 19:37:58.947 - INFO: 19:37:58.947 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-a] ************************* 19:37:58.947 - INFO: ok: [localhost] 19:38:00.782 - INFO: 19:38:00.783 - INFO: TASK [Remove Namespace ocp-41965-a] ******************************************** 19:38:00.783 - INFO: ok: [localhost] 19:38:00.796 - INFO: 19:38:00.796 - INFO: PLAY RECAP ********************************************************************* 19:38:00.796 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:38:00.950 - DEBUG: Removed private data directory: /tmp/tmp7t1_jacf 19:38:00.951 - INFO: Deploying app [ocp-attached-pvc] in namespace [ocp-41965-a] in cluster [source-cluster] 19:38:00.951 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:38:01.543 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:38:01.546 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:38:01.547 - INFO: the implicit localhost does not match 'all' 19:38:01.649 - INFO: [WARNING]: Found variable using reserved name: namespace 19:38:01.649 - INFO: 19:38:01.649 - INFO: PLAY [Deploy Application] ****************************************************** 19:38:02.564 - INFO: 19:38:02.564 - INFO: TASK [Gathering Facts] ********************************************************* 19:38:02.564 - INFO: ok: [localhost] 19:38:02.584 - INFO: 19:38:02.584 - INFO: TASK [include_vars] ************************************************************ 19:38:02.584 - INFO: ok: [localhost] 19:38:04.318 - INFO: 19:38:04.318 - INFO: TASK [ocp-attached-pvc : Check namespace] ************************************** 19:38:04.318 - INFO: ok: [localhost] 19:38:04.958 - INFO: 19:38:04.958 - INFO: TASK [ocp-attached-pvc : Create namespace] ************************************* 19:38:04.958 - INFO: changed: [localhost] 19:38:06.705 - INFO: 19:38:06.705 - INFO: TASK [ocp-attached-pvc : Create the pvc-attached application resources] ******** 19:38:06.706 - INFO: fatal: [localhost]: FAILED! => {"changed": false, "error": 422, "msg": "Failed to create object: b'{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"Deployment.apps \\\\\"attached-pvc\\\\\" is invalid: spec.template.spec.containers[0].restartPolicy: Forbidden: may not be set for non-init containers\",\"reason\":\"Invalid\",\"details\":{\"name\":\"attached-pvc\",\"group\":\"apps\",\"kind\":\"Deployment\",\"causes\":[{\"reason\":\"FieldValueForbidden\",\"message\":\"Forbidden: may not be set for non-init containers\",\"field\":\"spec.template.spec.containers[0].restartPolicy\"}]},\"code\":422}\\n'", "reason": "Unprocessable Entity", "status": 422} 19:38:06.707 - INFO: 19:38:06.707 - INFO: PLAY RECAP ********************************************************************* 19:38:06.707 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 19:38:06.863 - DEBUG: Removed private data directory: /tmp/tmpoz1hi301 19:38:06.948 - INFO: Migplan test-interop-migplan has been created 19:38:06.948 - INFO: Migplan IDIM: False 19:38:06.948 - INFO: Migplan IDVM: False 19:38:07.005 - INFO: Waiting for Ready status... 19:38:17.020 - INFO: Waiting for Ready status... 19:38:27.035 - INFO: The migration plan is Ready. ------------------------------ Captured log call ------------------------------- 19:38:27.049 - INFO: Migrating from ns:ocp-41965-b in cluster:source-cluster to ns:ocp-41965-bnew in cluster:host 19:38:27.049 - INFO: Migrating from ns:ocp-41965-a in cluster:source-cluster to ns:ocp-41965-anew in cluster:host 19:38:27.049 - INFO: Migplan test-interop-migplan. Wait until ready 19:38:27.062 - INFO: The migration plan is Ready. 19:38:27.062 - INFO: MIGPLAN READY 19:38:27.063 - INFO: NO WARNINGS IN MIGPLAN 19:38:27.063 - INFO: EXECUTE MIGRATION 19:38:27.108 - INFO: Not started. Waiting... 19:38:37.122 - INFO: Step: 18/49. Waiting... 19:38:47.137 - INFO: Step: 18/49. Waiting... 19:38:57.155 - INFO: Step: 41/49. Waiting... 19:39:07.190 - INFO: Step: 41/49. Waiting... 19:39:17.205 - INFO: Step: 41/49. Waiting... 19:39:27.219 - INFO: Step: 43/49. Waiting... 19:39:37.235 - INFO: Finished. 19:39:37.235 - INFO: VALIDATE APPLICATION 19:39:37.235 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 19:39:37.831 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:39:37.834 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:39:37.835 - INFO: the implicit localhost does not match 'all' 19:39:37.936 - INFO: [WARNING]: Found variable using reserved name: namespace 19:39:37.937 - INFO: 19:39:37.937 - INFO: PLAY [Validate application] **************************************************** 19:39:38.863 - INFO: 19:39:38.863 - INFO: TASK [Gathering Facts] ********************************************************* 19:39:38.863 - INFO: ok: [localhost] 19:39:38.883 - INFO: 19:39:38.883 - INFO: TASK [include_vars] ************************************************************ 19:39:38.883 - INFO: ok: [localhost] 19:39:40.816 - INFO: FAILED - RETRYING: Check pod status (40 retries left). 19:39:47.442 - INFO: FAILED - RETRYING: Check pod status (39 retries left). 19:39:54.054 - INFO: FAILED - RETRYING: Check pod status (38 retries left). 19:40:00.704 - INFO: FAILED - RETRYING: Check pod status (37 retries left). 19:40:07.323 - INFO: FAILED - RETRYING: Check pod status (36 retries left). 19:40:13.945 - INFO: FAILED - RETRYING: Check pod status (35 retries left). 19:40:20.621 - INFO: FAILED - RETRYING: Check pod status (34 retries left). 19:40:27.264 - INFO: FAILED - RETRYING: Check pod status (33 retries left). 19:40:33.894 - INFO: FAILED - RETRYING: Check pod status (32 retries left). 19:40:40.577 - INFO: FAILED - RETRYING: Check pod status (31 retries left). 19:40:47.223 - INFO: FAILED - RETRYING: Check pod status (30 retries left). 19:40:53.808 - INFO: FAILED - RETRYING: Check pod status (29 retries left). 19:41:00.454 - INFO: FAILED - RETRYING: Check pod status (28 retries left). 19:41:07.113 - INFO: FAILED - RETRYING: Check pod status (27 retries left). 19:41:13.769 - INFO: FAILED - RETRYING: Check pod status (26 retries left). 19:41:20.372 - INFO: FAILED - RETRYING: Check pod status (25 retries left). 19:41:27.022 - INFO: FAILED - RETRYING: Check pod status (24 retries left). 19:41:33.638 - INFO: FAILED - RETRYING: Check pod status (23 retries left). 19:41:40.249 - INFO: FAILED - RETRYING: Check pod status (22 retries left). 19:41:46.888 - INFO: FAILED - RETRYING: Check pod status (21 retries left). 19:41:53.526 - INFO: FAILED - RETRYING: Check pod status (20 retries left). 19:42:00.111 - INFO: FAILED - RETRYING: Check pod status (19 retries left). 19:42:06.684 - INFO: FAILED - RETRYING: Check pod status (18 retries left). 19:42:13.271 - INFO: FAILED - RETRYING: Check pod status (17 retries left). 19:42:19.861 - INFO: FAILED - RETRYING: Check pod status (16 retries left). 19:42:26.481 - INFO: FAILED - RETRYING: Check pod status (15 retries left). 19:42:33.108 - INFO: FAILED - RETRYING: Check pod status (14 retries left). 19:42:39.741 - INFO: FAILED - RETRYING: Check pod status (13 retries left). 19:42:46.301 - INFO: FAILED - RETRYING: Check pod status (12 retries left). 19:42:52.908 - INFO: FAILED - RETRYING: Check pod status (11 retries left). 19:42:59.486 - INFO: FAILED - RETRYING: Check pod status (10 retries left). 19:43:06.060 - INFO: FAILED - RETRYING: Check pod status (9 retries left). 19:43:12.619 - INFO: FAILED - RETRYING: Check pod status (8 retries left). 19:43:19.214 - INFO: FAILED - RETRYING: Check pod status (7 retries left). 19:43:25.774 - INFO: FAILED - RETRYING: Check pod status (6 retries left). 19:43:32.347 - INFO: FAILED - RETRYING: Check pod status (5 retries left). 19:43:38.914 - INFO: FAILED - RETRYING: Check pod status (4 retries left). 19:43:45.489 - INFO: FAILED - RETRYING: Check pod status (3 retries left). 19:43:52.120 - INFO: FAILED - RETRYING: Check pod status (2 retries left). 19:43:58.799 - INFO: FAILED - RETRYING: Check pod status (1 retries left). 19:44:05.668 - INFO: 19:44:05.669 - INFO: TASK [ocp-attached-pvc : Check pod status] ************************************* 19:44:05.669 - INFO: fatal: [localhost]: FAILED! => {"api_found": true, "attempts": 40, "changed": false, "resources": []} 19:44:05.670 - INFO: 19:44:05.670 - INFO: PLAY RECAP ********************************************************************* 19:44:05.670 - INFO: localhost : ok=2  changed=0 unreachable=0 failed=1  skipped=7  rescued=0 ignored=0 19:44:05.834 - DEBUG: Removed private data directory: /tmp/tmpdf7fc7c6 ---------------------------- Captured log teardown ----------------------------- 19:44:05.845 - INFO: Deleting Migplan test-interop-migplan... 19:44:15.991 - INFO: Removing app in namespace [ocp-41965-b] from cluster [host] 19:44:15.992 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:44:16.645 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:44:16.648 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:44:16.649 - INFO: the implicit localhost does not match 'all' 19:44:16.760 - INFO: [WARNING]: Found variable using reserved name: namespace 19:44:16.761 - INFO: 19:44:16.761 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:44:17.720 - INFO: 19:44:17.720 - INFO: TASK [Gathering Facts] ********************************************************* 19:44:17.720 - INFO: ok: [localhost] 19:44:17.740 - INFO: 19:44:17.740 - INFO: TASK [include_vars] ************************************************************ 19:44:17.740 - INFO: ok: [localhost] 19:44:19.536 - INFO: 19:44:19.536 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-b] ************************* 19:44:19.536 - INFO: ok: [localhost] 19:44:21.559 - INFO: 19:44:21.560 - INFO: TASK [Remove Namespace ocp-41965-b] ******************************************** 19:44:21.560 - INFO: ok: [localhost] 19:44:21.578 - INFO: 19:44:21.579 - INFO: PLAY RECAP ********************************************************************* 19:44:21.579 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:44:21.750 - DEBUG: Removed private data directory: /tmp/tmpzne_g86e 19:44:21.751 - INFO: Removing app in namespace [ocp-41965-b] from cluster [source-cluster] 19:44:21.751 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:44:22.494 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:44:22.498 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:44:22.498 - INFO: the implicit localhost does not match 'all' 19:44:22.604 - INFO: [WARNING]: Found variable using reserved name: namespace 19:44:22.605 - INFO: 19:44:22.605 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:44:23.611 - INFO: 19:44:23.611 - INFO: TASK [Gathering Facts] ********************************************************* 19:44:23.611 - INFO: ok: [localhost] 19:44:23.634 - INFO: 19:44:23.634 - INFO: TASK [include_vars] ************************************************************ 19:44:23.634 - INFO: ok: [localhost] 19:44:40.567 - INFO: 19:44:40.567 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-b] ************************* 19:44:40.567 - INFO: changed: [localhost] 19:44:42.464 - INFO: 19:44:42.465 - INFO: TASK [Remove Namespace ocp-41965-b] ******************************************** 19:44:42.465 - INFO: ok: [localhost] 19:44:42.477 - INFO: 19:44:42.477 - INFO: PLAY RECAP ********************************************************************* 19:44:42.477 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:44:42.629 - DEBUG: Removed private data directory: /tmp/tmprle4ew6a 19:44:42.630 - INFO: Removing app in namespace [ocp-41965-a] from cluster [host] 19:44:42.630 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:44:43.217 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:44:43.220 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:44:43.221 - INFO: the implicit localhost does not match 'all' 19:44:43.321 - INFO: [WARNING]: Found variable using reserved name: namespace 19:44:43.321 - INFO: 19:44:43.321 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:44:44.244 - INFO: 19:44:44.244 - INFO: TASK [Gathering Facts] ********************************************************* 19:44:44.244 - INFO: ok: [localhost] 19:44:44.268 - INFO: 19:44:44.269 - INFO: TASK [include_vars] ************************************************************ 19:44:44.269 - INFO: ok: [localhost] 19:44:46.048 - INFO: 19:44:46.048 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-a] ************************* 19:44:46.048 - INFO: ok: [localhost] 19:44:47.936 - INFO: 19:44:47.936 - INFO: TASK [Remove Namespace ocp-41965-a] ******************************************** 19:44:47.936 - INFO: ok: [localhost] 19:44:47.949 - INFO: 19:44:47.949 - INFO: PLAY RECAP ********************************************************************* 19:44:47.949 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:44:48.100 - DEBUG: Removed private data directory: /tmp/tmpx2wjsq4p 19:44:48.100 - INFO: Removing app in namespace [ocp-41965-a] from cluster [source-cluster] 19:44:48.100 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:44:48.687 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:44:48.690 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:44:48.690 - INFO: the implicit localhost does not match 'all' 19:44:48.791 - INFO: [WARNING]: Found variable using reserved name: namespace 19:44:48.791 - INFO: 19:44:48.792 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:44:49.714 - INFO: 19:44:49.714 - INFO: TASK [Gathering Facts] ********************************************************* 19:44:49.714 - INFO: ok: [localhost] 19:44:49.734 - INFO: 19:44:49.734 - INFO: TASK [include_vars] ************************************************************ 19:44:49.734 - INFO: ok: [localhost] 19:45:06.523 - INFO: 19:45:06.524 - INFO: TASK [ocp-attached-pvc : Remove namespace ocp-41965-a] ************************* 19:45:06.524 - INFO: changed: [localhost] 19:45:08.351 - INFO: 19:45:08.351 - INFO: TASK [Remove Namespace ocp-41965-a] ******************************************** 19:45:08.351 - INFO: ok: [localhost] 19:45:08.364 - INFO: 19:45:08.364 - INFO: PLAY RECAP ********************************************************************* 19:45:08.364 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:45:08.512 - DEBUG: Removed private data directory: /tmp/tmpnk2maa2r 19:45:08.513 - INFO: Removing namespace fixture: ocp-41965-a from cluster source-cluster 19:45:08.557 - INFO: Removing namespace fixture: ocp-41965-anew from cluster host 19:45:08.616 - INFO: Waiting for namespace fixture to be deleted 19:45:14.714 - INFO: Removing namespace fixture: ocp-41965-b from cluster source-cluster 19:45:14.730 - INFO: Removing namespace fixture: ocp-41965-bnew from cluster host 19:45:14.760 - INFO: Waiting for namespace fixture to be deleted _____________________________ test_mtc_147_interop _____________________________ mtc = migplan = apps = [] migrated_namespaces = 'ocp-longpvcname:ocp-longpvcname' src_cluster = tgt_cluster = @pytest.mark.app(app_id="ocp-longpvcname", app_namespace="ocp-longpvcname") @pytest.mark.migrated_namespaces("ocp-longpvcname:ocp-longpvcname") def test_mtc_147_interop(mtc, migplan, apps, migrated_namespaces, src_cluster, tgt_cluster): tgt_namespace = get_target_namespace_from_namespace(migrated_namespaces) src_namespace = get_source_namespace_from_namespace(migrated_namespaces) app = apps[0] logger.info( "Migrating from ns:{0} in cluster:{1} to ns:{2} in cluster:{3}".format( src_namespace, src_cluster.name, tgt_namespace, tgt_cluster.name ) ) logger.info("Migplan {0}. Wait until ready".format(migplan.name)) ready = migplan.wait_until_ready() # Assert that plan is in Ready status assert ready, "Migplan must be ready:{0}".format(pretty(migplan.definition)) logger.info("MIGPLAN READY") # Assert that there are no warnings in the migplan assert len(migplan.get_warnings()) == 0, "There should be no warnings {0}".format(pretty(migplan.definition)) logger.info("NO WARNINGS IN MIGPLAN") logger.info("EXECUTE MIGRATION") migmigration = migplan.migrate(quiesce=True) success = migmigration.wait_until_success(timeout=600) assert success, "The migration should succeed. {0}".format(pretty(migmigration.definition)) migrated_app = get_migrated_app(app, tgt_cluster, tgt_namespace) logger.info("VALIDATE APPLICATION") ok = migrated_app.validate() > assert ok, "The application should be validated OK in the target cluster" E AssertionError: The application should be validated OK in the target cluster E assert False mtc-e2e-qev2/mtc_tests/tests/test_interop.py:363: AssertionError ------------------------------ Captured log setup ------------------------------ 19:48:54.449 - INFO: Removing namespace fixture: ocp-longpvcname from cluster source-cluster 19:48:55.149 - INFO: Removing namespace fixture: ocp-longpvcname from cluster host 19:48:55.903 - INFO: Removing app [ocp-longpvcname] in namespace [ocp-longpvcname] from cluster [host] 19:48:55.904 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:48:56.525 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:48:56.528 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:48:56.529 - INFO: the implicit localhost does not match 'all' 19:48:56.633 - INFO: [WARNING]: Found variable using reserved name: namespace 19:48:56.634 - INFO: 19:48:56.634 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:48:57.598 - INFO: 19:48:57.598 - INFO: TASK [Gathering Facts] ********************************************************* 19:48:57.598 - INFO: ok: [localhost] 19:48:57.618 - INFO: 19:48:57.619 - INFO: TASK [include_vars] ************************************************************ 19:48:57.619 - INFO: ok: [localhost] 19:48:59.476 - INFO: 19:48:59.476 - INFO: TASK [ocp-longpvcname : Remove namespace ocp-longpvcname] ********************** 19:48:59.476 - INFO: ok: [localhost] 19:49:01.379 - INFO: 19:49:01.380 - INFO: TASK [Remove Namespace ocp-longpvcname] **************************************** 19:49:01.380 - INFO: ok: [localhost] 19:49:01.393 - INFO: 19:49:01.393 - INFO: PLAY RECAP ********************************************************************* 19:49:01.394 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:49:01.553 - DEBUG: Removed private data directory: /tmp/tmprur9uphr 19:49:01.554 - INFO: Removing app [ocp-longpvcname] in namespace [ocp-longpvcname] from cluster [source-cluster] 19:49:01.554 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:49:02.174 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:49:02.178 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:49:02.178 - INFO: the implicit localhost does not match 'all' 19:49:02.282 - INFO: [WARNING]: Found variable using reserved name: namespace 19:49:02.282 - INFO: 19:49:02.282 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:49:03.246 - INFO: 19:49:03.247 - INFO: TASK [Gathering Facts] ********************************************************* 19:49:03.247 - INFO: ok: [localhost] 19:49:03.267 - INFO: 19:49:03.267 - INFO: TASK [include_vars] ************************************************************ 19:49:03.267 - INFO: ok: [localhost] 19:49:05.095 - INFO: 19:49:05.095 - INFO: TASK [ocp-longpvcname : Remove namespace ocp-longpvcname] ********************** 19:49:05.096 - INFO: ok: [localhost] 19:49:07.077 - INFO: 19:49:07.078 - INFO: TASK [Remove Namespace ocp-longpvcname] **************************************** 19:49:07.078 - INFO: ok: [localhost] 19:49:07.090 - INFO: 19:49:07.091 - INFO: PLAY RECAP ********************************************************************* 19:49:07.091 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:49:07.249 - DEBUG: Removed private data directory: /tmp/tmpo0zum77j 19:49:07.250 - INFO: Deploying app [ocp-longpvcname] in namespace [ocp-longpvcname] in cluster [source-cluster] 19:49:07.250 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:49:07.859 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:49:07.863 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:49:07.863 - INFO: the implicit localhost does not match 'all' 19:49:07.967 - INFO: [WARNING]: Found variable using reserved name: namespace 19:49:07.967 - INFO: 19:49:07.967 - INFO: PLAY [Deploy Application] ****************************************************** 19:49:08.902 - INFO: 19:49:08.902 - INFO: TASK [Gathering Facts] ********************************************************* 19:49:08.902 - INFO: ok: [localhost] 19:49:08.923 - INFO: 19:49:08.923 - INFO: TASK [include_vars] ************************************************************ 19:49:08.923 - INFO: ok: [localhost] 19:49:10.728 - INFO: 19:49:10.728 - INFO: TASK [ocp-longpvcname : Check namespace] *************************************** 19:49:10.728 - INFO: ok: [localhost] 19:49:11.376 - INFO: 19:49:11.376 - INFO: TASK [ocp-longpvcname : Create namespace] ************************************** 19:49:11.376 - INFO: changed: [localhost] 19:49:13.292 - INFO: 19:49:13.292 - INFO: TASK [ocp-longpvcname : Create the longpvcname application resources] ********** 19:49:13.292 - INFO: fatal: [localhost]: FAILED! => {"changed": false, "error": 422, "msg": "Failed to create object: b'{\"kind\":\"Status\",\"apiVersion\":\"v1\",\"metadata\":{},\"status\":\"Failure\",\"message\":\"Deployment.apps \\\\\"longpvc-test\\\\\" is invalid: spec.template.spec.containers[0].restartPolicy: Forbidden: may not be set for non-init containers\",\"reason\":\"Invalid\",\"details\":{\"name\":\"longpvc-test\",\"group\":\"apps\",\"kind\":\"Deployment\",\"causes\":[{\"reason\":\"FieldValueForbidden\",\"message\":\"Forbidden: may not be set for non-init containers\",\"field\":\"spec.template.spec.containers[0].restartPolicy\"}]},\"code\":422}\\n'", "reason": "Unprocessable Entity", "status": 422} 19:49:13.293 - INFO: 19:49:13.293 - INFO: PLAY RECAP ********************************************************************* 19:49:13.293 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=1  skipped=1  rescued=0 ignored=0 19:49:13.449 - DEBUG: Removed private data directory: /tmp/tmp_rnvkfnp 19:49:13.534 - INFO: Migplan test-interop-migplan has been created 19:49:13.535 - INFO: Migplan IDIM: False 19:49:13.535 - INFO: Migplan IDVM: False 19:49:13.595 - INFO: Waiting for Ready status... 19:49:23.612 - INFO: Waiting for Ready status... 19:49:33.627 - INFO: The migration plan is Ready. ------------------------------ Captured log call ------------------------------- 19:49:33.629 - INFO: Migrating from ns:ocp-longpvcname in cluster:source-cluster to ns:ocp-longpvcname in cluster:host 19:49:33.629 - INFO: Migplan test-interop-migplan. Wait until ready 19:49:33.644 - INFO: The migration plan is Ready. 19:49:33.644 - INFO: MIGPLAN READY 19:49:33.644 - INFO: NO WARNINGS IN MIGPLAN 19:49:33.644 - INFO: EXECUTE MIGRATION 19:49:33.697 - INFO: Not started. Waiting... 19:49:43.712 - INFO: Step: 18/49. Waiting... 19:49:53.728 - INFO: Step: 41/49. Waiting... 19:50:03.745 - INFO: Step: 41/49. Waiting... 19:50:13.760 - INFO: Step: 41/49. Waiting... 19:50:23.819 - INFO: Step: 41/49. Waiting... 19:50:33.834 - INFO: Finished. 19:50:33.834 - INFO: VALIDATE APPLICATION 19:50:33.834 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 19:50:34.431 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:50:34.434 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:50:34.435 - INFO: the implicit localhost does not match 'all' 19:50:34.535 - INFO: [WARNING]: Found variable using reserved name: namespace 19:50:34.535 - INFO: 19:50:34.535 - INFO: PLAY [Validate application] **************************************************** 19:50:35.480 - INFO: 19:50:35.480 - INFO: TASK [Gathering Facts] ********************************************************* 19:50:35.480 - INFO: ok: [localhost] 19:50:35.501 - INFO: 19:50:35.501 - INFO: TASK [include_vars] ************************************************************ 19:50:35.501 - INFO: ok: [localhost] 19:50:37.616 - INFO: FAILED - RETRYING: Check pod status (40 retries left). 19:50:44.332 - INFO: FAILED - RETRYING: Check pod status (39 retries left). 19:50:51.020 - INFO: FAILED - RETRYING: Check pod status (38 retries left). 19:50:57.704 - INFO: FAILED - RETRYING: Check pod status (37 retries left). 19:51:04.404 - INFO: FAILED - RETRYING: Check pod status (36 retries left). 19:51:11.116 - INFO: FAILED - RETRYING: Check pod status (35 retries left). 19:51:17.820 - INFO: FAILED - RETRYING: Check pod status (34 retries left). 19:51:24.476 - INFO: FAILED - RETRYING: Check pod status (33 retries left). 19:51:31.129 - INFO: FAILED - RETRYING: Check pod status (32 retries left). 19:51:37.789 - INFO: FAILED - RETRYING: Check pod status (31 retries left). 19:51:44.488 - INFO: FAILED - RETRYING: Check pod status (30 retries left). 19:51:51.206 - INFO: FAILED - RETRYING: Check pod status (29 retries left). 19:51:57.875 - INFO: FAILED - RETRYING: Check pod status (28 retries left). 19:52:04.490 - INFO: FAILED - RETRYING: Check pod status (27 retries left). 19:52:11.154 - INFO: FAILED - RETRYING: Check pod status (26 retries left). 19:52:17.793 - INFO: FAILED - RETRYING: Check pod status (25 retries left). 19:52:24.422 - INFO: FAILED - RETRYING: Check pod status (24 retries left). 19:52:31.100 - INFO: FAILED - RETRYING: Check pod status (23 retries left). 19:52:37.784 - INFO: FAILED - RETRYING: Check pod status (22 retries left). 19:52:44.378 - INFO: FAILED - RETRYING: Check pod status (21 retries left). 19:52:51.034 - INFO: FAILED - RETRYING: Check pod status (20 retries left). 19:52:57.633 - INFO: FAILED - RETRYING: Check pod status (19 retries left). 19:53:04.272 - INFO: FAILED - RETRYING: Check pod status (18 retries left). 19:53:10.858 - INFO: FAILED - RETRYING: Check pod status (17 retries left). 19:53:17.433 - INFO: FAILED - RETRYING: Check pod status (16 retries left). 19:53:24.071 - INFO: FAILED - RETRYING: Check pod status (15 retries left). 19:53:30.638 - INFO: FAILED - RETRYING: Check pod status (14 retries left). 19:53:37.206 - INFO: FAILED - RETRYING: Check pod status (13 retries left). 19:53:43.795 - INFO: FAILED - RETRYING: Check pod status (12 retries left). 19:53:50.409 - INFO: FAILED - RETRYING: Check pod status (11 retries left). 19:53:57.067 - INFO: FAILED - RETRYING: Check pod status (10 retries left). 19:54:03.727 - INFO: FAILED - RETRYING: Check pod status (9 retries left). 19:54:10.404 - INFO: FAILED - RETRYING: Check pod status (8 retries left). 19:54:17.056 - INFO: FAILED - RETRYING: Check pod status (7 retries left). 19:54:23.681 - INFO: FAILED - RETRYING: Check pod status (6 retries left). 19:54:30.347 - INFO: FAILED - RETRYING: Check pod status (5 retries left). 19:54:36.950 - INFO: FAILED - RETRYING: Check pod status (4 retries left). 19:54:43.577 - INFO: FAILED - RETRYING: Check pod status (3 retries left). 19:54:50.186 - INFO: FAILED - RETRYING: Check pod status (2 retries left). 19:54:56.754 - INFO: FAILED - RETRYING: Check pod status (1 retries left). 19:55:03.331 - INFO: 19:55:03.331 - INFO: TASK [ocp-longpvcname : Check pod status] ************************************** 19:55:03.332 - INFO: fatal: [localhost]: FAILED! => {"api_found": true, "attempts": 40, "changed": false, "resources": []} 19:55:03.333 - INFO: 19:55:03.333 - INFO: PLAY RECAP ********************************************************************* 19:55:03.333 - INFO: localhost : ok=2  changed=0 unreachable=0 failed=1  skipped=7  rescued=0 ignored=0 19:55:03.487 - DEBUG: Removed private data directory: /tmp/tmp6n4ci2u2 ---------------------------- Captured log teardown ----------------------------- 19:55:03.497 - INFO: Deleting Migplan test-interop-migplan... 19:55:13.636 - INFO: Removing app in namespace [ocp-longpvcname] from cluster [host] 19:55:13.636 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:55:14.234 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:55:14.237 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:55:14.237 - INFO: the implicit localhost does not match 'all' 19:55:14.341 - INFO: [WARNING]: Found variable using reserved name: namespace 19:55:14.341 - INFO: 19:55:14.341 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:55:15.266 - INFO: 19:55:15.266 - INFO: TASK [Gathering Facts] ********************************************************* 19:55:15.266 - INFO: ok: [localhost] 19:55:15.286 - INFO: 19:55:15.287 - INFO: TASK [include_vars] ************************************************************ 19:55:15.287 - INFO: ok: [localhost] 19:55:27.072 - INFO: 19:55:27.072 - INFO: TASK [ocp-longpvcname : Remove namespace ocp-longpvcname] ********************** 19:55:27.072 - INFO: changed: [localhost] 19:55:28.988 - INFO: 19:55:28.989 - INFO: TASK [Remove Namespace ocp-longpvcname] **************************************** 19:55:28.989 - INFO: ok: [localhost] 19:55:29.001 - INFO: 19:55:29.002 - INFO: PLAY RECAP ********************************************************************* 19:55:29.002 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:55:29.156 - DEBUG: Removed private data directory: /tmp/tmpifynvvlc 19:55:29.157 - INFO: Removing app in namespace [ocp-longpvcname] from cluster [source-cluster] 19:55:29.157 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:55:29.754 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:55:29.757 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:55:29.757 - INFO: the implicit localhost does not match 'all' 19:55:29.858 - INFO: [WARNING]: Found variable using reserved name: namespace 19:55:29.859 - INFO: 19:55:29.859 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:55:30.785 - INFO: 19:55:30.786 - INFO: TASK [Gathering Facts] ********************************************************* 19:55:30.786 - INFO: ok: [localhost] 19:55:30.806 - INFO: 19:55:30.806 - INFO: TASK [include_vars] ************************************************************ 19:55:30.806 - INFO: ok: [localhost] 19:55:47.619 - INFO: 19:55:47.619 - INFO: TASK [ocp-longpvcname : Remove namespace ocp-longpvcname] ********************** 19:55:47.619 - INFO: changed: [localhost] 19:55:49.498 - INFO: 19:55:49.498 - INFO: TASK [Remove Namespace ocp-longpvcname] **************************************** 19:55:49.498 - INFO: ok: [localhost] 19:55:49.511 - INFO: 19:55:49.511 - INFO: PLAY RECAP ********************************************************************* 19:55:49.511 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:55:49.662 - DEBUG: Removed private data directory: /tmp/tmp4g666zub 19:55:49.663 - INFO: Removing namespace fixture: ocp-longpvcname from cluster source-cluster 19:55:49.705 - INFO: Removing namespace fixture: ocp-longpvcname from cluster host =============================== warnings summary =============================== mtc_tests/tests/test_interop.py: 150 warnings /mtc-e2e-qev2/venv/lib/python3.11/site-packages/kubernetes/client/rest.py:308: DeprecationWarning: HTTPResponse.getheaders() is deprecated and will be removed in urllib3 v2.1.0. Instead access HTTPResponse.headers directly. self.headers = http_resp.getheaders() -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ==================================== PASSES ==================================== _____________________________ test_mtc_98_interop ______________________________ ------------------------------ Captured log setup ------------------------------ 19:35:53.728 - INFO: Removing namespace fixture: ocp-41968-nsmap-1 from cluster source-cluster 19:35:54.386 - INFO: Removing namespace fixture: ocp-41968-nsmap-1 from cluster host 19:35:55.131 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-3] from cluster [host] 19:35:55.145 - INFO: Waiting for resources to be deleted 19:35:55.160 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-3] from cluster [source-cluster] 19:35:55.175 - INFO: Waiting for resources to be deleted 19:35:55.189 - INFO: Deploying app [empty-namespace] in namespace [ocp-41968-nsmap-3] in cluster [source-cluster] 19:35:55.203 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:35:55.203 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:35:55.414 - INFO: Deployed namespace ocp-41968-nsmap-3 in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:35:55.415 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-2] from cluster [host] 19:35:55.430 - INFO: Waiting for resources to be deleted 19:35:55.445 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-2] from cluster [source-cluster] 19:35:55.460 - INFO: Waiting for resources to be deleted 19:35:55.475 - INFO: Deploying app [empty-namespace] in namespace [ocp-41968-nsmap-2] in cluster [source-cluster] 19:35:55.489 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:35:55.489 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:35:55.658 - INFO: Deployed namespace ocp-41968-nsmap-2 in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:35:55.659 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-1] from cluster [host] 19:35:55.673 - INFO: Waiting for resources to be deleted 19:35:55.688 - INFO: Removing app [empty-namespace] in namespace [ocp-41968-nsmap-1] from cluster [source-cluster] 19:35:55.703 - INFO: Waiting for resources to be deleted 19:35:55.717 - INFO: Deploying app [empty-namespace] in namespace [ocp-41968-nsmap-1] in cluster [source-cluster] 19:35:55.731 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:35:55.731 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:35:55.973 - INFO: Deployed namespace ocp-41968-nsmap-1 in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:35:56.056 - INFO: Migplan test-interop-migplan has been created 19:35:56.057 - INFO: Migplan IDIM: False 19:35:56.057 - INFO: Migplan IDVM: False 19:35:56.115 - INFO: Waiting for Ready status... 19:36:06.129 - INFO: Waiting for Ready status... 19:36:16.143 - INFO: The migration plan is Ready. ------------------------------ Captured log call ------------------------------- 19:36:16.144 - INFO: Migrating from ns:ocp-41968-nsmap-1 in cluster:source-cluster to ns:ocp-41968-nsmap-1 in cluster:host 19:36:16.144 - INFO: Migplan test-interop-migplan. Wait until ready 19:36:16.159 - INFO: The migration plan is Ready. 19:36:16.159 - INFO: MIGPLAN READY 19:36:16.159 - INFO: CHECKING FOR WARNINGS 19:36:16.159 - INFO: NO WARNINGS FOUND 19:36:16.159 - INFO: FIRST TIME PATCHING WITH WRONG 19:36:26.220 - INFO: { "apiVersion": "migration.openshift.io/v1alpha1", "kind": "MigPlan", "metadata": { "annotations": { "kubectl.kubernetes.io/last-applied-configuration": "{\"apiVersion\":\"migration.openshift.io/v1alpha1\",\"kind\":\"MigPlan\",\"metadata\":{\"name\":\"test-interop-migplan\",\"namespace\":\"openshift-migration\"},\"spec\":{\"destMigClusterRef\":{\"name\":\"host\",\"namespace\":\"openshift-migration\"},\"migStorageRef\":{\"name\":\"minio.automation.test\",\"namespace\":\"openshift-migration\"},\"namespaces\":[\"ocp-41968-nsmap-1\",\"ocp-41968-nsmap-2:ocp-41968-nsmap-1\"],\"refresh\":true,\"srcMigClusterRef\":{\"name\":\"source-cluster\",\"namespace\":\"openshift-migration\"}}}", "migration.openshift.io/selected-migplan-type": "full", "openshift.io/touch": "4a0d3be6-f128-11ee-bfbf-0a580a830017" }, "creationTimestamp": "2024-04-02T19:35:56Z", "generation": 5, "labels": { "controller-tools.k8s.io": "1.0" }, "managedFields": [ { "apiVersion": "migration.openshift.io/v1alpha1", "fieldsType": "FieldsV1", "fieldsV1": { "f:metadata": { "f:annotations": { ".": {}, "f:kubectl.kubernetes.io/last-applied-configuration": {}, "f:migration.openshift.io/selected-migplan-type": {} }, "f:labels": { ".": {}, "f:controller-tools.k8s.io": {} } }, "f:spec": { ".": {}, "f:destMigClusterRef": { ".": {}, "f:name": {}, "f:namespace": {} }, "f:migStorageRef": { ".": {}, "f:name": {}, "f:namespace": {} }, "f:namespaces": {}, "f:srcMigClusterRef": { ".": {}, "f:name": {}, "f:namespace": {} } } }, "manager": "OpenAPI-Generator", "operation": "Update", "time": "2024-04-02T19:36:16Z" }, { "apiVersion": "migration.openshift.io/v1alpha1", "fieldsType": "FieldsV1", "fieldsV1": { "f:metadata": { "f:annotations": { "f:openshift.io/touch": {} } }, "f:status": { ".": {}, "f:conditions": {}, "f:destStorageClasses": {}, "f:excludedResources": {}, "f:observedDigest": {}, "f:srcStorageClasses": {} } }, "manager": "manager", "operation": "Update", "time": "2024-04-02T19:36:23Z" } ], "name": "test-interop-migplan", "namespace": "openshift-migration", "resourceVersion": "43277", "uid": "4225e85b-2ab6-465b-886b-eae8a601003e" }, "spec": { "destMigClusterRef": { "name": "host", "namespace": "openshift-migration" }, "migStorageRef": { "name": "minio.automation.test", "namespace": "openshift-migration" }, "namespaces": [ "ocp-41968-nsmap-1", "ocp-41968-nsmap-2:ocp-41968-nsmap-1" ], "srcMigClusterRef": { "name": "source-cluster", "namespace": "openshift-migration" } }, "status": { "conditions": [ { "category": "Required", "lastTransitionTime": "2024-04-02T19:36:11Z", "message": "The `persistentVolumes` list has been updated with discovered PVs.", "reason": "Done", "status": "True", "type": "PvsDiscovered" }, { "category": "Required", "lastTransitionTime": "2024-04-02T19:36:11Z", "message": "The storage resources have been created.", "reason": "Done", "status": "True", "type": "StorageEnsured" }, { "category": "Critical", "lastTransitionTime": "2024-04-02T19:36:16Z", "message": "Duplicate destination cluster namespaces [ocp-41968-nsmap-1] in migplan.", "reason": "DuplicateNamespaces", "status": "True", "type": "DuplicateNamespaceOnDestinationCluster" } ], "destStorageClasses": [ { "accessModes": [ "ReadWriteOnce" ], "name": "gp2-csi", "provisioner": "ebs.csi.aws.com" }, { "accessModes": [ "ReadWriteOnce" ], "default": true, "name": "gp3-csi", "provisioner": "ebs.csi.aws.com" } ], "excludedResources": [ "imagetags", "templateinstances", "clusterserviceversions", "packagemanifests", "subscriptions", "servicebrokers", "servicebindings", "serviceclasses", "serviceinstances", "serviceplans", "operatorgroups", "events", "events.events.k8s.io", "rolebindings.authorization.openshift.io" ], "observedDigest": "81de117904227c9004f544bc042b92e27e9df8b0766cab00e66f1dcb5618a44d", "srcStorageClasses": [ { "accessModes": [ "ReadWriteOnce" ], "name": "gp2-csi", "provisioner": "ebs.csi.aws.com" }, { "accessModes": [ "ReadWriteOnce" ], "default": true, "name": "gp3-csi", "provisioner": "ebs.csi.aws.com" } ] } } 19:36:26.220 - INFO: SECOND TIME PATCHING CORRECT 19:36:36.281 - INFO: Migplan test-interop-migplan. Wait until ready 19:36:36.295 - INFO: The migration plan is Ready. 19:36:36.296 - INFO: MIGPLAN READY 19:36:36.296 - INFO: CHECKING FOR WARNINGS 19:36:36.296 - INFO: NO WARNINGS FOUND 19:36:36.296 - INFO: THIRD TIME PATCHING WRONG AGAIN 19:36:46.355 - INFO: { "apiVersion": "migration.openshift.io/v1alpha1", "kind": "MigPlan", "metadata": { "annotations": { "kubectl.kubernetes.io/last-applied-configuration": "{\"apiVersion\":\"migration.openshift.io/v1alpha1\",\"kind\":\"MigPlan\",\"metadata\":{\"name\":\"test-interop-migplan\",\"namespace\":\"openshift-migration\"},\"spec\":{\"destMigClusterRef\":{\"name\":\"host\",\"namespace\":\"openshift-migration\"},\"migStorageRef\":{\"name\":\"minio.automation.test\",\"namespace\":\"openshift-migration\"},\"namespaces\":[\"ocp-41968-nsmap-1:ocp-41968-nsmap-a\",\"ocp-41968-nsmap-2:ocp-41968-nsmap-a\"],\"refresh\":true,\"srcMigClusterRef\":{\"name\":\"source-cluster\",\"namespace\":\"openshift-migration\"}}}", "migration.openshift.io/selected-migplan-type": "full", "openshift.io/touch": "55f110a3-f128-11ee-bfbf-0a580a830017" }, "creationTimestamp": "2024-04-02T19:35:56Z", "generation": 9, "labels": { "controller-tools.k8s.io": "1.0" }, "managedFields": [ { "apiVersion": "migration.openshift.io/v1alpha1", "fieldsType": "FieldsV1", "fieldsV1": { "f:metadata": { "f:annotations": { ".": {}, "f:kubectl.kubernetes.io/last-applied-configuration": {}, "f:migration.openshift.io/selected-migplan-type": {} }, "f:labels": { ".": {}, "f:controller-tools.k8s.io": {} } }, "f:spec": { ".": {}, "f:destMigClusterRef": { ".": {}, "f:name": {}, "f:namespace": {} }, "f:migStorageRef": { ".": {}, "f:name": {}, "f:namespace": {} }, "f:namespaces": {}, "f:srcMigClusterRef": { ".": {}, "f:name": {}, "f:namespace": {} } } }, "manager": "OpenAPI-Generator", "operation": "Update", "time": "2024-04-02T19:36:36Z" }, { "apiVersion": "migration.openshift.io/v1alpha1", "fieldsType": "FieldsV1", "fieldsV1": { "f:metadata": { "f:annotations": { "f:openshift.io/touch": {} } }, "f:status": { ".": {}, "f:conditions": {}, "f:destStorageClasses": {}, "f:excludedResources": {}, "f:observedDigest": {}, "f:srcStorageClasses": {} } }, "manager": "manager", "operation": "Update", "time": "2024-04-02T19:36:43Z" } ], "name": "test-interop-migplan", "namespace": "openshift-migration", "resourceVersion": "43436", "uid": "4225e85b-2ab6-465b-886b-eae8a601003e" }, "spec": { "destMigClusterRef": { "name": "host", "namespace": "openshift-migration" }, "migStorageRef": { "name": "minio.automation.test", "namespace": "openshift-migration" }, "namespaces": [ "ocp-41968-nsmap-1:ocp-41968-nsmap-a", "ocp-41968-nsmap-2:ocp-41968-nsmap-a" ], "srcMigClusterRef": { "name": "source-cluster", "namespace": "openshift-migration" } }, "status": { "conditions": [ { "category": "Required", "lastTransitionTime": "2024-04-02T19:36:11Z", "message": "The `persistentVolumes` list has been updated with discovered PVs.", "reason": "Done", "status": "True", "type": "PvsDiscovered" }, { "category": "Required", "lastTransitionTime": "2024-04-02T19:36:11Z", "message": "The storage resources have been created.", "reason": "Done", "status": "True", "type": "StorageEnsured" }, { "category": "Critical", "lastTransitionTime": "2024-04-02T19:36:36Z", "message": "Duplicate destination cluster namespaces [ocp-41968-nsmap-a] in migplan.", "reason": "DuplicateNamespaces", "status": "True", "type": "DuplicateNamespaceOnDestinationCluster" } ], "destStorageClasses": [ { "accessModes": [ "ReadWriteOnce" ], "name": "gp2-csi", "provisioner": "ebs.csi.aws.com" }, { "accessModes": [ "ReadWriteOnce" ], "default": true, "name": "gp3-csi", "provisioner": "ebs.csi.aws.com" } ], "excludedResources": [ "imagetags", "templateinstances", "clusterserviceversions", "packagemanifests", "subscriptions", "servicebrokers", "servicebindings", "serviceclasses", "serviceinstances", "serviceplans", "operatorgroups", "events", "events.events.k8s.io", "rolebindings.authorization.openshift.io" ], "observedDigest": "6d46ee7080d20beb45908ec43740ef6ee7e8153a09c2288816312936a9d4a15d", "srcStorageClasses": [ { "accessModes": [ "ReadWriteOnce" ], "name": "gp2-csi", "provisioner": "ebs.csi.aws.com" }, { "accessModes": [ "ReadWriteOnce" ], "default": true, "name": "gp3-csi", "provisioner": "ebs.csi.aws.com" } ] } } 19:36:46.355 - INFO: FOURTH TIME PATCHING RIGHT AGAIN 19:36:56.415 - INFO: Migplan test-interop-migplan. Wait until ready 19:36:56.429 - INFO: The migration plan is Ready. 19:36:56.429 - INFO: MIGPLAN READY 19:36:56.429 - INFO: CHECKING FOR WARNINGS 19:36:56.429 - INFO: NO WARNINGS FOUND ---------------------------- Captured log teardown ----------------------------- 19:36:56.430 - INFO: Deleting Migplan test-interop-migplan... 19:37:06.538 - INFO: Removing app in namespace [ocp-41968-nsmap-3] from cluster [host] 19:37:06.555 - INFO: Waiting for resources to be deleted 19:37:06.569 - INFO: Removing app in namespace [ocp-41968-nsmap-3] from cluster [source-cluster] 19:37:06.585 - INFO: Removing namespace: ocp-41968-nsmap-3 19:37:06.606 - INFO: Waiting for resources to be deleted 19:37:13.723 - INFO: Removing app in namespace [ocp-41968-nsmap-2] from cluster [host] 19:37:13.738 - INFO: Waiting for resources to be deleted 19:37:13.753 - INFO: Removing app in namespace [ocp-41968-nsmap-2] from cluster [source-cluster] 19:37:13.767 - INFO: Removing namespace: ocp-41968-nsmap-2 19:37:13.785 - INFO: Waiting for resources to be deleted 19:37:20.907 - INFO: Removing app in namespace [ocp-41968-nsmap-1] from cluster [host] 19:37:20.925 - INFO: Waiting for resources to be deleted 19:37:20.939 - INFO: Removing app in namespace [ocp-41968-nsmap-1] from cluster [source-cluster] 19:37:20.954 - INFO: Removing namespace: ocp-41968-nsmap-1 19:37:20.971 - INFO: Waiting for resources to be deleted 19:37:28.105 - INFO: Removing namespace fixture: ocp-41968-nsmap-1 from cluster source-cluster 19:37:28.119 - INFO: Removing namespace fixture: ocp-41968-nsmap-1 from cluster host _____________________________ test_mtc_116_interop _____________________________ ------------------------------ Captured log setup ------------------------------ 19:45:25.414 - INFO: Removing namespace fixture: op-37589-rollbacksuccess from cluster source-cluster 19:45:26.082 - INFO: Removing namespace fixture: op-37589-rollbacksuccess from cluster host 19:45:26.770 - INFO: Removing app [nginx-j2] in namespace [op-37589-rollbacksuccess] from cluster [host] 19:45:26.783 - INFO: Waiting for resources to be deleted 19:45:26.797 - INFO: Removing app [nginx-j2] in namespace [op-37589-rollbacksuccess] from cluster [source-cluster] 19:45:26.812 - INFO: Waiting for resources to be deleted 19:45:26.827 - INFO: Deploying app [nginx-j2] in namespace [op-37589-rollbacksuccess] in cluster [source-cluster] 19:45:26.841 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:45:26.841 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:45:27.036 - INFO: Deployed namespace op-37589-rollbacksuccess in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:45:27.055 - INFO: Namespace properly created and "Active": op-37589-rollbacksuccess 19:45:27.055 - INFO: Deploying nginx application in namespace in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:45:27.055 - DEBUG: Rendering template: nginxpv/deployment.yml.j2 19:45:27.056 - DEBUG: Using vars: app_name: nginx app_namespace: op-37589-rollbacksuccess deployment_api: apps/v1 html_accessmode: ReadWriteOnce logs_accessmode: ReadWriteOnce storage_class: default storage_size: 1Gi 19:45:42.384 - DEBUG: Requesting get: http://my-nginx-op-37589-rollbacksuccess.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:45:42.759 - DEBUG: Requesting get: http://my-nginx-op-37589-rollbacksuccess.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:45:42.784 - INFO: Validating nginx application in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:45:43.142 - DEBUG: Validated that 1 errors were reported in the errors log file 19:45:43.542 - DEBUG: Validated that 1 errors and 1 success were reported in the access log file 19:45:43.630 - INFO: Migplan test-interop-migplan has been created 19:45:43.631 - INFO: Migplan IDIM: False 19:45:43.631 - INFO: Migplan IDVM: False 19:45:43.694 - INFO: Waiting for Ready status... 19:45:53.710 - INFO: Waiting for Ready status... 19:46:03.725 - INFO: The migration plan is Ready. ------------------------------ Captured log call ------------------------------- 19:46:03.741 - INFO: Migrating from ns:op-37589-rollbacksuccess in cluster:source-cluster to ns:op-37589-rollbacksuccess in cluster:host 19:46:03.741 - INFO: Migplan test-interop-migplan. Wait until ready 19:46:03.757 - INFO: The migration plan is Ready. 19:46:03.757 - INFO: MIGPLAN READY 19:46:03.757 - INFO: EXECUTE STAGE 19:46:03.809 - INFO: Step: 2/38. Waiting... 19:46:13.860 - INFO: Step: 32/38. Waiting... 19:46:23.875 - INFO: Step: 32/38. Waiting... 19:46:33.892 - INFO: Step: 32/38. Waiting... 19:46:43.907 - INFO: Step: 32/38. Waiting... 19:46:53.922 - INFO: Finished. 19:46:53.923 - INFO: CHECK PVC 19:46:53.923 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:46:54.208 - INFO: EXECUTE MIGRATION 19:46:54.260 - INFO: Not started. Waiting... 19:47:04.275 - INFO: Step: 18/49. Waiting... 19:47:14.296 - INFO: Step: 38/49. Waiting... 19:47:24.312 - INFO: Step: 38/49. Waiting... 19:47:34.327 - INFO: Step: 38/49. Waiting... 19:47:44.342 - INFO: Step: 43/49. Waiting... 19:47:54.358 - INFO: Finished. 19:47:54.359 - INFO: VALIDATE APPLICATION 19:47:54.359 - INFO: Validating migrated nginx application in cluster https://api.mtc-target-nzvv.cspilp.interop.ccitredhat.com:6443 19:47:54.423 - DEBUG: Requesting get: http://my-nginx-op-37589-rollbacksuccess.apps.mtc-target-nzvv.cspilp.interop.ccitredhat.com 19:47:54.928 - DEBUG: Validated that 1 errors were reported in the errors log file 19:47:55.328 - DEBUG: Validated that 1 errors and 2 success were reported in the access log file 19:47:55.328 - INFO: EXECUTE ROLLBACK 19:47:55.379 - INFO: Step: 2/11. Waiting... 19:48:05.395 - INFO: Step: 8/11. Waiting... 19:48:15.433 - INFO: Step: 9/11. Waiting... 19:48:25.448 - INFO: Finished. 19:48:25.449 - INFO: VALIDATE APPLICATION 19:48:25.449 - INFO: Validating migrated nginx application in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:48:25.519 - DEBUG: Requesting get: http://my-nginx-op-37589-rollbacksuccess.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:48:25.946 - DEBUG: Validated that 1 errors were reported in the errors log file 19:48:26.342 - DEBUG: Validated that 1 errors and 2 success were reported in the access log file ---------------------------- Captured log teardown ----------------------------- 19:48:26.343 - INFO: Deleting Migplan test-interop-migplan... 19:48:36.441 - INFO: Removing app in namespace [op-37589-rollbacksuccess] from cluster [host] 19:48:36.458 - INFO: Removing namespace: op-37589-rollbacksuccess 19:48:36.473 - INFO: Waiting for resources to be deleted 19:48:42.569 - INFO: Removing app in namespace [op-37589-rollbacksuccess] from cluster [source-cluster] 19:48:42.586 - INFO: Removing namespace: op-37589-rollbacksuccess 19:48:42.605 - INFO: Waiting for resources to be deleted 19:48:50.755 - INFO: Removing namespace fixture: op-37589-rollbacksuccess from cluster source-cluster 19:48:50.771 - INFO: Removing namespace fixture: op-37589-rollbacksuccess from cluster host _____________________________ test_mtc_172_interop _____________________________ ------------------------------ Captured log setup ------------------------------ 19:55:53.410 - INFO: Removing namespace fixture: ocp-41820-configmap-source from cluster source-cluster 19:55:54.075 - INFO: Removing namespace fixture: ocp-41820-configmap-target from cluster host 19:55:54.763 - INFO: Removing namespace fixture: ocp-41820-django-source from cluster source-cluster 19:55:54.777 - INFO: Removing namespace fixture: ocp-41820-django-target from cluster host 19:55:54.790 - INFO: Removing namespace fixture: ocp-41820-nginx-source from cluster source-cluster 19:55:54.805 - INFO: Removing namespace fixture: ocp-41820-nginx-target from cluster host 19:55:54.818 - INFO: Removing app [nginx-j2] in namespace [ocp-41820-nginx-source] from cluster [host] 19:55:54.831 - INFO: Waiting for resources to be deleted 19:55:54.845 - INFO: Removing app [nginx-j2] in namespace [ocp-41820-nginx-source] from cluster [source-cluster] 19:55:54.859 - INFO: Waiting for resources to be deleted 19:55:54.874 - INFO: Deploying app [nginx-j2] in namespace [ocp-41820-nginx-source] in cluster [source-cluster] 19:55:54.888 - DEBUG: Running always new-project with --skip-config-write=true flag to avoid overriding the configuration 19:55:54.888 - DEBUG: Running always with --insecure-skip-tls-verify flag to avoid "Unable to connect to the server: x509: certificate signed by unknown authority" 19:55:55.075 - INFO: Deployed namespace ocp-41820-nginx-source in host https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:55:55.095 - INFO: Namespace properly created and "Active": ocp-41820-nginx-source 19:55:55.095 - INFO: Deploying nginx application in namespace in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:55:55.095 - DEBUG: Rendering template: nginxpv/deployment.yml.j2 19:55:55.096 - DEBUG: Using vars: app_name: nginx app_namespace: ocp-41820-nginx-source deployment_api: apps/v1 html_accessmode: ReadWriteOnce logs_accessmode: ReadWriteOnce storage_class: default storage_size: 1Gi 19:56:10.441 - DEBUG: Requesting get: http://my-nginx-ocp-41820-nginx-source.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:56:10.871 - DEBUG: Requesting get: http://my-nginx-ocp-41820-nginx-source.apps.mtc-source-4bze.cspilp.interop.ccitredhat.com 19:56:10.895 - INFO: Validating nginx application in cluster https://api.mtc-source-4bze.cspilp.interop.ccitredhat.com:6443 19:56:11.251 - DEBUG: Validated that 1 errors were reported in the errors log file 19:56:11.649 - DEBUG: Validated that 1 errors and 1 success were reported in the access log file 19:56:11.650 - INFO: Removing app [ocp-django] in namespace [ocp-41820-django-source] from cluster [host] 19:56:11.650 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:56:12.246 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:56:12.249 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:56:12.249 - INFO: the implicit localhost does not match 'all' 19:56:12.353 - INFO: [WARNING]: Found variable using reserved name: namespace 19:56:12.354 - INFO: 19:56:12.354 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:56:13.294 - INFO: 19:56:13.294 - INFO: TASK [Gathering Facts] ********************************************************* 19:56:13.294 - INFO: ok: [localhost] 19:56:13.315 - INFO: 19:56:13.315 - INFO: TASK [include_vars] ************************************************************ 19:56:13.315 - INFO: ok: [localhost] 19:56:15.069 - INFO: 19:56:15.069 - INFO: TASK [ocp-django : Remove namespace ocp-41820-django-source] ******************* 19:56:15.069 - INFO: ok: [localhost] 19:56:16.952 - INFO: 19:56:16.952 - INFO: TASK [Remove Namespace ocp-41820-django-source] ******************************** 19:56:16.952 - INFO: ok: [localhost] 19:56:16.965 - INFO: 19:56:16.965 - INFO: PLAY RECAP ********************************************************************* 19:56:16.965 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:56:17.120 - DEBUG: Removed private data directory: /tmp/tmpslgd_azt 19:56:17.121 - INFO: Removing app [ocp-django] in namespace [ocp-41820-django-source] from cluster [source-cluster] 19:56:17.121 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:56:17.711 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:56:17.714 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:56:17.714 - INFO: the implicit localhost does not match 'all' 19:56:17.815 - INFO: [WARNING]: Found variable using reserved name: namespace 19:56:17.816 - INFO: 19:56:17.816 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:56:18.757 - INFO: 19:56:18.758 - INFO: TASK [Gathering Facts] ********************************************************* 19:56:18.758 - INFO: ok: [localhost] 19:56:18.778 - INFO: 19:56:18.779 - INFO: TASK [include_vars] ************************************************************ 19:56:18.779 - INFO: ok: [localhost] 19:56:20.509 - INFO: 19:56:20.509 - INFO: TASK [ocp-django : Remove namespace ocp-41820-django-source] ******************* 19:56:20.509 - INFO: ok: [localhost] 19:56:22.315 - INFO: 19:56:22.316 - INFO: TASK [Remove Namespace ocp-41820-django-source] ******************************** 19:56:22.316 - INFO: ok: [localhost] 19:56:22.329 - INFO: 19:56:22.329 - INFO: PLAY RECAP ********************************************************************* 19:56:22.329 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:56:22.487 - DEBUG: Removed private data directory: /tmp/tmp_dx48zl9 19:56:22.488 - INFO: Deploying app [ocp-django] in namespace [ocp-41820-django-source] in cluster [source-cluster] 19:56:22.488 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:56:23.095 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:56:23.099 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:56:23.099 - INFO: the implicit localhost does not match 'all' 19:56:23.200 - INFO: [WARNING]: Found variable using reserved name: namespace 19:56:23.200 - INFO: 19:56:23.200 - INFO: PLAY [Deploy Application] ****************************************************** 19:56:24.127 - INFO: 19:56:24.127 - INFO: TASK [Gathering Facts] ********************************************************* 19:56:24.127 - INFO: ok: [localhost] 19:56:24.148 - INFO: 19:56:24.148 - INFO: TASK [include_vars] ************************************************************ 19:56:24.148 - INFO: ok: [localhost] 19:56:25.924 - INFO: 19:56:25.924 - INFO: TASK [ocp-django : Check namespace] ******************************************** 19:56:25.924 - INFO: ok: [localhost] 19:56:26.550 - INFO: 19:56:26.550 - INFO: TASK [ocp-django : Create namespace] ******************************************* 19:56:26.550 - INFO: changed: [localhost] 19:56:27.669 - INFO: 19:56:27.669 - INFO: TASK [ocp-django : Create the mtc test django psql persistent template] ******** 19:56:27.669 - INFO: changed: [localhost] 19:56:28.477 - INFO: 19:56:28.477 - INFO: TASK [ocp-django : Create openshift django psql persisten application from openshift templates] *** 19:56:28.477 - INFO: ok: [localhost] 19:56:30.092 - INFO: FAILED - RETRYING: Check postgresql pod status (60 retries left). 19:56:36.672 - INFO: FAILED - RETRYING: Check postgresql pod status (59 retries left). 19:56:43.263 - INFO: FAILED - RETRYING: Check postgresql pod status (58 retries left). 19:56:49.845 - INFO: FAILED - RETRYING: Check postgresql pod status (57 retries left). 19:56:56.412 - INFO: FAILED - RETRYING: Check postgresql pod status (56 retries left). 19:57:02.987 - INFO: 19:57:02.987 - INFO: TASK [ocp-django : Check postgresql pod status] ******************************** 19:57:02.987 - INFO: ok: [localhost] 19:57:04.593 - INFO: FAILED - RETRYING: Check application pod status (60 retries left). 19:57:11.152 - INFO: FAILED - RETRYING: Check application pod status (59 retries left). 19:57:17.734 - INFO: FAILED - RETRYING: Check application pod status (58 retries left). 19:57:24.327 - INFO: FAILED - RETRYING: Check application pod status (57 retries left). 19:57:30.912 - INFO: FAILED - RETRYING: Check application pod status (56 retries left). 19:57:37.543 - INFO: FAILED - RETRYING: Check application pod status (55 retries left). 19:57:44.162 - INFO: 19:57:44.162 - INFO: TASK [ocp-django : Check application pod status] ******************************* 19:57:44.162 - INFO: ok: [localhost] 19:57:45.820 - INFO: 19:57:45.820 - INFO: TASK [ocp-django : Get route] ************************************************** 19:57:45.820 - INFO: ok: [localhost] 19:57:46.563 - INFO: 19:57:46.563 - INFO: TASK [ocp-django : Access the html file] *************************************** 19:57:46.563 - INFO: ok: [localhost] => (item=1) 19:57:47.147 - INFO: ok: [localhost] => (item=2) 19:57:47.734 - INFO: ok: [localhost] => (item=3) 19:57:48.309 - INFO: ok: [localhost] => (item=4) 19:57:48.885 - INFO: ok: [localhost] => (item=5) 19:57:48.907 - INFO: 19:57:48.907 - INFO: PLAY RECAP ********************************************************************* 19:57:48.907 - INFO: localhost : ok=10  changed=2  unreachable=0 failed=0 skipped=1  rescued=0 ignored=0 19:57:49.064 - DEBUG: Removed private data directory: /tmp/tmp2m410kig 19:57:49.065 - INFO: Removing app [ocp-configmap] in namespace [ocp-41820-configmap-source] from cluster [host] 19:57:49.066 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:57:49.662 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:57:49.665 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:57:49.665 - INFO: the implicit localhost does not match 'all' 19:57:49.766 - INFO: [WARNING]: Found variable using reserved name: namespace 19:57:49.766 - INFO: 19:57:49.766 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:57:50.696 - INFO: 19:57:50.697 - INFO: TASK [Gathering Facts] ********************************************************* 19:57:50.697 - INFO: ok: [localhost] 19:57:50.718 - INFO: 19:57:50.718 - INFO: TASK [include_vars] ************************************************************ 19:57:50.718 - INFO: ok: [localhost] 19:57:52.467 - INFO: 19:57:52.468 - INFO: TASK [ocp-configmap : Remove namespace ocp-41820-configmap-source] ************* 19:57:52.468 - INFO: ok: [localhost] 19:57:54.304 - INFO: 19:57:54.304 - INFO: TASK [Remove Namespace ocp-41820-configmap-source] ***************************** 19:57:54.304 - INFO: ok: [localhost] 19:57:54.317 - INFO: 19:57:54.317 - INFO: PLAY RECAP ********************************************************************* 19:57:54.317 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:57:54.469 - DEBUG: Removed private data directory: /tmp/tmpxlp06ee3 19:57:54.469 - INFO: Removing app [ocp-configmap] in namespace [ocp-41820-configmap-source] from cluster [source-cluster] 19:57:54.470 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 19:57:55.061 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:57:55.064 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:57:55.064 - INFO: the implicit localhost does not match 'all' 19:57:55.167 - INFO: [WARNING]: Found variable using reserved name: namespace 19:57:55.167 - INFO: 19:57:55.167 - INFO: PLAY [Execute Application Cleanup] ********************************************* 19:57:56.096 - INFO: 19:57:56.096 - INFO: TASK [Gathering Facts] ********************************************************* 19:57:56.096 - INFO: ok: [localhost] 19:57:56.116 - INFO: 19:57:56.116 - INFO: TASK [include_vars] ************************************************************ 19:57:56.116 - INFO: ok: [localhost] 19:57:57.886 - INFO: 19:57:57.886 - INFO: TASK [ocp-configmap : Remove namespace ocp-41820-configmap-source] ************* 19:57:57.887 - INFO: ok: [localhost] 19:57:59.757 - INFO: 19:57:59.758 - INFO: TASK [Remove Namespace ocp-41820-configmap-source] ***************************** 19:57:59.758 - INFO: ok: [localhost] 19:57:59.771 - INFO: 19:57:59.771 - INFO: PLAY RECAP ********************************************************************* 19:57:59.771 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 19:57:59.925 - DEBUG: Removed private data directory: /tmp/tmpsaicwhyp 19:57:59.926 - INFO: Deploying app [ocp-configmap] in namespace [ocp-41820-configmap-source] in cluster [source-cluster] 19:57:59.926 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/deploy.yml 19:58:00.513 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 19:58:00.517 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 19:58:00.517 - INFO: the implicit localhost does not match 'all' 19:58:00.617 - INFO: [WARNING]: Found variable using reserved name: namespace 19:58:00.617 - INFO: 19:58:00.617 - INFO: PLAY [Deploy Application] ****************************************************** 19:58:01.549 - INFO: 19:58:01.550 - INFO: TASK [Gathering Facts] ********************************************************* 19:58:01.550 - INFO: ok: [localhost] 19:58:01.570 - INFO: 19:58:01.570 - INFO: TASK [include_vars] ************************************************************ 19:58:01.570 - INFO: ok: [localhost] 19:58:03.381 - INFO: 19:58:03.381 - INFO: TASK [ocp-configmap : Check namespace] ***************************************** 19:58:03.381 - INFO: ok: [localhost] 19:58:04.029 - INFO: 19:58:04.029 - INFO: TASK [ocp-configmap : Create namespace] **************************************** 19:58:04.029 - INFO: changed: [localhost] 19:58:05.848 - INFO: 19:58:05.848 - INFO: TASK [ocp-configmap : Deploy configuration map] ******************************** 19:58:05.848 - INFO: changed: [localhost] 19:58:07.513 - INFO: 19:58:07.513 - INFO: TASK [ocp-configmap : Deploy redis appliation] ********************************* 19:58:07.513 - INFO: changed: [localhost] 19:58:07.632 - INFO: 19:58:07.632 - INFO: PLAY RECAP ********************************************************************* 19:58:07.632 - INFO: localhost : ok=6  changed=3  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 19:58:07.788 - DEBUG: Removed private data directory: /tmp/tmpjj_4jzjt 19:58:07.914 - INFO: Migplan test-interop-migplan has been created 19:58:07.915 - INFO: Migplan IDIM: False 19:58:07.915 - INFO: Migplan IDVM: False 19:58:08.003 - INFO: Waiting for Ready status... 19:58:18.017 - INFO: Waiting for Ready status... 19:58:28.031 - INFO: The migration plan is Ready. ------------------------------ Captured log call ------------------------------- 19:58:28.032 - INFO: Migrating from ns:ocp-41820-nginx-source in cluster:source-cluster to ns:ocp-41820-nginx-target in cluster:host 19:58:28.032 - INFO: Migrating from ns:ocp-41820-django-source in cluster:source-cluster to ns:ocp-41820-django-target in cluster:host 19:58:28.032 - INFO: Migrating from ns:ocp-41820-configmap-source in cluster:source-cluster to ns:ocp-41820-configmap-target in cluster:host 19:58:28.032 - INFO: Migplan test-interop-migplan. Wait until ready 19:58:28.046 - INFO: The migration plan is Ready. 19:58:28.046 - INFO: MIGPLAN READY 19:58:28.046 - INFO: NO WARNINGS IN MIGPLAN 19:58:28.046 - INFO: EXECUTE MIGRATION 19:58:28.093 - INFO: Not started. Waiting... 19:58:38.106 - INFO: Step: 18/49. Waiting... 19:58:48.121 - INFO: Step: 18/49. Waiting... 19:58:58.135 - INFO: Step: 38/49. Waiting... 19:59:08.149 - INFO: Step: 38/49. Waiting... 19:59:18.162 - INFO: Step: 38/49. Waiting... 19:59:28.178 - INFO: Step: 38/49. Waiting... 19:59:38.192 - INFO: Step: 38/49. Waiting... 19:59:48.206 - INFO: Step: 43/49. Waiting... 19:59:58.221 - INFO: Finished. 19:59:58.221 - INFO: VALIDATE APPLICATION 19:59:58.221 - INFO: Validating migrated nginx application in cluster https://api.mtc-target-nzvv.cspilp.interop.ccitredhat.com:6443 20:00:07.334 - DEBUG: Requesting get: http://my-nginx-ocp-41820-nginx-target.apps.mtc-target-nzvv.cspilp.interop.ccitredhat.com 20:00:07.790 - DEBUG: Validated that 1 errors were reported in the errors log file 20:00:08.193 - DEBUG: Validated that 1 errors and 2 success were reported in the access log file 20:00:08.193 - INFO: VALIDATE APPLICATION 20:00:08.193 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 20:00:08.786 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:00:08.789 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:00:08.790 - INFO: the implicit localhost does not match 'all' 20:00:08.888 - INFO: [WARNING]: Found variable using reserved name: namespace 20:00:08.889 - INFO: 20:00:08.889 - INFO: PLAY [Validate application] **************************************************** 20:00:09.810 - INFO: 20:00:09.811 - INFO: TASK [Gathering Facts] ********************************************************* 20:00:09.811 - INFO: ok: [localhost] 20:00:09.831 - INFO: 20:00:09.831 - INFO: TASK [include_vars] ************************************************************ 20:00:09.831 - INFO: ok: [localhost] 20:00:11.782 - INFO: FAILED - RETRYING: Check postgresql pod status (60 retries left). 20:00:18.448 - INFO: 20:00:18.448 - INFO: TASK [ocp-django : Check postgresql pod status] ******************************** 20:00:18.448 - INFO: ok: [localhost] 20:00:20.123 - INFO: FAILED - RETRYING: Check application pod status (60 retries left). 20:00:26.741 - INFO: FAILED - RETRYING: Check application pod status (59 retries left). 20:00:33.386 - INFO: 20:00:33.386 - INFO: TASK [ocp-django : Check application pod status] ******************************* 20:00:33.386 - INFO: ok: [localhost] 20:00:35.068 - INFO: 20:00:35.068 - INFO: TASK [ocp-django : Get route] ************************************************** 20:00:35.068 - INFO: ok: [localhost] 20:00:35.797 - INFO: 20:00:35.798 - INFO: TASK [ocp-django : Access the html file] *************************************** 20:00:35.798 - INFO: ok: [localhost] => (item=1) 20:00:36.378 - INFO: ok: [localhost] => (item=2) 20:00:36.945 - INFO: ok: [localhost] => (item=3) 20:00:37.529 - INFO: ok: [localhost] => (item=4) 20:00:38.099 - INFO: ok: [localhost] => (item=5) 20:00:38.121 - INFO: 20:00:38.121 - INFO: PLAY RECAP ********************************************************************* 20:00:38.121 - INFO: localhost : ok=6  changed=0 unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 20:00:38.279 - DEBUG: Removed private data directory: /tmp/tmp8n0eyp6b 20:00:38.279 - INFO: VALIDATE APPLICATION 20:00:38.280 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/validate.yml 20:00:38.869 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:00:38.872 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:00:38.873 - INFO: the implicit localhost does not match 'all' 20:00:38.971 - INFO: [WARNING]: Found variable using reserved name: namespace 20:00:38.971 - INFO: 20:00:38.972 - INFO: PLAY [Validate application] **************************************************** 20:00:39.906 - INFO: 20:00:39.906 - INFO: TASK [Gathering Facts] ********************************************************* 20:00:39.907 - INFO: ok: [localhost] 20:00:39.927 - INFO: 20:00:39.927 - INFO: TASK [include_vars] ************************************************************ 20:00:39.927 - INFO: ok: [localhost] 20:00:41.854 - INFO: 20:00:41.854 - INFO: TASK [ocp-configmap : Check config map] **************************************** 20:00:41.854 - INFO: ok: [localhost] 20:00:43.590 - INFO: 20:00:43.590 - INFO: TASK [ocp-configmap : Check redis app] ***************************************** 20:00:43.590 - INFO: ok: [localhost] 20:00:44.524 - INFO: 20:00:44.524 - INFO: TASK [ocp-configmap : Verify max memory configuration] ************************* 20:00:44.524 - INFO: changed: [localhost] 20:00:45.423 - INFO: 20:00:45.424 - INFO: TASK [ocp-configmap : Verify memory policy configuration] ********************** 20:00:45.424 - INFO: changed: [localhost] 20:00:45.442 - INFO: 20:00:45.442 - INFO: PLAY RECAP ********************************************************************* 20:00:45.442 - INFO: localhost : ok=6  changed=2  unreachable=0 failed=0 skipped=5  rescued=0 ignored=0 20:00:45.594 - DEBUG: Removed private data directory: /tmp/tmpiqsulpgd ---------------------------- Captured log teardown ----------------------------- 20:00:45.595 - INFO: Deleting Migplan test-interop-migplan... 20:00:55.708 - INFO: Removing app in namespace [ocp-41820-nginx-source] from cluster [host] 20:00:55.725 - INFO: Waiting for resources to be deleted 20:00:55.739 - INFO: Removing app in namespace [ocp-41820-nginx-source] from cluster [source-cluster] 20:00:55.786 - INFO: Removing namespace: ocp-41820-nginx-source 20:00:55.805 - INFO: Waiting for resources to be deleted 20:01:09.022 - INFO: Removing app in namespace [ocp-41820-django-source] from cluster [host] 20:01:09.022 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 20:01:09.622 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:01:09.625 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:01:09.626 - INFO: the implicit localhost does not match 'all' 20:01:09.731 - INFO: [WARNING]: Found variable using reserved name: namespace 20:01:09.731 - INFO: 20:01:09.731 - INFO: PLAY [Execute Application Cleanup] ********************************************* 20:01:10.667 - INFO: 20:01:10.668 - INFO: TASK [Gathering Facts] ********************************************************* 20:01:10.668 - INFO: ok: [localhost] 20:01:10.688 - INFO: 20:01:10.688 - INFO: TASK [include_vars] ************************************************************ 20:01:10.688 - INFO: ok: [localhost] 20:01:12.476 - INFO: 20:01:12.476 - INFO: TASK [ocp-django : Remove namespace ocp-41820-django-source] ******************* 20:01:12.476 - INFO: ok: [localhost] 20:01:14.270 - INFO: 20:01:14.270 - INFO: TASK [Remove Namespace ocp-41820-django-source] ******************************** 20:01:14.270 - INFO: ok: [localhost] 20:01:14.284 - INFO: 20:01:14.284 - INFO: PLAY RECAP ********************************************************************* 20:01:14.284 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 20:01:14.439 - DEBUG: Removed private data directory: /tmp/tmphj4gsuq7 20:01:14.440 - INFO: Removing app in namespace [ocp-41820-django-source] from cluster [source-cluster] 20:01:14.440 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 20:01:15.042 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:01:15.045 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:01:15.045 - INFO: the implicit localhost does not match 'all' 20:01:15.149 - INFO: [WARNING]: Found variable using reserved name: namespace 20:01:15.149 - INFO: 20:01:15.149 - INFO: PLAY [Execute Application Cleanup] ********************************************* 20:01:16.081 - INFO: 20:01:16.081 - INFO: TASK [Gathering Facts] ********************************************************* 20:01:16.081 - INFO: ok: [localhost] 20:01:16.101 - INFO: 20:01:16.102 - INFO: TASK [include_vars] ************************************************************ 20:01:16.102 - INFO: ok: [localhost] 20:01:32.933 - INFO: 20:01:32.933 - INFO: TASK [ocp-django : Remove namespace ocp-41820-django-source] ******************* 20:01:32.933 - INFO: changed: [localhost] 20:01:34.785 - INFO: 20:01:34.785 - INFO: TASK [Remove Namespace ocp-41820-django-source] ******************************** 20:01:34.785 - INFO: ok: [localhost] 20:01:34.798 - INFO: 20:01:34.798 - INFO: PLAY RECAP ********************************************************************* 20:01:34.798 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 20:01:34.947 - DEBUG: Removed private data directory: /tmp/tmpkvbj94fz 20:01:34.948 - INFO: Removing app in namespace [ocp-41820-configmap-source] from cluster [host] 20:01:34.948 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 20:01:35.538 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:01:35.541 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:01:35.541 - INFO: the implicit localhost does not match 'all' 20:01:35.641 - INFO: [WARNING]: Found variable using reserved name: namespace 20:01:35.642 - INFO: 20:01:35.642 - INFO: PLAY [Execute Application Cleanup] ********************************************* 20:01:36.574 - INFO: 20:01:36.575 - INFO: TASK [Gathering Facts] ********************************************************* 20:01:36.575 - INFO: ok: [localhost] 20:01:36.595 - INFO: 20:01:36.595 - INFO: TASK [include_vars] ************************************************************ 20:01:36.595 - INFO: ok: [localhost] 20:01:38.337 - INFO: 20:01:38.337 - INFO: TASK [ocp-configmap : Remove namespace ocp-41820-configmap-source] ************* 20:01:38.337 - INFO: ok: [localhost] 20:01:40.144 - INFO: 20:01:40.144 - INFO: TASK [Remove Namespace ocp-41820-configmap-source] ***************************** 20:01:40.144 - INFO: ok: [localhost] 20:01:40.157 - INFO: 20:01:40.158 - INFO: PLAY RECAP ********************************************************************* 20:01:40.158 - INFO: localhost : ok=4  changed=0 unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 20:01:40.312 - DEBUG: Removed private data directory: /tmp/tmpuq6_uqe4 20:01:40.313 - INFO: Removing app in namespace [ocp-41820-configmap-source] from cluster [source-cluster] 20:01:40.313 - INFO: Execute playbook: /mtc-e2e-qev2/venv/lib/python3.11/site-packages/ocpdeployer/ansible/remove.yml 20:01:40.910 - INFO: [WARNING]: No inventory was parsed, only implicit localhost is available 20:01:40.913 - INFO: [WARNING]: provided hosts list is empty, only localhost is available. Note that 20:01:40.914 - INFO: the implicit localhost does not match 'all' 20:01:41.014 - INFO: [WARNING]: Found variable using reserved name: namespace 20:01:41.015 - INFO: 20:01:41.015 - INFO: PLAY [Execute Application Cleanup] ********************************************* 20:01:41.938 - INFO: 20:01:41.939 - INFO: TASK [Gathering Facts] ********************************************************* 20:01:41.939 - INFO: ok: [localhost] 20:01:41.959 - INFO: 20:01:41.959 - INFO: TASK [include_vars] ************************************************************ 20:01:41.959 - INFO: ok: [localhost] 20:01:53.759 - INFO: 20:01:53.759 - INFO: TASK [ocp-configmap : Remove namespace ocp-41820-configmap-source] ************* 20:01:53.760 - INFO: changed: [localhost] 20:01:55.618 - INFO: 20:01:55.619 - INFO: TASK [Remove Namespace ocp-41820-configmap-source] ***************************** 20:01:55.619 - INFO: ok: [localhost] 20:01:55.631 - INFO: 20:01:55.631 - INFO: PLAY RECAP ********************************************************************* 20:01:55.631 - INFO: localhost : ok=4  changed=1  unreachable=0 failed=0 skipped=8  rescued=0 ignored=0 20:01:55.786 - DEBUG: Removed private data directory: /tmp/tmpy849al34 20:01:55.787 - INFO: Removing namespace fixture: ocp-41820-configmap-source from cluster source-cluster 20:01:55.803 - INFO: Removing namespace fixture: ocp-41820-configmap-target from cluster host 20:01:55.837 - INFO: Waiting for namespace fixture to be deleted 20:02:02.957 - INFO: Removing namespace fixture: ocp-41820-django-source from cluster source-cluster 20:02:02.974 - INFO: Removing namespace fixture: ocp-41820-django-target from cluster host 20:02:03.006 - INFO: Waiting for namespace fixture to be deleted 20:02:15.204 - INFO: Removing namespace fixture: ocp-41820-nginx-source from cluster source-cluster 20:02:15.221 - INFO: Removing namespace fixture: ocp-41820-nginx-target from cluster host 20:02:15.252 - INFO: Waiting for namespace fixture to be deleted -------------- generated xml file: /mtc-e2e-qev2/junit-report.xml -------------- =========================== short test summary info ============================ PASSED mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_98_interop PASSED mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_116_interop PASSED mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_172_interop FAILED mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_87_interop - AssertionError: The application should be validated OK in the target cluster assert False FAILED mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_101_interop - AssertionError: The application should be validated OK in the target cluster assert False FAILED mtc-e2e-qev2/mtc_tests/tests/test_interop.py::test_mtc_147_interop - AssertionError: The application should be validated OK in the target cluster assert False ============ 3 failed, 3 passed, 150 warnings in 2019.77s (0:33:39) ============ Copying /mtc-e2e-qev2/junit-report.xml to /logs/artifacts/junit_mtc_interop_results.xml...