Triggered by Paul Wayper GitLab Merge Request #243: insights-platform/tasks_init_just_api => master Running as SYSTEM [EnvInject] - Loading node environment variables. Building remotely on ci-int-jenkins-slave-07-rhel7 (rhel7) in workspace /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check The recommended git tool is: NONE using credential f4718b82-0cb2-47ef-aea2-dadae3f050a5 Wiping out workspace first. Cloning the remote Git repository Cloning repository git@gitlab.cee.redhat.com:insights-platform/advisor-backend.git > git init /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check # timeout=10 Fetching upstream changes from git@gitlab.cee.redhat.com:insights-platform/advisor-backend.git > git --version # timeout=10 > git --version # 'git version 1.8.3.1' using GIT_SSH to set credentials devtools-bot-sd-jenkins-ssh [INFO] Currently running in a labeled security context [INFO] Currently SELinux is 'enforcing' on the host > /usr/bin/chcon --type=ssh_home_t /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check@tmp/jenkins-gitclient-ssh5950915550905242001.key > git fetch --tags --progress git@gitlab.cee.redhat.com:insights-platform/advisor-backend.git +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git@gitlab.cee.redhat.com:insights-platform/advisor-backend.git # timeout=10 > git config --add remote.origin.fetch +refs/heads/*:refs/remotes/origin/* # timeout=10 > git config remote.origin.url git@gitlab.cee.redhat.com:insights-platform/advisor-backend.git # timeout=10 Fetching upstream changes from git@gitlab.cee.redhat.com:insights-platform/advisor-backend.git using GIT_SSH to set credentials devtools-bot-sd-jenkins-ssh [INFO] Currently running in a labeled security context [INFO] Currently SELinux is 'enforcing' on the host > /usr/bin/chcon --type=ssh_home_t /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check@tmp/jenkins-gitclient-ssh335711753949659108.key > git fetch --tags --progress git@gitlab.cee.redhat.com:insights-platform/advisor-backend.git +refs/heads/*:refs/remotes/origin/* +refs/merge-requests/243/head:refs/remotes/origin/merge-requests/243 # timeout=10 > git rev-parse 2009b2450bffdd09dbac44821e79141db5672a19^{commit} # timeout=10 > git branch -a -v --no-abbrev --contains 2009b2450bffdd09dbac44821e79141db5672a19 # timeout=10 Merging Revision 2009b2450bffdd09dbac44821e79141db5672a19 (origin/merge-requests/243, origin/tasks_init_just_api) to origin/master, UserMergeOptions{mergeRemote='origin', mergeTarget='master', mergeStrategy='DEFAULT', fastForwardMode='FF'} > git rev-parse origin/master^{commit} # timeout=10 > git config core.sparsecheckout # timeout=10 > git checkout -f origin/master # timeout=10 > git remote # timeout=10 > git config --get remote.origin.url # timeout=10 using GIT_SSH to set credentials devtools-bot-sd-jenkins-ssh [INFO] Currently running in a labeled security context [INFO] Currently SELinux is 'enforcing' on the host > /usr/bin/chcon --type=ssh_home_t /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check@tmp/jenkins-gitclient-ssh2080842885040389809.key > git merge --ff 2009b2450bffdd09dbac44821e79141db5672a19 # timeout=10 > git rev-parse HEAD^{commit} # timeout=10 Seen branch in repository origin/Always_200_system_PUT Seen branch in repository origin/PathwaysCategoryFilterFix Seen branch in repository origin/acks_no_branch_id Seen branch in repository origin/advisor_spectacular Seen branch in repository origin/always_200_classic_get Seen branch in repository origin/always_200_classic_get_GATED Seen branch in repository origin/always_200_v1_system_put Seen branch in repository origin/apply_security_patches_dockerfile Seen branch in repository origin/backend_build_test Seen branch in repository origin/backend_build_test2 Seen branch in repository origin/ben-test-10 Seen branch in repository origin/ben-test-11 Seen branch in repository origin/ben-test-12 Seen branch in repository origin/ben-test-13 Seen branch in repository origin/ben-test-14 Seen branch in repository origin/ben-test-15 Seen branch in repository origin/ben-test-16 Seen branch in repository origin/ben-test-17 Seen branch in repository origin/ben-test-18 Seen branch in repository origin/ben-test-19 Seen branch in repository origin/ben-test-20 Seen branch in repository origin/ben-test-21 Seen branch in repository origin/ben-test-22 Seen branch in repository origin/ben-test-23 Seen branch in repository origin/ben-test-24 Seen branch in repository origin/ben-test-7 Seen branch in repository origin/ben-test-8 Seen branch in repository origin/bent-test-3 Seen branch in repository origin/bent-test-5 Seen branch in repository origin/bent-test-9 Seen branch in repository origin/bent-test-dont-merge Seen branch in repository origin/bent-test2 Seen branch in repository origin/better_api_rule_acks Seen branch in repository origin/bonfire_in_pipenv Seen branch in repository origin/build_test Seen branch in repository origin/container_init_in_script Seen branch in repository origin/dan_build_test Seen branch in repository origin/disable_show_satellite_hosts Seen branch in repository origin/docker_build_cache_flush Seen branch in repository origin/export_needs_rhel_version Seen branch in repository origin/filter_branch_by_owner Seen branch in repository origin/fix_boto3_in_watchtower Seen branch in repository origin/fix_playbook_accept_header Seen branch in repository origin/gunicorn Seen branch in repository origin/gunicorn_again Seen branch in repository origin/handle_multiple_acks_correctly Seen branch in repository origin/impacted_date_too Seen branch in repository origin/internal Seen branch in repository origin/master Seen branch in repository origin/merge-requests/243 Seen branch in repository origin/more_debugging_on_external_requests Seen branch in repository origin/new_rhel_versions Seen branch in repository origin/pathways_remove_id_in Seen branch in repository origin/permissions_ldap_in_multiple_groups Seen branch in repository origin/post-results Seen branch in repository origin/python38_test Seen branch in repository origin/recent_reports_for_rule Seen branch in repository origin/remove_dnf_update Seen branch in repository origin/remove_system_create_validation Seen branch in repository origin/revert_recent_host_query_changes Seen branch in repository origin/revert_satcompat_system_status Seen branch in repository origin/sat_compat_add_system_delete Seen branch in repository origin/sat_compat_groups_404 Seen branch in repository origin/sat_compat_ignore_show_satellite Seen branch in repository origin/sat_compat_systems_v1_no_system_empty_object Seen branch in repository origin/sat_compat_systems_v1_retrieve_404 Seen branch in repository origin/sat_compat_topic_name_case_insensitive Seen branch in repository origin/service_fix_report_host_obj Seen branch in repository origin/simpler_systems_reports Seen branch in repository origin/tasks_init Seen branch in repository origin/tasks_init_dan Seen branch in repository origin/tasks_init_just_api Seen branch in repository origin/tasks_migrations_init Seen branch in repository origin/tasks_squashed Seen branch in repository origin/test_bonfire_43 Seen branch in repository origin/test_bonfire_changes Seen branch in repository origin/test_bonfire_v4 Seen branch in repository origin/test_new_bonfire Seen branch in repository origin/test_new_oc_wrapper Seen branch in repository origin/test_pathway_incident_count Seen branch in repository origin/test_pr2 Seen branch in repository origin/test_pr_tag_naming Seen branch in repository origin/trigger_report_hooks_report_list_rework Seen branch in repository origin/update_pipenv_202202 Seen branch in repository origin/update_sat_compat_readme Seen branch in repository origin/v1_systems_no_upload_required Seen branch in repository origin/vacuum_db_no_systems_fix Seen branch in repository origin/vwildner/add-iqe-tasks Seen 88 remote branches > git show-ref --tags -d # timeout=10 Checking out Revision 2009b2450bffdd09dbac44821e79141db5672a19 (origin/merge-requests/243, origin/tasks_init_just_api, origin/master) > git config core.sparsecheckout # timeout=10 > git checkout -f 2009b2450bffdd09dbac44821e79141db5672a19 # timeout=10 Commit message: "Reduce poll time to 10 seconds (speedier SIGTERM response)" > git rev-list --no-walk 2009b2450bffdd09dbac44821e79141db5672a19 # timeout=10 > git rev-list --no-walk 2009b2450bffdd09dbac44821e79141db5672a19 # timeout=10 > git rev-list --no-walk 2009b2450bffdd09dbac44821e79141db5672a19 # timeout=10 Retrieving secret: insights-cicd/ephemeral-bot-svc-account Retrieving secret: app-sre/creds/app-interface/production/basic-auth Retrieving secret: insights-cicd/rh-registry-pull Retrieving secret: app-sre/quay/cloudservices-push Retrieving secret: insights-cicd/insightsdroid-github Retrieving secret: insights-cicd/quay/cloudservices Retrieving secret: insights/secrets/qe/global/sonarqube Retrieving secret: insights/secrets/qe/global/gitlab-sa Retrieving secret: insights/secrets/qe/global/ibutsu [insights-platform-advisor-backend-pr-check] $ /bin/sh -xe /tmp/jenkins6803277079360783208.sh + ./pr_check.sh Red Hat Enterprise Linux Server release 7.9 (Maipo) ++ export DOCKER_CONFIG=/var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.docker ++ DOCKER_CONFIG=/var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.docker ++ rm -fr /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.docker ++ mkdir /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.docker ++ export KUBECONFIG_DIR=/var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube ++ KUBECONFIG_DIR=/var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube ++ export KUBECONFIG=/var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube/config ++ KUBECONFIG=/var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube/config ++ rm -fr /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube ++ mkdir /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube ++ set +x Collecting pip Using cached https://files.pythonhosted.org/packages/a4/6d/6463d49a933f547439d6b5b98b46af8742cc03ae83543e4d7688c2420f8b/pip-21.3.1-py3-none-any.whl Collecting setuptools<58 Using cached https://files.pythonhosted.org/packages/4b/b9/71687c5d2034c863db1db2e0704f5e27581ff3cb44d7f293968c5e08ceb3/setuptools-57.5.0-py3-none-any.whl Collecting wheel Using cached https://files.pythonhosted.org/packages/27/d6/003e593296a85fd6ed616ed962795b2f87709c3eee2bca4f6d0fe55c6d00/wheel-0.37.1-py2.py3-none-any.whl Installing collected packages: pip, setuptools, wheel Found existing installation: pip 9.0.3 Uninstalling pip-9.0.3: Successfully uninstalled pip-9.0.3 Found existing installation: setuptools 39.2.0 Uninstalling setuptools-39.2.0: Successfully uninstalled setuptools-39.2.0 Successfully installed pip-21.3.1 setuptools-57.5.0 wheel-0.37.1 You are using pip version 21.3.1, however version 22.1.1 is available. You should consider upgrading via the 'pip install --upgrade pip' command. Collecting crc-bonfire>=4.1.1 Using cached crc_bonfire-4.2.0-py3-none-any.whl (52 kB) Collecting multidict Using cached multidict-5.2.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (159 kB) Collecting click>=7.1.2 Using cached click-8.0.4-py3-none-any.whl (97 kB) Collecting tabulate Using cached tabulate-0.8.9-py3-none-any.whl (25 kB) Collecting python-dotenv Using cached python_dotenv-0.20.0-py3-none-any.whl (17 kB) Collecting cached-property Using cached cached_property-1.5.2-py2.py3-none-any.whl (7.6 kB) Collecting chardet<4.0,>=2.0 Using cached chardet-3.0.4-py2.py3-none-any.whl (133 kB) Collecting PyYAML Using cached PyYAML-6.0-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (603 kB) Collecting wait-for Using cached wait_for-1.2.0-py2.py3-none-any.whl (10 kB) Collecting ocviapy>=0.3.2 Using cached ocviapy-1.0.1-py3-none-any.whl (10 kB) Collecting gql==3.0.0a6 Using cached gql-3.0.0a6-py2.py3-none-any.whl Collecting sh Using cached sh-1.14.2-py2.py3-none-any.whl (40 kB) Collecting requests Using cached requests-2.27.1-py2.py3-none-any.whl (63 kB) Collecting junitparser Using cached junitparser-2.5.0-py2.py3-none-any.whl (10 kB) Collecting app-common-python>=0.1.6 Using cached app_common_python-0.2.1-py3-none-any.whl (4.6 kB) Collecting yarl<2.0,>=1.6 Using cached yarl-1.7.2-cp36-cp36m-manylinux_2_5_x86_64.manylinux1_x86_64.manylinux_2_12_x86_64.manylinux2010_x86_64.whl (270 kB) Collecting graphql-core<3.2,>=3.1.5 Using cached graphql_core-3.1.7-py3-none-any.whl (189 kB) Collecting importlib-metadata Using cached importlib_metadata-4.8.3-py3-none-any.whl (17 kB) Collecting anytree Using cached anytree-2.8.0-py2.py3-none-any.whl (41 kB) Collecting kubernetes Using cached kubernetes-23.6.0-py2.py3-none-any.whl (1.5 MB) Collecting future Using cached future-0.18.2-py3-none-any.whl Collecting certifi>=2017.4.17 Using cached certifi-2022.5.18.1-py3-none-any.whl (155 kB) Collecting urllib3<1.27,>=1.21.1 Using cached urllib3-1.26.9-py2.py3-none-any.whl (138 kB) Collecting charset-normalizer~=2.0.0 Using cached charset_normalizer-2.0.12-py3-none-any.whl (39 kB) Collecting idna<4,>=2.5 Using cached idna-3.3-py3-none-any.whl (61 kB) Collecting parsedatetime>=2.5 Using cached parsedatetime-2.6-py3-none-any.whl (42 kB) Collecting typing-extensions>=3.7.4 Using cached typing_extensions-4.1.1-py3-none-any.whl (26 kB) Collecting six>=1.9.0 Using cached six-1.16.0-py2.py3-none-any.whl (11 kB) Collecting zipp>=0.5 Using cached zipp-3.6.0-py3-none-any.whl (5.3 kB) Collecting google-auth>=1.0.1 Using cached google_auth-2.6.6-py2.py3-none-any.whl (156 kB) Collecting requests-oauthlib Using cached requests_oauthlib-1.3.1-py2.py3-none-any.whl (23 kB) Collecting python-dateutil>=2.5.3 Using cached python_dateutil-2.8.2-py2.py3-none-any.whl (247 kB) Collecting websocket-client!=0.40.0,!=0.41.*,!=0.42.*,>=0.32.0 Using cached websocket_client-1.3.1-py3-none-any.whl (54 kB) Requirement already satisfied: setuptools>=21.0.0 in ./.bonfire_venv/lib/python3.6/site-packages (from kubernetes->ocviapy>=0.3.2->crc-bonfire>=4.1.1) (57.5.0) Collecting cachetools<6.0,>=2.0.0 Using cached cachetools-4.2.4-py3-none-any.whl (10 kB) Collecting rsa<5,>=3.1.4 Using cached rsa-4.8-py3-none-any.whl (39 kB) Collecting pyasn1-modules>=0.2.1 Using cached pyasn1_modules-0.2.8-py2.py3-none-any.whl (155 kB) Collecting oauthlib>=3.0.0 Using cached oauthlib-3.2.0-py3-none-any.whl (151 kB) Collecting pyasn1<0.5.0,>=0.4.6 Using cached pyasn1-0.4.8-py2.py3-none-any.whl (77 kB) Installing collected packages: urllib3, pyasn1, idna, charset-normalizer, certifi, six, rsa, requests, pyasn1-modules, oauthlib, cachetools, zipp, websocket-client, typing-extensions, requests-oauthlib, PyYAML, python-dateutil, parsedatetime, multidict, google-auth, yarl, wait-for, sh, kubernetes, importlib-metadata, graphql-core, future, anytree, tabulate, python-dotenv, ocviapy, junitparser, gql, click, chardet, cached-property, app-common-python, crc-bonfire Successfully installed PyYAML-6.0 anytree-2.8.0 app-common-python-0.2.1 cached-property-1.5.2 cachetools-4.2.4 certifi-2022.5.18.1 chardet-3.0.4 charset-normalizer-2.0.12 click-8.0.4 crc-bonfire-4.2.0 future-0.18.2 google-auth-2.6.6 gql-3.0.0a6 graphql-core-3.1.7 idna-3.3 importlib-metadata-4.8.3 junitparser-2.5.0 kubernetes-23.6.0 multidict-5.2.0 oauthlib-3.2.0 ocviapy-1.0.1 parsedatetime-2.6 pyasn1-0.4.8 pyasn1-modules-0.2.8 python-dateutil-2.8.2 python-dotenv-0.20.0 requests-2.27.1 requests-oauthlib-1.3.1 rsa-4.8 sh-1.14.2 six-1.16.0 tabulate-0.8.9 typing-extensions-4.1.1 urllib3-1.26.9 wait-for-1.2.0 websocket-client-1.3.1 yarl-1.7.2 zipp-3.6.0 Cloning into '/var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.bonfire'... ++ docker login -u=**** -p=**** quay.io Login Succeeded ++ docker login '-u=****' -p=**** registry.redhat.io Login Succeeded ++ set +x W0525 11:18:44.839191 7355 loader.go:221] Config not found: /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube/config W0525 11:18:44.915066 7355 loader.go:221] Config not found: /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube/config W0525 11:18:44.915101 7355 loader.go:221] Config not found: /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.kube/config Logged into "****" as "system:serviceaccount:ephemeral-base:ephemeral-bot" using the token provided. You have access to 167 projects, the list has been suppressed. You can list all projects with 'oc projects' Using project "". ++ docker login -u=**** -p=**** quay.io Login Succeeded ++ docker login '-u=****' -p=**** registry.redhat.io Login Succeeded ++ set +x checking if image 'quay.io/cloudservices/advisor-backend:pr-243-2009b24' already exists in quay.io... ++ docker build -t quay.io/cloudservices/advisor-backend:pr-243-2009b24 /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check -f /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/Dockerfile Sending build context to Docker daemon 68.39 MB Step 1/17 : FROM registry.access.redhat.com/ubi8/ubi-minimal:latest ---> 08c1631d50a3 Step 2/17 : USER 0 ---> Using cache ---> 6b71520fa2c4 Step 3/17 : RUN FULL_RHEL=$(microdnf repolist --enabled | grep rhel-8) ; if [ -z "$FULL_RHEL" ] ; then rpm -Uvh http://mirror.centos.org/centos/8-stream/BaseOS/x86_64/os/Packages/centos-stream-repos-8-4.el8.noarch.rpm http://mirror.centos.org/centos/8-stream/BaseOS/x86_64/os/Packages/centos-gpg-keys-8-4.el8.noarch.rpm && sed -i 's/^\(enabled.*\)/\1\npriority=200/;' /etc/yum.repos.d/CentOS*.repo ; fi ---> Using cache ---> 7e59ebfa4d4a Step 4/17 : RUN microdnf module enable postgresql:13 nodejs:16 python38:3.8 && microdnf install --setopt=tsflags=nodocs -y postgresql nodejs python38 && microdnf upgrade -y && microdnf clean all ---> Using cache ---> cb904820d732 Step 5/17 : ENV APP_ROOT /opt/app-root/src ---> Using cache ---> 3c54211f044c Step 6/17 : WORKDIR $APP_ROOT ---> Using cache ---> 0ea9e410efbd Step 7/17 : RUN mkdir /{.config,.local} && chgrp -R 0 /{.config,.local} $APP_ROOT && chmod -R g=u /{.config,.local} $APP_ROOT ---> Using cache ---> 8f524e1ef85a Step 8/17 : COPY package.json ./ ---> Using cache ---> d4c59e9e6ba2 Step 9/17 : RUN npm install ---> Using cache ---> 49db0c5c805d Step 10/17 : COPY Pipfile* ./ ---> Using cache ---> df5255bec60a Step 11/17 : RUN python -m pip install --upgrade pip && python -m pip install pipenv && pipenv install --system --dev ---> Using cache ---> d18e7ad253d8 Step 12/17 : USER 1001 ---> Using cache ---> 6a239de22c73 Step 13/17 : COPY .flake8 .coveragerc refresh_db.sh container_init.sh ./ ---> Using cache ---> f19785b372b3 Step 14/17 : COPY api api ---> Using cache ---> 18926c9d8e2e Step 15/17 : COPY service service ---> Using cache ---> cbbc5d7797e7 Step 16/17 : EXPOSE 8000 ---> Using cache ---> 5f052746f3f8 Step 17/17 : LABEL quay.expires-after 3d ---> Using cache ---> e20349f6b230 Successfully built e20349f6b230 ++ set +x ++ docker push quay.io/cloudservices/advisor-backend:pr-243-2009b24 The push refers to a repository [quay.io/cloudservices/advisor-backend] d24ddcb5c5dd: Preparing 2cbbe1e314d1: Preparing 557787263ad0: Preparing 791059901329: Preparing d22b4d5cf71e: Preparing 5b6b75b0f8e6: Preparing 05edc5b3a8f2: Preparing f2673c237fb7: Preparing 651c6b14aedf: Preparing 7443f40260af: Preparing 386aadb581ce: Preparing e34e3bdec276: Preparing dff9f8de74c0: Preparing 05edc5b3a8f2: Waiting 386aadb581ce: Waiting e34e3bdec276: Waiting f2673c237fb7: Waiting dff9f8de74c0: Waiting 651c6b14aedf: Waiting 7443f40260af: Waiting 5b6b75b0f8e6: Waiting 791059901329: Layer already exists d24ddcb5c5dd: Layer already exists d22b4d5cf71e: Layer already exists 2cbbe1e314d1: Layer already exists 557787263ad0: Layer already exists f2673c237fb7: Layer already exists 05edc5b3a8f2: Layer already exists 651c6b14aedf: Layer already exists 5b6b75b0f8e6: Layer already exists 7443f40260af: Layer already exists 386aadb581ce: Layer already exists e34e3bdec276: Layer already exists dff9f8de74c0: Layer already exists pr-243-2009b24: digest: sha256:e57c3ee4f578f565d6b79f7d4ca47981c53cd6d0fb302c96e48fd9c43149deea size: 3041 ++ set +x 0844d065473917dac107df7e001cdc8beca7a787f071df9dbaef4f92314bc6bd ==================================== === Installing Pip Dependencies ==== ==================================== Warning: the environment variable LANG is not set! We recommend setting this in ~/.profile (or equivalent) for proper expected behavior. Creating a virtualenv for this project... Pipfile: /opt/app-root/src/Pipfile Using /usr/bin/python3.8 (3.8.12) to create virtualenv... ⠋ Creating virtual environment... ⠙ Creating virtual environment... ⠹ Creating virtual environment... ⠸ Creating virtual environment... ⠼ Creating virtual environment... ⠴ Creating virtual environment... ⠦ Creating virtual environment...created virtual environment CPython3.8.12.final.0-64 in 424ms creator CPython3Posix(dest=/.local/share/virtualenvs/src-dqYAXZ28, clear=False, no_vcs_ignore=False, global=False) seeder FromAppData(download=False, pip=bundle, setuptools=bundle, wheel=bundle, via=copy, app_data_dir=/.local/share/virtualenv) added seed packages: pip==22.0.4, setuptools==62.1.0, wheel==0.37.1 activators BashActivator,CShellActivator,FishActivator,NushellActivator,PowerShellActivator,PythonActivator ✔ Successfully created virtual environment! Virtualenv location: /.local/share/virtualenvs/src-dqYAXZ28 Installing dependencies from Pipfile.lock (0d1ddf)... 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 0/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 1/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 2/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 3/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 4/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 5/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 6/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 7/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 8/113 — 00:00:26 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 9/113 — 00:00:26 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 10/113 — 00:00:26 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 11/113 — 00:00:26 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 12/113 — 00:00:26 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 13/113 — 00:00:26 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 14/113 — 00:00:26 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 15/113 — 00:00:26 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 16/113 — 00:00:35 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 17/113 — 00:00:35 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 18/113 — 00:00:35 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 19/113 — 00:00:35 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 20/113 — 00:00:35 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 21/113 — 00:00:35 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 22/113 — 00:00:35 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 23/113 — 00:00:35 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 24/113 — 00:00:33 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 25/113 — 00:00:33 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 26/113 — 00:00:33 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 27/113 — 00:00:33 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 28/113 — 00:00:33 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 29/113 — 00:00:33 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 30/113 — 00:00:33 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 31/113 — 00:00:33 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 32/113 — 00:00:30 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 33/113 — 00:00:30 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 34/113 — 00:00:30 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 35/113 — 00:00:30 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 36/113 — 00:00:30 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 37/113 — 00:00:30 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 38/113 — 00:00:30 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 39/113 — 00:00:30 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 40/113 — 00:00:27 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 41/113 — 00:00:27 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 42/113 — 00:00:27 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 43/113 — 00:00:27 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 44/113 — 00:00:27 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 45/113 — 00:00:27 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 46/113 — 00:00:27 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 47/113 — 00:00:27 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 48/113 — 00:00:24 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 49/113 — 00:00:24 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 50/113 — 00:00:24 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 51/113 — 00:00:24 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 52/113 — 00:00:24 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 53/113 — 00:00:24 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 54/113 — 00:00:24 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 55/113 — 00:00:24 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 56/113 — 00:00:21 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 57/113 — 00:00:21 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 58/113 — 00:00:21 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 59/113 — 00:00:21 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 60/113 — 00:00:21 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 61/113 — 00:00:21 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 62/113 — 00:00:21 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 63/113 — 00:00:21 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 64/113 — 00:00:19 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 65/113 — 00:00:19 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 66/113 — 00:00:19 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 67/113 — 00:00:19 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 68/113 — 00:00:19 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 69/113 — 00:00:19 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 70/113 — 00:00:19 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 71/113 — 00:00:19 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 72/113 — 00:00:16 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 73/113 — 00:00:16 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 74/113 — 00:00:16 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 75/113 — 00:00:16 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 76/113 — 00:00:16 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 77/113 — 00:00:16 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 78/113 — 00:00:16 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 79/113 — 00:00:16 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 80/113 — 00:00:13 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 81/113 — 00:00:13 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 82/113 — 00:00:13 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 83/113 — 00:00:13 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 84/113 — 00:00:13 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 85/113 — 00:00:13 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 86/113 — 00:00:13 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 87/113 — 00:00:13 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 88/113 — 00:00:10 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 89/113 — 00:00:10 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 90/113 — 00:00:10 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 91/113 — 00:00:10 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 92/113 — 00:00:10 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 93/113 — 00:00:10 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 94/113 — 00:00:10 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 95/113 — 00:00:10 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 96/113 — 00:00:07 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 97/113 — 00:00:07 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 98/113 — 00:00:07 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 99/113 — 00:00:07 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 100/113 — 00:00:07 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 101/113 — 00:00:07 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 102/113 — 00:00:07 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 103/113 — 00:00:07 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 104/113 — 00:00:04 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 105/113 — 00:00:04 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 106/113 — 00:00:04 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 107/113 — 00:00:04 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 108/113 — 00:00:04 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 109/113 — 00:00:04 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 110/113 — 00:00:04 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 111/113 — 00:00:04 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 112/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 113/113 — 00:00:00 🐍 ▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉▉ 113/113 — 00:00:58 To activate this project's virtualenv, run pipenv shell. Alternatively, run a command inside the virtualenv with pipenv run. ==================================== === Running Lint Tests ==== ==================================== ==================================== === Running API Tests ==== ==================================== Found 577 test(s). Creating test database for alias 'default'... System check identified no issues (0 silenced). Running tests... ---------------------------------------------------------------------- ................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................................. ---------------------------------------------------------------------- Ran 577 tests in 55.658s OK Generating XML reports... Destroying test database for alias 'default'... /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check ==================================== === Running Service Tests ==== ==================================== ============================= test session starts ============================== platform linux -- Python 3.8.12, pytest-7.1.2, pluggy-1.0.0 -- /.local/share/virtualenvs/src-dqYAXZ28/bin/python cachedir: .pytest_cache rootdir: /opt/app-root/src plugins: cov-3.0.0, django-4.5.2, mock-3.7.0, server-fixtures-1.7.0, shutil-1.7.0, Faker-13.7.0 collecting ... %3|1653477656.400|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477656.408|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) Not running in openshift collected 33 items service/tests/test_advisor_service.py::test_previous_upload Operations to perform: Apply all migrations: api Running migrations: Applying api.0001_squashed_initial... OK Applying api.0002_remove_currentreport_unused_indexes... OK Applying api.0003_host_add_account... OK Applying api.0004_host_account_required... OK Applying api.0005_subset_model... OK Applying api.0006_host_add_branch_id... OK Applying api.0007_use_django_jsonfield... OK Applying api.0008_ack_add_remote_branch... OK Applying api.0009_delete_hosttag... OK Applying api.0010_add_playbooks_fields... OK Applying api.0011_make_display_name_nullable... OK Applying api.0012_field_help_text... OK Applying api.0013_host_one2one_inventoryhost... OK Applying api.0014_host_remove_inventory_fields_1... OK Applying api.0015_remove_host_rhel_version...%3|1653477657.400|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) OK Applying api.0016_auto_20210329_1403...%3|1653477657.407|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) OK Applying api.0017_delete_null_play_playbooks... OK Applying api.0018_add_pathway_component... OK Applying api.0019_add_slug_to_pathways... OK Applying api.0020_ack_no_branch_id... OK Applying api.0021_alter_dailyhitgroup_id... OK Applying api.0022_add_org_id_columns... OK Creating test database for alias 'default' ('test_insightsapi')... Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_autoacks_for_new_account Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_handle_engine_results Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_satellite_handle_engine_results Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_handle_engine_results_bad_keys Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_generate_webhook_msgs_new_report Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_generate_webhook_msgs_resolved_report Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_db_one_failure Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_db_repeated_failure Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_handle_rule_hits Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_db_reports_bad_upload PASSED service/tests/test_advisor_service.py::test_db_reports_bad_upload_source PASSED service/tests/test_advisor_service.py::test_db_reports_upload_source_exception PASSED service/tests/test_advisor_service.py::test_handle_inventory_delete_event Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_bad_db_inventory_upload_delete Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_bad_db_inventory_current_report_delete Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_bad_db_inventory_hostack_delete Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_non_rhel_system_filtering Installed 153 object(s) from 8 fixture(s) PASSED service/tests/test_advisor_service.py::test_executor_unhandled_exception PASSED service/tests/test_advisor_service.py::test_consume_upload PASSED service/tests/test_advisor_service.py::test_consume_exception_in_process_archive PASSED service/tests/test_advisor_service.py::test_consume_error PASSED service/tests/test_advisor_service.py::test_consume_partition_eof_error PASSED service/tests/test_advisor_service.py::test_subscribe_and_teardown PASSED service/tests/test_advisor_service.py::test_sigterm_shutdown %3|1653477666.066|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477666.067|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) [11:21:06] INFO [insights-advisor-service.:816] [140053786800384 MainThread] [771] Starting Insights Advisor Service %3|1653477667.065|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477667.067|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477668.065|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477668.067|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) [11:21:08] INFO [insights-advisor-service.terminate:46] [140053786800384 MainThread] [771] SIGTERM received, triggering shutdown Not running in openshift PASSED service/tests/test_advisor_service.py::test_sigterm_shutdown_failed_cloudwatch_setup %3|1653477669.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477669.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) [11:21:09] INFO [insights-advisor-service.:816] [140458584387840 MainThread] [787] Starting Insights Advisor Service %3|1653477670.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477670.589|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477671.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) %3|1653477671.589|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) %3|1653477672.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477672.589|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477674.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477674.589|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477675.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477675.589|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477676.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477676.589|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477677.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477678.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477678.589|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477679.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477680.544|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477681.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477681.589|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477682.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477683.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477684.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477685.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477686.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477686.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477687.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477688.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477688.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477690.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477690.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477691.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477692.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477692.591|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477693.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477693.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477694.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477694.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477695.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477696.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477696.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477697.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477697.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477698.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477698.590|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) [11:21:38] INFO [insights-advisor-service.terminate:46] [140458584387840 MainThread] [787] SIGTERM received, triggering shutdown Not running in openshift %3|1653477699.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477700.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477701.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477702.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477703.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) %3|1653477704.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477706.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477708.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477709.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477710.545|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477711.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477712.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477713.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477714.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477715.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477716.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477718.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477719.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477720.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477721.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477722.547|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477724.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477726.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477728.546|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) /.local/share/virtualenvs/src-dqYAXZ28/lib/python3.8/site-packages/watchtower/__init__.py:341: WatchtowerWarning: Failed to deliver logs: Could not connect to the endpoint URL: "https://logs.bogus.amazonaws.com/" warnings.warn("Failed to deliver logs: {}".format(e), WatchtowerWarning) /.local/share/virtualenvs/src-dqYAXZ28/lib/python3.8/site-packages/watchtower/__init__.py:345: WatchtowerWarning: Failed to deliver logs: None warnings.warn("Failed to deliver logs: {}".format(response), WatchtowerWarning) PASSED service/tests/test_advisor_service.py::test_our_log_formatter %3|1653477730.758|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) Not running in openshift %3|1653477730.760|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) {"component": "insights-advisor-service", "@timestamp": "2022-05-25T11:22:10.760Z", "@version": 1, "source_host": "622b5ffc728f", "name": "insights-advisor-service", "args": [], "levelname": "INFO", "levelno": 20, "pathname": "service/service.py", "filename": "service.py", "module": "service", "stack_info": null, "lineno": 816, "funcName": "", "created": 1653477730.760785, "msecs": 760.7851028442383, "relativeCreated": 1266.007423400879, "thread": 139847501144320, "threadName": "MainThread", "processName": "MainProcess", "process": 804, "message": "Starting Insights Advisor Service"} %3|1653477731.757|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477731.759|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv4#127.0.0.1:9092 failed: Connection refused (after 0ms in state CONNECT) %3|1653477732.758|FAIL|rdkafka#producer-1| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT, 1 identical error(s) suppressed) %3|1653477732.759|FAIL|rdkafka#consumer-2| [thrd:localhost:9092/bootstrap]: localhost:9092/bootstrap: Connect to ipv6#[::1]:9092 failed: Connection refused (after 0ms in state CONNECT) {"component": "insights-advisor-service", "@timestamp": "2022-05-25T11:22:12.761Z", "@version": 1, "source_host": "622b5ffc728f", "name": "insights-advisor-service", "args": [], "levelname": "INFO", "levelno": 20, "pathname": "service/service.py", "filename": "service.py", "module": "service", "stack_info": null, "lineno": 46, "funcName": "terminate", "created": 1653477732.761487, "msecs": 761.4870071411133, "relativeCreated": 3266.709327697754, "thread": 139847501144320, "threadName": "MainThread", "processName": "MainProcess", "process": 804, "message": "SIGTERM received, triggering shutdown"} PASSED service/tests/test_advisor_service.py::test_handle_rule_hits_path PASSED service/tests/test_advisor_service.py::test_handle_rule_hits_missing_keys PASSED service/tests/test_advisor_service.py::test_prometheus PASSED service/tests/test_advisor_service.py::test_handle_inventory_event_path PASSED service/tests/test_advisor_service.py::test_handle_inventory_event_missing_type PASSED service/tests/test_advisor_service.py::test_handle_inventory_event_missing_delete_keys PASSED =============================== warnings summary =============================== ../../../.local/share/virtualenvs/src-dqYAXZ28/lib/python3.8/site-packages/django/conf/__init__.py:229 /.local/share/virtualenvs/src-dqYAXZ28/lib/python3.8/site-packages/django/conf/__init__.py:229: RemovedInDjango50Warning: The USE_L10N setting is deprecated. Starting with Django 5.0, localized formatting of data will always be enabled. For example Django will display numbers and dates using the format of the current locale. warnings.warn(USE_L10N_DEPRECATED_MSG, RemovedInDjango50Warning) -- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html ----------- generated xml file: /opt/app-root/src/junit-service.xml ------------ ---------- coverage: platform linux, python 3.8.12-final-0 ----------- Coverage HTML written to dir htmlcov =================== 33 passed, 1 warning in 78.10s (0:01:18) =================== Destroying test database for alias 'default' ('test_insightsapi')... ===================================== ==== ✔ SUCCESS: PASSED TESTS ==== ===================================== d801ba6ddcb77d03557b96968dfdfb0cc5f80f3b9f7d3c091745db5f85081257 622b5ffc728f0c2783a06d5c5febc365ce6e99f2d443687b59308b4711881311 advisor-test-pr-243-2009b24 /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.bonfire/cicd/bin/oc_wrapper ++ export BONFIRE_NS_REQUESTER=insights-platform-advisor-backend-pr-check-1031 ++ BONFIRE_NS_REQUESTER=insights-platform-advisor-backend-pr-check-1031 +++ bonfire namespace reserve 2022-05-25 11:22:14 [ WARNING] [ MainThread] split() requires a non-empty pattern match. 2022-05-25 11:22:14 [ INFO] [ MainThread] Attempting to reserve a namespace... 2022-05-25 11:22:15 [ INFO] [ MainThread] Checking for existing reservations for 'insights-platform-advisor-backend-pr-check-1031' 2022-05-25 11:22:15 [ INFO] [ MainThread] processing namespace reservation 2022-05-25 11:22:15 [ INFO] [ MainThread] running (pid 9430): oc apply -f - 2022-05-25 11:22:16 [ INFO] [ pid-9430] |stdout| namespacereservation.cloud.redhat.com/bonfire-reservation-a1c81def created 2022-05-25 11:22:16 [ INFO] [ MainThread] waiting for reservation 'bonfire-reservation-a1c81def' to get picked up by operator 2022-05-25 11:22:17 [ INFO] [ MainThread] namespace 'ephemeral-3r0qps' is reserved by 'insights-platform-advisor-backend-pr-check-1031' for '1h' ++ export NAMESPACE=ephemeral-3r0qps ++ NAMESPACE=ephemeral-3r0qps ++ SMOKE_NAMESPACE=ephemeral-3r0qps ++ bonfire deploy advisor --source=appsre --ref-env insights-production --set-template-ref advisor-backend=2009b2450bffdd09dbac44821e79141db5672a19 --set-image-tag quay.io/cloudservices/advisor-backend=pr-243-2009b24 --namespace ephemeral-3r0qps --timeout 600 --optional-deps-method hybrid 2022-05-25 11:22:18 [ WARNING] [ MainThread] split() requires a non-empty pattern match. 2022-05-25 11:22:19 [ INFO] [ MainThread] processing app templates... 2022-05-25 11:22:19 [ INFO] [ MainThread] reading config from: /var/lib/jenkins/.config/bonfire/config.yaml 2022-05-25 11:22:19 [ INFO] [ MainThread] fetching apps config using source: appsre, target env: insights-ephemeral 2022-05-25 11:22:20 [ INFO] [ MainThread] local app configuration overrides found for: ['my_custom_app'] 2022-05-25 11:22:20 [ INFO] [ MainThread] subbing app template refs/image tags using environment: insights-production 2022-05-25 11:22:21 [ INFO] [ MainThread] processing app 'advisor' 2022-05-25 11:22:21 [ INFO] [ MainThread] processing component advisor-frontend 2022-05-25 11:22:22 [ INFO] [ MainThread] fetch succeeded for ref 'master' 2022-05-25 11:22:22 [ INFO] [ MainThread] ignoring component advisor-frontend, user opted to disable frontend deployments 2022-05-25 11:22:22 [ INFO] [ MainThread] processing component advisor-backend 2022-05-25 11:22:22 [ INFO] [ MainThread] component: 'advisor-backend' overriding template ref to '2009b2450bffdd09dbac44821e79141db5672a19' 2022-05-25 11:22:22 [ INFO] [ MainThread] replaced 5 occurence(s) of image tag for image 'quay.io/cloudservices/advisor-backend' 2022-05-25 11:22:22 [ INFO] [ MainThread] processing component ingress 2022-05-25 11:22:22 [ INFO] [ MainThread] processing component puptoo 2022-05-25 11:22:22 [ INFO] [ MainThread] processing component storage-broker 2022-05-25 11:22:23 [ INFO] [ MainThread] processing component engine 2022-05-25 11:22:23 [ INFO] [ MainThread] processing component rbac 2022-05-25 11:22:23 [ INFO] [ MainThread] processing component host-inventory 2022-05-25 11:22:24 [ INFO] [ MainThread] processing component xjoin-search 2022-05-25 11:22:24 [ INFO] [ MainThread] fetch succeeded for ref 'master' 2022-05-25 11:22:24 [ INFO] [ MainThread] applying app configs... 2022-05-25 11:22:24 [ INFO] [ MainThread] running (pid 9740): oc apply -f - -n ephemeral-3r0qps 2022-05-25 11:22:24 [ INFO] [ pid-9740] |stdout| clowdapp.cloud.redhat.com/advisor-backend created 2022-05-25 11:22:24 [ INFO] [ pid-9740] |stdout| clowdjobinvocation.cloud.redhat.com/content-deploy-43edf77 created 2022-05-25 11:22:24 [ INFO] [ pid-9740] |stdout| service/insights-advisor-api created 2022-05-25 11:22:24 [ INFO] [ pid-9740] |stdout| configmap/floorplan created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| clowdapp.cloud.redhat.com/ingress created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| clowdapp.cloud.redhat.com/puptoo created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| clowdapp.cloud.redhat.com/storage-broker created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| configmap/storage-broker-map created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| clowdapp.cloud.redhat.com/engine created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| configmap/insights-engine created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| clowdapp.cloud.redhat.com/rbac created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| configmap/rbac-env created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| configmap/rbac-config created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| configmap/model-access-permissions created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| secret/rbac-psks created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stderr| Warning: resource secrets/insights-rbac is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by oc apply. oc apply should only be used on resources created declaratively by either oc create --save-config or oc apply. The missing annotation will be patched automatically. 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| secret/insights-rbac configured 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| secret/rbac-secret created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| service/rbac created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| clowdapp.cloud.redhat.com/host-inventory created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| clowdjobinvocation.cloud.redhat.com/events-topic-rebuilder-123456 created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| service/insights-inventory created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| configmap/floorplan configured 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| clowdapp.cloud.redhat.com/xjoin-search created 2022-05-25 11:22:25 [ INFO] [ pid-9740] |stdout| service/xjoin-search created 2022-05-25 11:22:35 [ INFO] [ pid-9740] |stdout| elasticsearch.elasticsearch.k8s.elastic.co/xjoin-elasticsearch created 2022-05-25 11:22:35 [ INFO] [ pid-9740] |stdout| xjoinpipeline.xjoin.cloud.redhat.com/xjoinpipeline created 2022-05-25 11:22:35 [ INFO] [ pid-9740] |stderr| Warning: resource networkpolicies/allow-from-xjoin-operator-namespace is missing the kubectl.kubernetes.io/last-applied-configuration annotation which is required by oc apply. oc apply should only be used on resources created declaratively by either oc create --save-config or oc apply. The missing annotation will be patched automatically. 2022-05-25 11:22:35 [ INFO] [ pid-9740] |stdout| networkpolicy.networking.k8s.io/allow-from-xjoin-operator-namespace configured 2022-05-25 11:22:35 [ INFO] [ pid-9740] |stdout| configmap/xjoin created 2022-05-25 11:22:35 [ INFO] [ MainThread] waiting on resources for max of 600sec... 2022-05-25 11:22:35 [ INFO] [ MainThread] [clowdenvironment/env-ephemeral-3r0qps] waiting up to 600sec for resource to be 'ready' 2022-05-25 11:22:37 [ INFO] [ MainThread] [clowdenvironment/env-ephemeral-3r0qps] owned resource deployment/env-ephemeral-3r0qps-featureflags is ready! 2022-05-25 11:22:37 [ INFO] [ MainThread] [clowdenvironment/env-ephemeral-3r0qps] owned resource deployment/env-ephemeral-3r0qps-keycloak is ready! 2022-05-25 11:22:37 [ INFO] [ MainThread] [clowdenvironment/env-ephemeral-3r0qps] owned resource deployment/env-ephemeral-3r0qps-mbop is ready! 2022-05-25 11:22:37 [ INFO] [ MainThread] [clowdenvironment/env-ephemeral-3r0qps] owned resource deployment/env-ephemeral-3r0qps-minio is ready! 2022-05-25 11:22:37 [ INFO] [ MainThread] [clowdenvironment/env-ephemeral-3r0qps] owned resource deployment/env-ephemeral-3r0qps-mocktitlements is ready! 2022-05-25 11:22:37 [ INFO] [ MainThread] [clowdenvironment/env-ephemeral-3r0qps] owned resource deployment/featureflags-db is ready! 2022-05-25 11:22:37 [ INFO] [ MainThread] [clowdenvironment/env-ephemeral-3r0qps] resource is ready! 2022-05-25 11:22:37 [ INFO] [ thread-27] [clowdapp/advisor-backend] waiting up to 598sec for resource to be 'ready' 2022-05-25 11:22:37 [ INFO] [ thread-28] [clowdapp/engine] waiting up to 598sec for resource to be 'ready' 2022-05-25 11:22:37 [ INFO] [ thread-29] [clowdapp/host-inventory] waiting up to 598sec for resource to be 'ready' 2022-05-25 11:22:37 [ INFO] [ thread-30] [clowdapp/ingress] found owned resource deployment/ingress-service, not yet ready 2022-05-25 11:22:37 [ INFO] [ thread-30] [clowdapp/ingress] waiting up to 598sec for resource to be 'ready' 2022-05-25 11:22:37 [ INFO] [ thread-31] [clowdapp/puptoo] waiting up to 598sec for resource to be 'ready' 2022-05-25 11:22:37 [ INFO] [ thread-32] [clowdapp/rbac] waiting up to 598sec for resource to be 'ready' 2022-05-25 11:22:37 [ INFO] [ thread-33] [clowdapp/storage-broker] waiting up to 598sec for resource to be 'ready' 2022-05-25 11:22:37 [ INFO] [ thread-34] [clowdapp/xjoin-search] waiting up to 598sec for resource to be 'ready' 2022-05-25 11:22:51 [ INFO] [ thread-28] [clowdapp/engine] found owned resource deployment/engine-processor, not yet ready 2022-05-25 11:22:51 [ INFO] [ thread-31] [clowdapp/puptoo] found owned resource deployment/puptoo-processor, not yet ready 2022-05-25 11:22:51 [ INFO] [ thread-28] [clowdapp/engine] found owned resource deployment/engine-redis, not yet ready 2022-05-25 11:22:51 [ INFO] [ thread-33] [clowdapp/storage-broker] found owned resource deployment/storage-broker-processor, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-29] [clowdapp/host-inventory] found owned resource deployment/host-inventory-db, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-29] [clowdapp/host-inventory] found owned resource deployment/host-inventory-mq-p1, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-29] [clowdapp/host-inventory] found owned resource deployment/host-inventory-mq-pmin, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-29] [clowdapp/host-inventory] found owned resource deployment/host-inventory-mq-sp, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-29] [clowdapp/host-inventory] found owned resource deployment/host-inventory-service, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-34] [clowdapp/xjoin-search] found owned resource deployment/xjoin-search-api, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-32] [clowdapp/rbac] found owned resource deployment/rbac-db, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-32] [clowdapp/rbac] found owned resource deployment/rbac-redis, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-32] [clowdapp/rbac] owned resource deployment/rbac-scheduler-service is ready! 2022-05-25 11:22:59 [ INFO] [ thread-32] [clowdapp/rbac] found owned resource deployment/rbac-service, not yet ready 2022-05-25 11:22:59 [ INFO] [ thread-32] [clowdapp/rbac] owned resource deployment/rbac-worker-service is ready! 2022-05-25 11:23:07 [ INFO] [ thread-29] [clowdapp/host-inventory] owned resource deployment/host-inventory-mq-pmin is ready! 2022-05-25 11:23:07 [ INFO] [ thread-29] [clowdapp/host-inventory] owned resource deployment/host-inventory-mq-sp is ready! 2022-05-25 11:23:15 [ INFO] [ thread-33] [clowdapp/storage-broker] owned resource deployment/storage-broker-processor is ready! 2022-05-25 11:23:15 [ INFO] [ thread-28] [clowdapp/engine] owned resource deployment/engine-processor is ready! 2022-05-25 11:23:15 [ INFO] [ thread-30] [clowdapp/ingress] owned resource deployment/ingress-service is ready! 2022-05-25 11:23:15 [ INFO] [ thread-33] [clowdapp/storage-broker] resource is ready! 2022-05-25 11:23:15 [ INFO] [ thread-30] [clowdapp/ingress] resource is ready! 2022-05-25 11:23:22 [ INFO] [ thread-29] [clowdapp/host-inventory] owned resource deployment/host-inventory-service is ready! 2022-05-25 11:23:22 [ INFO] [ thread-31] [clowdapp/puptoo] owned resource deployment/puptoo-processor is ready! 2022-05-25 11:23:23 [ INFO] [ thread-31] [clowdapp/puptoo] resource is ready! 2022-05-25 11:23:37 [ INFO] [ thread-34] [clowdapp/xjoin-search] waiting 538sec longer 2022-05-25 11:23:37 [ INFO] [ thread-29] [clowdapp/host-inventory] waiting 538sec longer 2022-05-25 11:23:37 [ INFO] [ thread-28] [clowdapp/engine] waiting 538sec longer 2022-05-25 11:23:37 [ INFO] [ thread-32] [clowdapp/rbac] waiting 538sec longer 2022-05-25 11:23:37 [ INFO] [ thread-27] [clowdapp/advisor-backend] waiting 538sec longer 2022-05-25 11:23:46 [ INFO] [ thread-28] [clowdapp/engine] owned resource deployment/engine-redis is ready! 2022-05-25 11:23:46 [ INFO] [ thread-27] [clowdapp/advisor-backend] found owned resource deployment/advisor-backend-api, not yet ready 2022-05-25 11:23:46 [ INFO] [ thread-27] [clowdapp/advisor-backend] found owned resource deployment/advisor-backend-db, not yet ready 2022-05-25 11:23:46 [ INFO] [ thread-27] [clowdapp/advisor-backend] found owned resource deployment/advisor-backend-service, not yet ready 2022-05-25 11:23:46 [ INFO] [ thread-27] [clowdapp/advisor-backend] owned resource deployment/advisor-backend-tasks-service is ready! 2022-05-25 11:23:46 [ INFO] [ thread-32] [clowdapp/rbac] owned resource deployment/rbac-db is ready! 2022-05-25 11:23:46 [ INFO] [ thread-32] [clowdapp/rbac] owned resource deployment/rbac-redis is ready! 2022-05-25 11:23:46 [ INFO] [ thread-29] [clowdapp/host-inventory] owned resource deployment/host-inventory-db is ready! 2022-05-25 11:23:47 [ INFO] [ thread-28] [clowdapp/engine] resource is ready! 2022-05-25 11:24:10 [ INFO] [ thread-34] [clowdapp/xjoin-search] owned resource deployment/xjoin-search-api is ready! 2022-05-25 11:24:10 [ INFO] [ thread-34] [clowdapp/xjoin-search] resource is ready! 2022-05-25 11:24:18 [ INFO] [ thread-27] [clowdapp/advisor-backend] owned resource deployment/advisor-backend-tasks-service is ready! 2022-05-25 11:24:26 [ INFO] [ thread-29] [clowdapp/host-inventory] owned resource deployment/host-inventory-mq-p1 is ready! 2022-05-25 11:24:26 [ INFO] [ thread-29] [clowdapp/host-inventory] resource is ready! 2022-05-25 11:24:34 [ INFO] [ thread-27] [clowdapp/advisor-backend] owned resource deployment/advisor-backend-db is ready! 2022-05-25 11:24:37 [ INFO] [ thread-32] [clowdapp/rbac] waiting 478sec longer 2022-05-25 11:24:37 [ INFO] [ thread-27] [clowdapp/advisor-backend] waiting 478sec longer 2022-05-25 11:24:49 [ INFO] [ thread-27] [clowdapp/advisor-backend] owned resource deployment/advisor-backend-tasks-service is ready! 2022-05-25 11:24:57 [ INFO] [ thread-27] [clowdapp/advisor-backend] owned resource deployment/advisor-backend-service is ready! 2022-05-25 11:25:05 [ INFO] [ thread-32] [clowdapp/rbac] owned resource deployment/rbac-service is ready! 2022-05-25 11:25:06 [ INFO] [ thread-32] [clowdapp/rbac] resource is ready! 2022-05-25 11:25:13 [ INFO] [ thread-27] [clowdapp/advisor-backend] owned resource deployment/advisor-backend-api is ready! 2022-05-25 11:25:14 [ INFO] [ thread-27] [clowdapp/advisor-backend] resource is ready! 2022-05-25 11:25:14 [ INFO] [ MainThread] all resources being monitored reached 'ready' state 2022-05-25 11:25:14 [ INFO] [ thread-273] [deployment/env-ephemeral-3r0qps-7221b7f8-connect] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-274] [deployment/env-ephemeral-3r0qps-7221b7f8-entity-operator] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-275] [deployment/prometheus-operator] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-276] [statefulset/env-ephemeral-3r0qps-7221b7f8-kafka] owned resource pod/env-ephemeral-3r0qps-7221b7f8-kafka-0 is ready! 2022-05-25 11:25:14 [ INFO] [ thread-277] [statefulset/env-ephemeral-3r0qps-7221b7f8-zookeeper] owned resource pod/env-ephemeral-3r0qps-7221b7f8-zookeeper-0 is ready! 2022-05-25 11:25:14 [ INFO] [ thread-278] [statefulset/prometheus-env-ephemeral-3r0qps] owned resource pod/prometheus-env-ephemeral-3r0qps-0 is ready! 2022-05-25 11:25:14 [ INFO] [ thread-276] [statefulset/env-ephemeral-3r0qps-7221b7f8-kafka] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-279] [clowdjobinvocation/content-deploy-43edf77] waiting up to 440sec for resource to be 'ready' 2022-05-25 11:25:14 [ INFO] [ thread-277] [statefulset/env-ephemeral-3r0qps-7221b7f8-zookeeper] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-280] [clowdjobinvocation/events-topic-rebuilder-123456] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-278] [statefulset/prometheus-env-ephemeral-3r0qps] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-281] [kafkaconnect/env-ephemeral-3r0qps-7221b7f8] owned resource deployment/env-ephemeral-3r0qps-7221b7f8-connect is ready! 2022-05-25 11:25:14 [ INFO] [ thread-282] [kafka/env-ephemeral-3r0qps-7221b7f8] owned resource deployment/env-ephemeral-3r0qps-7221b7f8-entity-operator is ready! 2022-05-25 11:25:14 [ INFO] [ thread-283] [xjoinpipeline/xjoinpipeline] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-284] [statefulset/xjoin-elasticsearch-es-default] owned resource pod/xjoin-elasticsearch-es-default-0 is ready! 2022-05-25 11:25:14 [ INFO] [ thread-285] [cyndipipeline/advisor] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-281] [kafkaconnect/env-ephemeral-3r0qps-7221b7f8] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-282] [kafka/env-ephemeral-3r0qps-7221b7f8] owned resource statefulset/env-ephemeral-3r0qps-7221b7f8-kafka is ready! 2022-05-25 11:25:14 [ INFO] [ thread-284] [statefulset/xjoin-elasticsearch-es-default] resource is ready! 2022-05-25 11:25:14 [ INFO] [ thread-282] [kafka/env-ephemeral-3r0qps-7221b7f8] owned resource statefulset/env-ephemeral-3r0qps-7221b7f8-zookeeper is ready! 2022-05-25 11:25:14 [ INFO] [ thread-282] [kafka/env-ephemeral-3r0qps-7221b7f8] resource is ready! 2022-05-25 11:26:10 [ INFO] [ thread-279] [clowdjobinvocation/content-deploy-43edf77] resource is ready! 2022-05-25 11:26:10 [ INFO] [ MainThread] all resources being monitored reached 'ready' state 2022-05-25 11:26:10 [ INFO] [ MainThread] successfully deployed to namespace 'ephemeral-3r0qps' ephemeral-3r0qps ++ set +x Running: docker pull quay.io/cloudservices/mc:latest Trying to pull repository quay.io/cloudservices/mc ... latest: Pulling from quay.io/cloudservices/mc Digest: sha256:9e1cb5cea27d4f54e74d9770cbf6a698c4ca264b3492801b8900f2c768a0a152 Status: Image is up to date for quay.io/cloudservices/mc:latest +++ bonfire deploy-iqe-cji advisor-backend --marker 'advisor_smoke or advisor_api_smoke or advisor_service_smoke' --filter ''\'''\''' --image-tag ''\'''\''' --requirements ''\'''\''' --requirements-priority critical,high,medium,low --test-importance ''\'''\''' --plugins advisor --env clowder_smoke --cji-name advisor-backend --namespace ephemeral-3r0qps 2022-05-25 11:26:10 [ WARNING] [ MainThread] split() requires a non-empty pattern match. 2022-05-25 11:26:11 [ INFO] [ MainThread] processing IQE ClowdJobInvocation 2022-05-25 11:26:12 [ INFO] [ MainThread] running (pid 15405): oc apply -f - -n ephemeral-3r0qps 2022-05-25 11:26:12 [ INFO] [ pid-15405] |stdout| clowdjobinvocation.cloud.redhat.com/advisor-backend created 2022-05-25 11:26:12 [ INFO] [ MainThread] waiting on CJI 'advisor-backend' for max of 600sec... 2022-05-25 11:26:12 [ INFO] [ MainThread] waiting for Job to appear owned by CJI 'advisor-backend' 2022-05-25 11:26:12 [ INFO] [ MainThread] found Job 'advisor-backend-iqe-otm4jk9' created by CJI 'advisor-backend', now waiting for pod to appear 2022-05-25 11:26:12 [ INFO] [ MainThread] found pod 'advisor-backend-iqe-otm4jk9--1-p6zt7' associated with CJI 'advisor-backend', now waiting for pod to be 'running' 2022-05-25 11:26:13 [ INFO] [ MainThread] [pod/advisor-backend-iqe-otm4jk9--1-p6zt7] waiting up to 599sec for resource to be 'ready' 2022-05-25 11:26:18 [ INFO] [ MainThread] [pod/advisor-backend-iqe-otm4jk9--1-p6zt7] resource is ready! 2022-05-25 11:26:18 [ INFO] [ MainThread] pod 'advisor-backend-iqe-otm4jk9--1-p6zt7' related to CJI 'advisor-backend' in ns 'ephemeral-3r0qps' is running ++ POD=advisor-backend-iqe-otm4jk9--1-p6zt7 ++ set +x ++ oc_wrapper wait --timeout=30m --for=condition=JobInvocationComplete -n ephemeral-3r0qps cji/advisor-backend Build timed out (after 30 minutes). Marking the build as failed. Build was aborted Recording test results WARNING:bonfire.openshift:Ignoring immutable field errors Conflicting option (with another plugin?): `--long-running` Conflicting option (with another plugin?): `--run-excluded` Conflicting option (with another plugin?): `--long-running` Conflicting option (with another plugin?): `--run-excluded` Conflicting option (with another plugin?): `--dnf-compare` Conflicting option (with another plugin?): `--remediations` Conflicting option (with another plugin?): `--vmaas-prod-check` Conflicting option (with another plugin?): `--local` ⛔ ERROR: Could not run "pip search iqe" to precheck package versions ⚠️ WARNING: installed packages mismatches global constraints, to remediate run :$ iqe update Name Pinned Installed ------ -------- ----------- pip 22.1 21.3.1 Conflicting option (with another plugin?): `--long-running` Conflicting option (with another plugin?): `--run-excluded` Conflicting option (with another plugin?): `--long-running` Conflicting option (with another plugin?): `--run-excluded` Conflicting option (with another plugin?): `--dnf-compare` Conflicting option (with another plugin?): `--remediations` Conflicting option (with another plugin?): `--vmaas-prod-check` Conflicting option (with another plugin?): `--local` Conflicting option (with another plugin?): `--long-running` Conflicting option (with another plugin?): `--run-excluded` Conflicting option (with another plugin?): `--long-running` Conflicting option (with another plugin?): `--run-excluded` Conflicting option (with another plugin?): `--dnf-compare` Conflicting option (with another plugin?): `--remediations` Conflicting option (with another plugin?): `--vmaas-prod-check` Conflicting option (with another plugin?): `--local` /iqe_venv/lib/python3.8/site-packages/iqe/base/settings/_find_files.py:69: IQEDeprecationWarning: local LOCAL conf path not used warnings.warn("local LOCAL conf path not used", IQEDeprecationWarning) /iqe_venv/lib/python3.8/site-packages/iqe/base/settings/_find_files.py:69: IQEDeprecationWarning: local LOCAL conf path not used warnings.warn("local LOCAL conf path not used", IQEDeprecationWarning) /iqe_venv/lib/python3.8/site-packages/iqe/base/settings/vault_loading.py:28: DeprecationWarning: Call to deprecated function 'auth_approle'. This method will be removed in version '0.12.0' Please use the 'login' method on the 'hvac.api.auth_methods.approle' class moving forward. client.auth_approle( /iqe_venv/lib/python3.8/site-packages/iqe/base/settings/vault_loading.py:28: DeprecationWarning: Call to deprecated function 'auth_approle'. This method will be removed in version '0.12.0' Please use the 'login' method on the 'hvac.api.auth_methods.approle' class moving forward. client.auth_approle( /iqe_venv/lib/python3.8/site-packages/hvac/v1/__init__.py:526: DeprecationWarning: Call to deprecated function 'renew_self_token'. This method will be removed in version '1.0.0' Please use the 'renew_self' method on the 'hvac.api.auth_methods.token' class moving forward. return self.renew_self_token(increment=increment, wrap_ttl=wrap_ttl) /iqe_venv/lib/python3.8/site-packages/hvac/v1/__init__.py:526: DeprecationWarning: Call to deprecated function 'renew_self_token'. This method will be removed in version '1.0.0' Please use the 'renew_self' method on the 'hvac.api.auth_methods.token' class moving forward. return self.renew_self_token(increment=increment, wrap_ttl=wrap_ttl) ============================= test session starts ============================== platform linux -- Python 3.8.12, pytest-6.2.5, py-1.11.0, pluggy-1.0.0 Default Appliance hostname: cloud.redhat.com Default Appliance path: / rootdir: /tmp/tmpgmpp2fbu, testpaths: /iqe_venv/lib/python3.8/site-packages/iqe_advisor plugins: anyio-3.6.1, hypothesis-6.46.3, iqe-core-22.5.16.1, iqe-metadata-linting-22.4.28.0, iqe-requirements-2022.5.4.1, cov-3.0.0, forked-1.4.0, ibutsu-2.0.2, polarion-collect-0.25.0, report-parameters-0.6.0, subtests-0.6.0, xdist-2.5.0, schemathesis-3.15.2, Faker-13.11.1, iqe-advisor-frontend-plugin-22.5.18.0, iqe-advisor-plugin-22.5.4.0, iqe-compliance-plugin-22.5.18.0, iqe-host-inventory-frontend-plugin-22.5.10.0, iqe-host-inventory-plugin-22.5.17.0, iqe-ingress-plugin-22.5.23.0, iqe-mq-plugin-0.3.0, iqe-patchman-plugin-22.5.24.0, iqe-platform-ui-plugin-2022.5.24.0, iqe-rbac-plugin-22.5.12.0, iqe-remediations-frontend-plugin-22.5.19.0, iqe-remediations-plugin-22.5.13.0, iqe-vm-plugin-22.3.15.0, iqe-vmaas-plugin-22.3.22.0, iqe-vulnerability-plugin-22.5.19.0, html-3.1.1, lazy-fixture-0.6.3, metadata-2.0.1, mock-3.7.0, ordering-0.6, rerunfailures-10.2, requests-mock-1.9.3 gw0 I / gw1 I [gw0] Python 3.8.12 (default, Sep 16 2021, 10:46:05) -- [GCC 8.5.0 20210514 (Red Hat 8.5.0-3)] [gw1] Python 3.8.12 (default, Sep 16 2021, 10:46:05) -- [GCC 8.5.0 20210514 (Red Hat 8.5.0-3)] gw0 [20] / gw1 [20] scheduling tests via LoadScheduling tests/api/test_export.py::test_advisor_export_filters[total_risk-val0-True] tests/api/test_export.py::test_advisor_export_matches_status [gw0] [ 5%] ERROR tests/api/test_export.py::test_advisor_export_matches_status [gw1] [ 10%] ERROR tests/api/test_export.py::test_advisor_export_filters[total_risk-val0-True] tests/api/test_export.py::test_advisor_export_filters[res_risk-val1-True] tests/api/test_export.py::test_advisor_export_filters[impact-val2-True] [gw0] [ 15%] ERROR tests/api/test_export.py::test_advisor_export_filters[res_risk-val1-True] tests/api/test_export.py::test_advisor_export_filters[likelihood-val3-True] [gw1] [ 20%] ERROR tests/api/test_export.py::test_advisor_export_filters[impact-val2-True] tests/api/test_export.py::test_advisor_export_filters[category-val4-True] [gw1] [ 25%] ERROR tests/api/test_export.py::test_advisor_export_filters[category-val4-True] tests/api/test_export.py::test_advisor_export_filter_combined[filter_options0] [gw0] [ 30%] ERROR tests/api/test_export.py::test_advisor_export_filters[likelihood-val3-True] tests/api/test_export.py::test_advisor_export_filter_combined[filter_options1] [gw1] [ 35%] ERROR tests/api/test_export.py::test_advisor_export_filter_combined[filter_options0] tests/api/test_export.py::test_advisor_export_filter_combined[filter_options3] [gw0] [ 40%] ERROR tests/api/test_export.py::test_advisor_export_filter_combined[filter_options1] tests/api/test_export.py::test_advisor_export_filter_combined[filter_options2] [gw1] [ 45%] ERROR tests/api/test_export.py::test_advisor_export_filter_combined[filter_options3] tests/api/test_export.py::test_advisor_export_filter_combined[filter_options4] [gw0] [ 50%] ERROR tests/api/test_export.py::test_advisor_export_filter_combined[filter_options2] tests/api/test_export.py::test_advisor_export_filter_combined[filter_options5] [gw1] [ 55%] ERROR tests/api/test_export.py::test_advisor_export_filter_combined[filter_options4] tests/api/test_host_acks.py::test_ack_single_system_api [gw0] [ 60%] ERROR tests/api/test_export.py::test_advisor_export_filter_combined[filter_options5] tests/api/test_host_delete.py::test_host_delete [gw1] [ 65%] FAILED tests/api/test_host_acks.py::test_ack_single_system_api tests/api/test_host_delete.py::test_batch_host_delete [gw0] [ 70%] FAILED tests/api/test_host_delete.py::test_host_delete tests/api/test_ratings.py::test_rating_create [gw0] [ 75%] PASSED tests/api/test_ratings.py::test_rating_create [Checks API] No suitable checks publisher found. WARNING:bonfire.openshift:Ignoring immutable field errors Archiving artifacts Terminated +++ teardown ERR +++ local CAPTURED_SIGNAL=ERR +++ add_cicd_bin_to_path +++ command -v oc_wrapper /var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/.bonfire/cicd/bin/oc_wrapper +++ set +x ------------------------ ----- TEARING DOWN ----- ------------------------ Tear down operation triggered by signal: ERR Running teardown for ns: ephemeral-3r0qps Errors or failures detected, collecting K8s artifacts Collecting container logs... sent [/var/lib/jenkins/workspace/insights-platform-advisor-backend-pr-check/artifacts/k8s_artifacts/ephemeral-3r0qps/logs/advisor-backend-api-7dddf56cbf-hvc2g_advisor-backend-api.log] to splunk in 17 events Finished: FAILURE