Uploaded image for project: 'Project Quay'
  1. Project Quay
  2. PROJQUAY-5039

Failed to uprade OMR 1.2.9 to 1.3.0

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Done
    • Icon: Blocker Blocker
    • omr-v1.3.0
    • omr-v1.3.0
    • OMR

      Description of problem:

      Failed to uprade OMR 1.2.9 to 1.3.0

      Version-Release number of selected component (if applicable):

      1. $ cat /etc/redhat-release 
        Red Hat Enterprise Linux release 8.8 Beta (Ootpa)
      2. $ uname -a
        Linux preserve-quay-rhel8-8-primary-1 4.18.0-452.el8.x86_64 #1 SMP Mon Jan 23 16:48:33 EST 2023 x86_64 x86_64 x86_64 GNU/Linux
      3. $ podman --version
        podman version 4.3.1
      4. 'registry-proxy.engineering.redhat.com/rh-osbs/openshift-mirror-registry-rhel8@sha256:a90abdc95d0f9103467080f1a52c09372936ddc0e9cb98899363021e5f0681eb',
                                      'registry-proxy.engineering.redhat.com/rh-osbs/openshift-mirror-registry-rhel8:v1.3.0-5'

      How reproducible:

      Always

      Steps:

      1. Install OMR 1.2.9, download tarball from https://developers.redhat.com/content-gateway/rest/mirror/pub/openshift-v4/clients/mirror-registry/latest/mirror-registry.tar.gz
      2. Upgrade OMR to 1.3.0
      $ ./mirror-registry upgrade --quayHostname 10.0.79.108 --quayRoot /opt/omr-upg --ssh-key /home/cloud-user/omr/upg/openshift-qe.pem --targetHostname 10.0.79.108 --targetUsername cloud-user -v

      Actual results:

         __   __
        /  \ /  \     ______   _    _     __   __   __
       / /\ / /\ \   /  __  \ | |  | |   /  \  \ \ / /
      / /  / /  \ \  | |  | | | |  | |  / /\ \  \   /
      \ \  \ \  / /  | |__| | | |__| | / ____ \  | |
       \ \/ \ \/ /   \_  ___/  \____/ /_/    \_\ |_|
        \__/ \__/      \ \__
                        \___\ by Red Hat
       Build, Store, and Distribute your Containers
          
      INFO[2023-02-02 03:36:01] Upgrade has begun                            
      DEBU[2023-02-02 03:36:01] Ansible Execution Environment Image: quay.io/quay/mirror-registry-ee:latest 
      DEBU[2023-02-02 03:36:01] Pause Image: registry.access.redhat.com/ubi8/pause:8.7-6 
      DEBU[2023-02-02 03:36:01] Quay Image: registry.redhat.io/quay/quay-rhel8:v3.8.1 
      DEBU[2023-02-02 03:36:01] Redis Image: registry.redhat.io/rhel8/redis-6:1-92.1669834635 
      DEBU[2023-02-02 03:36:01] Postgres Image: registry.redhat.io/rhel8/postgresql-10:1-203.1669834630 
      INFO[2023-02-02 03:36:01] Found execution environment at /home/cloud-user/omr/execution-environment.tar 
      INFO[2023-02-02 03:36:01] Loading execution environment from execution-environment.tar 
      DEBU[2023-02-02 03:36:01] Importing execution enviornment with command: /bin/bash -c /usr/bin/podman image import \
                          --change 'ENV PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin' \
                          --change 'ENV HOME=/home/runner' \
                          --change 'ENV container=oci' \
                          --change 'ENTRYPOINT=["entrypoint"]' \
                          --change 'WORKDIR=/runner' \
                          --change 'EXPOSE=6379' \
                          --change 'VOLUME=/runner' \
                          --change 'CMD ["ansible-runner", "run", "/runner"]' \
                          - quay.io/quay/mirror-registry-ee:latest < /home/cloud-user/omr/execution-environment.tar 
      Getting image source signatures
      Copying blob cd1ab1ccf63e skipped: already exists  
      Copying config 6dff260654 done  
      Writing manifest to image destination
      Storing signatures
      sha256:6dff2606542fdc8de6e47bf8c2c8aec67a7b713908a7f10ee6afdd59a05033a9
      INFO[2023-02-02 03:36:03] Found SSH key at /home/cloud-user/omr/upg/openshift-qe.pem 
      INFO[2023-02-02 03:36:03] Attempting to set SELinux rules on /home/cloud-user/omr/upg/openshift-qe.pem 
      INFO[2023-02-02 03:36:03] Found image archive at /home/cloud-user/omr/image-archive.tar 
      INFO[2023-02-02 03:36:03] Attempting to set SELinux rules on image archive 
      INFO[2023-02-02 03:36:03] Running upgrade playbook. This may take some time. To see playbook output run the installer with -v (verbose) flag. 
      DEBU[2023-02-02 03:36:03] Running command: podman run --rm --interactive --tty --workdir /runner/project --net host -v /home/cloud-user/omr/image-archive.tar:/runner/image-archive.tar -v /home/cloud-user/omr/upg/openshift-qe.pem:/runner/env/ssh_key -e RUNNER_OMIT_EVENTS=False -e RUNNER_ONLY_FAILED_EVENTS=False -e ANSIBLE_HOST_KEY_CHECKING=False -e ANSIBLE_CONFIG=/runner/project/ansible.cfg -e ANSIBLE_NOCOLOR=false --quiet --name ansible_runner_instance quay.io/quay/mirror-registry-ee:latest ansible-playbook -i cloud-user@10.0.79.108, --private-key /runner/env/ssh_key -e "quay_image=registry.redhat.io/quay/quay-rhel8:v3.8.1 quay_version=v3.8.1 redis_image=registry.redhat.io/rhel8/redis-6:1-92.1669834635 postgres_image=registry.redhat.io/rhel8/postgresql-10:1-203.1669834630 pause_image=registry.access.redhat.com/ubi8/pause:8.7-6 quay_hostname=10.0.79.108:8443 local_install=false quay_root=/opt/omr-upg quay_storage=quay-storage pg_storage=pg-storage" upgrade_mirror_appliance.yml   PLAY [Upgrade Mirror Appliance] ******************************************************************************************************************TASK [Gathering Facts] ***************************************************************************************************************************
      ok: [cloud-user@10.0.79.108]TASK [mirror_appliance : Autodetect Image Archive] ***********************************************************************************************
      included: /runner/project/roles/mirror_appliance/tasks/autodetect-image-archive.yaml for cloud-user@10.0.79.108TASK [mirror_appliance : Checking for Image Archive] *********************************************************************************************
      ok: [cloud-user@10.0.79.108 -> localhost]TASK [mirror_appliance : Create install directory for image-archive.tar dest] ********************************************************************
      ok: [cloud-user@10.0.79.108]TASK [mirror_appliance : Copy Images if /runner/image-archive.tar exists] ************************************************************************
      changed: [cloud-user@10.0.79.108]TASK [mirror_appliance : Unpack Images if /runner/image-archive.tar exists] **********************************************************************
      changed: [cloud-user@10.0.79.108]TASK [mirror_appliance : Loading Redis if redis.tar exists] **************************************************************************************
      changed: [cloud-user@10.0.79.108]TASK [mirror_appliance : Loading Quay if quay.tar exists] ****************************************************************************************
      changed: [cloud-user@10.0.79.108]TASK [mirror_appliance : Loading Postgres if postgres.tar exists] ********************************************************************************
      changed: [cloud-user@10.0.79.108]TASK [mirror_appliance : Upgrade Quay Pod Service] ***********************************************************************************************
      included: /runner/project/roles/mirror_appliance/tasks/upgrade-pod-service.yaml for cloud-user@10.0.79.108TASK [mirror_appliance : Copy Quay Pod systemd service file] *************************************************************************************
      fatal: [cloud-user@10.0.79.108]: FAILED! => {"changed": false, "checksum": "b635063a57ec3cabf9248276b7fba5929ad0a85e", "msg": "Destination directory /home/cloud-user/.config/systemd/user does not exist"}PLAY RECAP ***************************************************************************************************************************************
      cloud-user@10.0.79.108     : ok=10   changed=5    unreachable=0    failed=1    skipped=0    rescued=0    ignored=0   ERRO[2023-02-02 03:38:27] An error occurred: exit status 2             

       

      Expected results:

      should upgrade to 1.3.0 successfully

      Additional info:

              jonathankingfc Jonathan King
              rhn-support-dyan Dongbo Yan
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated:
                Resolved: