• Icon: Bug Bug
    • Resolution: Won't Do
    • Icon: Major Major
    • None
    • None
    • quay.io
    • False
    • None
    • False

      User on quay.io is unable to pull an openshift release image using its digest.

       

      podman pull quay.io/rh_ee_aguidi/mirror/openshift-release-dev/ocp-v4.0-art-dev@sha256:f36e139f75b179ffe40f5a234a0cef3f0a051cc38cbde4b262fb2d
      96606acc06
      Trying to pull quay.io/rh_ee_aguidi/mirror/openshift-release-dev/ocp-v4.0-art-dev@sha256:f36e139f75b179ffe40f5a234a0cef3f0a051cc38cbde4b262fb2d96606acc06...
      Error: initializing source docker://quay.io/rh_ee_aguidi/mirror/openshift-release-dev/ocp-v4.0-art-dev@sha256:f36e139f75b179ffe40f5a234a0cef3f0a051cc38cbde4b262fb2d96606acc06: reading manifest sha256:f36e139f75b179ffe40f5a234a0cef3f0a051cc38cbde4b262fb2d96606acc06 in quay.io/rh_ee_aguidi/mirror/openshift-release-dev/ocp-v4.0-art-dev: manifest unknown

            [PROJQUAY-7456] Unable to pull image by digest

            Marcus Kok added a comment -

            rhn-support-dseals They should be able to see tags in the quay UI if they have them. Otherwise, if they have access to the quay db they can do a query for tags belonging to the manifest digest. Something along the lines of

            select manifest.digest, tag.id, tag.hidden, tag.lifetime_start_ms, tag.lifetime_end_ms from manifest inner join tag on manifest.id=tag.manifest_id where manifest.digest='sha256:f36e139f75b179ffe40f5a234a0cef3f0a051cc38cbde4b262fb2d96606acc06' 

            Garbage collection logs can be found in the quay container logs, they usually have "gcworker" at the beginning of the line

            Marcus Kok added a comment - rhn-support-dseals They should be able to see tags in the quay UI if they have them. Otherwise, if they have access to the quay db they can do a query for tags belonging to the manifest digest. Something along the lines of select manifest.digest, tag.id, tag.hidden, tag.lifetime_start_ms, tag.lifetime_end_ms from manifest inner join tag on manifest.id=tag.manifest_id where manifest.digest='sha256:f36e139f75b179ffe40f5a234a0cef3f0a051cc38cbde4b262fb2d96606acc06' Garbage collection logs can be found in the quay container logs, they usually have "gcworker" at the beginning of the line

            Daniel Seals added a comment - - edited

            marckok
            Customer has tried using v1 and v2 and encounter that same issue
            How can we check for image tags in the Quay registry and how do we check logs for garbage collection events?

            Daniel Seals added a comment - - edited marckok Customer has tried using v1 and v2 and encounter that same issue How can we check for image tags in the Quay registry and how do we check logs for garbage collection events?

            Marcus Kok added a comment -

            Hi rhn-support-dseals can you confirm whether the customer has tags present on the images they are trying to pull? The problem occurring here is that images with no tag have a lifetime of about an hour before they expire, which sounds like what your customer is running into. As far as I know oc-mirror v1 creates fake tags, so if they do have tags and/or are using oc-mirror v1 this might not be the same issue

            Marcus Kok added a comment - Hi rhn-support-dseals can you confirm whether the customer has tags present on the images they are trying to pull? The problem occurring here is that images with no tag have a lifetime of about an hour before they expire, which sounds like what your customer is running into. As far as I know oc-mirror v1 creates fake tags, so if they do have tags and/or are using oc-mirror v1 this might not be the same issue

            Daniel Seals added a comment - - edited

            Hi marckok
            I would like to reopen this bug as it looks like the customer is hitting this issue and they are pushing for a fix.

            They are using quay standalone and have tried using oc-mirror v1 and v2
            The mirror will complete with out any errors.
            pods/containers will pull images from the mirror then randomly the next time the pod/container is started the image pull will fail with unknown manifest
            They will then use skopeo to copy the image that is failing to pull into their mirror.
            The pod will then start as it is able to pull the image, but the next time the pod is started the image pull fail with unknown manifest.

            Let me know what additional info you need to confirm they are hitting this issue.

            Daniel Seals added a comment - - edited Hi marckok I would like to reopen this bug as it looks like the customer is hitting this issue and they are pushing for a fix. They are using quay standalone and have tried using oc-mirror v1 and v2 The mirror will complete with out any errors. pods/containers will pull images from the mirror then randomly the next time the pod/container is started the image pull will fail with unknown manifest They will then use skopeo to copy the image that is failing to pull into their mirror. The pod will then start as it is able to pull the image, but the next time the pod is started the image pull fail with unknown manifest. Let me know what additional info you need to confirm they are hitting this issue.

            Marcus Kok added a comment -

            Implementing a fix for this issue will require an overhaul of quay's GC process. Will close this since there is not enough use cases at the moment for the amount of work this requires.

            Marcus Kok added a comment - Implementing a fix for this issue will require an overhaul of quay's GC process. Will close this since there is not enough use cases at the moment for the amount of work this requires.

            Marcus Kok added a comment -

            Images without a tag are given a temporary tag that will expire within an hour of pushing to quay. This is why an image can be pulled with its digest as shown above shortly after pushing it, but after an hour a "manifest unknown" error will be returned.

            Marcus Kok added a comment - Images without a tag are given a temporary tag that will expire within an hour of pushing to quay. This is why an image can be pulled with its digest as shown above shortly after pushing it, but after an hour a "manifest unknown" error will be returned.

            Marcus Kok added a comment -

            Copied a similar release image to my own quay.io repository only by digest. Tags were not present in the repo and I confirmed that I could pull by digest. However, after an unknown amount of time the same error appeared on my image.

            podman pull quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4
            
            Trying to pull quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4...
            Getting image source signatures
            Copying blob sha256:883bfb5b6f91e5d9f2f9949db41d86978c25fae4b6db6bddb8142b2476a486b1
            Copying blob sha256:490c628ee8c3d2bc4bcba8b2cef58acdcb0b9f84b4224bd41d6c54768be4cdcf
            Copying blob sha256:f90c4920e095dc91c490dd9ed7920d18e0327ddedcf5e10d2887e80ccae94fd7
            Copying blob sha256:ea1a555838aa985b6823242d49cf5eeb7bb609f94affa0684a471946275ce76a
            Copying blob sha256:2e7683e69b403abde2f54d7d30a28c2680b537e212fbf588c16fe1f5f1dbe996
            Copying config sha256:c838295c6b144fda12bdc302a7d75e1579a9020fdb2f47fe0047bc7fd495efd8
            Writing manifest to image destination
            c838295c6b144fda12bdc302a7d75e1579a9020fdb2f47fe0047bc7fd495efd8
            
            podman pull quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4
            
            Trying to pull quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4...
            Error: initializing source docker://quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4: reading manifest sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4 in quay.io/marckok/ocp-release: manifest unknown

             

            Additionally I confirmed that the digest sha256:f36e139f75b179ffe40f5a234a0cef3f0a051cc38cbde4b262fb2d96606acc06 is present in the database, meaning that the "manifest unknown" error should not be happening. Will need to do more investigation on what is happening on quay that changes the behavior like this.

            Marcus Kok added a comment - Copied a similar release image to my own quay.io repository only by digest. Tags were not present in the repo and I confirmed that I could pull by digest. However, after an unknown amount of time the same error appeared on my image. podman pull quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4 Trying to pull quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4... Getting image source signatures Copying blob sha256:883bfb5b6f91e5d9f2f9949db41d86978c25fae4b6db6bddb8142b2476a486b1 Copying blob sha256:490c628ee8c3d2bc4bcba8b2cef58acdcb0b9f84b4224bd41d6c54768be4cdcf Copying blob sha256:f90c4920e095dc91c490dd9ed7920d18e0327ddedcf5e10d2887e80ccae94fd7 Copying blob sha256:ea1a555838aa985b6823242d49cf5eeb7bb609f94affa0684a471946275ce76a Copying blob sha256:2e7683e69b403abde2f54d7d30a28c2680b537e212fbf588c16fe1f5f1dbe996 Copying config sha256:c838295c6b144fda12bdc302a7d75e1579a9020fdb2f47fe0047bc7fd495efd8 Writing manifest to image destination c838295c6b144fda12bdc302a7d75e1579a9020fdb2f47fe0047bc7fd495efd8 podman pull quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4 Trying to pull quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4... Error: initializing source docker: //quay.io/marckok/ocp-release@sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4: reading manifest sha256:ff8da47400514becde9c242ae2a6f98e86eddc520755e773ba18a4abc4879bc4 in quay.io/marckok/ocp-release: manifest unknown   Additionally I confirmed that the digest sha256:f36e139f75b179ffe40f5a234a0cef3f0a051cc38cbde4b262fb2d96606acc06 is present in the database, meaning that the "manifest unknown" error should not be happening. Will need to do more investigation on what is happening on quay that changes the behavior like this.

              rhn-support-dseals Daniel Seals
              marckok Marcus Kok
              Votes:
              0 Vote for this issue
              Watchers:
              3 Start watching this issue

                Created:
                Updated:
                Resolved: