Uploaded image for project: 'Red Hat OpenShift Dev Spaces (formerly CodeReady Workspaces) '
  1. Red Hat OpenShift Dev Spaces (formerly CodeReady Workspaces)
  2. CRW-4352

code-rhel8: RH internal pulp repos now require https

XMLWordPrintable

    • False
    • None
    • False

      [2023-05-03T05:11:04.516Z] [1/2] STEP 7/16: RUN yum -y -q update     && yum install -y libsecret-devel libsecret curl make cmake gcc gcc-c++ python2 git git-core-doc openssh less libX11-devel libxkbfile-devel libxkbfile libxkbcommon bash tar gzip rsync patch     && yum -y clean all && rm -rf /var/cache/yum     && npm install -g yarn@1.22.17
      
      [2023-05-03T05:11:05.075Z] time="2023-05-03T01:11:03-04:00" level=warning msg="Failed to mount subscriptions, skipping entry in /usr/share/containers/mounts.conf: getting host subscription data: failed to read subscriptions from \"/usr/share/rhel/secrets\": open /usr/share/rhel/secrets/rhsm/rhsm.conf: permission denied"
      
      [2023-05-03T05:11:06.012Z] Error: Failed to download metadata for repo 'rhel-8-for-appstream-rpms-pulp': Cannot download repomd.xml: Cannot download repodata/repomd.xml: All mirrors were tried
      

      Failing for all the code-rhel8 container builds, on s390x, ppc64le. Not sure if this is because the dynamic x86_64 node takes longer to start up and therefore the parallel task fails faster on Z and P, or if this is a Z/P specific problem.

      Also not sure why the error has `rhel-8-for-appstream-rpms-pulp` but the values in content sets yaml are:

      x86_64:
      - rhel-8-for-x86_64-baseos-rpms
      - rhel-8-for-x86_64-appstream-rpms
      s390x:
      - rhel-8-for-s390x-baseos-rpms
      - rhel-8-for-s390x-appstream-rpms
      ppc64le:
      - rhel-8-for-ppc64le-baseos-rpms
      - rhel-8-for-ppc64le-appstream-rpms

            nickboldt Nick Boldt
            nickboldt Nick Boldt
            Votes:
            0 Vote for this issue
            Watchers:
            0 Start watching this issue

              Created:
              Updated:
              Resolved: