Uploaded image for project: 'Red Hat Advanced Cluster Management'
  1. Red Hat Advanced Cluster Management
  2. ACM-23897

Wrong current version displayed for nodepool on the Upgrade version popup

XMLWordPrintable

    • Quality / Stability / Reliability
    • False
    • Hide

      None

      Show
      None
    • False
    • ACM Console Train 32 - 2
    • None

      Description of problem:

      After creating a new hosted cluster via the hub console, selecting the `4.18.23` release image but then editing the `spec.release.image` to be `quay.io/openshift-release-dev/ocp-release:4.18.14-multi` in the YAML for the desired version. Everything else were left as defaults and the resulting deployed cluster reports the expected 4.18.14 version for the control plane and node pool version, however the kubernetes version of the nodes does not match the OpenShift release version as reported by the hosted cluster command line:

      $ oc get clusterversion
      NAME      VERSION   AVAILABLE   PROGRESSING   SINCE   STATUS
      version   4.18.14   True        False         28m     Cluster version is 4.18.14
      $ oc get nodes
      NAME                      STATUS   ROLES    AGE   VERSION
      upgradetest-xs94t-68vhb   Ready    worker   38m   v1.31.11
      upgradetest-xs94t-sfcxf   Ready    worker   38m   v1.31.11 
      

      The expected version for OCP 4.18.14 is v1.31.8

      Furthermore, when attempting to upgrade the cluster, it shows the correct current version for the control plane, but shows `4.18.23` for the nodepool

      Version-Release number of selected component (if applicable):

      Hub cluster: OCP 4.18.22

      RHACM: 2.14.0

      MCE: 2.9.0

      How reproducible:

      Every time

      Steps to Reproduce:

      1. From the hub console, create a new hosted cluster
      2. Provide a unique cluster name
      3. Select the "OpenShift 4.18.23" option in the release version
      4. Edit the `.spec.release.image` in the YAML to be a valid 4.18.14 image target
      5. Click next and specify a unique nodepool name.
      6. Leave all other options as defaults and click next, next and create to start creating the cluster
      7. Once created check the cluster versions, node versions and current version on the upgrade panel and see they do not match

      Actual results:

      The control plane and nodepool for the newly installed hosted cluster appears to be installed as version 4.18.14 according to the cluster overviews on the hub cluster, but the nodepool version does not match in reality

      Expected results:

      The control plane and nodepool versions should actually match and the cluster overview should not lie

      Additional info:

              rbrunopi Randy Bruno-Piverger
              rhn-support-pauwebst Paul Webster
              David Huynh David Huynh
              Votes:
              0 Vote for this issue
              Watchers:
              2 Start watching this issue

                Created:
                Updated: