Uploaded image for project: 'Red Hat OpenStack Services on OpenShift'
  1. Red Hat OpenStack Services on OpenShift
  2. OSPRH-88

As a cloud operator, i want to be able to update my nova deployment to a minor release without workload impact.

XMLWordPrintable

    • Icon: Epic Epic
    • Resolution: Duplicate
    • Icon: Major Major
    • rhos-18.0.0
    • None
    • nova-operator
    • None
    • nova minor upgrade support.
    • False
    • Hide

      None

      Show
      None
    • False
    • OSPRH-120Compute Engineering Backlog
    • Committed
    • Proposed
    • To Do
    • OSPRH-120 - Compute Engineering Backlog
    • Committed
    • Proposed
    • 0% To Do, 0% In Progress, 100% Done
    • 2024Q1
    • Compute

      note: minor updates shoudl not container any db model changes
      so we could choose to do a blue/green update if we wanted too to keep the control plan active or we can take a short downtime by scaling the stateful set to 0, updating the image and scaling back out.

      we could also look at using the RollingUpdate update-strategy for this
      https://kubernetes.io/docs/concepts/workloads/controllers/statefulset/#update-strategies

      for major upgrades we cannot use the RollingUpdate stragy as we must scale in all instance of the old container before scaling out the new ones.

      schema migrations (db sync) can always be done before the container are updated.

      as is true for all nova updates/upgrades the compute nodes should be updated after the other services. this is less important for a minor updated as in theory it is not required as rpc changes should never happen in a minor update however its still good practice to do them at the same time or after the other services.

      this epic should span the end-to-end minor update process including the computes(libvirt and ironic).

            ksambor@redhat.com Kamil Sambor
            smooney@redhat.com Sean Mooney
            rhos-dfg-compute
            Votes:
            0 Vote for this issue
            Watchers:
            7 Start watching this issue

              Created:
              Updated:
              Resolved: