-
Bug
-
Resolution: Not a Bug
-
Major
-
None
-
rhos-18.0.14 FR 4
-
None
-
0
-
False
-
-
False
-
?
-
rhos-connectivity-vans
-
None
-
-
Bug Fix
-
-
-
-
Refinement
-
1
-
Important
To Reproduce Steps to reproduce the behavior:
Run `oc apply -f 0010-ctlplane-XXX-xxx.yaml.txt` to enable octavia services on running RHOSO environment in accordance with
But this has not set amphoraImageContainerImage in the openstack live controlplane.
Expected behavior
Enable Octavia on running environment and create loadbalancer.
Bug impact
Customer lost the deadline to deliver this service on time.
Additional context
Customer is patching controlplane with file: 0010-ctlplane-XXX-xxx.yaml.txt[1] which is unable to set the value in live controlplane CR file [2] henceforth load balancer creating going to error[3]
[1]
oc apply -f 0010-ctlplane-XXX-xxx.yaml.txt
[2]
oc get oscp openstackconrolplane -o yaml
..
amphoraImageContainerImage: "" <<<<<<
[3]
0020-must-gather.local.6588570511425689740.tar.gz/must-gather.local.6588570511425689740/registry-redhat-io-rhoso-operators-openstack-must-gather-rhel9-sha256-78583fbcc49f84954142d8b96976a8c5b68e7b333c79a6c25f64b707a0642a44/namespaces/openstack/pods/octavia-worker-nlsmt/logs/octavia-worker.log
2026-02-05 06:50:35.732 34 WARNING octavia.controller.worker.v2.controller_worker [-] Flow 'octavia-create-loadbalancer-flow' (ca738064-6327-4a2c-9088-d81f0a616436) transitioned into state 'REVERTED' from state 'RUNNING'^[[00m 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server [-] Exception during message handling: octavia.common.exceptions.ComputeBuildException: Failed to build compute instance due to: Failed to retrieve image with amphora-image tag. 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/octavia/compute/drivers/nova_driver.py", line 114, in build 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server image_id = self.image_driver.get_image_id_by_tag( 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/octavia/image/drivers/glance_driver.py", line 61, in get_image_id_by_tag 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server raise exceptions.ImageGetException(tag=image_tag) 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server octavia.common.exceptions.ImageGetException: Failed to retrieve image with amphora-image tag. 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred: 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server Traceback (most recent call last): 2026-02-05 06:50:35.732 34 ERROR oslo_messaging.rpc.server File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
- Snippet from 0010-ctlplane-XXX-xxxx.yaml.txt which customer claimed that same file applied correctly in other environment.
octavia: enabled: true template: <<<<<<<<<<<<<<<<<<< amphoraImageContainerImage: aaa.bbb.ccc.ddd.int:8443/rhoso/octavia-amphora-image-rhel9 <<<<<<<<<<<< databaseInstance: "openstack" lbMgmtNetwork: availabilityZones: - octavia-az template: <<<<<<<<<<<<<<<<<<< octaviaHousekeeping: networkAttachments: - octavia octaviaHealthManager: networkAttachments: - octavia octaviaWorker: networkAttachments: - octavia
- Captured output of `oc logs -f deployment/openstack-operator-controller-manager -n octavia**` from working and non working to see the logs in the deployment phase.
egrep Warning 0050-octavia-operator-nonworking-log | head -n 1 2026-02-12T10:33:54.207Z INFO Controllers.OctaviaAmphoraController Warning: Could not create flavor profile. Amphora image might be missing or not tagged correctly. Skipping configuration of octavia flavor profile amphora-4vcpus and octavia flavor amphora-4vcpus. {"controller": "octaviaamphoracontroller", "controllerGroup": "octavia.openstack.org", "controllerKind": "OctaviaAmphoraController", "OctaviaAmphoraController": {"name":"octavia-healthmanager","namespace":"openstack"}, "namespace": "openstack", "name": "octavia-healthmanager", "reconcileID": "a4a7f5e1-64d9-44be-8705-4a602ef2ba07"}
egrep Warning 0060-octavia-operator-woking-log || echo $? 1
- Complete list of files for analysis
[pgodwin@supportshell-1 04367758]$ ls -lt total 3336 -rw-rw-rw-+ 1 yank yank 2258421 Feb 12 11:16 0060-octavia-operator-woking-log -rw-rw-rw-+ 1 yank yank 1048544 Feb 12 11:16 0050-octavia-operator-nonworking-log -rw-rw-rw-+ 1 yank yank 79337 Feb 9 08:51 0030-oscp.yaml.txt drwxrwxrwx+ 3 yank yank 59 Feb 5 14:06 0020-must-gather.local.6588570511425689740.tar.gz -rw-rw-rw-+ 1 yank yank 23823 Feb 5 11:48 0010-ctlplane-XXX-xxx.yaml.txt [supportshell-1.sush-001.prod.us-west-2.aws.redhat.com] [14:25:44+0000] [pgodwin@supportshell-1 04367758]$
- No amphora images.
[kni@ngpesxaicbast01 Secrets]$ oc rsh openstackclient openstack image list --tag amphora-image-vert [kni@ngpesxaicbast01 Secrets]$ oc rsh openstackclient openstack image list --tag amphora-image [kni@ngpesxaicbast01 Secrets]${noformat} - Pods and the octavia status looks good[xxx@XXX ~]$ oc get pod |grep octavia octavia-api-74f54c6f55-srpfd 2/2 Running 0 29m octavia-healthmanager-2gwpk 1/1 Running 1 7d4h octavia-healthmanager-4qnln 1/1 Running 2 7d4h octavia-healthmanager-66xsz 1/1 Running 1 7d4h octavia-housekeeping-hrjpr 1/1 Running 2 7d4h octavia-housekeeping-lqsl4 1/1 Running 1 7d4h octavia-housekeeping-wzzsp 1/1 Running 1 7d4h octavia-rsyslog-7vd6s 1/1 Running 2 7d4h octavia-rsyslog-hdmsc 1/1 Running 1 7d4h octavia-rsyslog-jfsgk 1/1 Running 1 7d4h octavia-worker-5t7d6 1/1 Running 1 7d4h octavia-worker-dsrf5 1/1 Running 1 7d4h octavia-worker-nlsmt 1/1 Running 2 7d4h [xxx@XXX ~]$ oc get octavia NAME STATUS MESSAGE octavia True Setup complete [kni@ngpesxaicbast01 ~]$
- I asked for the o/p of `oc get OpenStackVersion -o jsonpath='{.items[0].status.containerImages}' | jq .` in association with Jira: [https://issues.redhat.com/browse/OSPRH-19495] and the octavia images are "octaviaAPIImage": "registry.redhat.io/rhoso/openstack-octavia-api-rhel9@sha256:90457c67d6124e0d443bb0dd327ff81099bad7b0776753c9e276abe71869132d", "octaviaApacheImage": "registry.redhat.io/ubi9/httpd-24:latest@sha256:8536169e5537fe6c330eba814248abdcf39cdd8f7e7336034d74e6fda9544050", "octaviaHealthmanagerImage": "registry.redhat.io/rhoso/openstack-octavia-health-manager-rhel9@sha256:292570f35f6afc939f71a35ae7eb9b9b34a99d4da9f2021c560f7eb46a193d21", "octaviaHousekeepingImage": "registry.redhat.io/rhoso/openstack-octavia-housekeeping-rhel9@sha256:0d7a1c7cdef0cfb186d16d4a83419e93247a92312bf5b517e14b8275ad2187e3", "octaviaRsyslogImage": "registry.redhat.io/rhoso/openstack-rsyslog-rhel9@sha256:0baac32d2ceb8b4baa568ec928b0d0829dc0778a579b1e92e6c9286b942f4bb9", "octaviaWorkerImage": "registry.redhat.io/rhoso/openstack-octavia-worker-rhel9@sha256:0dd360bc62afe8b459953beac8c141768ac99d6640efb34075e1bd873a72c3c6",
- I am clarifying if customer is able to pull the image: xxxx.yyy.zzz.llll.int:8443/rhoso/octavia-amphora-image-rhel9 + what this image is?