Uploaded image for project: 'Red Hat OpenStack Services on OpenShift'
  1. Red Hat OpenStack Services on OpenShift
  2. OSPRH-27046

"ValueError: Circular reference detected" error occurs after reimage_timeout_per_gb timeout

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Undefined Undefined
    • None
    • rhos-18.0.z
    • openstack-nova
    • None
    • Important

      To Reproduce Steps to reproduce the behavior:

      1. Deploy RHOSO 18.0
      2. Create a instance from a volume
        $ openstack volume create --image cirros062 --bootable --size 50 cirros_vol
        $ openstack server create --volume cirros_vol testserver ... 
      3. Resize the instance using a qcow2 image.
        $ openstack --os-compute-api-version 2.95 server rebuild --image rhel94-qcow2 --reimage-boot-volume testserver
      4. nova-compute shows the following error after reaching to reimage_timeout_per_gb timeout
        [root@oso-compute-1 ~]# journalctl -u edpm_nova_compute -f
          :
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.415 2 WARNING nova.compute.manager [None req-aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa - - default default] [instance: aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa] Timeout waiting for ['volume-reimaged-aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa'] for instance with vm_state active and task_state rebuilding. Event states are: volume-reimaged-aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa: timed out after 20.00 seconds: eventlet.timeout.Timeout: 20 seconds
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: /usr/lib/python3.9/site-packages/oslo_serialization/jsonutils.py:188: UserWarning: Cannot convert <Timeout at 0x7f7c704a4580 seconds=20> to primitive, will raise ValueError instead of warning in version 3.0
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]:   warnings.warn("Cannot convert %r to primitive, will raise ValueError "
          :
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server [None req-aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa - - default default] Exception during message handling: ValueError: Circular reference detected
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/utils.py", line 1453, in decorated_function
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 203, in decorated_function
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 3848, in rebuild_instance
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     self._do_rebuild_instance_with_claim(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 3934, in _do_rebuild_instance_with_claim
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     self._do_rebuild_instance(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 4126, in _do_rebuild_instance
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     self._rebuild_default_impl(**kwargs)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 3715, in _rebuild_default_impl
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     self._rebuild_volume_backed_instance(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 3638, in _rebuild_volume_backed_instance
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     self.volume_api.reimage_volume(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python3.9/contextlib.py", line 126, in __exit__
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     next(self.gen)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 559, in wait_for_instance_event
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     self._wait_for_instance_events(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 471, in _wait_for_instance_events
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     actual_event = event.wait()
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 436, in wait
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     instance_event = self.event.wait()
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/eventlet/event.py", line 125, in wait
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     result = hub.switch()
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/eventlet/hubs/hub.py", line 313, in switch
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return self.greenlet.switch()
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server eventlet.timeout.Timeout: 20 seconds
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server Traceback (most recent call last):
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/server.py", line 244, in inner
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return func(*args, **kwargs)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 71, in wrapped
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     _emit_versioned_exception_notification(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     self.force_reraise()
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     raise self.value
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/exception_wrapper.py", line 63, in wrapped
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 186, in decorated_function
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     LOG.warning("Failed to revert task state for instance. "
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 227, in __exit__
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     self.force_reraise()
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     raise self.value
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/manager.py", line 157, in decorated_function
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/utils.py", line 1453, in decorated_function
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/compute/utils.py", line 1414, in __exit__
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     objects.InstanceActionEvent.event_finish_with_failure(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/objects/base.py", line 355, in wrapper
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return fn.__get__(None, obj)(*args, **kwargs)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_versionedobjects/base.py", line 175, in wrapper
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     result = cls.indirection_api.object_class_action_versions(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/nova/conductor/rpcapi.py", line 240, in object_class_action_versions
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return cctxt.call(context, 'object_class_action_versions',
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/rpc/client.py", line 190, in call
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     result = self.transport._send(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 123, in _send
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return self._driver.send(target, ctxt, message,
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 689, in send
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return self._send(target, ctxt, message, wait_for_reply, timeout,
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 644, in _send
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     msg = rpc_common.serialize_msg(msg)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 292, in serialize_msg
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     _MESSAGE_KEY: jsonutils.dumps(raw_msg)}
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.9/site-packages/oslo_serialization/jsonutils.py", line 210, in dumps
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return json.dumps(obj, default=default, **kwargs)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python3.9/json/__init__.py", line 234, in dumps
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return cls(
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python3.9/json/encoder.py", line 199, in encode
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     chunks = self.iterencode(o, _one_shot=True)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server   File "/usr/lib64/python3.9/json/encoder.py", line 257, in iterencode
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server     return _iterencode(o, 0)
        Feb 26 00:35:32 oso-compute-1 nova_compute[5053]: 2026-02-26 00:35:32.765 2 ERROR oslo_messaging.rpc.server ValueError: Circular reference detected
      5. The volume status falls into "reserved". Instance cannot boot because the volume is not attached to the instance
        [root@util ~]# oc rsh -n openstack openstackclient openstack volume list
        +--------------------------------------+------------+----------+------+-------------+
        | ID                                   | Name       | Status   | Size | Attached to |
        +--------------------------------------+------------+----------+------+-------------+
        | aaaaaaaa-aaaa-aaaa-aaaa-aaaaaaaaaaaa | cirros_vol | reserved |   50 |             |
        +--------------------------------------+------------+----------+------+-------------+ 

      This issue tends to occur when the glance image format is qcow2, because qcow2 image is usually smaller than raw format, and the timeout value is reimage_timeout_per_gb * image_size. [1] 

      If the timeout doesn't occur, we can decrease reimage_timeout_per_gb on nova.conf of compute nodes to reproduce the issue.

      This issue occurs regardless of the cinder/glance backends.

      [1] https://github.com/openstack/nova/blob/unmaintained/2024.1/nova/compute/manager.py#L3629

       

      Expected behavior

      • The rebuild process is successfully reverted after reaching reimage_timeout_per_gb timeout

      Bug impact

      • Instance and volume are broken and cannot be recovered

      Known workaround

      • The issue can be avoided by increasing reimage_timeout_per_gb

      Additional context

      I found a similar ticket https://issues.redhat.com/browse/OSPRH-10277, but it was closed with "Won't do".

              Unassigned Unassigned
              rhn-support-yatanaka Yamato Tanaka
              rhos-workloads-compute
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated: