Uploaded image for project: 'Red Hat OpenStack Services on OpenShift'
  1. Red Hat OpenStack Services on OpenShift
  2. OSPRH-27288

[uni][adoption][FR5 content] verification to start VM fails

XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Duplicate
    • Icon: Blocker Blocker
    • None
    • None
    • data-plane-adoption
    • None
    • Critical

      Noticed while testing the fix for https://issues.redhat.com/browse/OSPCIX-1213 in unialpha adoption and FR5 content https://gitlab.cee.redhat.com/ci-framework/ci-framework-testproject/-/merge_requests/1877

       

      Following task fails:-

      TASK [dataplane_adoption : verify if the Compute services can start the existing test VM instance] ***
      FAILED - RETRYING: [localhost]: verify if the Compute services can start the existing test VM instance (60 retries left).
      FAILED - RETRYING: [localhost]: verify if the Compute services can start the existing test VM instance (59 retries left).
      FAILED - RETRYING: [localhost]: verify if the Compute services can start the existing test VM instance (58 retries left) 

      Checking on compute node saw selinux related issues:-

      Mar 06 02:37:54 osp-compute-uni01alpha-0 virtlogd[355991]: 2026-03-06 02:37:54.811+0000: 355991: error : main:710 : Can't load config file: Failed to open file '/etc/libvirt/virtlogd.conf': Permission denied: /etc/libvirt/virtlogd.conf 

      audit.log

      type=AVC msg=audit(1772791661.407:40073): avc:  denied  { dac_read_search } for  pid=172289 comm="virtlogd" capability=2  scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tclass=capability permissive=0
      type=AVC msg=audit(1772791661.407:40073): avc:  denied  { read } for  pid=172289 comm="virtlogd" name="virtlogd.conf" dev="sda4" ino=352321670 scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=unconfined_u:object_r:container_file_t:s0 tclass=file permissive=0 

      with below it was able to start virtlogd service:-

      [root@osp-compute-uni01alpha-0 ~]# sudo restorecon -v /etc/libvirt/virtlogd.conf
      /etc/libvirt/virtlogd.conf not reset as customized by admin to unconfined_u:object_r:container_file_t:s0
      [root@osp-compute-uni01alpha-0 ~]# sudo restorecon -vF /etc/libvirt/virtlogd.conf
      Relabeled /etc/libvirt/virtlogd.conf from unconfined_u:object_r:container_file_t:s0 to system_u:object_r:virtlogd_etc_t:s0 

      Then it failed with:-

      : libvirt.libvirtError: Cannot create daemon common directory '/run/libvirt/common': Not a directory
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest Traceback (most recent call last):
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py", line 165, in launch
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest     return self._domain.createWithFlags(flags)
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 193, in doit
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest     result = proxy_call(self._autowrap, f, *args, **kwargs)
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 151, in proxy_call
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest     rv = execute(f, *args, **kwargs)
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 132, in execute
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest     six.reraise(c, e, tb)
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/six.py", line 709, in reraise
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest     raise value
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest   File "/usr/lib/python3.9/site-packages/eventlet/tpool.py", line 86, in tworker
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest     rv = meth(*args, **kwargs)
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest   File "/usr/lib64/python3.9/site-packages/libvirt.py", line 1415, in createWithFlags
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest     raise libvirtError('virDomainCreateWithFlags() failed')
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest libvirt.libvirtError: Cannot create daemon common directory '/run/libvirt/common': Not a directory
      2026-03-06 10:14:24.186 211948 ERROR nova.virt.libvirt.guest ^[[00m
      2026-03-06 10:14:24.191 211948 ERROR nova.virt.libvirt.driver [None req-a941c73b-ba5b-4e3f-a9dc-1a8c4f87f10a 1ebb462ab43e4b20aa5382d205ffba13 9a7fd0d1fc9b4ea59b0264c76e6687c3 - - default default] [instance: 208459d5-34f6-4342-bb33-2a9b09eae3c9] Failed to start libvirt guest: libvirt.libvirtError: Cannot create daemon common directory '/run/libvirt/common': Not a directory

      audit.log

      type=AVC msg=audit(1772792032.497:52122): avc:  denied  { getattr } for  pid=192467 comm="virtlogd" path="/run/libvirt/common" dev="tmpfs" ino=3098 scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:object_r:container_file_t:s0 tclass=dir permissive=0
      type=AVC msg=audit(1772792630.507:52572): avc:  denied  { dac_read_search } for  pid=215110 comm="virtlogd" capability=2  scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tclass=capability permissive=0
      type=AVC msg=audit(1772792630.507:52573): avc:  denied  { dac_read_search } for  pid=215110 comm="virtlogd" capability=2  scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tclass=capability permissive=0
      type=AVC msg=audit(1772792633.099:52578): avc:  denied  { getattr } for  pid=215110 comm="virtlogd" path="/run/libvirt/common" dev="tmpfs" ino=3098 scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:object_r:container_file_t:s0 tclass=dir permissive=0
      type=AVC msg=audit(1772792648.574:52594): avc:  denied  { dac_read_search } for  pid=215215 comm="virtlogd" capability=2  scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tclass=capability permissive=0
      type=AVC msg=audit(1772792648.574:52595): avc:  denied  { dac_read_search } for  pid=215215 comm="virtlogd" capability=2  scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tclass=capability permissive=0
      type=AVC msg=audit(1772792651.219:52600): avc:  denied  { create } for  pid=215215 comm="virtlogd" name="common" scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:object_r:virt_var_run_t:s0 tclass=dir permissive=0
      type=AVC msg=audit(1772792651.219:52601): avc:  denied  { create } for  pid=215215 comm="virtlogd" name="common" scontext=system_u:system_r:virtlogd_t:s0-s0:c0.c1023 tcontext=system_u:object_r:virt_var_run_t:s0 tclass=dir permissive=0

      there are some other AVCs noticed too:-

      type=AVC msg=audit(1772791702.371:42416): avc:  denied  { search } for  pid=179291 comm="iscsid" name="iscsi" dev="sda4" ino=192940096 scontext=system_u:system_r:iscsid_t:s0 tcontext=system_u:object_r:container_file_t:s0 tclass=dir permissive=0
      type=AVC msg=audit(1772791702.372:42417): avc:  denied  { search } for  pid=179291 comm="iscsid" name="iscsi" dev="sda4" ino=192940096 scontext=system_u:system_r:iscsid_t:s0 tcontext=system_u:object_r:container_file_t:s0 tclass=dir permissive=0
      type=AVC msg=audit(1772791702.372:42418): avc:  denied  { search } for  pid=179291 comm="iscsid" name="iscsi" dev="sda4" ino=192940096 scontext=system_u:system_r:iscsid_t:s0 tcontext=system_u:object_r:container_file_t:s0 tclass=dir permissive=0
      type=AVC msg=audit(1772791702.372:42419): avc:  denied  { search } for  pid=179291 comm="iscsid" name="iscsi" dev="sda4" ino=226494339 scontext=system_u:system_r:iscsid_t:s0 tcontext=system_u:object_r:container_file_t:s0 tclass=dir permissive=0
      type=AVC msg=audit(1772791702.372:42420): avc:  denied  { search } for  pid=179291 comm="iscsid" name="iscsi" dev="sda4" ino=226494339 scontext=system_u:system_r:iscsid_t:s0 tcontext=system_u:object_r:container_file_t:s0 tclass=dir permissive=0
      

      after switching to permissive(sudo setenforce 0), it unblocked and vm was able to start.

      So looks a regression, need to be checked with adoption and compute Teams for FR5 blocker consideration

              Unassigned Unassigned
              ykarel@redhat.com Yatin Karel
              rhos-dfg-upgrades
              Votes:
              0 Vote for this issue
              Watchers:
              1 Start watching this issue

                Created:
                Updated:
                Resolved: