-
Bug
-
Resolution: Obsolete
-
Undefined
-
None
-
rhel-8.6.0.z
-
None
-
Important
-
Customer Escalated
-
rhel-sst-virtualization
-
ssg_virtualization
-
5
-
False
-
-
None
-
Red Hat Virtualization
-
None
-
None
-
None
-
-
x86_64
-
None
What were you trying to do that didn't work?
In several RHV hypervisors, we have seen the memory usage of libvirtd growing indefinitely. In some cases reaching +11 GB:
USER PID %CPU %MEM VSZ RSS TTY STAT START TIME COMMAND
root 3781 0.1 4.5 13710484 11940980 ? - Nov25 53:34 /usr/sbin/libvirtd
Please provide the package NVR for which bug is seen:
RHEL 8.6 EUS with virt:rhel module
libvirt-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-client-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-config-network-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-config-nwfilter-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-interface-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-network-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-nodedev-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-nwfilter-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-qemu-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-secret-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-core-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-disk-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-gluster-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-iscsi-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-iscsi-direct-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-logical-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-mpath-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-rbd-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-driver-storage-scsi-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-daemon-kvm-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-libs-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
libvirt-lock-sanlock-8.0.0-5.10.module+el8.6.0+18949+ba4ca8a3.x86_64
python3-libvirt-8.0.0-1.1.module+el8.6.0+16381+3abc475c.x86_64
How reproducible:
Only observed in customer's environment in several hypervisors. Not reproduced locally.
Steps to reproduce
- ???
Expected results
No memory leak
Actual results
Memory usage keeps growing. In recent tests, libvirtd reaches 3 GB in 3 days.
We have captured the valgrind report from 3 different hosts. Some symbols are missing so I'm not sure if they'll be enough to identify the source of the leak.