Uploaded image for project: 'RHEL'
  1. RHEL
  2. RHEL-117154

[RHEL9] Cannot free VG sanlock, lvmlockd is not in use.

Linking RHIVOS CVEs to...Migration: Automation ...RHELPRIO AssignedTeam ...SWIFT: POC ConversionSync from "Extern...XMLWordPrintable

    • Icon: Bug Bug
    • Resolution: Unresolved
    • Icon: Normal Normal
    • rhel-9.8
    • rhel-9.7
    • lvm2
    • None
    • lvm2-2.03.33-1.el9
    • No
    • Moderate
    • rhel-storage-lvm
    • 3
    • 3
    • False
    • False
    • Hide

      None

      Show
      None
    • Yes
    • None
    • Pass
    • None
    • Release Note Not Required
    • Unspecified
    • Unspecified
    • Unspecified
    • x86_64
    • None

      I was playing around with cleanup outside of the score of our normal tests and ran into the deadlock while attempt to clean up sanlock VGs. This was even after a clean reboot and lock start which is even more concerning.

      kernel-6.12.0-92.el10    BUILT: Wed Jun  4 05:09:21 PM CEST 2025
      lvm2-2.03.32-1.el10    BUILT: Tue May  6 12:41:07 PM CEST 2025
      lvm2-libs-2.03.32-1.el10    BUILT: Tue May  6 12:41:07 PM CEST 2025
      sanlock-4.0.0-1.el10    BUILT: Thu May  1 04:31:33 PM CEST 2025
      sanlock-lib-4.0.0-1.el10    BUILT: Thu May  1 04:31:33 PM CEST 2025
       
       
       
      [root@virt-492 ~]# pvscan
        Reading VG global without a lock.
        PV /dev/sdd3   VG global          lvm2 [11.66 GiB / 11.41 GiB free]
        Reading VG pv_shuffle_A without a lock.
        PV /dev/sdd2   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.41 GiB free]
        PV /dev/sdb2   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sda3   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdb3   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdf1   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sde1   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        Reading VG pv_shuffle_B without a lock.
        PV /dev/sde2   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.41 GiB free]
        PV /dev/sdc3   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdc1   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdb1   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdf3   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdc2   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/vda3   VG rhel_virt-492   lvm2 [<9.00 GiB / 0    free]
        PV /dev/sda2                      lvm2 [<11.67 GiB]
        PV /dev/sdd1                      lvm2 [<11.67 GiB]
        PV /dev/sde3                      lvm2 [<11.67 GiB]
        PV /dev/sdf2                      lvm2 [<11.67 GiB]
        Total: 18 [207.24 GiB] / in use: 14 [<160.58 GiB] / in no VG: 4 [<46.67 GiB]
       
      [root@virt-492 ~]# vgchange --lockstart pv_shuffle_B
        VG pv_shuffle_B starting sanlock lockspace
        Starting locking.  Waiting for sanlock may take a few seconds to 3 min...
      [root@virt-492 ~]# vgchange --lockstart pv_shuffle_A
        VG pv_shuffle_A starting sanlock lockspace
        Starting locking.  Waiting for sanlock may take a few seconds to 3 min...
       
      [root@virt-492 ~]# vgremove -f pv_shuffle_B
        Global lock failed: check that global lockspace is started
      [root@virt-492 ~]# vgremove -f pv_shuffle_A
        Global lock failed: check that global lockspace is started
       
      ### Wait over 3 min....
       
      [root@virt-492 ~]# vgremove -f pv_shuffle_B
        Global lock failed: check that global lockspace is started
      [root@virt-492 ~]# vgremove --nolocking pv_shuffle_B
        Cannot free VG sanlock, lvmlockd is not in use.
       
      [root@virt-492 ~]# systemctl status sanlock
      â sanlock.service - Shared Storage Lease Manager
           Loaded: loaded (/usr/lib/systemd/system/sanlock.service; disabled; preset: disabled)
           Active: active (running) since Thu 2025-08-07 22:28:10 CEST; 14min ago
       Invocation: 7275dfcd113246b3ba7dbd0fac9c16b0
             Docs: man:sanlock(8)
          Process: 4281 ExecStart=/usr/sbin/sanlock daemon (code=exited, status=0/SUCCESS)
         Main PID: 4291 (sanlock)
            Tasks: 8 (limit: 25065)
           Memory: 22.2M (peak: 24M)
              CPU: 194ms
           CGroup: /system.slice/sanlock.service
                   ââ4291 /usr/sbin/sanlock daemon
                   ââ4293 /usr/sbin/sanlock daemon
       
      Aug 07 22:28:10 virt-492.cluster-qe.lab.eng.brq.redhat.com systemd[1]: Starting sanlock.service - Shared Storage Lease Manager...
      Aug 07 22:28:10 virt-492.cluster-qe.lab.eng.brq.redhat.com systemd[1]: Started sanlock.service - Shared Storage Lease Manager.
      Aug 07 22:28:10 virt-492.cluster-qe.lab.eng.brq.redhat.com sanlock[4291]: sanlock daemon started 4.0.0 host 36a94baf-f142-47f5-8bdc-8f38a0d7ccb8.virt-492.cl (virt-492.cluster-qe.lab.eng.brq.redhat.com)
      
      [root@virt-492 ~]# ps -ef | grep lvmlockd
      root        4721       1  0 22:36 ?        00:00:00 /usr/sbin/lvmlockd --foreground
       
      [root@virt-492 ~]# systemctl status lvmlockd
      â lvmlockd.service - LVM lock daemon
           Loaded: loaded (/usr/lib/systemd/system/lvmlockd.service; disabled; preset: disabled)
           Active: active (running) since Thu 2025-08-07 22:36:14 CEST; 6min ago
       Invocation: dedb57b089d742058698505a5c25521f
             Docs: man:lvmlockd(8)
         Main PID: 4721 (lvmlockd)
            Tasks: 5 (limit: 25065)
           Memory: 9.2M (peak: 9.7M)
              CPU: 42ms
           CGroup: /system.slice/lvmlockd.service
                   ââ4721 /usr/sbin/lvmlockd --foreground
       
      Aug 07 22:36:14 virt-492.cluster-qe.lab.eng.brq.redhat.com systemd[1]: Starting lvmlockd.service - LVM lock daemon...
      Aug 07 22:36:14 virt-492.cluster-qe.lab.eng.brq.redhat.com (lvmlockd)[4721]: lvmlockd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS
      Aug 07 22:36:14 virt-492.cluster-qe.lab.eng.brq.redhat.com lvmlockd[4721]: [D] creating /run/lvm/lvmlockd.socket
      Aug 07 22:36:14 virt-492.cluster-qe.lab.eng.brq.redhat.com lvmlockd[4721]: 1754598974 lvmlockd started
      Aug 07 22:36:14 virt-492.cluster-qe.lab.eng.brq.redhat.com systemd[1]: Started lvmlockd.service - LVM lock daemon.
      
      # This is the VG that was the initial global prior to the reboot, but it shouldn't be needed I would think post reboot 
      [root@virt-492 ~]# vgchange --lockstart global
        VG global starting sanlock lockspace
        Starting locking.  Waiting for sanlock may take a few seconds to 3 min...
       
      [root@virt-492 ~]# pvscan
        PV /dev/sdd3   VG global          lvm2 [11.66 GiB / 11.41 GiB free]
        PV /dev/sdd2   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.41 GiB free]
        PV /dev/sdb2   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sda3   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdb3   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdf1   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sde1   VG pv_shuffle_A    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sde2   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.41 GiB free]
        PV /dev/sdc3   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdc1   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdb1   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdf3   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/sdc2   VG pv_shuffle_B    lvm2 [11.66 GiB / 11.66 GiB free]
        PV /dev/vda3   VG rhel_virt-492   lvm2 [<9.00 GiB / 0    free]
        PV /dev/sda2                      lvm2 [<11.67 GiB]
        PV /dev/sdd1                      lvm2 [<11.67 GiB]
        PV /dev/sde3                      lvm2 [<11.67 GiB]
        PV /dev/sdf2                      lvm2 [<11.67 GiB]
        Total: 18 [207.24 GiB] / in use: 14 [<160.58 GiB] / in no VG: 4 [<46.67 GiB]
       
      [root@virt-492 ~]# vgremove --nolocking pv_shuffle_A
        Cannot free VG sanlock, lvmlockd is not in use.
      [root@virt-492 ~]# vgremove --nolocking pv_shuffle_B
        Cannot free VG sanlock, lvmlockd is not in use.
       
       
      [root@virt-492 ~]# vgremove pv_shuffle_B
      ^C
      [DEADLOCK]
       
       
       
      Aug  7 22:28:10 virt-492 systemd[1]: Started sanlock.service - Shared Storage Lease Manager.
      Aug  7 22:28:10 virt-492 sanlock[4291]: sanlock daemon started 4.0.0 host 36a94baf-f142-47f5-8bdc-8f38a0d7ccb8.virt-492.cl (virt-492.cluster-qe.lab.eng.brq.redhat.com)
      Aug  7 22:28:10 virt-492 systemd[1]: Started wdmd.service - Watchdog Multiplexing Daemon.
      Aug  7 22:28:10 virt-492 wdmd[4305]: wdmd started S0 H0 G179 using /dev/watchdog0 "i6300ESB timer"
      Aug  7 22:28:10 virt-492 systemd[1]: pmie_farm_check.service: Deactivated successfully.
      Aug  7 22:31:12 virt-492 systemd[1017]: Created slice background.slice - User Background Tasks Slice.
      Aug  7 22:31:12 virt-492 systemd[1017]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories...
      Aug  7 22:31:12 virt-492 systemd[1017]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories.
      Aug  7 22:36:14 virt-492 systemd[1]: Starting lvmlockd.service - LVM lock daemon...
      Aug  7 22:36:14 virt-492 (lvmlockd)[4721]: lvmlockd.service: Referenced but unset environment variable evaluates to an empty string: OPTIONS
      Aug  7 22:36:14 virt-492 lvmlockd[4721]: [D] creating /run/lvm/lvmlockd.socket
      Aug  7 22:36:14 virt-492 lvmlockd[4721]: 1754598974 lvmlockd started
      Aug  7 22:36:14 virt-492 systemd[1]: Started lvmlockd.service - LVM lock daemon.
      Aug  7 22:37:44 virt-492 wdmd[4305]: /dev/watchdog0 open with timeout 60
      Aug  7 22:40:23 virt-492 systemd[1]: Starting systemd-tmpfiles-clean.service - Cleanup of Temporary Directories...
      Aug  7 22:40:23 virt-492 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
      Aug  7 22:40:23 virt-492 systemd[1]: Finished systemd-tmpfiles-clean.service - Cleanup of Temporary Directories.
      Aug  7 22:48:39 virt-492 sanlock[4291]: 2025-08-07 22:48:39 1395 [4307]: cmd_release 2,10,4721 no resource
      Aug  7 22:48:39 virt-492 lvmlockd[4721]: 1754599719 lvm_pv_shuffle_B:VGLK unlock_san release rename error -1
      Aug  7 22:48:39 virt-492 lvmlockd[4721]: 1754599719 lvm_pv_shuffle_B:VGLK clear_locks free 1 drop 0 lm unlock error -221
      Aug  7 22:48:39 virt-492 sanlock[4291]: 2025-08-07 22:48:39 1396 [4291]: s1 kill 4721 sig 15 count 1
      Aug  7 22:48:40 virt-492 sanlock[4291]: 2025-08-07 22:48:40 1397 [4291]: s1 kill 4721 sig 15 count 2
      Aug  7 22:48:41 virt-492 sanlock[4291]: 2025-08-07 22:48:41 1398 [4291]: s1 kill 4721 sig 15 count 3
      Aug  7 22:48:42 virt-492 sanlock[4291]: 2025-08-07 22:48:42 1399 [4291]: s1 kill 4721 sig 15 count 4
      Aug  7 22:48:43 virt-492 sanlock[4291]: 2025-08-07 22:48:43 1400 [4291]: s1 kill 4721 sig 15 count 5
      Aug  7 22:48:44 virt-492 sanlock[4291]: 2025-08-07 22:48:44 1401 [4291]: s1 kill 4721 sig 15 count 6
      Aug  7 22:48:45 virt-492 sanlock[4291]: 2025-08-07 22:48:45 1402 [4291]: s1 kill 4721 sig 15 count 7
      Aug  7 22:48:46 virt-492 sanlock[4291]: 2025-08-07 22:48:46 1403 [4291]: s1 kill 4721 sig 15 count 8
      Aug  7 22:48:47 virt-492 sanlock[4291]: 2025-08-07 22:48:47 1404 [4291]: s1 kill 4721 sig 15 count 9
      Aug  7 22:48:48 virt-492 sanlock[4291]: 2025-08-07 22:48:48 1405 [4291]: s1 kill 4721 sig 15 count 10
      Aug  7 22:48:49 virt-492 sanlock[4291]: 2025-08-07 22:48:49 1406 [4291]: s1 kill 4721 sig 15 count 11
      Aug  7 22:48:50 virt-492 sanlock[4291]: 2025-08-07 22:48:50 1407 [4291]: s1 kill 4721 sig 15 count 12
      Aug  7 22:48:51 virt-492 sanlock[4291]: 2025-08-07 22:48:51 1408 [4291]: s1 kill 4721 sig 15 count 13
      Aug  7 22:48:52 virt-492 sanlock[4291]: 2025-08-07 22:48:52 1409 [4291]: s1 kill 4721 sig 15 count 14
      Aug  7 22:48:53 virt-492 sanlock[4291]: 2025-08-07 22:48:53 1410 [4291]: s1 kill 4721 sig 15 count 15
      Aug  7 22:48:54 virt-492 sanlock[4291]: 2025-08-07 22:48:54 1411 [4291]: s1 kill 4721 sig 15 count 16
      Aug  7 22:48:55 virt-492 sanlock[4291]: 2025-08-07 22:48:55 1412 [4291]: s1 kill 4721 sig 15 count 17
      Aug  7 22:48:56 virt-492 sanlock[4291]: 2025-08-07 22:48:56 1413 [4291]: s1 kill 4721 sig 15 count 18
      Aug  7 22:48:57 virt-492 sanlock[4291]: 2025-08-07 22:48:57 1414 [4291]: s1 kill 4721 sig 15 count 19
      Aug  7 22:48:58 virt-492 sanlock[4291]: 2025-08-07 22:48:58 1415 [4291]: s1 kill 4721 sig 15 count 20
      Aug  7 22:48:59 virt-492 sanlock[4291]: 2025-08-07 22:48:59 1416 [4291]: s1 kill 4721 sig 15 count 21
      Aug  7 22:49:00 virt-492 sanlock[4291]: 2025-08-07 22:49:00 1417 [4291]: s1 kill 4721 sig 15 count 22
      Aug  7 22:49:01 virt-492 sanlock[4291]: 2025-08-07 22:49:01 1418 [4291]: s1 kill 4721 sig 15 count 23
      Aug  7 22:49:02 virt-492 sanlock[4291]: 2025-08-07 22:49:02 1419 [4291]: s1 kill 4721 sig 15 count 24
      Aug  7 22:49:03 virt-492 sanlock[4291]: 2025-08-07 22:49:03 1420 [4291]: s1 kill 4721 sig 15 count 25
      [...]
      Aug  7 22:50:02 virt-492 sanlock[4291]: 2025-08-07 22:50:02 1479 [4291]: s1 kill 4721 sig 15 count 84
      Aug  7 22:50:03 virt-492 sanlock[4291]: 2025-08-07 22:50:03 1480 [4291]: s1 kill 4721 sig 15 count 85
      Aug  7 22:50:04 virt-492 sanlock[4291]: 2025-08-07 22:50:04 1481 [4291]: s1 kill 4721 sig 15 count 86
      Aug  7 22:50:05 virt-492 sanlock[4291]: 2025-08-07 22:50:05 1482 [4291]: s1 kill 4721 sig 15 count 87
      Aug  7 22:50:06 virt-492 sanlock[4291]: 2025-08-07 22:50:06 1483 [4291]: s1 kill 4721 sig 15 count 88
      Aug  7 22:50:07 virt-492 sanlock[4291]: 2025-08-07 22:50:07 1484 [4291]: s1 kill 4721 sig 15 count 89
      Aug  7 22:50:08 virt-492 sanlock[4291]: 2025-08-07 22:50:08 1485 [4291]: s1 kill 4721 sig 15 count 90
      Aug  7 22:50:09 virt-492 sanlock[4291]: 2025-08-07 22:50:09 1486 [4291]: s1 kill 4721 sig 15 count 91
      Aug  7 22:50:10 virt-492 sanlock[4291]: 2025-08-07 22:50:10 1487 [4291]: s1 kill 4721 sig 15 count 92
      Aug  7 22:50:11 virt-492 sanlock[4291]: 2025-08-07 22:50:11 1488 [4291]: s1 kill 4721 sig 15 count 93
      Aug  7 22:50:12 virt-492 sanlock[4291]: 2025-08-07 22:50:12 1489 [4291]: s1 kill 4721 sig 15 count 94
      Aug  7 22:50:13 virt-492 sanlock[4291]: 2025-08-07 22:50:13 1490 [4291]: s1 kill 4721 sig 15 count 95
      Aug  7 22:50:14 virt-492 sanlock[4291]: 2025-08-07 22:50:14 1491 [4291]: s1 kill 4721 sig 15 count 96
      Aug  7 22:50:15 virt-492 sanlock[4291]: 2025-08-07 22:50:15 1492 [4291]: s1 kill 4721 sig 15 count 97
      Aug  7 22:50:16 virt-492 sanlock[4291]: 2025-08-07 22:50:16 1493 [4291]: s1 kill 4721 sig 15 count 98
      Aug  7 22:50:17 virt-492 sanlock[4291]: 2025-08-07 22:50:17 1494 [4291]: s1 kill 4721 sig 15 count 99
      Aug  7 22:50:18 virt-492 sanlock[4291]: 2025-08-07 22:50:18 1495 [4291]: s1 kill 4721 sig 15 count 100
      Aug  7 22:50:18 virt-492 sanlock[4291]: 2025-08-07 22:50:18 1495 [4291]: s1 killing pids stuck 1
      Aug  7 22:50:25 virt-492 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
       
       
       
       
      [root@virt-492 ~]# vgremove -vvvv  pv_shuffle_B
      23:12:49.121779 vgremove[7288] lvmcmdline.c:3145  Version: 2.03.32(2-RHEL10) (2025-05-05)
      23:12:49.122020 vgremove[7288] lvmcmdline.c:3146  Parsing: vgremove -vvvv pv_shuffle_B
      23:12:49.122064 vgremove[7288] lvmcmdline.c:2010  Recognised command vgremove_general (id 175 / enum 154).
      23:12:49.122129 vgremove[7288] device_mapper/libdm-config.c:1172  global/use_lvmpolld not found in config: defaulting to 1
      23:12:49.122171 vgremove[7288] filters/filter-type.c:61  LVM type filter initialised.
      23:12:49.122318 vgremove[7288] filters/filter-deviceid.c:66  deviceid filter initialised.
      23:12:49.122346 vgremove[7288] device_mapper/libdm-config.c:1172  devices/sysfs_scan not found in config: defaulting to 1
      23:12:49.122391 vgremove[7288] filters/filter-sysfs.c:99  Sysfs filter initialised.
      23:12:49.122406 vgremove[7288] device_mapper/libdm-config.c:1172  devices/scan_lvs not found in config: defaulting to 0
      23:12:49.122415 vgremove[7288] filters/filter-usable.c:113  Usable device filter initialised (scan_lvs 0).
      23:12:49.122439 vgremove[7288] device_mapper/libdm-config.c:1172  devices/multipath_component_detection not found in config: defaulting to 1
      23:12:49.122449 vgremove[7288] filters/filter-mpath.c:87  mpath filter initialised.
      23:12:49.122459 vgremove[7288] filters/filter-partitioned.c:68  Partitioned filter initialised.
      23:12:49.122473 vgremove[7288] filters/filter-signature.c:88  signature filter initialised.
      23:12:49.122482 vgremove[7288] device_mapper/libdm-config.c:1172  devices/md_component_detection not found in config: defaulting to 1
      23:12:49.122492 vgremove[7288] filters/filter-md.c:149  MD filter initialised.
      23:12:49.122501 vgremove[7288] device_mapper/libdm-config.c:1172  devices/fw_raid_component_detection not found in config: defaulting to 0
      23:12:49.122511 vgremove[7288] filters/filter-composite.c:98  Composite filter initialised.
      23:12:49.122521 vgremove[7288] device_mapper/libdm-config.c:1172  devices/ignore_suspended_devices not found in config: defaulting to 0
      23:12:49.122531 vgremove[7288] device_mapper/libdm-config.c:1172  devices/ignore_lvm_mirrors not found in config: defaulting to 1
      23:12:49.122540 vgremove[7288] filters/filter-persistent.c:188  Persistent filter initialised.
      23:12:49.122551 vgremove[7288] device_mapper/libdm-config.c:1172  devices/scan_lvs not found in config: defaulting to 0
      23:12:49.122740 vgremove[7288] device_mapper/libdm-config.c:1172  devices/allow_mixed_block_sizes not found in config: defaulting to 0
      23:12:49.122778 vgremove[7288] device_mapper/libdm-config.c:1073  devices/hints not found in config: defaulting to "all"
      23:12:49.122925 vgremove[7288] device_mapper/libdm-config.c:1073  activation/activation_mode not found in config: defaulting to "degraded"
      23:12:49.122962 vgremove[7288] device_mapper/libdm-config.c:1172  metadata/record_lvs_history not found in config: defaulting to 0
      23:12:49.123057 vgremove[7288] device_mapper/libdm-config.c:1073  devices/search_for_devnames not found in config: defaulting to "all"
      23:12:49.123104 vgremove[7288] device_mapper/libdm-config.c:1100  activation/reserved_stack not found in config: defaulting to 64
      23:12:49.123209 vgremove[7288] device_mapper/libdm-config.c:1100  activation/reserved_memory not found in config: defaulting to 8192
      23:12:49.123245 vgremove[7288] device_mapper/libdm-config.c:1100  activation/process_priority not found in config: defaulting to -18
      23:12:49.123345 vgremove[7288] lvmcmdline.c:3220  DEGRADED MODE. Incomplete RAID LVs will be processed.
      23:12:49.123394 vgremove[7288] device_mapper/libdm-config.c:1172  activation/monitoring not found in config: defaulting to 1
      23:12:49.123487 vgremove[7288] lvmcmdline.c:3226  Processing command: vgremove -vvvv pv_shuffle_B
      23:12:49.123522 vgremove[7288] lvmcmdline.c:3227  Command pid: 7288
      23:12:49.123631 vgremove[7288] lvmcmdline.c:3228  System ID: 
      23:12:49.123646 vgremove[7288] lvmcmdline.c:3231  O_DIRECT will be used
      23:12:49.123656 vgremove[7288] device_mapper/libdm-config.c:1100  global/locking_type not found in config: defaulting to 1
      23:12:49.123678 vgremove[7288] device_mapper/libdm-config.c:1172  global/wait_for_locks not found in config: defaulting to 1
      23:12:49.123688 vgremove[7288] locking/locking.c:142  File locking settings: readonly:0 sysinit:0 ignorelockingfailure:0 global/metadata_read_only:0 global/wait_for_locks:1.
      23:12:49.123702 vgremove[7288] device_mapper/libdm-config.c:1172  global/prioritise_write_locks not found in config: defaulting to 1
      23:12:49.123712 vgremove[7288] device_mapper/libdm-config.c:1073  global/locking_dir not found in config: defaulting to "/run/lock/lvm"
      23:12:49.129961 vgremove[7288] device_mapper/libdm-common.c:988  Preparing SELinux context for /run/lock/lvm to system_u:object_r:lvm_lock_t:s0.
      23:12:49.130118 vgremove[7288] device_mapper/libdm-common.c:991  Resetting SELinux context to default value.
      23:12:49.130182 vgremove[7288] device_mapper/libdm-config.c:1172  devices/md_component_detection not found in config: defaulting to 1
      23:12:49.130218 vgremove[7288] device_mapper/libdm-config.c:1073  devices/md_component_checks not found in config: defaulting to "auto"
      23:12:49.130255 vgremove[7288] lvmcmdline.c:3046  Using md_component_checks auto use_full_md_check 0
      23:12:49.130292 vgremove[7288] device_mapper/libdm-config.c:1073  devices/multipath_wwids_file not found in config: defaulting to "/etc/multipath/wwids"
      23:12:49.130410 vgremove[7288] device/dev-mpath.c:255  multipath wwids read 0 from /etc/multipath/wwids
      23:12:49.130540 vgremove[7288] daemon-client.c:30  /run/lvm/lvmlockd.socket: Opening daemon socket to lvmlockd for protocol lvmlockd version 1.
      23:12:49.130730 vgremove[7288] daemon-client.c:50  Sending daemon lvmlockd: hello
      23:12:49.131181 vgremove[7288] locking/lvmlockd.c:99  Successfully connected to lvmlockd on fd 3.
      23:12:49.131243 vgremove[7288] misc/lvm-flock.c:228  Locking /run/lock/lvm/P_global WB
      23:12:49.131600 vgremove[7288] device_mapper/libdm-common.c:988  Preparing SELinux context for /run/lock/lvm/P_global to system_u:object_r:lvm_lock_t:s0.
      23:12:49.131682 vgremove[7288] misc/lvm-flock.c:113  _do_flock /run/lock/lvm/P_global:aux WB
      23:12:49.131896 vgremove[7288] misc/lvm-flock.c:113  _do_flock /run/lock/lvm/P_global WB
      

              cmarthal@redhat.com Corey Marthaler
              cmarthal@redhat.com Corey Marthaler
              David Teigland David Teigland
              Corey Marthaler Corey Marthaler
              Angana Chakraborty Angana Chakraborty
              Votes:
              0 Vote for this issue
              Watchers:
              10 Start watching this issue

                Created:
                Updated: