-
Bug
-
Resolution: Unresolved
-
Major
-
rhel-10.0.beta
-
libblockdev-3.1.0-6.el10
-
None
-
None
-
1
-
rhel-sst-storage-management
-
ssg_filesystems_storage_and_HA
-
21
-
23
-
1
-
Dev ack
-
False
-
-
None
-
Red Hat Enterprise Linux
-
System Roles Sprint 3
-
None
What were you trying to do that didn't work?
Please provide the package NVR for which bug is seen:
rhel-system-roles-1.79.0-0.3.el10.noarch
6.10.0-0.rc4.11.el10.x86_64
python3-blivet-3.10.0-4.el10.noarch
How reproducible:
Steps to reproduce
- ansible-playbook -vv -i host tests_lvm_pool_pv_grow.yml
Expected results
Actual results
skipping: [localhost] => {"changed": false, "false_condition": "storage_test_pool.type == 'lvm' and storage_test_pool.raid_level", "skip_r}TASK [Check the type of each PV] ********************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-pool-members.yml:59 ok: [localhost] => (item=/dev/nvme0n1) => { "ansible_loop_var": "pv", "changed": false, "msg": "All assertions passed", "pv": "/dev/nvme0n1" }TASK [Check that blivet supports PV grow to fill] ***************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-pool-members.yml:73 ok: [localhost] => {"changed": false, "rc": 0, "stderr": "", "stderr_lines": [], "stdout": "True\n", "stdout_lines": ["True"]}TASK [Verify that PVs fill the whole devices when they should] **************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/test-verify-pool-members.yml:82 included: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-pool-member-pvsize.yml for localhost => (item=/dev/nvme0n1)TASK [Get actual PV size] ***************************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-pool-member-pvsize.yml:2 ok: [localhost] => {"changed": false, "cmd": ["pvs", "--noheadings", "--nosuffix", "--units", "b", "-o", "SIZE", "/dev/nvme0n1"], "delta":}TASK [Convert blkinfo size to bytes] ****************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-pool-member-pvsize.yml:7 ok: [localhost] => {"bytes": 1000190509056, "changed": false, "lvm": "931g", "parted": "931GiB", "size": "931 GiB"}TASK [Verify each PV size] **************************************************************************************************************** task path: /usr/share/ansible/roles/rhel-system-roles.storage/tests/verify-pool-member-pvsize.yml:12 fatal: [localhost]: FAILED! => { "assertion": "(dev_size.bytes - actual_pv_size.stdout | int) | abs / actual_pv_size.stdout | int < 0.04", "changed": false, "evaluated_to": false, "msg": "PV resize failure; size difference too big (device size: 1000190509056) (actual PV size: 3766484992)" }PLAY RECAP ******************************************************************************************************************************** localhost : ok=70 changed=2 unreachable=0 failed=1 skipped=31 rescued=0 ignored=0 [root@smicro-s110p-01 tests]# lsblk NAME MAJ:MIN RM SIZE RO TYPE MOUNTPOINTS sda 8:0 0 447.1G 0 disk ├─sda1 8:1 0 600M 0 part /boot/efi ├─sda2 8:2 0 1G 0 part /boot └─sda3 8:3 0 445.5G 0 part ├─rhel_smicro--s110p--01-root 253:0 0 70G 0 lvm / ├─rhel_smicro--s110p--01-swap 253:1 0 31.3G 0 lvm [SWAP] └─rhel_smicro--s110p--01-home 253:2 0 344.3G 0 lvm /home nvme0n1 259:0 0 931.5G 0 disk ├─foo-test2 253:3 0 3G 0 lvm /opt/test2 └─foo-test1 253:4 0 520M 0 lvm /opt/test1 nvme1n1 259:1 0 931.5G 0 disk nvme5n1 259:2 0 931.5G 0 disk nvme3n1 259:3 0 931.5G 0 disk nvme2n1 259:4 0 931.5G 0 disk nvme4n1 259:5 0 931.5G 0 disk
- links to
-
RHBA-2024:132121 libblockdev bug fix and enhancement update