-
Bug
-
Resolution: Unresolved
-
Normal
-
None
-
None
-
None
-
None
-
None
-
rhel-sst-csi-client-tools
-
None
-
False
-
-
None
-
None
-
None
-
None
-
None
Description of problem:
virt-who config fail to get mapping info in rhsm.log on Nutanix env with Prism Flavor central
How reproducible:
100%
Is this issue a regression from an earlier version:
Yes, it is a regression bug, this issue does not exist in Satellite6.15.1
Version
Satellite6.16 stream snapshot 65.0
virt-who-1.31.26-1.el9.noarch
katello-4.13.0-0.1.master.el9sat.noarch
rubygem-foreman_virt_who_configure-0.5.22-1.el9sat.noarch
rubygem-hammer_cli_foreman_virt_who_configure-0.1.0-1.el9sat.noarch
Steps to Reproduce:
1. create virt-who config with Hypervisor Type ** Nutanix AHV (ahv), chose Prism Flavor ** element
2. Deploy the virt-who config and check the mapping info
[root@ip-10-0-167-170 yum.repos.d]# hammer virt-who-config deploy --id 1 --organization-id 1 == [1/5] Installing virt-who == Running unlocking of package versions ================================================================================ Unlock packages: [OK] -------------------------------------------------------------------------------- Running ForemanMaintain::Scenario ================================================================================ Install packages: [OK] -------------------------------------------------------------------------------- Running locking of package versions ================================================================================ Lock packages: [OK] -------------------------------------------------------------------------------- == [2/5] Encrypting password == == [3/5] Creating virt-who configuration == == [4/5] Creating sysconfig virt-who configuration == == [5/5] Enabling and restarting the virt-who service == Created symlink /etc/systemd/system/multi-user.target.wants/virt-who.service → /usr/lib/systemd/system/virt-who.service. == Finished == Finished successfully
check mapping info in /var/log/rhsm/rhsm.log
2024-07-10 02:09:27,364 [virtwho.destination_-3959399538730922996 DEBUG] MainProcess(25301):Thread-3 @subscriptionmanager.py:hypervisorCheckIn:210 - Host-to-guest mapping being sent to 'Default_Organization': { "hypervisors": [ { "hypervisorId": { "hypervisorId": "NTNX-3c8c4069-A" }, "name": "NTNX-3c8c4069-A", "guestIds": [ { "guestId": "129d8e57-b4fc-4d95-ad33-5aa6ec6fb146", "state": 1, "attributes": { "virtWhoType": "ahv", "active": 1 } } ], "facts": { "cpu.cpu_socket(s)": "6", "hypervisor.type": "kKvm", "hypervisor.version": "Nutanix 20190916.276", "dmi.system.uuid": "1c1b19c9-988c-4b86-a2b2-658fded10ccb", "hypervisor.cluster": "virtwho-test" } } ] }
3. change the Prism Flavor ** central and deploy the virt-who ,and then check mapping info
[root@ip-10-0-167-170 yum.repos.d]# hammer virt-who-config deploy --id 1 --organization-id 1 == [1/5] Installing virt-who == Running unlocking of package versions ================================================================================ Unlock packages: [OK] -------------------------------------------------------------------------------- Running ForemanMaintain::Scenario ================================================================================ Install packages: [OK] -------------------------------------------------------------------------------- Running locking of package versions ================================================================================ Lock packages: [OK] -------------------------------------------------------------------------------- == [2/5] Encrypting password == == [3/5] Creating virt-who configuration == == [4/5] Creating sysconfig virt-who configuration == == [5/5] Enabling and restarting the virt-who service == == Finished == Finished successfully
check mapping info in /var/log/rhsm/rhsm.log
2024-07-10 02:10:05,788 [virtwho.main DEBUG] MainProcess(25301):MainThread @executor.py:terminate:272 - virt-who is shutting down 2024-07-10 02:10:06,704 [virtwho.main DEBUG] MainProcess(25301):Thread-2 @virt.py:run:541 - Thread 'virt-who-config-1' terminated 2024-07-10 02:10:06,730 [virtwho.destination_-3959399538730922996 DEBUG] MainProcess(25301):Thread-3 @virt.py:run:541 - Thread 'destination_-3959399538730922996' terminated 2024-07-10 02:10:06,958 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [global]: Value for "status" not set, using default: False 2024-07-10 02:10:06,958 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [global]: Value for "json" not set, using default: False 2024-07-10 02:10:06,958 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [global]: Value for "print_" not set, using default: False 2024-07-10 02:10:06,959 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [global]: Value for "log_per_config" not set, using default: False 2024-07-10 02:10:06,959 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [global]: Value for "configs" not set, using default: [] 2024-07-10 02:10:06,959 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [global]: Value for "reporter_id" not set, using default: ip-10-0-167-170.rhos-01.prod.psi.rdu2.redhat.com-479677c5067f22a170172499189cb777 2024-07-10 02:10:06,959 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [global]: Value for "log_file" not set, using default: rhsm.log 2024-07-10 02:10:06,959 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [global]: Value for "log_dir" not set, using default: /var/log/rhsm 2024-07-10 02:10:06,959 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [virt-who-config-1]: Value for "is_hypervisor" not set, using default: True 2024-07-10 02:10:06,959 [virtwho.rhsm_log DEBUG] MainProcess(25578):MainThread @config.py:init_config:1582 - [virt-who-config-1]: Value for "ahv_internal_debug" not set, using default: False 2024-07-10 02:10:06,959 [virtwho.rhsm_log INFO] MainProcess(25578):MainThread @executor.py:__init__:52 - Using config named 'virt-who-config-1' 2024-07-10 02:10:06,959 [virtwho.rhsm_log INFO] MainProcess(25578):MainThread @main.py:main:136 - Using configuration "virt-who-config-1" ("ahv" mode) 2024-07-10 02:10:06,960 [virtwho.rhsm_log INFO] MainProcess(25578):MainThread @main.py:main:139 - Using reporter_id='ip-10-0-167-170.rhos-01.prod.psi.rdu2.redhat.com-479677c5067f22a170172499189cb777' 2024-07-10 02:10:06,960 [virtwho.main DEBUG] MainProcess(25578):MainThread @executor.py:run:233 - Starting infinite loop with 7200 seconds interval 2024-07-10 02:10:06,968 [rhsm.https DEBUG] MainProcess(25578):MainThread @https.py:<module>:57 - Using standard libs to provide httplib and ssl 2024-07-10 02:10:06,974 [virtwho.main DEBUG] MainProcess(25578):Thread-2 @virt.py:run:513 - Thread 'virt-who-config-1' started 2024-07-10 02:10:06,976 [virtwho.destination_4753028580417423715 DEBUG] MainProcess(25578):Thread-3 @virt.py:run:513 - Thread 'destination_4753028580417423715' started 2024-07-10 02:10:08,534 [virtwho.main INFO] MainProcess(25578):Thread-2 @ahv_interface.py:get_vm_entities:403 - Getting the list of available VM entities 2024-07-10 02:10:09,152 [virtwho.main DEBUG] MainProcess(25578):Thread-2 @ahv_interface.py:get_vm_entities:447 - Next vm list call has this body: {'length': 20, 'offset': 1} 2024-07-10 02:10:09,615 [virtwho.main DEBUG] MainProcess(25578):Thread-2 @ahv_interface.py:get_vm_entities:433 - Gathered all VM entities 2024-07-10 02:10:09,616 [virtwho.main INFO] MainProcess(25578):Thread-2 @ahv_interface.py:get_vm_entities:449 - Total number of vms uuids found and saved for processing 1 2024-07-10 02:10:09,616 [virtwho.main DEBUG] MainProcess(25578):Thread-2 @ahv_interface.py:get_host_uuid_from_vm:568 - Host UUID 1c1b19c9-988c-4b86-a2b2-658fded10ccb found for VM: 129d8e57-b4fc-4d95-ad33-5aa6ec6fb146 2024-07-10 02:10:10,096 [virtwho.main DEBUG] MainProcess(25578):Thread-2 @ahv.py:get_host_guest_mapping_v3:123 - Host '1c1b19c9-988c-4b86-a2b2-658fded10ccb' doesn't have hypervisor_id property 2024-07-10 02:10:10,096 [virtwho.main INFO] MainProcess(25578):Thread-2 @virt.py:_send_data:1191 - Report for config "virt-who-config-1" gathered, placing in datastore 2024-07-10 02:10:10,981 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:__init__:242 - Environment variable NO_PROXY= will be used 2024-07-10 02:10:10,981 [virtwho.destination_4753028580417423715 DEBUG] MainProcess(25578):Thread-3 @subscriptionmanager.py:_connect:147 - Authenticating with RHSM username virt_who_reporter_1 2024-07-10 02:10:10,981 [virtwho.destination_4753028580417423715 INFO] MainProcess(25578):Thread-3 @subscriptionmanager.py:_connect:158 - X-Correlation-ID: cbca7f86be5e4604a1635ce09c65704b 2024-07-10 02:10:10,982 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:__init__:626 - Creating new BaseRestLib instance 2024-07-10 02:10:10,982 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:__init__:368 - Connection built: host=ip-10-0-167-170.rhos-01.prod.psi.rdu2.redhat.com port=443 handler=/rhsm auth=basic username=virt_who_reporter_1 2024-07-10 02:10:10,982 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1150 - Making request: GET /rhsm/status/ 2024-07-10 02:10:10,982 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_create_connection:759 - Creating new connection 2024-07-10 02:10:10,991 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_create_connection:824 - Created connection: <ssl.SSLSocket fd=9, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.0.167.170', 56664), raddr=('10.0.167.170', 443)> 2024-07-10 02:10:11,029 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_update_smoothed_response_time:1230 - Response time: 0.00010728836059570312, Smoothed response time: 0.00010728836059570312 2024-07-10 02:10:11,029 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1184 - Response: status=200, request="GET /rhsm/status/" 2024-07-10 02:10:11,029 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1193 - HTTP header 'Connection' not included in response 2024-07-10 02:10:11,030 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1150 - Making request: GET /rhsm/status 2024-07-10 02:10:11,030 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_create_connection:756 - Reusing connection: <ssl.SSLSocket fd=9, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.0.167.170', 56664), raddr=('10.0.167.170', 443)> 2024-07-10 02:10:11,062 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_update_smoothed_response_time:1230 - Response time: 0.00010752677917480469, Smoothed response time: 0.00010731220245361327 2024-07-10 02:10:11,062 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1184 - Response: status=200, request="GET /rhsm/status" 2024-07-10 02:10:11,063 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1188 - Server wants to keep connection 2024-07-10 02:10:11,063 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1204 - Connection timeout: 15 is used from 'Keep-Alive' HTTP header 2024-07-10 02:10:11,063 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1207 - Max number of requests: 99 is used from 'Keep-Alive' HTTP header 2024-07-10 02:10:11,063 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_load_manager_capabilities:1483 - Server has the following capabilities: ['instance_multiplier', 'derived_product', 'vcpu', 'cert_v3', 'hypervisors_heartbeat', 'remove_by_pool_id', 'syspurpose', 'storage_band', 'cores', 'multi_environment', 'hypervisors_async', 'org_level_content_access', 'guest_limit', 'ram', 'batch_bind', 'combined_reporting'] 2024-07-10 02:10:11,064 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1150 - Making request: PUT /rhsm/hypervisors/Default_Organization/heartbeat?reporter_id=ip-10-0-167-170.rhos-01.prod.psi.rdu2.redhat.com-479677c5067f22a170172499189cb777 2024-07-10 02:10:11,064 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_create_connection:756 - Reusing connection: <ssl.SSLSocket fd=9, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.0.167.170', 56664), raddr=('10.0.167.170', 443)> 2024-07-10 02:10:11,234 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_update_smoothed_response_time:1230 - Response time: 9.584426879882812e-05, Smoothed response time: 0.00010616540908813476 2024-07-10 02:10:11,234 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1184 - Response: status=200, request="PUT /rhsm/hypervisors/Default_Organization/heartbeat?reporter_id=ip-10-0-167-170.rhos-01.prod.psi.rdu2.redhat.com-479677c5067f22a170172499189cb777" 2024-07-10 02:10:11,234 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1188 - Server wants to keep connection 2024-07-10 02:10:11,234 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1204 - Connection timeout: 15 is used from 'Keep-Alive' HTTP header 2024-07-10 02:10:11,235 [rhsm.connection DEBUG] MainProcess(25578):Thread-3 @connection.py:_request:1207 - Max number of requests: 98 is used from 'Keep-Alive' HTTP header 2024-07-10 02:10:11,235 [virtwho.destination_4753028580417423715 INFO] MainProcess(25578):Thread-3 @virt.py:_send_data:797 - Hosts-to-guests mapping for config "virt-who-config-1": 0 hypervisors and 0 guests found
Actual behavior:
virt-who config fail to get mapping info in rhsm.log on Nutanix env with Prism Flavor central
Expected behavior:
virt-who config can get mapping info in rhsm.log on Nutanix env with Prism Flavor central
Business Impact / Additional info: