-
Bug
-
Resolution: Won't Do
-
Normal
-
None
-
rhel-9.2.0
-
Yes
-
None
-
rhel-net-drivers-1
-
None
-
False
-
False
-
-
None
-
None
-
None
-
None
-
If docs needed, set a value
-
-
Unspecified
-
None
-
57,005
+++ This bug was initially created as a clone of Bug #2148122 +++
Description of problem:
All of the mvapich2 benchmarks fail with RC134 or RC1 when tested on MLX5 ROCE devices with bonding.
This is a regression from RHEL-9.1.0, where no such issue was encountered and ALL BENCHMARKS PASSED.
Version-Release number of selected component (if applicable):
Clients: rdma-dev-20
Servers: rdma-dev-19
DISTRO=RHEL-9.2.0-20221122.2
+ [22-11-26 08:18:32] cat /etc/redhat-release
Red Hat Enterprise Linux release 9.2 Beta (Plow)
+ [22-11-26 08:18:32] uname -a
Linux rdma-dev-20.rdma.lab.eng.rdu2.redhat.com 5.14.0-197.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Wed Nov 16 14:31:27 EST 2022 x86_64 x86_64 x86_64 GNU/Linux
+ [22-11-26 08:18:32] cat /proc/cmdline
BOOT_IMAGE=(hd0,msdos1)/vmlinuz-5.14.0-197.el9.x86_64 root=UUID=538888d6-9151-4709-96e7-45de731ccb83 ro intel_idle.max_cstate=0 processor.max_cstate=0 intel_iommu=on iommu=on console=tty0 rd_NO_PLYMOUTH crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M resume=UUID=54a6b4ee-5516-4124-a3e4-0cc7620c182d console=ttyS1,115200n81
+ [22-11-26 08:18:32] rpm -q rdma-core linux-firmware
rdma-core-41.0-3.el9.x86_64
linux-firmware-20221012-128.el9.noarch
+ [22-11-26 08:18:32] tail /sys/class/infiniband/mlx5_2/fw_ver /sys/class/infiniband/mlx5_3/fw_ver /sys/class/infiniband/mlx5_bond_0/fw_ver
==> /sys/class/infiniband/mlx5_2/fw_ver <==
12.28.2006
==> /sys/class/infiniband/mlx5_3/fw_ver <==
12.28.2006
==> /sys/class/infiniband/mlx5_bond_0/fw_ver <==
14.31.1014
+ [22-11-26 08:18:32] lspci
+ [22-11-26 08:18:32] grep -i -e ethernet -e infiniband -e omni -e ConnectX
01:00.0 Ethernet controller: Broadcom Inc. and subsidiaries NetXtreme BCM5720 Gigabit Ethernet PCIe
01:00.1 Ethernet controller: Broadcom Inc. and subsidiaries NetXtreme BCM5720 Gigabit Ethernet PCIe
02:00.0 Ethernet controller: Broadcom Inc. and subsidiaries NetXtreme BCM5720 Gigabit Ethernet PCIe
02:00.1 Ethernet controller: Broadcom Inc. and subsidiaries NetXtreme BCM5720 Gigabit Ethernet PCIe
04:00.0 Ethernet controller: Mellanox Technologies MT27710 Family [ConnectX-4 Lx]
04:00.1 Ethernet controller: Mellanox Technologies MT27710 Family [ConnectX-4 Lx]
82:00.0 Infiniband controller: Mellanox Technologies MT27700 Family [ConnectX-4]
82:00.1 Infiniband controller: Mellanox Technologies MT27700 Family [ConnectX-4]
Installed:
mpitests-mvapich2-5.8-1.el9.x86_64 mvapich2-2.3.6-3.el9.x86_64
How reproducible:
100%
Steps to Reproduce:
1. bring up the RDMA hosts mentioned above with RHEL9.2 build
2. set up RDMA hosts for mvapich2 benchamrk tests
3. run one of the mvapich2 benchmark with "mpirun" command, as the following:
timeout --preserve-status --kill-after=5m 3m mpirun -hostfile /root/hfile_one_core -np 2 mpitests-IMB-MPI1 PingPong -time 1.5
Actual results:
+ [22-11-23 12:46:14] timeout --preserve-status --kill-after=5m 3m mpirun -hostfile /root/hfile_one_core -np 2 mpitests-IMB-MPI1 PingPong -time 1.5
[rdma-virt-03.rdma.lab.eng.rdu2.redhat.com:mpi_rank_1][rdma_find_network_type] Warning: Detected active HCAs in different subnets (mlx5_1 in subnet 1 and mlx5_0 in subnet 3). This can cause hangs. Plese use MV2_IBA_HCA to select appropriate HCAs or force rail sharing.
-
-
- buffer overflow detected ***: mpitests-IMB-MPI1 terminated
[rdma-virt-03.rdma.lab.eng.rdu2.redhat.com:mpi_rank_1][error_sighandler] Caught error: Aborted (signal 6)
- buffer overflow detected ***: mpitests-IMB-MPI1 terminated
-
===================================================================================
= BAD TERMINATION OF ONE OF YOUR APPLICATION PROCESSES
= PID 59810 RUNNING AT 172.31.45.203
= EXIT CODE: 134
= CLEANING UP REMAINING PROCESSES
= YOU CAN IGNORE THE BELOW CLEANUP MESSAGES
===================================================================================
[proxy:0:0@rdma-virt-02.rdma.lab.eng.rdu2.redhat.com] HYD_pmcd_pmip_control_cmd_cb (pm/pmiserv/pmip_cb.c:911): assert (!closed) failed
[proxy:0:0@rdma-virt-02.rdma.lab.eng.rdu2.redhat.com] HYDT_dmxu_poll_wait_for_event (tools/demux/demux_poll.c:76): callback returned error status
[proxy:0:0@rdma-virt-02.rdma.lab.eng.rdu2.redhat.com] main (pm/pmiserv/pmip.c:202): demux engine error waiting for event
YOUR APPLICATION TERMINATED WITH THE EXIT STRING: Aborted (signal 6)
This typically refers to a problem with your application.
Please see the FAQ page for debugging suggestions
+ [22-11-23 12:46:14] __MPI_check_result 134 mpitests-mvapich2 IMB-MPI1 PingPong mpirun /root/hfile_one_core
Expected results:
Normal completion of the benchmark tests with expected stat outputs
example normal result for the above benchmark with RHEL-9.1.0
+ [22-11-26 07:46:41] timeout --preserve-status --kill-after=5m 3m mpirun -hostfile /root/hfile_one_core -np 2 mpitests-IMB-MPI1 PingPong -time 1.5
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:mpi_rank_0][rdma_param_handle_heterogeneity] All nodes involved in the job were detected to be homogeneous in terms of processors and interconnects. Setting MV2_HOMOGENEOUS_CLUSTER=1 can improve job startup performance on such systems. The following link has more details on enhancing job startup performance. http://mvapich.cse.ohio-state.edu/performance/job-startup/.
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:mpi_rank_0][rdma_param_handle_heterogeneity] To suppress this warning, please set MV2_SUPPRESS_JOB_STARTUP_PERFORMANCE_WARNING to 1
#----------------------------------------------------------------
- Intel(R) MPI Benchmarks 2021.3, MPI-1 part
#---------------------------------------------------------------- - Date : Sat Nov 26 07:46:41 2022
- Machine : x86_64
- System : Linux
- Release : 5.14.0-162.6.1.el9_1.x86_64
- Version : #1 SMP PREEMPT_DYNAMIC Fri Sep 30 07:36:03 EDT 2022
- MPI Version : 3.1
- MPI Thread Environment:
- Calling sequence was:
- mpitests-IMB-MPI1 PingPong -time 1.5
- Minimum message length in bytes: 0
- Maximum message length in bytes: 4194304
# - MPI_Datatype : MPI_BYTE
- MPI_Datatype for reductions : MPI_FLOAT
- MPI_Op : MPI_SUM
- List of Benchmarks to run:
- PingPong
#---------------------------------------------------
- Benchmarking PingPong
- #processes = 2
#---------------------------------------------------
#bytes #repetitions t[usec] Mbytes/sec
0 1000 1.33 0.00
1 1000 1.39 0.72
2 1000 1.39 1.44
4 1000 1.38 2.90
8 1000 1.37 5.83
16 1000 1.42 11.30
32 1000 1.42 22.60
64 1000 1.42 44.99
128 1000 1.50 85.06
256 1000 2.32 110.47
512 1000 2.37 216.18
1024 1000 2.62 390.22
2048 1000 2.89 707.54
4096 1000 3.91 1048.16
8192 1000 5.00 1638.25
16384 1000 6.54 2503.76
32768 1000 8.11 4040.48
65536 640 11.03 5939.29
131072 320 16.81 7797.95
262144 160 28.75 9117.72
524288 80 52.03 10077.15
1048576 40 98.62 10632.61
2097152 20 191.40 10956.77
4194304 10 376.95 11126.90
- All processes entering MPI_Finalize
[0] Failed to dealloc pd (Device or resource busy)
[1] Failed to dealloc pd (Device or resource busy)
[0] 16 at [0x000055f1767a1e20], src/mpid/ch3/src/mpid_rma.c[182]
[0] 8 at [0x000055f1771a9ed0], src/mpi/comm/create_2level_comm.c[1523]
[0] 8 at [0x000055f1771aa190], src/util/procmap/local_proc.c[93]
[0] 8 at [0x000055f1771aa0e0], src/util/procmap/local_proc.c[92]
[0] 24 at [0x000055f1771a9e10], src/mpi/group/grouputil.c[74]
[0] 8 at [0x000055f1771a9d60], src/mpi/comm/create_2level_comm.c[1481]
[0] 128 at [0x000055f1771a9c40], src/mpi/coll/ch3_shmem_coll.c[4501]
[0] 8 at [0x000055f1771aa320], src/util/procmap/local_proc.c[93]
[0] 8 at [0x000055f1771aa270], src/util/procmap/local_proc.c[92]
[0] 8 at [0x000055f1771a9b90], src/mpi/comm/create_2level_comm.c[942]
[0] 8 at [0x000055f1771a9ae0], src/mpi/comm/create_2level_comm.c[940]
[0] 1024 at [0x000055f1771a9640], src/mpi/coll/ch3_shmem_coll.c[5254]
[0] 8 at [0x000055f177120ed0], src/mpi/coll/ch3_shmem_coll.c[5249]
[0] 312 at [0x000055f1771a9460], src/mpi/coll/ch3_shmem_coll.c[5201]
[0] 264 at [0x000055f1771a92b0], src/mpi/coll/ch3_shmem_coll.c[5150]
[0] 8 at [0x000055f177120e20], src/mpi/comm/create_2level_comm.c[2103]
[0] 8 at [0x000055f177120d70], src/mpi/comm/create_2level_comm.c[2095]
[0] 8 at [0x000055f1771a9200], src/util/procmap/local_proc.c[93]
[0] 8 at [0x000055f1771a9150], src/util/procmap/local_proc.c[92]
[0] 16 at [0x000055f177168e30], src/mpi/group/grouputil.c[74]
[0] 8 at [0x000055f1771a90a0], src/util/procmap/local_proc.c[93]
[0] 8 at [0x000055f177120f80], src/util/procmap/local_proc.c[92]
[0] 24 at [0x000055f177168f90], src/mpi/group/grouputil.c[74]
[0] 8 at [0x000055f177168ee0], src/mpi/comm/create_2level_comm.c[1998]
[0] 8 at [0x000055f177168d80], src/mpi/comm/create_2level_comm.c[1974]
[0] 2048 at [0x000055f1771684e0], src/mpi/comm/create_2level_comm.c[1961]
[0] 24 at [0x000055f1771208a0], src/mpi/group/grouputil.c[74]
[0] 8 at [0x000055f177120ab0], src/util/procmap/local_proc.c[93]
[0] 8 at [0x000055f177120a00], src/util/procmap/local_proc.c[92]
[0] 8 at [0x000055f1767a1290], src/mpid/ch3/src/mpid_rma.c[182]
[0] 8 at [0x000055f1767a11e0], src/mpid/ch3/src/mpid_rma.c[182]
[0] 8 at [0x000055f176810f70], src/mpid/ch3/src/mpid_rma.c[182]
[0] 8 at [0x000055f176810ec0], src/mpid/ch3/src/mpid_rma.c[182]
[0] 8 at [0x000055f17695bf50], src/mpid/ch3/src/mpid_rma.c[182]
[0] 8 at [0x000055f1768105e0], src/mpid/ch3/src/mpid_rma.c[182]
[0] 504 at [0x000055f17695b410], src/mpi/comm/commutil.c[342]
[0] 32 at [0x000055f17695b350], src/mpid/ch3/src/mpid_vc.c[111]
[1] 8 at [0x0000564cba2d8880], src/mpi/comm/create_2level_comm.c[1523]
[1] 8 at [0x0000564cba2d8cd0], src/util/procmap/local_proc.c[93]
[1] 8 at [0x0000564cba2d8a90], src/util/procmap/local_proc.c[92]
[1] 24 at [0x0000564cba2d87c0], src/mpi/group/grouputil.c[74]
[1] 8 at [0x0000564cba2d8710], src/mpi/comm/create_2level_comm.c[1481]
[1] 128 at [0x0000564cba2d8540], src/mpi/coll/ch3_shmem_coll.c[4484]
[1] 8 at [0x0000564cba2d8c20], src/util/procmap/local_proc.c[93]
[1] 8 at [0x0000564cba2d8b70], src/util/procmap/local_proc.c[92]
[1] 8 at [0x0000564cba297f70], src/mpi/comm/create_2level_comm.c[942]
[1] 8 at [0x0000564cba297ec0], src/mpi/comm/create_2level_comm.c[940]
[1] 1024 at [0x0000564cba2d80a0], src/mpi/coll/ch3_shmem_coll.c[5254]
[1] 8 at [0x0000564cba297700], src/mpi/coll/ch3_shmem_coll.c[5249]
[1] 312 at [0x0000564cba297ce0], src/mpi/coll/ch3_shmem_coll.c[5201]
[1] 264 at [0x0000564cba297b30], src/mpi/coll/ch3_shmem_coll.c[5150]
[1] 8 at [0x0000564cba297650], src/mpi/comm/create_2level_comm.c[2103]
[1] 8 at [0x0000564cba2975a0], src/mpi/comm/create_2level_comm.c[2095]
[1] 8 at [0x0000564cba297a80], src/util/procmap/local_proc.c[93]
[1] 8 at [0x0000564cba2979d0], src/util/procmap/local_proc.c[92]
[1] 24 at [0x0000564cba297910], src/mpid/ch3/src/mpid_vc.c[111]
[1] 16 at [0x0000564cba24ff10], src/mpi/group/grouputil.c[74]
[1] 8 at [0x0000564cba297860], src/util/procmap/local_proc.c[93]
[1] 8 at [0x0000564cba2977b0], src/util/procmap/local_proc.c[92]
[1] 24 at [0x0000564cba2974e0], src/mpi/group/grouputil.c[74]
[1] 8 at [0x0000564cba24ffc0], src/mpi/comm/create_2level_comm.c[1998]
[1] 8 at [0x0000564cba24faf0], src/mpi/comm/create_2level_comm.c[1974]
[1] 2048 at [0x0000564cb9a8b7a0], src/mpi/comm/create_2level_comm.c[1961]
[1] 24 at [0x0000564cba24f9b0], src/mpi/group/grouputil.c[74]
[1] 8 at [0x0000564cba24fc50], src/util/procmap/local_proc.c[93]
[1] 8 at [0x0000564cba24fba0], src/util/procmap/local_proc.c[92]
[1] 8 at [0x0000564cb98d0530], src/mpid/ch3/src/mpid_rma.c[182]
[1] 8 at [0x0000564cb98d0480], src/mpid/ch3/src/mpid_rma.c[182]
[1] 8 at [0x0000564cb993ff70], src/mpid/ch3/src/mpid_rma.c[182]
[1] 8 at [0x0000564cb993fec0], src/mpid/ch3/src/mpid_rma.c[182]
[1] 8 at [0x0000564cb993f880], src/mpid/ch3/src/mpid_rma.c[182]
[1] 8 at [0x0000564cb993f7d0], src/mpid/ch3/src/mpid_rma.c[182]
[1] 504 at [0x0000564cb9a8a640], src/mpi/comm/commutil.c[342]
[1] 32 at [0x0000564cb9a8a580], src/mpid/ch3/src/mpid_vc.c[111]
+ [22-11-26 07:46:41] __MPI_check_result 0 mpitests-mvapich2 IMB-MPI1 PingPong mpirun /root/hfile_one_core
+ [22-11-26 07:46:41] '[' 6 -ne 6 ']'
+ [22-11-26 07:46:41] local status=0
+ [22-11-26 07:46:41] local pkg=mvapich2
+ [22-11-26 07:46:41] local benchmark=IMB-MPI1
++ [22-11-26 07:46:41] basename PingPong
+ [22-11-26 07:46:41] local app=PingPong
+ [22-11-26 07:46:41] app=PingPong
+ [22-11-26 07:46:41] local cmd=mpirun
++ [22-11-26 07:46:41] basename /root/hfile_one_core
+ [22-11-26 07:46:41] local hfile=hfile_one_core
+ [22-11-26 07:46:41] hfile=one_core
+ [22-11-26 07:46:41] RQA_check_result -r 0 -t 'mvapich2 IMB-MPI1 PingPong mpirun one_core'
+ [22-11-26 07:46:41] local test_pass=0
+ [22-11-26 07:46:41] local test_skip=777
+ [22-11-26 07:46:41] test 4 -gt 0
+ [22-11-26 07:46:41] case $1 in
+ [22-11-26 07:46:41] local rc=0
+ [22-11-26 07:46:41] shift
+ [22-11-26 07:46:41] shift
+ [22-11-26 07:46:41] test 2 -gt 0
+ [22-11-26 07:46:41] case $1 in
+ [22-11-26 07:46:41] local 'msg=mvapich2 IMB-MPI1 PingPong mpirun one_core'
+ [22-11-26 07:46:41] shift
+ [22-11-26 07:46:41] shift
+ [22-11-26 07:46:41] test 0 -gt 0
+ [22-11-26 07:46:41] '[' -z 0 -o -z 'mvapich2 IMB-MPI1 PingPong mpirun one_core' ']'
+ [22-11-26 07:46:41] '[' -z '' ']'
+ [22-11-26 07:46:41] source /etc/environment
++ [22-11-26 07:46:41] TEST_RESULT_FILE=
+ [22-11-26 07:46:41] '[' -z '' ']'
+ [22-11-26 07:46:41] __RQA_create_test_result_file
++ [22-11-26 07:46:41] rpm -q --queryformat '%
-%
{VERSION}-%
{RELEASE}' rdma-core
+ [22-11-26 07:46:42] local rdma_version=rdma-core-41.0-3.el9
++ [22-11-26 07:46:42] echo /kernel/infiniband/mpi/mvapich2/client
++ [22-11-26 07:46:42] sed -r 's/\/kernel\/infiniband\///;s/\/server//;s/\/client//;s/\/standalone//'
+ [22-11-26 07:46:42] TEST_NAME=mpi/mvapich2
+ [22-11-26 07:46:42] [[ ! -z rdma-dev-19 ]]
+ [22-11-26 07:46:42] [[ ! -z rdma-dev-20 ]]
+ [22-11-26 07:46:42] hosts_list=rdma-dev-19/rdma-dev-20
+ [22-11-26 07:46:42] local _bkrjb=
++ [22-11-26 07:46:42] grep JOBID= /etc/motd
++ [22-11-26 07:46:42] sed s/JOBID=/J:/g
++ [22-11-26 07:46:42] head -1
++ [22-11-26 07:46:42] tr -d ' '
+ [22-11-26 07:46:42] _bkrjb=J:7281055
+ [22-11-26 07:46:42] [[ ! -z J:7281055 ]]
+ [22-11-26 07:46:42] _bkrjb=' & Beaker job J:7281055'
++ [22-11-26 07:46:42] echo results_mpi/mvapich2.txt
++ [22-11-26 07:46:42] sed 's/\//-/g'
+ [22-11-26 07:46:42] local result_filename=results_mpi-mvapich2.txt
++ [22-11-26 07:46:42] mktemp -d
+ [22-11-26 07:46:42] TEST_RESULT_FILE=/tmp/tmp.gN3ZkSJOUt/results_mpi-mvapich2.txt
+ [22-11-26 07:46:42] echo 'mpi/mvapich2 test results on rdma-dev-19/rdma-dev-20 & Beaker job J:7281055:'
++ [22-11-26 07:46:42] uname -r
+ [22-11-26 07:46:42] echo '5.14.0-162.6.1.el9_1.x86_64, rdma-core-41.0-3.el9, mlx5, roce.45, ConnectX-4 Lx & mlx5_bond_0'
+ [22-11-26 07:46:42] echo ' Result | Status | Test'
+ [22-11-26 07:46:42] echo ' -------------------------------------------------'
+ [22-11-26 07:46:42] sed -i /TEST_RESULT_FILE/d /etc/environment
+ [22-11-26 07:46:42] echo TEST_RESULT_FILE=/tmp/tmp.gN3ZkSJOUt/results_mpi-mvapich2.txt
+ [22-11-26 07:46:42] source /etc/environment
++ [22-11-26 07:46:42] TEST_RESULT_FILE=/tmp/tmp.gN3ZkSJOUt/results_mpi-mvapich2.txt
+ [22-11-26 07:46:42] '[' 0 -eq 0 ']'
+ [22-11-26 07:46:42] local test_result=PASS
+ [22-11-26 07:46:42] printf '%10s | %6s | %s\n' PASS 0 'mvapich2 IMB-MPI1 PingPong mpirun one_core'
+ [22-11-26 07:46:42] set +x
—
- TEST RESULT FOR mvapich2
- Test: mvapich2 IMB-MPI1 PingPong mpirun one_core
- Result: PASS
- Return: 0
—
Additional info:
- is blocked by
-
RHEL-6177 [RHEL8.8] all mvapich2 benchmarks fail on MLX5 ROCE when run on bonding devices
-
- Closed
-
- external trackers