Description of problem: All openmpi benchmarks fail on all RDMA devices all RHEL-8.9 builds. This started happening probably from July 7th. The RHEL8.9 builds that used pass on all openmpi benchmarks on most RDMA devices - like mlx5 ib0, ib1, roce - now all are failing. However, all openmpi benchmarks pass with RHEL-8.8.0. Version-Release number of selected component (if applicable): It seems that all RHEL-8.9 builds are showing the failure behaviors on all openmpi benchmarks. How reproducible: 100% Steps to Reproduce: 1. Please refer to https://beaker-archive.hosts.prod.psi.bos.redhat.com/beaker-logs/2023/07/80456/8045680/14195192/162600969/760525155/resultoutputfile.log it contains 4 openmpi benchmark testing for cxgb4 iw, hfi opa, mlx5 ib0, mlx5 ib1. 2. 3. Actual results: Here is one group of RDMA devices that failed on openmpi benchmarks in RDMA sanity test. RHEL-8.9.0-20230619.20 sanity test results on rdma-perf-06/rdma-perf-07 & Beaker job J:8048367: 4.18.0-497.el8.x86_64, rdma-core-44.0-2.el8.1, cxgb4, iw, T62100-LP-CR & cxgb4_0 Result | Status | Test ---------+--------+------------------------------------ FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv FAIL | 1 | openmpi mpitests-IMB-EXT Window FAIL | 1 | openmpi mpitests-osu_get_bw sanity test results on rdma-perf-06/rdma-perf-07 & Beaker job J:8048367: 4.18.0-497.el8.x86_64, rdma-core-44.0-2.el8.1, hfi1, opa0, Omni-Path & hfi1_0 Result | Status | Test ---------+--------+------------------------------------ FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv FAIL | 1 | openmpi mpitests-IMB-EXT Window FAIL | 1 | openmpi mpitests-osu_get_bw sanity test results on rdma-perf-06/rdma-perf-07 & Beaker job J:8048367: 4.18.0-497.el8.x86_64, rdma-core-44.0-2.el8.1, mlx5, ib0, ConnectX-5 & mlx5_0 Result | Status | Test ---------+--------+------------------------------------ FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv FAIL | 1 | openmpi mpitests-IMB-EXT Window FAIL | 1 | openmpi mpitests-osu_get_bw sanity test results on rdma-perf-06/rdma-perf-07 & Beaker job J:8048367: 4.18.0-497.el8.x86_64, rdma-core-44.0-2.el8.1, mlx5, ib1, ConnectX-5 & mlx5_1 Result | Status | Test ---------+--------+------------------------------------ FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv FAIL | 1 | openmpi mpitests-IMB-EXT Window FAIL | 1 | openmpi mpitests-osu_get_bw Expected results: Here is the same build, RHEL-8.9.0-20230619.20 sanity results on the same RDMA devices. sanity test results on rdma-perf-06/rdma-perf-07 & Beaker job J:8041295: 4.18.0-497.el8.x86_64, rdma-core-44.0-2.el8.1, cxgb4, iw, T62100-LP-CR & cxgb4_0 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi mpitests-IMB-MPI1 PingPong PASS | 0 | openmpi mpitests-IMB-IO S_Read_indv PASS | 0 | openmpi mpitests-IMB-EXT Window PASS | 0 | openmpi mpitests-osu_get_bw sanity test results on rdma-perf-06/rdma-perf-07 & Beaker job J:8041295: 4.18.0-497.el8.x86_64, rdma-core-44.0-2.el8.1, hfi1, opa0, Omni-Path & hfi1_0 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi mpitests-IMB-MPI1 PingPong PASS | 0 | openmpi mpitests-IMB-IO S_Read_indv PASS | 0 | openmpi mpitests-IMB-EXT Window PASS | 0 | openmpi mpitests-osu_get_bw sanity test results on rdma-perf-06/rdma-perf-07 & Beaker job J:8041295: 4.18.0-497.el8.x86_64, rdma-core-44.0-2.el8.1, mlx5, ib0, ConnectX-5 & mlx5_0 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi mpitests-IMB-MPI1 PingPong PASS | 0 | openmpi mpitests-IMB-IO S_Read_indv PASS | 0 | openmpi mpitests-IMB-EXT Window PASS | 0 | openmpi mpitests-osu_get_bw sanity test results on rdma-perf-06/rdma-perf-07 & Beaker job J:8041295: 4.18.0-497.el8.x86_64, rdma-core-44.0-2.el8.1, mlx5, ib1, ConnectX-5 & mlx5_1 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi mpitests-IMB-MPI1 PingPong PASS | 0 | openmpi mpitests-IMB-IO S_Read_indv PASS | 0 | openmpi mpitests-IMB-EXT Window PASS | 0 | openmpi mpitests-osu_get_bw All failures show the same errors, as shown below. + [23-07-09 00:35:06] timeout 3m /usr/lib64/openmpi/bin/mpirun --allow-run-as-root --map-by node -mca btl_openib_warn_nonexistent_if 0 -mca btl_openib_if_include cxgb4_0:1 -mca mtl '^psm2,psm,ofi' -mca btl '^openib' --mca mtl_base_verbose 100 --mca btl_base_verbose 100 -mca pml ucx -mca osc ucx -x UCX_NET_DEVICES=cxgb4_iw --mca osc_ucx_verbose 100 --mca pml_ucx_verbose 100 -hostfile /root/hfile_one_core -np 2 /usr/lib64/openmpi/bin/mpitests-IMB-MPI1 PingPong [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: registering framework btl components [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: found loaded component ofi [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: component ofi register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: found loaded component self [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: component self register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: found loaded component sm [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: found loaded component tcp [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: component tcp register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: found loaded component usnic [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: component usnic register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: found loaded component vader [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_register: component vader register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: opening btl components [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: found loaded component ofi [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: component ofi open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: found loaded component self [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: component self open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: found loaded component tcp [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: component tcp open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: found loaded component usnic [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: component usnic open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: found loaded component vader [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: components_open: component vader open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: initializing btl component ofi [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: registering framework btl components [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: found loaded component ofi [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: component ofi register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: found loaded component self [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: component self register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: found loaded component sm [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: found loaded component tcp [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: component tcp register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: found loaded component usnic [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: component usnic register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: found loaded component vader [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_register: component vader register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: opening btl components [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: found loaded component ofi [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: component ofi open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: found loaded component self [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: component self open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: found loaded component tcp [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: component tcp open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: found loaded component usnic [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: component usnic open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: found loaded component vader [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: components_open: component vader open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: initializing btl component ofi [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 rdma-perf-07.rdma.lab.eng.rdu2.redhat.com.112673Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 rdma-perf-07.rdma.lab.eng.rdu2.redhat.com.112673Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 rdma-perf-07.rdma.lab.eng.rdu2.redhat.com.112673Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 95 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 95 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: init of component ofi returned success [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: initializing btl component self [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: init of component self returned success [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: initializing btl component tcp [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl: tcp: Searching for exclude address+prefix: 127.0.0.1 / 8 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl: tcp: Found match: 127.0.0.1 (lo) [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl: tcp: Using interface: sppp [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: Attempting to bind to AF_INET port 1024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: Successfully bound to AF_INET port 1024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: my listening v4 socket is 0.0.0.0:1024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface bnxt_roce [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface bnxt_roce [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface cxgb4_iw [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface cxgb4_iw [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib1 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib1 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface hfi1_opa0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface hfi1_opa0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface bnxt_roce.43 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface bnxt_roce.43 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface bnxt_roce.45 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface bnxt_roce.45 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface cxgb4_iw.51 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface cxgb4_iw.51 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface cxgb4_iw.52 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface cxgb4_iw.52 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface hfi1_opa0.8024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface hfi1_opa0.8024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface hfi1_opa0.8022 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface hfi1_opa0.8022 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib0.8002 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib0.8002 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib0.8012 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib0.8012 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib0.8006 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib0.8006 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib0.8010 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib0.8010 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib0.8014 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib0.8014 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib0.8004 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib0.8004 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib1.8009 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib1.8009 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib1.8013 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib1.8013 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib1.8011 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib1.8011 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib1.8005 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib1.8005 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib1.8003 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib1.8003 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: examining interface mlx5_ib1.8007 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:tcp: using ipv6 interface mlx5_ib1.8007 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: init of component tcp returned success [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: initializing btl component usnic [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] btl:usnic: disqualifiying myself due to fi_getinfo(3) failure: No data available (-61) [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: init of component usnic returned failure [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: close: component usnic closed [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: close: unloading component usnic [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: initializing btl component vader [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] select: init of component vader returned failure [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: close: component vader closed [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] mca: base: close: unloading component vader [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] common_ucx.c:174 using OPAL memory hooks as external events [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] pml_ucx.c:197 mca_pml_ucx_open: UCX version 1.14.1 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] common_ucx.c:333 self/memory: did not match transport list [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] common_ucx.c:333 tcp/cxgb4_iw: did not match transport list [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] common_ucx.c:333 sysv/memory: did not match transport list [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] common_ucx.c:333 posix/memory: did not match transport list [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] common_ucx.c:337 support level is none -------------------------------------------------------------------------- No components were able to be opened in the pml framework. This typically means that either no components of this type were installed, or none of the installed components can be loaded. Sometimes this means that shared libraries required by these components are unable to be found/loaded. Host: rdma-perf-07 Framework: pml -------------------------------------------------------------------------- [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112673] PML ucx cannot be selected [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 95 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 95 rdma-perf-06.rdma.lab.eng.rdu2.redhat.com.117196Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) rdma-perf-06.rdma.lab.eng.rdu2.redhat.com.117196Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) rdma-perf-06.rdma.lab.eng.rdu2.redhat.com.117196Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: init of component ofi returned success [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: initializing btl component self [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: init of component self returned success [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: initializing btl component tcp [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl: tcp: Searching for exclude address+prefix: 127.0.0.1 / 8 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl: tcp: Found match: 127.0.0.1 (lo) [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl: tcp: Using interface: sppp [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: Attempting to bind to AF_INET port 1024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: Successfully bound to AF_INET port 1024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: my listening v4 socket is 0.0.0.0:1024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface qede_roce [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface qede_roce [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface cxgb4_iw [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface cxgb4_iw [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface idrac [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface idrac [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface hfi1_opa0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface hfi1_opa0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface cxgb4_iw.51 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface cxgb4_iw.51 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface cxgb4_iw.52 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface cxgb4_iw.52 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface hfi1_opa0.8024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface hfi1_opa0.8024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface hfi1_opa0.8022 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface hfi1_opa0.8022 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface qede_roce.45 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface qede_roce.45 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface qede_roce.43 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface qede_roce.43 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface mlx5_ib0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface mlx5_ib0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface mlx5_ib1 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface mlx5_ib1 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface mlx5_ib1.8009 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface mlx5_ib1.8009 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface mlx5_ib1.8013 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface mlx5_ib1.8013 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface mlx5_ib1.8011 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface mlx5_ib1.8011 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface mlx5_ib1.8005 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface mlx5_ib1.8005 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface mlx5_ib1.8003 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface mlx5_ib1.8003 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: examining interface mlx5_ib1.8007 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:tcp: using ipv6 interface mlx5_ib1.8007 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: init of component tcp returned success [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: initializing btl component usnic [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] btl:usnic: disqualifiying myself due to fi_getinfo(3) failure: No data available (-61) [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: init of component usnic returned failure [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: close: component usnic closed [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: close: unloading component usnic [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: initializing btl component vader [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] select: init of component vader returned failure [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: close: component vader closed [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] mca: base: close: unloading component vader [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] common_ucx.c:174 using OPAL memory hooks as external events [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] pml_ucx.c:197 mca_pml_ucx_open: UCX version 1.14.1 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] common_ucx.c:333 self/memory: did not match transport list [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] common_ucx.c:333 tcp/cxgb4_iw: did not match transport list [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] common_ucx.c:333 sysv/memory: did not match transport list [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] common_ucx.c:333 posix/memory: did not match transport list [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] common_ucx.c:337 support level is none [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:117196] PML ucx cannot be selected [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112665] 1 more process has sent help message help-mca-base.txt / find-available:none found [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112665] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages + [23-07-09 00:35:11] mpi_return=1 + [23-07-09 00:35:11] RQA_check_result -r 1 -t 'openmpi mpitests-IMB-MPI1 PingPong' Additional info: The following is one for the same build - RHEL-8.9.0-20230619.20, and for the same RDMA devices, as shown above. https://beaker-archive.hosts.prod.psi.bos.redhat.com/beaker-logs/2023/07/80412/8041295/14188487/162547717/760201064/resultoutputfile.log The same openmpi mpitests-IMB-MPI1 PingPong test: + [23-07-06 13:58:41] timeout 3m /usr/lib64/openmpi/bin/mpirun --allow-run-as-root --map-by node -mca btl_openib_warn_nonexistent_if 0 -mca btl_openib_if_include cxgb4_0:1 -mca mtl '^psm2,psm,ofi' -mca btl '^openib' --mca mtl_base_verbose 100 --mca btl_base_verbose 100 -mca pml ucx -mca osc ucx -x UCX_NET_DEVICES=cxgb4_iw --mca osc_ucx_verbose 100 --mca pml_ucx_verbose 100 -hostfile /root/hfile_one_core -np 2 /usr/lib64/openmpi/bin/mpitests-IMB-MPI1 PingPong [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: registering framework btl components [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: found loaded component ofi [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: component ofi register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: found loaded component self [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: component self register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: found loaded component sm [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: found loaded component tcp [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: component tcp register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: found loaded component usnic [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: component usnic register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: found loaded component vader [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_register: component vader register function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: opening btl components [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: found loaded component ofi [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: component ofi open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: found loaded component self [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: component self open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: found loaded component tcp [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: component tcp open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: found loaded component usnic [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: component usnic open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: found loaded component vader [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: components_open: component vader open function successful [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: initializing btl component ofi [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: registering framework btl components [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: found loaded component ofi [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: component ofi register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: found loaded component self [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: component self register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: found loaded component sm [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: found loaded component tcp [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: component tcp register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: found loaded component usnic [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: component usnic register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: found loaded component vader [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_register: component vader register function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: opening btl components [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: found loaded component ofi [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: component ofi open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: found loaded component self [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: component self open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: found loaded component tcp [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: component tcp open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: found loaded component usnic [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: component usnic open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: found loaded component vader [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: components_open: component vader open function successful [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: initializing btl component ofi [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 rdma-perf-07.rdma.lab.eng.rdu2.redhat.com.112960Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) rdma-perf-07.rdma.lab.eng.rdu2.redhat.com.112960Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 rdma-perf-07.rdma.lab.eng.rdu2.redhat.com.112960Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: init of component ofi returned success [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: initializing btl component self [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: init of component self returned success [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: initializing btl component tcp [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl: tcp: Searching for exclude address+prefix: 127.0.0.1 / 8 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl: tcp: Found match: 127.0.0.1 (lo) [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: Attempting to bind to AF_INET port 1024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: Successfully bound to AF_INET port 1024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: my listening v4 socket is 0.0.0.0:1024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface bnxt_roce [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface bnxt_roce [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface cxgb4_iw [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface cxgb4_iw [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib1 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib1 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface hfi1_opa0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface hfi1_opa0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface bnxt_roce.43 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface bnxt_roce.43 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface bnxt_roce.45 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface bnxt_roce.45 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface cxgb4_iw.51 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface cxgb4_iw.51 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface cxgb4_iw.52 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface cxgb4_iw.52 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface hfi1_opa0.8024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface hfi1_opa0.8024 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface hfi1_opa0.8022 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface hfi1_opa0.8022 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib0.8002 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib0.8002 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib0.8012 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib0.8012 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib0.8006 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib0.8006 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib0.8010 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib0.8010 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib0.8014 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib0.8014 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib0.8004 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib0.8004 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib1.8009 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib1.8009 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib1.8013 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib1.8013 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib1.8011 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib1.8011 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib1.8005 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib1.8005 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib1.8003 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib1.8003 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: examining interface mlx5_ib1.8007 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:tcp: using ipv6 interface mlx5_ib1.8007 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: init of component tcp returned success [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: initializing btl component usnic [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] btl:usnic: disqualifiying myself due to fi_getinfo(3) failure: No data available (-61) [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: init of component usnic returned failure [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: component usnic closed [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: unloading component usnic [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: initializing btl component vader [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] select: init of component vader returned failure [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: component vader closed [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: unloading component vader [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 95 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 95 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:197 mca_pml_ucx_open: UCX version 1.14.1 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:289 mca_pml_ucx_init [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:114 Pack remote worker address, size 38 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:114 Pack local worker address, size 141 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:351 created ucp context 0x55be70103280, worker 0x55be7013a240 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 22 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 95 [create_qp:2753]create qp: failed on ibv_cmd_create_qp with 95 rdma-perf-06.rdma.lab.eng.rdu2.redhat.com.116966Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) rdma-perf-06.rdma.lab.eng.rdu2.redhat.com.116966Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) rdma-perf-06.rdma.lab.eng.rdu2.redhat.com.116966Wrong pkey 0x8001, please use PSM2_PKEY to specify a valid pkey (err=23) [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: init of component ofi returned success [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: initializing btl component self [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: init of component self returned success [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: initializing btl component tcp [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl: tcp: Searching for exclude address+prefix: 127.0.0.1 / 8 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl: tcp: Found match: 127.0.0.1 (lo) [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: Attempting to bind to AF_INET port 1024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: Successfully bound to AF_INET port 1024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: my listening v4 socket is 0.0.0.0:1024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface qede_roce [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface qede_roce [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface cxgb4_iw [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface cxgb4_iw [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface idrac [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface idrac [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface hfi1_opa0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface hfi1_opa0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface cxgb4_iw.51 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface cxgb4_iw.51 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface cxgb4_iw.52 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface cxgb4_iw.52 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface hfi1_opa0.8024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface hfi1_opa0.8024 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface hfi1_opa0.8022 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface hfi1_opa0.8022 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface qede_roce.45 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface qede_roce.45 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface qede_roce.43 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface qede_roce.43 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface mlx5_ib0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface mlx5_ib0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface mlx5_ib1 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface mlx5_ib1 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface mlx5_ib1.8009 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface mlx5_ib1.8009 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface mlx5_ib1.8013 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface mlx5_ib1.8013 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface mlx5_ib1.8011 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface mlx5_ib1.8011 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface mlx5_ib1.8005 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface mlx5_ib1.8005 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface mlx5_ib1.8003 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface mlx5_ib1.8003 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: examining interface mlx5_ib1.8007 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:tcp: using ipv6 interface mlx5_ib1.8007 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: init of component tcp returned success [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: initializing btl component usnic [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] btl:usnic: disqualifiying myself due to fi_getinfo(3) failure: No data available (-61) [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: init of component usnic returned failure [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: component usnic closed [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: unloading component usnic [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: initializing btl component vader [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] select: init of component vader returned failure [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: component vader closed [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: unloading component vader [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:197 mca_pml_ucx_open: UCX version 1.14.1 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:289 mca_pml_ucx_init [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:114 Pack remote worker address, size 38 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:114 Pack local worker address, size 141 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:351 created ucp context 0x56268567af40, worker 0x5626856d9c00 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:182 Got proc 0 address, size 141 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:411 connecting to proc. 0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:182 Got proc 1 address, size 141 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:411 connecting to proc. 1 #---------------------------------------------------------------- # Intel(R) MPI Benchmarks 2021.3, MPI-1 part #---------------------------------------------------------------- # Date : Thu Jul 6 13:58:43 2023 # Machine : x86_64 # System : Linux # Release : 4.18.0-497.el8.x86_64 # Version : #1 SMP Sat Jun 10 09:07:40 EDT 2023 # MPI Version : 3.1 # MPI Thread Environment: # Calling sequence was: # /usr/lib64/openmpi/bin/mpitests-IMB-MPI1 PingPong # Minimum message length in bytes: 0 # Maximum message length in bytes: 4194304 # # MPI_Datatype : MPI_BYTE # MPI_Datatype for reductions : MPI_FLOAT # MPI_Op : MPI_SUM # # # List of Benchmarks to run: # PingPong [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:182 Got proc 1 address, size 38 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:411 connecting to proc. 1 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:182 Got proc 0 address, size 38 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:411 connecting to proc. 0 #--------------------------------------------------- # Benchmarking PingPong # #processes = 2 #--------------------------------------------------- #bytes #repetitions t[usec] Mbytes/sec 0 1000 14.26 0.00 1 1000 13.20 0.08 2 1000 12.83 0.16 4 1000 13.14 0.30 8 1000 12.95 0.62 16 1000 12.66 1.26 32 1000 12.87 2.49 64 1000 12.86 4.98 128 1000 13.01 9.84 256 1000 13.01 19.67 512 1000 13.76 37.22 1024 1000 13.86 73.91 2048 1000 14.31 143.17 4096 1000 15.93 257.06 8192 1000 20.99 390.36 16384 1000 21.68 755.58 32768 1000 28.62 1144.90 65536 640 38.98 1681.46 131072 320 65.57 1999.08 262144 160 100.57 2606.48 524288 80 135.98 3855.69 1048576 40 197.97 5296.59 2097152 20 316.05 6635.45 4194304 10 564.65 7428.21 # All processes entering MPI_Finalize [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] common_ucx.c:240 disconnecting from rank 0 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] common_ucx.c:240 disconnecting from rank 1 [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] common_ucx.c:204 waiting for 1 disconnect requests [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] common_ucx.c:204 waiting for 0 disconnect requests [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] common_ucx.c:240 disconnecting from rank 0 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] common_ucx.c:204 waiting for 1 disconnect requests [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] common_ucx.c:240 disconnecting from rank 1 [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] common_ucx.c:204 waiting for 0 disconnect requests [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:367 mca_pml_ucx_cleanup [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:367 mca_pml_ucx_cleanup [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] pml_ucx.c:268 mca_pml_ucx_close [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] pml_ucx.c:268 mca_pml_ucx_close [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: component ofi closed [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: unloading component ofi [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: component ofi closed [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: unloading component ofi [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: component self closed [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: unloading component self [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: component tcp closed [rdma-perf-06.rdma.lab.eng.rdu2.redhat.com:116966] mca: base: close: unloading component tcp [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: component self closed [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: unloading component self [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: component tcp closed [rdma-perf-07.rdma.lab.eng.rdu2.redhat.com:112960] mca: base: close: unloading component tcp + [23-07-06 13:58:45] mpi_return=0 + [23-07-06 13:58:45] RQA_check_result -r 0 -t 'openmpi mpitests-IMB-MPI1 PingPong'
Installed: mpitests-openmpi-5.8-1.el8.x86_64 openmpi-2:4.1.1-5.el8.x86_64 openmpi-devel-2:4.1.1-5.el8.x86_64 openmpi-java-2:4.1.1-5.el8.x86_64 openmpi-java-devel-2:4.1.1-5.el8.x86_64 mpi-openmpi test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8112820: 4.18.0-504.el8.x86_64, rdma-core-46.0-1.el8.1, mlx5, ib0, ConnectX-4 & mlx5_2 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi IMB-MPI1 PingPong mpirun one_core PASS | 0 | openmpi IMB-MPI1 PingPing mpirun one_core PASS | 0 | openmpi IMB-MPI1 Sendrecv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Exchange mpirun one_core PASS | 0 | openmpi IMB-MPI1 Bcast mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatterv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoall mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoallv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allreduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Barrier mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_priv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_priv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_shared mpirun one_core PASS | 0 | openmpi IMB-EXT Window mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Accumulate mpirun one_core PASS | 0 | openmpi IMB-NBC Ibcast mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgather mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Igather mpirun one_core PASS | 0 | openmpi IMB-NBC Igatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatterv mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoall mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoallv mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iallreduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ibarrier mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA One_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA One_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA Put_local mpirun one_core PASS | 0 | openmpi IMB-RMA Put_all_local mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_put mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_get mpirun one_core PASS | 0 | openmpi IMB-RMA Accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Get_accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Fetch_and_op mpirun one_core PASS | 0 | openmpi IMB-RMA Compare_and_swap mpirun one_core PASS | 0 | openmpi IMB-RMA Get_local mpirun one_core PASS | 0 | openmpi IMB-RMA Get_all_local mpirun one_core PASS | 0 | openmpi OSU acc_latency mpirun one_core PASS | 0 | openmpi OSU allgather mpirun one_core PASS | 0 | openmpi OSU allgatherv mpirun one_core PASS | 0 | openmpi OSU allreduce mpirun one_core PASS | 0 | openmpi OSU alltoall mpirun one_core PASS | 0 | openmpi OSU alltoallv mpirun one_core PASS | 0 | openmpi OSU barrier mpirun one_core PASS | 0 | openmpi OSU bcast mpirun one_core PASS | 0 | openmpi OSU bibw mpirun one_core PASS | 0 | openmpi OSU bw mpirun one_core PASS | 0 | openmpi OSU cas_latency mpirun one_core PASS | 0 | openmpi OSU fop_latency mpirun one_core PASS | 0 | openmpi OSU gather mpirun one_core PASS | 0 | openmpi OSU gatherv mpirun one_core PASS | 0 | openmpi OSU get_acc_latency mpirun one_core PASS | 0 | openmpi OSU get_bw mpirun one_core PASS | 0 | openmpi OSU get_latency mpirun one_core PASS | 0 | openmpi OSU hello mpirun one_core PASS | 0 | openmpi OSU iallgather mpirun one_core PASS | 0 | openmpi OSU iallgatherv mpirun one_core PASS | 0 | openmpi OSU iallreduce mpirun one_core PASS | 0 | openmpi OSU ialltoall mpirun one_core PASS | 0 | openmpi OSU ialltoallv mpirun one_core PASS | 0 | openmpi OSU ialltoallw mpirun one_core PASS | 0 | openmpi OSU ibarrier mpirun one_core PASS | 0 | openmpi OSU ibcast mpirun one_core PASS | 0 | openmpi OSU igather mpirun one_core PASS | 0 | openmpi OSU igatherv mpirun one_core PASS | 0 | openmpi OSU init mpirun one_core PASS | 0 | openmpi OSU ireduce mpirun one_core PASS | 0 | openmpi OSU iscatter mpirun one_core PASS | 0 | openmpi OSU iscatterv mpirun one_core PASS | 0 | openmpi OSU latency mpirun one_core PASS | 0 | openmpi OSU latency_mp mpirun one_core PASS | 0 | openmpi OSU mbw_mr mpirun one_core PASS | 0 | openmpi OSU multi_lat mpirun one_core PASS | 0 | openmpi OSU put_bibw mpirun one_core PASS | 0 | openmpi OSU put_bw mpirun one_core PASS | 0 | openmpi OSU put_latency mpirun one_core PASS | 0 | openmpi OSU reduce mpirun one_core PASS | 0 | openmpi OSU reduce_scatter mpirun one_core PASS | 0 | openmpi OSU scatter mpirun one_core PASS | 0 | openmpi OSU scatterv mpirun one_core PASS | 0 | NON-ROOT IMB-MPI1 PingPong Checking for failures and known issues: no test failures mpi-openmpi test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8112820: 4.18.0-504.el8.x86_64, rdma-core-46.0-1.el8.1, mlx5, ib1, ConnectX-4 & mlx5_3 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi IMB-MPI1 PingPong mpirun one_core PASS | 0 | openmpi IMB-MPI1 PingPing mpirun one_core PASS | 0 | openmpi IMB-MPI1 Sendrecv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Exchange mpirun one_core PASS | 0 | openmpi IMB-MPI1 Bcast mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatterv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoall mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoallv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allreduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Barrier mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_priv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_priv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_shared mpirun one_core PASS | 0 | openmpi IMB-EXT Window mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Accumulate mpirun one_core PASS | 0 | openmpi IMB-NBC Ibcast mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgather mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Igather mpirun one_core PASS | 0 | openmpi IMB-NBC Igatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatterv mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoall mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoallv mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iallreduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ibarrier mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA One_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA One_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA Put_local mpirun one_core PASS | 0 | openmpi IMB-RMA Put_all_local mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_put mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_get mpirun one_core PASS | 0 | openmpi IMB-RMA Accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Get_accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Fetch_and_op mpirun one_core PASS | 0 | openmpi IMB-RMA Compare_and_swap mpirun one_core PASS | 0 | openmpi IMB-RMA Get_local mpirun one_core PASS | 0 | openmpi IMB-RMA Get_all_local mpirun one_core PASS | 0 | openmpi OSU acc_latency mpirun one_core PASS | 0 | openmpi OSU allgather mpirun one_core PASS | 0 | openmpi OSU allgatherv mpirun one_core PASS | 0 | openmpi OSU allreduce mpirun one_core PASS | 0 | openmpi OSU alltoall mpirun one_core PASS | 0 | openmpi OSU alltoallv mpirun one_core PASS | 0 | openmpi OSU barrier mpirun one_core PASS | 0 | openmpi OSU bcast mpirun one_core PASS | 0 | openmpi OSU bibw mpirun one_core PASS | 0 | openmpi OSU bw mpirun one_core PASS | 0 | openmpi OSU cas_latency mpirun one_core PASS | 0 | openmpi OSU fop_latency mpirun one_core PASS | 0 | openmpi OSU gather mpirun one_core PASS | 0 | openmpi OSU gatherv mpirun one_core PASS | 0 | openmpi OSU get_acc_latency mpirun one_core PASS | 0 | openmpi OSU get_bw mpirun one_core PASS | 0 | openmpi OSU get_latency mpirun one_core PASS | 0 | openmpi OSU hello mpirun one_core PASS | 0 | openmpi OSU iallgather mpirun one_core PASS | 0 | openmpi OSU iallgatherv mpirun one_core PASS | 0 | openmpi OSU iallreduce mpirun one_core PASS | 0 | openmpi OSU ialltoall mpirun one_core PASS | 0 | openmpi OSU ialltoallv mpirun one_core PASS | 0 | openmpi OSU ialltoallw mpirun one_core PASS | 0 | openmpi OSU ibarrier mpirun one_core PASS | 0 | openmpi OSU ibcast mpirun one_core PASS | 0 | openmpi OSU igather mpirun one_core PASS | 0 | openmpi OSU igatherv mpirun one_core PASS | 0 | openmpi OSU init mpirun one_core PASS | 0 | openmpi OSU ireduce mpirun one_core PASS | 0 | openmpi OSU iscatter mpirun one_core PASS | 0 | openmpi OSU iscatterv mpirun one_core PASS | 0 | openmpi OSU latency mpirun one_core PASS | 0 | openmpi OSU latency_mp mpirun one_core PASS | 0 | openmpi OSU mbw_mr mpirun one_core PASS | 0 | openmpi OSU multi_lat mpirun one_core PASS | 0 | openmpi OSU put_bibw mpirun one_core PASS | 0 | openmpi OSU put_bw mpirun one_core PASS | 0 | openmpi OSU put_latency mpirun one_core PASS | 0 | openmpi OSU reduce mpirun one_core PASS | 0 | openmpi OSU reduce_scatter mpirun one_core PASS | 0 | openmpi OSU scatter mpirun one_core PASS | 0 | openmpi OSU scatterv mpirun one_core PASS | 0 | NON-ROOT IMB-MPI1 PingPong Checking for failures and known issues: no test failures mpi-openmpi test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8112820: 4.18.0-504.el8.x86_64, rdma-core-46.0-1.el8.1, mlx5, roce.45, ConnectX-4 Lx & mlx5_bond_0 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi IMB-MPI1 PingPong mpirun one_core PASS | 0 | openmpi IMB-MPI1 PingPing mpirun one_core PASS | 0 | openmpi IMB-MPI1 Sendrecv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Exchange mpirun one_core PASS | 0 | openmpi IMB-MPI1 Bcast mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatterv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoall mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoallv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allreduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Barrier mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_priv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_priv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_shared mpirun one_core PASS | 0 | openmpi IMB-EXT Window mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Accumulate mpirun one_core PASS | 0 | openmpi IMB-NBC Ibcast mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgather mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Igather mpirun one_core PASS | 0 | openmpi IMB-NBC Igatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatterv mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoall mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoallv mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iallreduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ibarrier mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA One_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA One_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA Put_local mpirun one_core PASS | 0 | openmpi IMB-RMA Put_all_local mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_put mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_get mpirun one_core PASS | 0 | openmpi IMB-RMA Accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Get_accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Fetch_and_op mpirun one_core PASS | 0 | openmpi IMB-RMA Compare_and_swap mpirun one_core PASS | 0 | openmpi IMB-RMA Get_local mpirun one_core PASS | 0 | openmpi IMB-RMA Get_all_local mpirun one_core PASS | 0 | openmpi OSU acc_latency mpirun one_core PASS | 0 | openmpi OSU allgather mpirun one_core PASS | 0 | openmpi OSU allgatherv mpirun one_core PASS | 0 | openmpi OSU allreduce mpirun one_core PASS | 0 | openmpi OSU alltoall mpirun one_core PASS | 0 | openmpi OSU alltoallv mpirun one_core PASS | 0 | openmpi OSU barrier mpirun one_core PASS | 0 | openmpi OSU bcast mpirun one_core PASS | 0 | openmpi OSU bibw mpirun one_core PASS | 0 | openmpi OSU bw mpirun one_core PASS | 0 | openmpi OSU cas_latency mpirun one_core PASS | 0 | openmpi OSU fop_latency mpirun one_core PASS | 0 | openmpi OSU gather mpirun one_core PASS | 0 | openmpi OSU gatherv mpirun one_core PASS | 0 | openmpi OSU get_acc_latency mpirun one_core PASS | 0 | openmpi OSU get_bw mpirun one_core PASS | 0 | openmpi OSU get_latency mpirun one_core PASS | 0 | openmpi OSU hello mpirun one_core PASS | 0 | openmpi OSU iallgather mpirun one_core PASS | 0 | openmpi OSU iallgatherv mpirun one_core PASS | 0 | openmpi OSU iallreduce mpirun one_core PASS | 0 | openmpi OSU ialltoall mpirun one_core PASS | 0 | openmpi OSU ialltoallv mpirun one_core PASS | 0 | openmpi OSU ialltoallw mpirun one_core PASS | 0 | openmpi OSU ibarrier mpirun one_core PASS | 0 | openmpi OSU ibcast mpirun one_core PASS | 0 | openmpi OSU igather mpirun one_core PASS | 0 | openmpi OSU igatherv mpirun one_core PASS | 0 | openmpi OSU init mpirun one_core PASS | 0 | openmpi OSU ireduce mpirun one_core PASS | 0 | openmpi OSU iscatter mpirun one_core PASS | 0 | openmpi OSU iscatterv mpirun one_core PASS | 0 | openmpi OSU latency mpirun one_core PASS | 0 | openmpi OSU latency_mp mpirun one_core PASS | 0 | openmpi OSU mbw_mr mpirun one_core PASS | 0 | openmpi OSU multi_lat mpirun one_core PASS | 0 | openmpi OSU put_bibw mpirun one_core PASS | 0 | openmpi OSU put_bw mpirun one_core PASS | 0 | openmpi OSU put_latency mpirun one_core PASS | 0 | openmpi OSU reduce mpirun one_core PASS | 0 | openmpi OSU reduce_scatter mpirun one_core PASS | 0 | openmpi OSU scatter mpirun one_core PASS | 0 | openmpi OSU scatterv mpirun one_core PASS | 0 | NON-ROOT IMB-MPI1 PingPong
Moving to verified as tests with openmpi-4.1.1-5.el8 passed. $ grep DISTRO /etc/motd | uniq | tr -d " " DISTRO=RHEL-8.9.0-20230809.19 $ cat /etc/redhat-release Red Hat Enterprise Linux release 8.9 Beta (Ootpa) $ uname -r 4.18.0-508.el8.x86_64 $ rpm -qa | grep -E "rdma-core|openmpi|ucx" rdma-core-devel-46.0-1.el8.1.x86_64 openmpi-4.1.1-5.el8.x86_64 mpitests-openmpi-7.1-2.el8.1.x86_64 ucx-1.14.1-1.el8.1.x86_64 openmpi-devel-4.1.1-5.el8.x86_64 rdma-core-46.0-1.el8.1.x86_64 $ mpi-openmpi test results on rdma-perf-02/rdma-perf-03 & Beaker job J:8170807: 4.18.0-508.el8.x86_64, rdma-core-46.0-1.el8.1, mlx5, ib0, ConnectX-5 & mlx5_0 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi IMB-MPI1 PingPong mpirun one_core PASS | 0 | openmpi IMB-MPI1 PingPing mpirun one_core PASS | 0 | openmpi IMB-MPI1 Sendrecv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Exchange mpirun one_core PASS | 0 | openmpi IMB-MPI1 Bcast mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatterv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoall mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoallv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allreduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Barrier mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_priv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_priv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_shared mpirun one_core PASS | 0 | openmpi IMB-EXT Window mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Accumulate mpirun one_core PASS | 0 | openmpi IMB-NBC Ibcast mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgather mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Igather mpirun one_core PASS | 0 | openmpi IMB-NBC Igatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatterv mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoall mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoallv mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iallreduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ibarrier mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA One_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA One_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA Put_local mpirun one_core PASS | 0 | openmpi IMB-RMA Put_all_local mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_put mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_get mpirun one_core PASS | 0 | openmpi IMB-RMA Accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Get_accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Fetch_and_op mpirun one_core PASS | 0 | openmpi IMB-RMA Compare_and_swap mpirun one_core PASS | 0 | openmpi IMB-RMA Get_local mpirun one_core PASS | 0 | openmpi IMB-RMA Get_all_local mpirun one_core PASS | 0 | openmpi OSU acc_latency mpirun one_core PASS | 0 | openmpi OSU allgather mpirun one_core PASS | 0 | openmpi OSU allgatherv mpirun one_core PASS | 0 | openmpi OSU allreduce mpirun one_core PASS | 0 | openmpi OSU alltoall mpirun one_core PASS | 0 | openmpi OSU alltoallv mpirun one_core PASS | 0 | openmpi OSU alltoallw mpirun one_core PASS | 0 | openmpi OSU barrier mpirun one_core PASS | 0 | openmpi OSU bcast mpirun one_core PASS | 0 | openmpi OSU bibw mpirun one_core PASS | 0 | openmpi OSU bibw_persistent mpirun one_core PASS | 0 | openmpi OSU bw mpirun one_core PASS | 0 | openmpi OSU bw_persistent mpirun one_core PASS | 0 | openmpi OSU cas_latency mpirun one_core PASS | 0 | openmpi OSU fop_latency mpirun one_core PASS | 0 | openmpi OSU gather mpirun one_core PASS | 0 | openmpi OSU gatherv mpirun one_core PASS | 0 | openmpi OSU get_acc_latency mpirun one_core PASS | 0 | openmpi OSU get_bw mpirun one_core PASS | 0 | openmpi OSU get_latency mpirun one_core PASS | 0 | openmpi OSU hello mpirun one_core PASS | 0 | openmpi OSU iallgather mpirun one_core PASS | 0 | openmpi OSU iallgatherv mpirun one_core PASS | 0 | openmpi OSU iallreduce mpirun one_core PASS | 0 | openmpi OSU ialltoall mpirun one_core PASS | 0 | openmpi OSU ialltoallv mpirun one_core PASS | 0 | openmpi OSU ialltoallw mpirun one_core PASS | 0 | openmpi OSU ibarrier mpirun one_core PASS | 0 | openmpi OSU ibcast mpirun one_core PASS | 0 | openmpi OSU igather mpirun one_core PASS | 0 | openmpi OSU igatherv mpirun one_core PASS | 0 | openmpi OSU ineighbor_allgather mpirun one_core PASS | 0 | openmpi OSU ineighbor_allgatherv mpirun one_core PASS | 0 | openmpi OSU ineighbor_alltoall mpirun one_core PASS | 0 | openmpi OSU ineighbor_alltoallv mpirun one_core PASS | 0 | openmpi OSU ineighbor_alltoallw mpirun one_core PASS | 0 | openmpi OSU init mpirun one_core PASS | 0 | openmpi OSU ireduce mpirun one_core PASS | 0 | openmpi OSU ireduce_scatter mpirun one_core PASS | 0 | openmpi OSU iscatter mpirun one_core PASS | 0 | openmpi OSU iscatterv mpirun one_core PASS | 0 | openmpi OSU latency mpirun one_core PASS | 0 | openmpi OSU latency_mp mpirun one_core PASS | 0 | openmpi OSU latency_persistent mpirun one_core PASS | 0 | openmpi OSU mbw_mr mpirun one_core PASS | 0 | openmpi OSU multi_lat mpirun one_core PASS | 0 | openmpi OSU neighbor_allgather mpirun one_core PASS | 0 | openmpi OSU neighbor_allgatherv mpirun one_core PASS | 0 | openmpi OSU neighbor_alltoall mpirun one_core PASS | 0 | openmpi OSU neighbor_alltoallv mpirun one_core PASS | 0 | openmpi OSU neighbor_alltoallw mpirun one_core PASS | 0 | openmpi OSU put_bibw mpirun one_core PASS | 0 | openmpi OSU put_bw mpirun one_core PASS | 0 | openmpi OSU put_latency mpirun one_core PASS | 0 | openmpi OSU reduce mpirun one_core PASS | 0 | openmpi OSU reduce_scatter mpirun one_core PASS | 0 | openmpi OSU scatter mpirun one_core PASS | 0 | openmpi OSU scatterv mpirun one_core PASS | 0 | NON-ROOT IMB-MPI1 PingPong Checking for failures and known issues: no test failures mpi-openmpi test results on rdma-perf-02/rdma-perf-03 & Beaker job J:8170807: 4.18.0-508.el8.x86_64, rdma-core-46.0-1.el8.1, mlx5, roce.45, ConnectX-5 & mlx5_1 Result | Status | Test ---------+--------+------------------------------------ PASS | 0 | openmpi IMB-MPI1 PingPong mpirun one_core PASS | 0 | openmpi IMB-MPI1 PingPing mpirun one_core PASS | 0 | openmpi IMB-MPI1 Sendrecv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Exchange mpirun one_core PASS | 0 | openmpi IMB-MPI1 Bcast mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allgatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gather mpirun one_core PASS | 0 | openmpi IMB-MPI1 Gatherv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Scatterv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoall mpirun one_core PASS | 0 | openmpi IMB-MPI1 Alltoallv mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core PASS | 0 | openmpi IMB-MPI1 Allreduce mpirun one_core PASS | 0 | openmpi IMB-MPI1 Barrier mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO S_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO S_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_shared mpirun one_core PASS | 0 | openmpi IMB-IO P_Write_priv mpirun one_core PASS | 0 | openmpi IMB-IO P_Read_priv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_indv mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_expl mpirun one_core PASS | 0 | openmpi IMB-IO C_Write_shared mpirun one_core PASS | 0 | openmpi IMB-IO C_Read_shared mpirun one_core PASS | 0 | openmpi IMB-EXT Window mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Unidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Get mpirun one_core PASS | 0 | openmpi IMB-EXT Bidir_Put mpirun one_core PASS | 0 | openmpi IMB-EXT Accumulate mpirun one_core PASS | 0 | openmpi IMB-NBC Ibcast mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgather mpirun one_core PASS | 0 | openmpi IMB-NBC Iallgatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Igather mpirun one_core PASS | 0 | openmpi IMB-NBC Igatherv mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iscatterv mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoall mpirun one_core PASS | 0 | openmpi IMB-NBC Ialltoallv mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core PASS | 0 | openmpi IMB-NBC Iallreduce mpirun one_core PASS | 0 | openmpi IMB-NBC Ibarrier mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Unidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_put mpirun one_core PASS | 0 | openmpi IMB-RMA Bidir_get mpirun one_core PASS | 0 | openmpi IMB-RMA One_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA One_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_put_all mpirun one_core PASS | 0 | openmpi IMB-RMA All_get_all mpirun one_core PASS | 0 | openmpi IMB-RMA Put_local mpirun one_core PASS | 0 | openmpi IMB-RMA Put_all_local mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_put mpirun one_core PASS | 0 | openmpi IMB-RMA Exchange_get mpirun one_core PASS | 0 | openmpi IMB-RMA Accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Get_accumulate mpirun one_core PASS | 0 | openmpi IMB-RMA Fetch_and_op mpirun one_core PASS | 0 | openmpi IMB-RMA Compare_and_swap mpirun one_core PASS | 0 | openmpi IMB-RMA Get_local mpirun one_core PASS | 0 | openmpi IMB-RMA Get_all_local mpirun one_core PASS | 0 | openmpi OSU acc_latency mpirun one_core PASS | 0 | openmpi OSU allgather mpirun one_core PASS | 0 | openmpi OSU allgatherv mpirun one_core PASS | 0 | openmpi OSU allreduce mpirun one_core PASS | 0 | openmpi OSU alltoall mpirun one_core PASS | 0 | openmpi OSU alltoallv mpirun one_core PASS | 0 | openmpi OSU alltoallw mpirun one_core PASS | 0 | openmpi OSU barrier mpirun one_core PASS | 0 | openmpi OSU bcast mpirun one_core PASS | 0 | openmpi OSU bibw mpirun one_core PASS | 0 | openmpi OSU bibw_persistent mpirun one_core PASS | 0 | openmpi OSU bw mpirun one_core PASS | 0 | openmpi OSU bw_persistent mpirun one_core PASS | 0 | openmpi OSU cas_latency mpirun one_core PASS | 0 | openmpi OSU fop_latency mpirun one_core PASS | 0 | openmpi OSU gather mpirun one_core PASS | 0 | openmpi OSU gatherv mpirun one_core PASS | 0 | openmpi OSU get_acc_latency mpirun one_core PASS | 0 | openmpi OSU get_bw mpirun one_core PASS | 0 | openmpi OSU get_latency mpirun one_core PASS | 0 | openmpi OSU hello mpirun one_core PASS | 0 | openmpi OSU iallgather mpirun one_core PASS | 0 | openmpi OSU iallgatherv mpirun one_core PASS | 0 | openmpi OSU iallreduce mpirun one_core PASS | 0 | openmpi OSU ialltoall mpirun one_core PASS | 0 | openmpi OSU ialltoallv mpirun one_core PASS | 0 | openmpi OSU ialltoallw mpirun one_core PASS | 0 | openmpi OSU ibarrier mpirun one_core PASS | 0 | openmpi OSU ibcast mpirun one_core PASS | 0 | openmpi OSU igather mpirun one_core PASS | 0 | openmpi OSU igatherv mpirun one_core PASS | 0 | openmpi OSU ineighbor_allgather mpirun one_core PASS | 0 | openmpi OSU ineighbor_allgatherv mpirun one_core PASS | 0 | openmpi OSU ineighbor_alltoall mpirun one_core PASS | 0 | openmpi OSU ineighbor_alltoallv mpirun one_core PASS | 0 | openmpi OSU ineighbor_alltoallw mpirun one_core PASS | 0 | openmpi OSU init mpirun one_core PASS | 0 | openmpi OSU ireduce mpirun one_core PASS | 0 | openmpi OSU ireduce_scatter mpirun one_core PASS | 0 | openmpi OSU iscatter mpirun one_core PASS | 0 | openmpi OSU iscatterv mpirun one_core PASS | 0 | openmpi OSU latency mpirun one_core PASS | 0 | openmpi OSU latency_mp mpirun one_core PASS | 0 | openmpi OSU latency_persistent mpirun one_core PASS | 0 | openmpi OSU mbw_mr mpirun one_core PASS | 0 | openmpi OSU multi_lat mpirun one_core PASS | 0 | openmpi OSU neighbor_allgather mpirun one_core PASS | 0 | openmpi OSU neighbor_allgatherv mpirun one_core PASS | 0 | openmpi OSU neighbor_alltoall mpirun one_core PASS | 0 | openmpi OSU neighbor_alltoallv mpirun one_core PASS | 0 | openmpi OSU neighbor_alltoallw mpirun one_core PASS | 0 | openmpi OSU put_bibw mpirun one_core PASS | 0 | openmpi OSU put_bw mpirun one_core PASS | 0 | openmpi OSU put_latency mpirun one_core PASS | 0 | openmpi OSU reduce mpirun one_core PASS | 0 | openmpi OSU reduce_scatter mpirun one_core PASS | 0 | openmpi OSU scatter mpirun one_core PASS | 0 | openmpi OSU scatterv mpirun one_core PASS | 0 | NON-ROOT IMB-MPI1 PingPong Checking for failures and known issues: no test failures