Bug 2216042

Summary: [RHEL-9.3] openmpi test fail in RDMA sanity test when tested
Product: Red Hat Enterprise Linux 9 Reporter: Brian Chae <bchae>
Component: openmpiAssignee: Kamal Heib <kheib>
Status: VERIFIED --- QA Contact: Brian Chae <bchae>
Severity: unspecified Docs Contact:
Priority: unspecified    
Version: 9.3CC: kheib, mschmidt, rdma-dev-team, tmichael, zguo
Target Milestone: rcKeywords: Triaged
Target Release: ---   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: openmpi-4.1.1-7.el9 Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description Brian Chae 2023-06-19 21:36:37 UTC
Description of problem:

All openmpi test cases fail on RDMA sanity test when tested on MLX5 IB/RoCE devices from RHEL-9.3.0-20230616.31 build and up.

      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv
      FAIL |      1 | openmpi mpitests-IMB-EXT Window
      FAIL |      1 | openmpi mpitests-osu_get_bw

This is a regression from RHEL-9.3.0-20230615.41 build, where all of the above openmpi test cases passed.

      PASS |      0 | openmpi mpitests-IMB-MPI1 PingPong
      PASS |      0 | openmpi mpitests-IMB-IO S_Read_indv
      PASS |      0 | openmpi mpitests-IMB-EXT Window
      PASS |      0 | openmpi mpitests-osu_get_bw



Version-Release number of selected component (if applicable):

Clients: rdma-dev-20
Servers: rdma-dev-19

DISTRO=RHEL-9.3.0-20230616.31

+ [23-06-19 16:09:39] cat /etc/redhat-release
Red Hat Enterprise Linux release 9.3 Beta (Plow)

+ [23-06-19 16:09:39] uname -a
Linux rdma-dev-20.rdma.lab.eng.rdu2.redhat.com 5.14.0-327.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Tue Jun 13 19:49:22 EDT 2023 x86_64 x86_64 x86_64 GNU/Linux

+ [23-06-19 16:09:39] cat /proc/cmdline
BOOT_IMAGE=(hd0,msdos1)/vmlinuz-5.14.0-327.el9.x86_64 root=UUID=40c67501-acf0-404f-979e-aae413fad56e ro intel_idle.max_cstate=0 processor.max_cstate=0 intel_iommu=on iommu=on console=tty0 rd_NO_PLYMOUTH crashkernel=1G-4G:192M,4G-64G:256M,64G-:512M resume=UUID=14257324-2a80-4d43-bbea-1948f9f0a77c console=ttyS1,115200n81

+ [23-06-19 16:09:39] rpm -q rdma-core linux-firmware
rdma-core-46.0-1.el9.x86_64
linux-firmware-20230404-134.el9.noarch

+ [23-06-19 16:09:39] tail /sys/class/infiniband/mlx5_2/fw_ver /sys/class/infiniband/mlx5_3/fw_ver /sys/class/infiniband/mlx5_bond_0/fw_ver
==> /sys/class/infiniband/mlx5_2/fw_ver <==
12.28.2006

==> /sys/class/infiniband/mlx5_3/fw_ver <==
12.28.2006

==> /sys/class/infiniband/mlx5_bond_0/fw_ver <==
14.31.1014
+ [23-06-19 16:09:39] lspci
+ [23-06-19 16:09:39] grep -i -e ethernet -e infiniband -e omni -e ConnectX
01:00.0 Ethernet controller: Broadcom Inc. and subsidiaries NetXtreme BCM5720 Gigabit Ethernet PCIe
01:00.1 Ethernet controller: Broadcom Inc. and subsidiaries NetXtreme BCM5720 Gigabit Ethernet PCIe
02:00.0 Ethernet controller: Broadcom Inc. and subsidiaries NetXtreme BCM5720 Gigabit Ethernet PCIe
02:00.1 Ethernet controller: Broadcom Inc. and subsidiaries NetXtreme BCM5720 Gigabit Ethernet PCIe
04:00.0 Ethernet controller: Mellanox Technologies MT27710 Family [ConnectX-4 Lx]
04:00.1 Ethernet controller: Mellanox Technologies MT27710 Family [ConnectX-4 Lx]
82:00.0 Infiniband controller: Mellanox Technologies MT27700 Family [ConnectX-4]
82:00.1 Infiniband controller: Mellanox Technologies MT27700 Family [ConnectX-4]

=======================================
Installed:
  mpitests-openmpi-5.8-1.el9.x86_64         openmpi-1:4.1.5-1.el9.x86_64      
=======================================



How reproducible:
100%

Steps to Reproduce:
1. Refer to the following RDMA sanity test logs.

https://beaker-archive.hosts.prod.psi.bos.redhat.com/beaker-logs/2023/06/79905/7990509/14116853/162007739/757634526/resultoutputfile.log

https://beaker-archive.hosts.prod.psi.bos.redhat.com/beaker-logs/2023/06/79904/7990462/14116804/162007516/757633371/resultoutputfile.log


2. timeout 3m /usr/lib64/openmpi/bin/mpirun --allow-run-as-root --map-by node -mca btl_openib_warn_nonexistent_if 0 -mca btl_openib_if_include mlx5_0:1 -mca btl '^openib' -mca pml ucx -mca btl_openib_cpc_include rdmacm -mca btl_openib_receive_queues P,65536,256,192,128 -mca btl_openib_allow_ib 1 --mca btl_base_verbose 100 --mca pml_ob1_verbose 100 -mca osc ucx -x UCX_NET_DEVICES=mlx5_roce.45 --mca osc_ucx_verbose 100 --mca pml_ucx_verbose 100 -hostfile /root/hfile_one_core -np 2 /usr/lib64/openmpi/bin/mpitests-IMB-MPI1 PingPong


3.

Actual results:

+ [23-06-19 16:00:49] timeout 3m /usr/lib64/openmpi/bin/mpirun --allow-run-as-root --map-by node -mca btl_openib_warn_nonexistent_if 0 -mca btl_openib_if_include mlx5_0:1 -mca btl '^openib' -mca pml ucx -mca btl_openib_cpc_include rdmacm -mca btl_openib_receive_queues P,65536,256,192,128 -mca btl_openib_allow_ib 1 --mca btl_base_verbose 100 --mca pml_ob1_verbose 100 -mca osc ucx -x UCX_NET_DEVICES=mlx5_roce.45 --mca osc_ucx_verbose 100 --mca pml_ucx_verbose 100 -hostfile /root/hfile_one_core -np 2 /usr/lib64/openmpi/bin/mpitests-IMB-MPI1 PingPong
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: registering framework btl components
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: found loaded component ofi
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: component ofi register function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: found loaded component self
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: component self register function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: found loaded component sm
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: found loaded component tcp
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: component tcp register function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: found loaded component usnic
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: component usnic register function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: found loaded component vader
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_register: component vader register function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: opening btl components
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: found loaded component ofi
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: component ofi open function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: found loaded component self
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: component self open function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: found loaded component tcp
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: component tcp open function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: found loaded component usnic
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: component usnic open function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: found loaded component vader
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: components_open: component vader open function successful
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: initializing btl component ofi
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: registering framework btl components
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: found loaded component ofi
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: component ofi register function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: found loaded component self
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: component self register function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: found loaded component sm
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: found loaded component tcp
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: component tcp register function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: found loaded component usnic
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: component usnic register function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: found loaded component vader
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_register: component vader register function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: opening btl components
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: found loaded component ofi
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: component ofi open function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: found loaded component self
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: component self open function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: found loaded component tcp
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: component tcp open function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: found loaded component usnic
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: component usnic open function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: found loaded component vader
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: components_open: component vader open function successful
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: initializing btl component ofi
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: init of component ofi returned success
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: initializing btl component self
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: init of component self returned success
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: initializing btl component tcp
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl: tcp: Searching for exclude address+prefix: 127.0.0.1 / 8
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl: tcp: Found match: 127.0.0.1 (lo)
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl: tcp: Using interface: sppp 
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: Attempting to bind to AF_INET port 1024
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: Successfully bound to AF_INET port 1024
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: my listening v4 socket is 0.0.0.0:1024
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_roce
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_roce
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib0
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib0
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib1
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib1
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib0.8002
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib0.8002
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib0.8012
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib0.8012
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib0.8010
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib0.8010
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib0.8004
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib0.8004
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib1.8009
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib1.8009
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib1.8013
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib1.8013
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib1.8011
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib1.8011
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib1.8005
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib1.8005
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib1.8003
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib1.8003
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_ib1.8007
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_ib1.8007
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_roce.45
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_roce.45
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface mlx5_roce.43
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface mlx5_roce.43
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: examining interface lab-bridge0
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:tcp: using ipv6 interface lab-bridge0
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: init of component tcp returned success
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: initializing btl component usnic
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] btl:usnic: disqualifiying myself due to fi_getinfo(3) failure: No data available (-61)
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: init of component usnic returned failure
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: close: component usnic closed
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: close: unloading component usnic
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: initializing btl component vader
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] select: init of component vader returned failure
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: close: component vader closed
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] mca: base: close: unloading component vader
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: init of component ofi returned success
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: initializing btl component self
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: init of component self returned success
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: initializing btl component tcp
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl: tcp: Searching for exclude address+prefix: 127.0.0.1 / 8
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl: tcp: Found match: 127.0.0.1 (lo)
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl: tcp: Using interface: sppp 
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: Attempting to bind to AF_INET port 1024
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: Successfully bound to AF_INET port 1024
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: my listening v4 socket is 0.0.0.0:1024
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_roce
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_roce
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib0
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib0
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib1
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib1
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib0.8002
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib0.8002
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib0.8012
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib0.8012
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib0.8010
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib0.8010
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib0.8004
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib0.8004
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib1.8009
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib1.8009
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib1.8013
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib1.8013
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib1.8011
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib1.8011
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib1.8005
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib1.8005
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib1.8003
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib1.8003
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_ib1.8007
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_ib1.8007
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_roce.45
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_roce.45
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface mlx5_roce.43
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface mlx5_roce.43
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: examining interface lab-bridge0
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:tcp: using ipv6 interface lab-bridge0
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: init of component tcp returned success
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: initializing btl component usnic
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] btl:usnic: disqualifiying myself due to fi_getinfo(3) failure: No data available (-61)
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: init of component usnic returned failure
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: close: component usnic closed
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: close: unloading component usnic
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: initializing btl component vader
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] select: init of component vader returned failure
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: close: component vader closed
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] mca: base: close: unloading component vader
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] common_ucx.c:174 using OPAL memory hooks as external events
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] pml_ucx.c:197 mca_pml_ucx_open: UCX version 1.14.1
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] common_ucx.c:174 using OPAL memory hooks as external events
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] pml_ucx.c:197 mca_pml_ucx_open: UCX version 1.14.1
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] common_ucx.c:332 self/memory: did not match transport list
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] common_ucx.c:332 tcp/mlx5_roce.45: did not match transport list
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] common_ucx.c:332 sysv/memory: did not match transport list
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] common_ucx.c:332 posix/memory: did not match transport list
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] common_ucx.c:337 support level is none
--------------------------------------------------------------------------
No components were able to be opened in the pml framework.

This typically means that either no components of this type were
installed, or none of the installed components can be loaded.
Sometimes this means that shared libraries required by these
components are unable to be found/loaded.

  Host:      rdma-dev-22
  Framework: pml
--------------------------------------------------------------------------
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65022] PML ucx cannot be selected
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] common_ucx.c:332 self/memory: did not match transport list
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] common_ucx.c:332 tcp/mlx5_roce.45: did not match transport list
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] common_ucx.c:332 sysv/memory: did not match transport list
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] common_ucx.c:332 posix/memory: did not match transport list
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] common_ucx.c:337 support level is none
[rdma-dev-21.rdma.lab.eng.rdu2.redhat.com:64965] PML ucx cannot be selected
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65017] 1 more process has sent help message help-mca-base.txt / find-available:none found
[rdma-dev-22.rdma.lab.eng.rdu2.redhat.com:65017] Set MCA parameter "orte_base_help_aggregate" to 0 to see all help / error messages
+ [23-06-19 16:00:53] mpi_return=1
+ [23-06-19 16:00:53] RQA_check_result -r 1 -t 'openmpi mpitests-IMB-MPI1 PingPong'



Expected results:

+ [23-06-19 16:46:22] timeout 3m /usr/lib64/openmpi/bin/mpirun --allow-run-as-root --map-by node -mca btl_openib_warn_nonexistent_if 0 -mca btl_openib_if_include mlx5_2:1 -mca mtl '^psm2,psm,ofi' -mca btl '^openib' -mca btl_openib_allow_ib 1 --mca mtl_openib_verbose 100 --mca btl_base_verbose 100 -mca pml ucx -mca osc ucx -x UCX_NET_DEVICES=mlx5_ib0 --mca osc_ucx_verbose 100 --mca pml_ucx_verbose 100 -hostfile /root/hfile_one_core -np 2 /usr/lib64/openmpi/bin/mpitests-IMB-MPI1 PingPong
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: registering framework btl components
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: found loaded component ofi
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: component ofi register function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: found loaded component self
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: component self register function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: found loaded component sm
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: found loaded component tcp
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: component tcp register function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: found loaded component usnic
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: component usnic register function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: found loaded component vader
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_register: component vader register function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: opening btl components
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: found loaded component ofi
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: component ofi open function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: found loaded component self
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: component self open function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: found loaded component tcp
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: component tcp open function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: found loaded component usnic
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: component usnic open function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: found loaded component vader
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: components_open: component vader open function successful
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: initializing btl component ofi
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: registering framework btl components
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: found loaded component ofi
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: component ofi register function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: found loaded component self
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: component self register function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: found loaded component sm
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: found loaded component tcp
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: component tcp register function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: found loaded component usnic
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: component usnic register function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: found loaded component vader
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_register: component vader register function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: opening btl components
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: found loaded component ofi
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: component ofi open function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: found loaded component self
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: component self open function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: found loaded component tcp
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: component tcp open function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: found loaded component usnic
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: component usnic open function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: found loaded component vader
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: components_open: component vader open function successful
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: initializing btl component ofi
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: init of component ofi returned success
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: initializing btl component self
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: init of component self returned success
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: initializing btl component tcp
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl: tcp: Searching for exclude address+prefix: 127.0.0.1 / 8
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl: tcp: Found match: 127.0.0.1 (lo)
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: init of component ofi returned success
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: initializing btl component self
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: init of component self returned success
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: initializing btl component tcp
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: Attempting to bind to AF_INET port 1024
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl: tcp: Searching for exclude address+prefix: 127.0.0.1 / 8
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl: tcp: Found match: 127.0.0.1 (lo)
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: Successfully bound to AF_INET port 1024
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: my listening v4 socket is 0.0.0.0:1024
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib0
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib0
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib1
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib1
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib0.8002
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib0.8002
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib0.8012
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib0.8012
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib0.8010
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib0.8010
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib0.8004
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib0.8004
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib1.8009
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib1.8009
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib1.8013
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib1.8013
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib1.8011
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib1.8011
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib1.8005
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib1.8005
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib1.8003
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib1.8003
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_ib1.8007
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_ib1.8007
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_team_roce
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_team_roce
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_team_ro.43
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_team_ro.43
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface mlx5_team_ro.45
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface mlx5_team_ro.45
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: examining interface lab-bridge0
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:tcp: using ipv6 interface lab-bridge0
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: init of component tcp returned success
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: initializing btl component usnic
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] btl:usnic: disqualifiying myself due to fi_getinfo(3) failure: No data available (-61)
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: init of component usnic returned failure
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: component usnic closed
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: unloading component usnic
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: initializing btl component vader
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] select: init of component vader returned failure
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: component vader closed
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: unloading component vader
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: Attempting to bind to AF_INET port 1024
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: Successfully bound to AF_INET port 1024
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: my listening v4 socket is 0.0.0.0:1024
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib0
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib0
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib1
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib1
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib0.8002
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib0.8002
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib0.8012
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib0.8012
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib0.8010
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib0.8010
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib0.8004
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib0.8004
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib1.8009
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib1.8009
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib1.8013
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib1.8013
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib1.8011
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib1.8011
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib1.8005
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib1.8005
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib1.8003
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib1.8003
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_ib1.8007
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_ib1.8007
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface lab-bridge0
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface lab-bridge0
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_bond_roce
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_bond_roce
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_bond_ro.45
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_bond_ro.45
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: examining interface mlx5_bond_ro.43
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:tcp: using ipv6 interface mlx5_bond_ro.43
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: init of component tcp returned success
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: initializing btl component usnic
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] btl:usnic: disqualifiying myself due to fi_getinfo(3) failure: No data available (-61)
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: init of component usnic returned failure
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: component usnic closed
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: unloading component usnic
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: initializing btl component vader
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] select: init of component vader returned failure
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: component vader closed
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: unloading component vader
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:197 mca_pml_ucx_open: UCX version 1.14.1
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:197 mca_pml_ucx_open: UCX version 1.14.1
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:289 mca_pml_ucx_init
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:289 mca_pml_ucx_init
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:114 Pack remote worker address, size 38
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:114 Pack local worker address, size 141
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:351 created ucp context 0x55d14dba03a0, worker 0x55d14dae4000
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:114 Pack remote worker address, size 38
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:114 Pack local worker address, size 141
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:351 created ucp context 0x561ff184e080, worker 0x561ff19e3150
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:182 Got proc 0 address, size 141
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:411 connecting to proc. 0
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:182 Got proc 1 address, size 141
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:411 connecting to proc. 1
#----------------------------------------------------------------
#    Intel(R) MPI Benchmarks 2021.3, MPI-1 part
#----------------------------------------------------------------
# Date                  : Mon Jun 19 16:46:23 2023
# Machine               : x86_64
# System                : Linux
# Release               : 5.14.0-327.el9.x86_64
# Version               : #1 SMP PREEMPT_DYNAMIC Tue Jun 13 19:49:22 EDT 2023
# MPI Version           : 3.1
# MPI Thread Environment: 


# Calling sequence was: 

# /usr/lib64/openmpi/bin/mpitests-IMB-MPI1 PingPong 

# Minimum message length in bytes:   0
# Maximum message length in bytes:   4194304
#
# MPI_Datatype                   :   MPI_BYTE 
# MPI_Datatype for reductions    :   MPI_FLOAT 
# MPI_Op                         :   MPI_SUM  
# 
# 

# List of Benchmarks to run:

# PingPong
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:182 Got proc 0 address, size 38
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:411 connecting to proc. 0
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:182 Got proc 1 address, size 38
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:411 connecting to proc. 1

#---------------------------------------------------
# Benchmarking PingPong 
# #processes = 2 
#---------------------------------------------------
       #bytes #repetitions      t[usec]   Mbytes/sec
            0         1000        10.87         0.00
            1         1000        10.81         0.09
            2         1000        10.78         0.19
            4         1000        10.76         0.37
            8         1000        10.76         0.74
           16         1000        10.74         1.49
           32         1000        10.79         2.97
           64         1000        10.84         5.90
          128         1000        10.93        11.71
          256         1000        11.28        22.69
          512         1000        11.65        43.96
         1024         1000        12.17        84.11
         2048         1000        23.09        88.71
         4096         1000        26.56       154.22
         8192         1000        34.64       236.49
        16384         1000        42.34       387.00
        32768         1000        81.86       400.27
        65536          640        69.22       946.73
       131072          320       165.42       792.35
       262144          160       205.18      1277.61
       524288           80       315.45      1662.05
      1048576           40       435.90      2405.54
      2097152           20       692.27      3029.38
      4194304           10      1279.65      3277.68


# All processes entering MPI_Finalize

[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] common_ucx.c:240 disconnecting from rank 0
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] common_ucx.c:204 waiting for 1 disconnect requests
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] common_ucx.c:240 disconnecting from rank 0
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] common_ucx.c:240 disconnecting from rank 1
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] common_ucx.c:204 waiting for 1 disconnect requests
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] common_ucx.c:204 waiting for 0 disconnect requests
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] common_ucx.c:240 disconnecting from rank 1
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] common_ucx.c:204 waiting for 0 disconnect requests
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:367 mca_pml_ucx_cleanup
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:367 mca_pml_ucx_cleanup
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] pml_ucx.c:268 mca_pml_ucx_close
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] pml_ucx.c:268 mca_pml_ucx_close
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: component ofi closed
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: unloading component ofi
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: component ofi closed
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: unloading component ofi
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: component self closed
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: unloading component self
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: component tcp closed
[rdma-dev-19.rdma.lab.eng.rdu2.redhat.com:67038] mca: base: close: unloading component tcp
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: component self closed
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: unloading component self
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: component tcp closed
[rdma-dev-20.rdma.lab.eng.rdu2.redhat.com:66982] mca: base: close: unloading component tcp
+ [23-06-19 16:46:27] mpi_return=0
+ [23-06-19 16:46:27] RQA_check_result -r 0 -t 'openmpi mpitests-IMB-MPI1 PingPong'


Additional info:

The last good openmpi pkg version was

Installed:
  mpitests-openmpi-5.8-1.el9.x86_64         openmpi-1:4.1.1-5.el9.x86_64       
  openmpi-devel-1:4.1.1-5.el9.x86_64       

The last build with the above openmpi version was RHEL-9.3.0-20230615.41

Refer to the following test logs:

RHEL-9.3.0-20230615.41 (bz 2212516 ON_QA ver) - MLX5 IB sanity (rdma-dev-19/rdma-dev-20)
https://beaker.engineering.redhat.com/jobs/7990681

RHEL-9.3.0-20230615.41 (bz 2212516 ON_QA ver) - MLX5 ROCE sanity (rdma-dev-21/rdma-dev-22)
https://beaker.engineering.redhat.com/jobs/7990683



RHEL-9.3.0-20230615.41 (bz 2212516 ON_QA ver) - MLX5 IB sanity (rdma-virt-02/rdma-virt-03)
https://beaker.engineering.redhat.com/jobs/7990518

Comment 1 Brian Chae 2023-06-29 12:08:46 UTC
This regression is for all RDMA HCAs.

For BNXT ROCE


RHEL-9.3.0-20230619.35 - BXNT ROCE sanity (rdma-qe-24/rdma-qe-25)

sanity test results on rdma-qe-24/rdma-qe-25 & Beaker job J:8020541:
5.14.0-327.el9.x86_64, rdma-core-46.0-1.el9, bnxt_en, roce.45, BCM57414 & bnxt_re3
    Result | Status | Test
  ---------+--------+------------------------------------
      FAIL |      1 | ibstatus reported expected HCA rate
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      SKIP |    777 | ibsrpdm
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.45.25
      PASS |      0 | ping6 self - fe80::20a:f7ff:fec5:a3a1%bnxt_roce.45
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.45.24
      PASS |      0 | ping6 server - fe80::20a:f7ff:fec5:b501%bnxt_roce.45
      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv
      FAIL |      1 | openmpi mpitests-IMB-EXT Window
      FAIL |      1 | openmpi mpitests-osu_get_bw
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK
Checking for failures and known issues:

Comment 2 Brian Chae 2023-07-24 21:08:12 UTC
Initial tests on MLX5 IB0, RoCE devices looked good.

sanity test results on rdma-perf-00/rdma-perf-01 & Beaker job J:8100123:
5.14.0-341.el9.x86_64, rdma-core-46.0-1.el9, mlx4, ib0, ConnectX-3 & mlx4_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx4_ib
      PASS |      0 | load module mlx4_en
      PASS |      0 | load module mlx4_core
      PASS |      0 | enable opensm
      PASS |      0 | restart opensm
      PASS |      0 | osmtest -f c -g 0xf4521403007be1b1
      PASS |      0 | stop opensm
      PASS |      0 | disable opensm
      FAIL |      1 | ibstatus reported expected HCA rate
      PASS |      0 | pkey mlx4_ib0.8080 create/delete
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      PASS |      0 | /usr/sbin/ibsrpdm -vc
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.0.181
      PASS |      0 | ping6 self - fe80::f652:1403:7b:e1b1%mlx4_ib0
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.0.180
      PASS |      0 | ping6 server - fe80::202:c903:31:7791%mlx4_ib0
      PASS |      0 | openmpi mpitests-IMB-MPI1 PingPong
      PASS |      0 | openmpi mpitests-IMB-IO S_Read_indv
      PASS |      0 | openmpi mpitests-IMB-EXT Window
      PASS |      0 | openmpi mpitests-osu_get_bw
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK
Checking for failures and known issues:
  ibstatus reported expected HCA rate is a known issue on RHEL-9.3 over ConnectX-3 mlx4/ib0 - see Known config issue
  /usr/share/pmix/test/pmix_test is a known issue on RHEL-9.3 over ConnectX-3 mlx4/ib0 - see bz2176561
  no new test failures

sanity test results on rdma-perf-00/rdma-perf-01 & Beaker job J:8100123:
5.14.0-341.el9.x86_64, rdma-core-46.0-1.el9, mlx4, roce.45, ConnectX-3 & mlx4_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx4_ib
      PASS |      0 | load module mlx4_en
      PASS |      0 | load module mlx4_core
      FAIL |      1 | ibstatus reported expected HCA rate
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      SKIP |    777 | ibsrpdm
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.45.181
      PASS |      0 | ping6 self - fe80::f652:14ff:fe7b:e1b2%mlx4_roce.45
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.45.180
      PASS |      0 | ping6 server - fe80::202:c9ff:fe31:7791%mlx4_roce.45
      PASS |      0 | openmpi mpitests-IMB-MPI1 PingPong
      PASS |      0 | openmpi mpitests-IMB-IO S_Read_indv
      PASS |      0 | openmpi mpitests-IMB-EXT Window
      PASS |      0 | openmpi mpitests-osu_get_bw
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK
Checking for failures and known issues:
  ibstatus reported expected HCA rate is a known issue on RHEL-9.3 over ConnectX-3 mlx4/roce.45 - see Known config issue
  /usr/share/pmix/test/pmix_test is a known issue on RHEL-9.3 over ConnectX-3 mlx4/roce.45 - see bz2176561
  no new test failures

mpi-openmpi test results on rdma-perf-00/rdma-perf-01 & Beaker job J:8100123:
5.14.0-341.el9.x86_64, rdma-core-46.0-1.el9, mlx4, ib0, ConnectX-3 & mlx4_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | openmpi IMB-MPI1 PingPong mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 PingPing mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Sendrecv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Exchange mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Bcast mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatterv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoall mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoallv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allreduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Barrier mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-EXT Window mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibcast mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatterv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoall mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoallv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallreduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibarrier mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_all_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Fetch_and_op mpirun one_core
      PASS |      0 | openmpi IMB-RMA Compare_and_swap mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_all_local mpirun one_core
      PASS |      0 | openmpi OSU acc_latency mpirun one_core
      PASS |      0 | openmpi OSU allgather mpirun one_core
      PASS |      0 | openmpi OSU allgatherv mpirun one_core
      PASS |      0 | openmpi OSU allreduce mpirun one_core
      PASS |      0 | openmpi OSU alltoall mpirun one_core
      PASS |      0 | openmpi OSU alltoallv mpirun one_core
      PASS |      0 | openmpi OSU barrier mpirun one_core
      PASS |      0 | openmpi OSU bcast mpirun one_core
      PASS |      0 | openmpi OSU bibw mpirun one_core
      PASS |      0 | openmpi OSU bw mpirun one_core
      PASS |      0 | openmpi OSU cas_latency mpirun one_core
      PASS |      0 | openmpi OSU fop_latency mpirun one_core
      PASS |      0 | openmpi OSU gather mpirun one_core
      PASS |      0 | openmpi OSU gatherv mpirun one_core
      PASS |      0 | openmpi OSU get_acc_latency mpirun one_core
      PASS |      0 | openmpi OSU get_bw mpirun one_core
      PASS |      0 | openmpi OSU get_latency mpirun one_core
      PASS |      0 | openmpi OSU hello mpirun one_core
      PASS |      0 | openmpi OSU iallgather mpirun one_core
      PASS |      0 | openmpi OSU iallgatherv mpirun one_core
      PASS |      0 | openmpi OSU iallreduce mpirun one_core
      PASS |      0 | openmpi OSU ialltoall mpirun one_core
      PASS |      0 | openmpi OSU ialltoallv mpirun one_core
      PASS |      0 | openmpi OSU ialltoallw mpirun one_core
      PASS |      0 | openmpi OSU ibarrier mpirun one_core
      PASS |      0 | openmpi OSU ibcast mpirun one_core
      PASS |      0 | openmpi OSU igather mpirun one_core
      PASS |      0 | openmpi OSU igatherv mpirun one_core
      PASS |      0 | openmpi OSU init mpirun one_core
      PASS |      0 | openmpi OSU ireduce mpirun one_core
      PASS |      0 | openmpi OSU iscatter mpirun one_core
      PASS |      0 | openmpi OSU iscatterv mpirun one_core
      PASS |      0 | openmpi OSU latency mpirun one_core
      PASS |      0 | openmpi OSU latency_mp mpirun one_core
      PASS |      0 | openmpi OSU mbw_mr mpirun one_core
      PASS |      0 | openmpi OSU multi_lat mpirun one_core
      PASS |      0 | openmpi OSU put_bibw mpirun one_core
      PASS |      0 | openmpi OSU put_bw mpirun one_core
      PASS |      0 | openmpi OSU put_latency mpirun one_core
      PASS |      0 | openmpi OSU reduce mpirun one_core
      PASS |      0 | openmpi OSU reduce_scatter mpirun one_core
      PASS |      0 | openmpi OSU scatter mpirun one_core
      PASS |      0 | openmpi OSU scatterv mpirun one_core
      PASS |      0 | NON-ROOT IMB-MPI1 PingPong
Checking for failures and known issues:
  no test failures

mpi-openmpi test results on rdma-perf-00/rdma-perf-01 & Beaker job J:8100123:
5.14.0-341.el9.x86_64, rdma-core-46.0-1.el9, mlx4, roce.45, ConnectX-3 & mlx4_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | openmpi IMB-MPI1 PingPong mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 PingPing mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Sendrecv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Exchange mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Bcast mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatterv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoall mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoallv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allreduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Barrier mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-EXT Window mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibcast mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatterv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoall mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoallv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallreduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibarrier mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_all_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Fetch_and_op mpirun one_core
      PASS |      0 | openmpi IMB-RMA Compare_and_swap mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_all_local mpirun one_core
      PASS |      0 | openmpi OSU acc_latency mpirun one_core
      PASS |      0 | openmpi OSU allgather mpirun one_core
      PASS |      0 | openmpi OSU allgatherv mpirun one_core
      PASS |      0 | openmpi OSU allreduce mpirun one_core
      PASS |      0 | openmpi OSU alltoall mpirun one_core
      PASS |      0 | openmpi OSU alltoallv mpirun one_core
      PASS |      0 | openmpi OSU barrier mpirun one_core
      PASS |      0 | openmpi OSU bcast mpirun one_core
      PASS |      0 | openmpi OSU bibw mpirun one_core
      PASS |      0 | openmpi OSU bw mpirun one_core
      PASS |      0 | openmpi OSU cas_latency mpirun one_core
      PASS |      0 | openmpi OSU fop_latency mpirun one_core
      PASS |      0 | openmpi OSU gather mpirun one_core
      PASS |      0 | openmpi OSU gatherv mpirun one_core
      PASS |      0 | openmpi OSU get_acc_latency mpirun one_core
      PASS |      0 | openmpi OSU get_bw mpirun one_core
      PASS |      0 | openmpi OSU get_latency mpirun one_core
      PASS |      0 | openmpi OSU hello mpirun one_core
      PASS |      0 | openmpi OSU iallgather mpirun one_core
      PASS |      0 | openmpi OSU iallgatherv mpirun one_core
      PASS |      0 | openmpi OSU iallreduce mpirun one_core
      PASS |      0 | openmpi OSU ialltoall mpirun one_core
      PASS |      0 | openmpi OSU ialltoallv mpirun one_core
      PASS |      0 | openmpi OSU ialltoallw mpirun one_core
      PASS |      0 | openmpi OSU ibarrier mpirun one_core
      PASS |      0 | openmpi OSU ibcast mpirun one_core
      PASS |      0 | openmpi OSU igather mpirun one_core
      PASS |      0 | openmpi OSU igatherv mpirun one_core
      PASS |      0 | openmpi OSU init mpirun one_core
      PASS |      0 | openmpi OSU ireduce mpirun one_core
      PASS |      0 | openmpi OSU iscatter mpirun one_core
      PASS |      0 | openmpi OSU iscatterv mpirun one_core
      PASS |      0 | openmpi OSU latency mpirun one_core
      PASS |      0 | openmpi OSU latency_mp mpirun one_core
      PASS |      0 | openmpi OSU mbw_mr mpirun one_core
      PASS |      0 | openmpi OSU multi_lat mpirun one_core
      PASS |      0 | openmpi OSU put_bibw mpirun one_core
      PASS |      0 | openmpi OSU put_bw mpirun one_core
      PASS |      0 | openmpi OSU put_latency mpirun one_core
      PASS |      0 | openmpi OSU reduce mpirun one_core
      PASS |      0 | openmpi OSU reduce_scatter mpirun one_core
      PASS |      0 | openmpi OSU scatter mpirun one_core
      PASS |      0 | openmpi OSU scatterv mpirun one_core
      PASS |      0 | NON-ROOT IMB-MPI1 PingPong
Checking for failures and known issues:
  no test failures

Tests on other HCAs will be updated soon...

Comment 3 Brian Chae 2023-07-24 21:09:26 UTC
(In reply to Brian Chae from comment #2)
> Initial tests on MLX5 IB0, RoCE devices looked good.
> 
> sanity test results on rdma-perf-00/rdma-perf-01 & Beaker job J:8100123:
> 5.14.0-341.el9.x86_64, rdma-core-46.0-1.el9, mlx4, ib0, ConnectX-3 & mlx4_0
>     Result | Status | Test
>   ---------+--------+------------------------------------
>       PASS |      0 | load module mlx4_ib
>       PASS |      0 | load module mlx4_en
>       PASS |      0 | load module mlx4_core
>       PASS |      0 | enable opensm
>       PASS |      0 | restart opensm
>       PASS |      0 | osmtest -f c -g 0xf4521403007be1b1
>       PASS |      0 | stop opensm
>       PASS |      0 | disable opensm
>       FAIL |      1 | ibstatus reported expected HCA rate
>       PASS |      0 | pkey mlx4_ib0.8080 create/delete
>       PASS |      0 | /usr/sbin/ibstat
>       PASS |      0 | /usr/sbin/ibstatus
>       PASS |      0 | systemctl start srp_daemon.service
>       PASS |      0 | /usr/sbin/ibsrpdm -vc
>       PASS |      0 | systemctl stop srp_daemon
>       PASS |      0 | ping self - 172.31.0.181
>       PASS |      0 | ping6 self - fe80::f652:1403:7b:e1b1%mlx4_ib0
>       FAIL |    127 | /usr/share/pmix/test/pmix_test
>       PASS |      0 | ping server - 172.31.0.180
>       PASS |      0 | ping6 server - fe80::202:c903:31:7791%mlx4_ib0
>       PASS |      0 | openmpi mpitests-IMB-MPI1 PingPong
>       PASS |      0 | openmpi mpitests-IMB-IO S_Read_indv
>       PASS |      0 | openmpi mpitests-IMB-EXT Window
>       PASS |      0 | openmpi mpitests-osu_get_bw
>       PASS |      0 | ip multicast addr
>       PASS |      0 | rping
>       PASS |      0 | rcopy
>       PASS |      0 | ib_read_bw
>       PASS |      0 | ib_send_bw
>       PASS |      0 | ib_write_bw
>       PASS |      0 | iser login
>       PASS |      0 | mount /dev/sdb /iser
>       PASS |      0 | iser write 1K
>       PASS |      0 | iser write 1M
>       PASS |      0 | iser write 1G
>       PASS |      0 | nfsordma mount - XFS_EXT
>       PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
>       PASS |      0 | nfsordma umount - XFS_EXT
>       PASS |      0 | nfsordma mount - RAMDISK
>       PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
>       PASS |      0 | nfsordma umount - RAMDISK
> Checking for failures and known issues:
>   ibstatus reported expected HCA rate is a known issue on RHEL-9.3 over
> ConnectX-3 mlx4/ib0 - see Known config issue
>   /usr/share/pmix/test/pmix_test is a known issue on RHEL-9.3 over
> ConnectX-3 mlx4/ib0 - see Red Hatbz2176561
>   no new test failures
> 
> sanity test results on rdma-perf-00/rdma-perf-01 & Beaker job J:8100123:
> 5.14.0-341.el9.x86_64, rdma-core-46.0-1.el9, mlx4, roce.45, ConnectX-3 &
> mlx4_0
>     Result | Status | Test
>   ---------+--------+------------------------------------
>       PASS |      0 | load module mlx4_ib
>       PASS |      0 | load module mlx4_en
>       PASS |      0 | load module mlx4_core
>       FAIL |      1 | ibstatus reported expected HCA rate
>       PASS |      0 | /usr/sbin/ibstat
>       PASS |      0 | /usr/sbin/ibstatus
>       PASS |      0 | systemctl start srp_daemon.service
>       SKIP |    777 | ibsrpdm
>       PASS |      0 | systemctl stop srp_daemon
>       PASS |      0 | ping self - 172.31.45.181
>       PASS |      0 | ping6 self - fe80::f652:14ff:fe7b:e1b2%mlx4_roce.45
>       FAIL |    127 | /usr/share/pmix/test/pmix_test
>       PASS |      0 | ping server - 172.31.45.180
>       PASS |      0 | ping6 server - fe80::202:c9ff:fe31:7791%mlx4_roce.45
>       PASS |      0 | openmpi mpitests-IMB-MPI1 PingPong
>       PASS |      0 | openmpi mpitests-IMB-IO S_Read_indv
>       PASS |      0 | openmpi mpitests-IMB-EXT Window
>       PASS |      0 | openmpi mpitests-osu_get_bw
>       PASS |      0 | ip multicast addr
>       PASS |      0 | rping
>       PASS |      0 | rcopy
>       PASS |      0 | ib_read_bw
>       PASS |      0 | ib_send_bw
>       PASS |      0 | ib_write_bw
>       PASS |      0 | iser login
>       PASS |      0 | mount /dev/sdb /iser
>       PASS |      0 | iser write 1K
>       PASS |      0 | iser write 1M
>       PASS |      0 | iser write 1G
>       PASS |      0 | nfsordma mount - XFS_EXT
>       PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
>       PASS |      0 | nfsordma umount - XFS_EXT
>       PASS |      0 | nfsordma mount - RAMDISK
>       PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
>       PASS |      0 | nfsordma umount - RAMDISK
> Checking for failures and known issues:
>   ibstatus reported expected HCA rate is a known issue on RHEL-9.3 over
> ConnectX-3 mlx4/roce.45 - see Known config issue
>   /usr/share/pmix/test/pmix_test is a known issue on RHEL-9.3 over
> ConnectX-3 mlx4/roce.45 - see Red Hatbz2176561
>   no new test failures
> 
> mpi-openmpi test results on rdma-perf-00/rdma-perf-01 & Beaker job J:8100123:
> 5.14.0-341.el9.x86_64, rdma-core-46.0-1.el9, mlx4, ib0, ConnectX-3 & mlx4_0
>     Result | Status | Test
>   ---------+--------+------------------------------------
>       PASS |      0 | openmpi IMB-MPI1 PingPong mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 PingPing mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Sendrecv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Exchange mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Bcast mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Allgather mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Allgatherv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Gather mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Gatherv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Scatter mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Scatterv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Alltoall mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Alltoallv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Reduce mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Allreduce mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Barrier mpirun one_core
>       PASS |      0 | openmpi IMB-IO S_Write_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO S_Read_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO S_Write_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO S_Read_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Write_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Read_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Write_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Read_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Write_shared mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Read_shared mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Write_priv mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Read_priv mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Write_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Read_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Write_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Read_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Write_shared mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Read_shared mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Window mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Unidir_Put mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Unidir_Get mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Bidir_Get mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Bidir_Put mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Accumulate mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ibcast mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iallgather mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iallgatherv mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Igather mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Igatherv mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iscatter mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iscatterv mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ialltoall mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ialltoallv mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ireduce mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iallreduce mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ibarrier mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Unidir_put mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Unidir_get mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Bidir_put mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Bidir_get mpirun one_core
>       PASS |      0 | openmpi IMB-RMA One_put_all mpirun one_core
>       PASS |      0 | openmpi IMB-RMA One_get_all mpirun one_core
>       PASS |      0 | openmpi IMB-RMA All_put_all mpirun one_core
>       PASS |      0 | openmpi IMB-RMA All_get_all mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Put_local mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Put_all_local mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Exchange_put mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Exchange_get mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Accumulate mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Get_accumulate mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Fetch_and_op mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Compare_and_swap mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Get_local mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Get_all_local mpirun one_core
>       PASS |      0 | openmpi OSU acc_latency mpirun one_core
>       PASS |      0 | openmpi OSU allgather mpirun one_core
>       PASS |      0 | openmpi OSU allgatherv mpirun one_core
>       PASS |      0 | openmpi OSU allreduce mpirun one_core
>       PASS |      0 | openmpi OSU alltoall mpirun one_core
>       PASS |      0 | openmpi OSU alltoallv mpirun one_core
>       PASS |      0 | openmpi OSU barrier mpirun one_core
>       PASS |      0 | openmpi OSU bcast mpirun one_core
>       PASS |      0 | openmpi OSU bibw mpirun one_core
>       PASS |      0 | openmpi OSU bw mpirun one_core
>       PASS |      0 | openmpi OSU cas_latency mpirun one_core
>       PASS |      0 | openmpi OSU fop_latency mpirun one_core
>       PASS |      0 | openmpi OSU gather mpirun one_core
>       PASS |      0 | openmpi OSU gatherv mpirun one_core
>       PASS |      0 | openmpi OSU get_acc_latency mpirun one_core
>       PASS |      0 | openmpi OSU get_bw mpirun one_core
>       PASS |      0 | openmpi OSU get_latency mpirun one_core
>       PASS |      0 | openmpi OSU hello mpirun one_core
>       PASS |      0 | openmpi OSU iallgather mpirun one_core
>       PASS |      0 | openmpi OSU iallgatherv mpirun one_core
>       PASS |      0 | openmpi OSU iallreduce mpirun one_core
>       PASS |      0 | openmpi OSU ialltoall mpirun one_core
>       PASS |      0 | openmpi OSU ialltoallv mpirun one_core
>       PASS |      0 | openmpi OSU ialltoallw mpirun one_core
>       PASS |      0 | openmpi OSU ibarrier mpirun one_core
>       PASS |      0 | openmpi OSU ibcast mpirun one_core
>       PASS |      0 | openmpi OSU igather mpirun one_core
>       PASS |      0 | openmpi OSU igatherv mpirun one_core
>       PASS |      0 | openmpi OSU init mpirun one_core
>       PASS |      0 | openmpi OSU ireduce mpirun one_core
>       PASS |      0 | openmpi OSU iscatter mpirun one_core
>       PASS |      0 | openmpi OSU iscatterv mpirun one_core
>       PASS |      0 | openmpi OSU latency mpirun one_core
>       PASS |      0 | openmpi OSU latency_mp mpirun one_core
>       PASS |      0 | openmpi OSU mbw_mr mpirun one_core
>       PASS |      0 | openmpi OSU multi_lat mpirun one_core
>       PASS |      0 | openmpi OSU put_bibw mpirun one_core
>       PASS |      0 | openmpi OSU put_bw mpirun one_core
>       PASS |      0 | openmpi OSU put_latency mpirun one_core
>       PASS |      0 | openmpi OSU reduce mpirun one_core
>       PASS |      0 | openmpi OSU reduce_scatter mpirun one_core
>       PASS |      0 | openmpi OSU scatter mpirun one_core
>       PASS |      0 | openmpi OSU scatterv mpirun one_core
>       PASS |      0 | NON-ROOT IMB-MPI1 PingPong
> Checking for failures and known issues:
>   no test failures
> 
> mpi-openmpi test results on rdma-perf-00/rdma-perf-01 & Beaker job J:8100123:
> 5.14.0-341.el9.x86_64, rdma-core-46.0-1.el9, mlx4, roce.45, ConnectX-3 &
> mlx4_0
>     Result | Status | Test
>   ---------+--------+------------------------------------
>       PASS |      0 | openmpi IMB-MPI1 PingPong mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 PingPing mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Sendrecv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Exchange mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Bcast mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Allgather mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Allgatherv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Gather mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Gatherv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Scatter mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Scatterv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Alltoall mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Alltoallv mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Reduce mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Allreduce mpirun one_core
>       PASS |      0 | openmpi IMB-MPI1 Barrier mpirun one_core
>       PASS |      0 | openmpi IMB-IO S_Write_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO S_Read_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO S_Write_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO S_Read_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Write_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Read_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Write_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Read_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Write_shared mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Read_shared mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Write_priv mpirun one_core
>       PASS |      0 | openmpi IMB-IO P_Read_priv mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Write_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Read_indv mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Write_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Read_expl mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Write_shared mpirun one_core
>       PASS |      0 | openmpi IMB-IO C_Read_shared mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Window mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Unidir_Put mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Unidir_Get mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Bidir_Get mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Bidir_Put mpirun one_core
>       PASS |      0 | openmpi IMB-EXT Accumulate mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ibcast mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iallgather mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iallgatherv mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Igather mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Igatherv mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iscatter mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iscatterv mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ialltoall mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ialltoallv mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ireduce mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Iallreduce mpirun one_core
>       PASS |      0 | openmpi IMB-NBC Ibarrier mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Unidir_put mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Unidir_get mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Bidir_put mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Bidir_get mpirun one_core
>       PASS |      0 | openmpi IMB-RMA One_put_all mpirun one_core
>       PASS |      0 | openmpi IMB-RMA One_get_all mpirun one_core
>       PASS |      0 | openmpi IMB-RMA All_put_all mpirun one_core
>       PASS |      0 | openmpi IMB-RMA All_get_all mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Put_local mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Put_all_local mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Exchange_put mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Exchange_get mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Accumulate mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Get_accumulate mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Fetch_and_op mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Compare_and_swap mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Get_local mpirun one_core
>       PASS |      0 | openmpi IMB-RMA Get_all_local mpirun one_core
>       PASS |      0 | openmpi OSU acc_latency mpirun one_core
>       PASS |      0 | openmpi OSU allgather mpirun one_core
>       PASS |      0 | openmpi OSU allgatherv mpirun one_core
>       PASS |      0 | openmpi OSU allreduce mpirun one_core
>       PASS |      0 | openmpi OSU alltoall mpirun one_core
>       PASS |      0 | openmpi OSU alltoallv mpirun one_core
>       PASS |      0 | openmpi OSU barrier mpirun one_core
>       PASS |      0 | openmpi OSU bcast mpirun one_core
>       PASS |      0 | openmpi OSU bibw mpirun one_core
>       PASS |      0 | openmpi OSU bw mpirun one_core
>       PASS |      0 | openmpi OSU cas_latency mpirun one_core
>       PASS |      0 | openmpi OSU fop_latency mpirun one_core
>       PASS |      0 | openmpi OSU gather mpirun one_core
>       PASS |      0 | openmpi OSU gatherv mpirun one_core
>       PASS |      0 | openmpi OSU get_acc_latency mpirun one_core
>       PASS |      0 | openmpi OSU get_bw mpirun one_core
>       PASS |      0 | openmpi OSU get_latency mpirun one_core
>       PASS |      0 | openmpi OSU hello mpirun one_core
>       PASS |      0 | openmpi OSU iallgather mpirun one_core
>       PASS |      0 | openmpi OSU iallgatherv mpirun one_core
>       PASS |      0 | openmpi OSU iallreduce mpirun one_core
>       PASS |      0 | openmpi OSU ialltoall mpirun one_core
>       PASS |      0 | openmpi OSU ialltoallv mpirun one_core
>       PASS |      0 | openmpi OSU ialltoallw mpirun one_core
>       PASS |      0 | openmpi OSU ibarrier mpirun one_core
>       PASS |      0 | openmpi OSU ibcast mpirun one_core
>       PASS |      0 | openmpi OSU igather mpirun one_core
>       PASS |      0 | openmpi OSU igatherv mpirun one_core
>       PASS |      0 | openmpi OSU init mpirun one_core
>       PASS |      0 | openmpi OSU ireduce mpirun one_core
>       PASS |      0 | openmpi OSU iscatter mpirun one_core
>       PASS |      0 | openmpi OSU iscatterv mpirun one_core
>       PASS |      0 | openmpi OSU latency mpirun one_core
>       PASS |      0 | openmpi OSU latency_mp mpirun one_core
>       PASS |      0 | openmpi OSU mbw_mr mpirun one_core
>       PASS |      0 | openmpi OSU multi_lat mpirun one_core
>       PASS |      0 | openmpi OSU put_bibw mpirun one_core
>       PASS |      0 | openmpi OSU put_bw mpirun one_core
>       PASS |      0 | openmpi OSU put_latency mpirun one_core
>       PASS |      0 | openmpi OSU reduce mpirun one_core
>       PASS |      0 | openmpi OSU reduce_scatter mpirun one_core
>       PASS |      0 | openmpi OSU scatter mpirun one_core
>       PASS |      0 | openmpi OSU scatterv mpirun one_core
>       PASS |      0 | NON-ROOT IMB-MPI1 PingPong
> Checking for failures and known issues:
>   no test failures
> 
> Tests on other HCAs will be updated soon...

The above were tested with


Installed:
  mpitests-openmpi-5.8-1.el9.x86_64         openmpi-2:4.1.1-7.el9.x86_64       
  openmpi-devel-2:4.1.1-7.el9.x86_64        openmpi-java-2:4.1.1-7.el9.x86_64  
  openmpi-java-devel-2:4.1.1-7.el9.x86_64

Comment 4 Afom T. Michael 2023-07-24 22:25:09 UTC
Marking this as "Verified: Tested" since tests with "openmpi-2:4.1.1-7.el9" passes as shown on comments above.

Comment 9 Brian Chae 2023-08-10 11:02:33 UTC
The ON_QA verification has been conducted successfully as the following:

1. nightly build tested : RHEL-9.3.0-20230809.27 and RHEL-9.3.0-20230809.d.37
2. openmpi package tested : 

Installed:
  mpitests-openmpi-7.1-2.el9.x86_64         openmpi-2:4.1.1-7.el9.x86_64
  openmpi-devel-2:4.1.1-7.el9.x86_64

3. RDMA HW tested : MLX5 IB0, MLX5 IB1, MLX5 RoCE, BNXT RoCE, QEDR RoCE, QEDR iWARP, CXGB4 iWARP, HFI OPA0
4. RDMA test suites tested : sanity, openmpi
5. Results : what is shown in this comment box is the ones from MLX5 IB0/IB1/RoCE only due to the space

sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8170735:
5.14.0-351.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_2
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      PASS |      0 | enable opensm
      PASS |      0 | restart opensm
      PASS |      0 | osmtest -f c -g 0x248a07030049d468
      PASS |      0 | stop opensm
      PASS |      0 | disable opensm
      FAIL |      1 | ibstatus reported expected HCA rate
      PASS |      0 | pkey mlx5_ib0.8080 create/delete
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      PASS |      0 | /usr/sbin/ibsrpdm -vc
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.0.120
      PASS |      0 | ping6 self - fe80::268a:703:49:d468%mlx5_ib0
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.0.119
      PASS |      0 | ping6 server - fe80::268a:703:49:d338%mlx5_ib0
      PASS |      0 | openmpi mpitests-IMB-MPI1 PingPong
      PASS |      0 | openmpi mpitests-IMB-IO S_Read_indv
      PASS |      0 | openmpi mpitests-IMB-EXT Window
      PASS |      0 | openmpi mpitests-osu_get_bw
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK
Checking for failures and known issues:
  ibstatus reported expected HCA rate is a known issue on RHEL-9.3 over ConnectX-4 mlx5/ib0 - see Known config issue
  /usr/share/pmix/test/pmix_test is a known issue on RHEL-9.3 over ConnectX-4 mlx5/ib0 - see bz2176561
  no new test failures

sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8170735:
5.14.0-351.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_3
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      PASS |      0 | enable opensm
      PASS |      0 | restart opensm
      PASS |      0 | osmtest -f c -g 0x248a07030049d469
      PASS |      0 | stop opensm
      PASS |      0 | disable opensm
      PASS |      0 | ibstatus reported expected HCA rate
      PASS |      0 | pkey mlx5_ib1.8080 create/delete
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      PASS |      0 | /usr/sbin/ibsrpdm -vc
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.1.120
      PASS |      0 | ping6 self - fe80::268a:703:49:d469%mlx5_ib1
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.1.119
      PASS |      0 | ping6 server - fe80::268a:703:49:d339%mlx5_ib1
      PASS |      0 | openmpi mpitests-IMB-MPI1 PingPong
      PASS |      0 | openmpi mpitests-IMB-IO S_Read_indv
      PASS |      0 | openmpi mpitests-IMB-EXT Window
      PASS |      0 | openmpi mpitests-osu_get_bw
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK
Checking for failures and known issues:
  /usr/share/pmix/test/pmix_test is a known issue on RHEL-9.3 over ConnectX-4 mlx5/ib1 - see bz2176561
  no new test failures

sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8170735:
5.14.0-351.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 Lx & mlx5_bond_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      FAIL |      1 | ibstatus reported expected HCA rate
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      SKIP |    777 | ibsrpdm
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.45.120
      PASS |      0 | ping6 self - fe80::7efe:90ff:fecb:762a%mlx5_team_ro.45
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.45.119
      PASS |      0 | ping6 server - fe80::7efe:90ff:fecb:743a%mlx5_team_ro.45
      PASS |      0 | openmpi mpitests-IMB-MPI1 PingPong
      PASS |      0 | openmpi mpitests-IMB-IO S_Read_indv
      PASS |      0 | openmpi mpitests-IMB-EXT Window
      PASS |      0 | openmpi mpitests-osu_get_bw
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK
Checking for failures and known issues:
  ibstatus reported expected HCA rate is a known issue on RHEL-9.3 over ConnectX-4 Lx mlx5/roce.45 - see Known config issue
  /usr/share/pmix/test/pmix_test is a known issue on RHEL-9.3 over ConnectX-4 Lx mlx5/roce.45 - see bz2176561
  no new test failures



b. openmpi


mpi-openmpi test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8170735:
5.14.0-351.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_2
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | openmpi IMB-MPI1 PingPong mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 PingPing mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Sendrecv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Exchange mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Bcast mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatterv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoall mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoallv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allreduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Barrier mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-EXT Window mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibcast mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatterv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoall mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoallv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallreduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibarrier mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_all_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Fetch_and_op mpirun one_core
      PASS |      0 | openmpi IMB-RMA Compare_and_swap mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_all_local mpirun one_core
      PASS |      0 | openmpi OSU acc_latency mpirun one_core
      PASS |      0 | openmpi OSU allgather mpirun one_core
      PASS |      0 | openmpi OSU allgatherv mpirun one_core
      PASS |      0 | openmpi OSU allreduce mpirun one_core
      PASS |      0 | openmpi OSU alltoall mpirun one_core
      PASS |      0 | openmpi OSU alltoallv mpirun one_core
      PASS |      0 | openmpi OSU alltoallw mpirun one_core
      PASS |      0 | openmpi OSU barrier mpirun one_core
      PASS |      0 | openmpi OSU bcast mpirun one_core
      PASS |      0 | openmpi OSU bibw mpirun one_core
      PASS |      0 | openmpi OSU bibw_persistent mpirun one_core
      PASS |      0 | openmpi OSU bw mpirun one_core
      PASS |      0 | openmpi OSU bw_persistent mpirun one_core
      PASS |      0 | openmpi OSU cas_latency mpirun one_core
      PASS |      0 | openmpi OSU fop_latency mpirun one_core
      PASS |      0 | openmpi OSU gather mpirun one_core
      PASS |      0 | openmpi OSU gatherv mpirun one_core
      PASS |      0 | openmpi OSU get_acc_latency mpirun one_core
      PASS |      0 | openmpi OSU get_bw mpirun one_core
      PASS |      0 | openmpi OSU get_latency mpirun one_core
      PASS |      0 | openmpi OSU hello mpirun one_core
      PASS |      0 | openmpi OSU iallgather mpirun one_core
      PASS |      0 | openmpi OSU iallgatherv mpirun one_core
      PASS |      0 | openmpi OSU iallreduce mpirun one_core
      PASS |      0 | openmpi OSU ialltoall mpirun one_core
      PASS |      0 | openmpi OSU ialltoallv mpirun one_core
      PASS |      0 | openmpi OSU ialltoallw mpirun one_core
      PASS |      0 | openmpi OSU ibarrier mpirun one_core
      PASS |      0 | openmpi OSU ibcast mpirun one_core
      PASS |      0 | openmpi OSU igather mpirun one_core
      PASS |      0 | openmpi OSU igatherv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_allgather mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_allgatherv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoall mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoallv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoallw mpirun one_core
      PASS |      0 | openmpi OSU init mpirun one_core
      PASS |      0 | openmpi OSU ireduce mpirun one_core
      PASS |      0 | openmpi OSU ireduce_scatter mpirun one_core
      PASS |      0 | openmpi OSU iscatter mpirun one_core
      PASS |      0 | openmpi OSU iscatterv mpirun one_core
      PASS |      0 | openmpi OSU latency mpirun one_core
      PASS |      0 | openmpi OSU latency_mp mpirun one_core
      PASS |      0 | openmpi OSU latency_persistent mpirun one_core
      PASS |      0 | openmpi OSU mbw_mr mpirun one_core
      PASS |      0 | openmpi OSU multi_lat mpirun one_core
      PASS |      0 | openmpi OSU neighbor_allgather mpirun one_core
      PASS |      0 | openmpi OSU neighbor_allgatherv mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoall mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoallv mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoallw mpirun one_core
      PASS |      0 | openmpi OSU put_bibw mpirun one_core
      PASS |      0 | openmpi OSU put_bw mpirun one_core
      PASS |      0 | openmpi OSU put_latency mpirun one_core
      PASS |      0 | openmpi OSU reduce mpirun one_core
      PASS |      0 | openmpi OSU reduce_scatter mpirun one_core
      PASS |      0 | openmpi OSU scatter mpirun one_core
      PASS |      0 | openmpi OSU scatterv mpirun one_core
      PASS |      0 | NON-ROOT IMB-MPI1 PingPong
Checking for failures and known issues:
  no test failures



mpi-openmpi test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8170735:
5.14.0-351.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_3
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | openmpi IMB-MPI1 PingPong mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 PingPing mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Sendrecv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Exchange mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Bcast mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatterv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoall mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoallv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allreduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Barrier mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-EXT Window mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibcast mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatterv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoall mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoallv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallreduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibarrier mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_all_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Fetch_and_op mpirun one_core
      PASS |      0 | openmpi IMB-RMA Compare_and_swap mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_all_local mpirun one_core
      PASS |      0 | openmpi OSU acc_latency mpirun one_core
     PASS |      0 | openmpi OSU allgather mpirun one_core
      PASS |      0 | openmpi OSU allgatherv mpirun one_core
      PASS |      0 | openmpi OSU allreduce mpirun one_core
      PASS |      0 | openmpi OSU alltoall mpirun one_core
      PASS |      0 | openmpi OSU alltoallv mpirun one_core
      PASS |      0 | openmpi OSU alltoallw mpirun one_core
      PASS |      0 | openmpi OSU barrier mpirun one_core
      PASS |      0 | openmpi OSU bcast mpirun one_core
      PASS |      0 | openmpi OSU bibw mpirun one_core
      PASS |      0 | openmpi OSU bibw_persistent mpirun one_core
      PASS |      0 | openmpi OSU bw mpirun one_core
      PASS |      0 | openmpi OSU bw_persistent mpirun one_core
      PASS |      0 | openmpi OSU cas_latency mpirun one_core
      PASS |      0 | openmpi OSU fop_latency mpirun one_core
      PASS |      0 | openmpi OSU gather mpirun one_core
      PASS |      0 | openmpi OSU gatherv mpirun one_core
      PASS |      0 | openmpi OSU get_acc_latency mpirun one_core
      PASS |      0 | openmpi OSU get_bw mpirun one_core
      PASS |      0 | openmpi OSU get_latency mpirun one_core
      PASS |      0 | openmpi OSU hello mpirun one_core
      PASS |      0 | openmpi OSU iallgather mpirun one_core
      PASS |      0 | openmpi OSU iallgatherv mpirun one_core
      PASS |      0 | openmpi OSU iallreduce mpirun one_core
      PASS |      0 | openmpi OSU ialltoall mpirun one_core
      PASS |      0 | openmpi OSU ialltoallv mpirun one_core
      PASS |      0 | openmpi OSU ialltoallw mpirun one_core
      PASS |      0 | openmpi OSU ibarrier mpirun one_core
      PASS |      0 | openmpi OSU ibcast mpirun one_core
      PASS |      0 | openmpi OSU igather mpirun one_core
      PASS |      0 | openmpi OSU igatherv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_allgather mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_allgatherv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoall mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoallv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoallw mpirun one_core
      PASS |      0 | openmpi OSU init mpirun one_core
      PASS |      0 | openmpi OSU ireduce mpirun one_core
      PASS |      0 | openmpi OSU ireduce_scatter mpirun one_core
      PASS |      0 | openmpi OSU iscatter mpirun one_core
      PASS |      0 | openmpi OSU iscatterv mpirun one_core
      PASS |      0 | openmpi OSU latency mpirun one_core
      PASS |      0 | openmpi OSU latency_mp mpirun one_core
      PASS |      0 | openmpi OSU latency_persistent mpirun one_core
      PASS |      0 | openmpi OSU mbw_mr mpirun one_core
      PASS |      0 | openmpi OSU multi_lat mpirun one_core
      PASS |      0 | openmpi OSU neighbor_allgather mpirun one_core
      PASS |      0 | openmpi OSU neighbor_allgatherv mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoall mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoallv mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoallw mpirun one_core
      PASS |      0 | openmpi OSU put_bibw mpirun one_core
      PASS |      0 | openmpi OSU put_bw mpirun one_core
      PASS |      0 | openmpi OSU put_latency mpirun one_core
      PASS |      0 | openmpi OSU reduce mpirun one_core
      PASS |      0 | openmpi OSU reduce_scatter mpirun one_core
      PASS |      0 | openmpi OSU scatter mpirun one_core
      PASS |      0 | openmpi OSU scatterv mpirun one_core
      PASS |      0 | NON-ROOT IMB-MPI1 PingPong
Checking for failures and known issues:
  no test failures



mpi-openmpi test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8170735:
5.14.0-351.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 Lx & mlx5_bond_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | openmpi IMB-MPI1 PingPong mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 PingPing mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Sendrecv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Exchange mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Bcast mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allgatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gather mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Gatherv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Scatterv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoall mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Alltoallv mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Reduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Allreduce mpirun one_core
      PASS |      0 | openmpi IMB-MPI1 Barrier mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO S_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Write_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO P_Read_priv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_indv mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_expl mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Write_shared mpirun one_core
      PASS |      0 | openmpi IMB-IO C_Read_shared mpirun one_core
      PASS |      0 | openmpi IMB-EXT Window mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Unidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Get mpirun one_core
      PASS |      0 | openmpi IMB-EXT Bidir_Put mpirun one_core
      PASS |      0 | openmpi IMB-EXT Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibcast mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallgatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igather mpirun one_core
      PASS |      0 | openmpi IMB-NBC Igatherv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iscatterv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoall mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ialltoallv mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ireduce_scatter mpirun one_core
      PASS |      0 | openmpi IMB-NBC Iallreduce mpirun one_core
      PASS |      0 | openmpi IMB-NBC Ibarrier mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Unidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Bidir_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA One_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_put_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA All_get_all mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Put_all_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_put mpirun one_core
      PASS |      0 | openmpi IMB-RMA Exchange_get mpirun one_core
      PASS |      0 | openmpi IMB-RMA Accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_accumulate mpirun one_core
      PASS |      0 | openmpi IMB-RMA Fetch_and_op mpirun one_core
      PASS |      0 | openmpi IMB-RMA Compare_and_swap mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_local mpirun one_core
      PASS |      0 | openmpi IMB-RMA Get_all_local mpirun one_core
      PASS |      0 | openmpi OSU acc_latency mpirun one_core
      PASS |      0 | openmpi OSU allgather mpirun one_core
      PASS |      0 | openmpi OSU allgatherv mpirun one_core
      PASS |      0 | openmpi OSU allreduce mpirun one_core
      PASS |      0 | openmpi OSU alltoall mpirun one_core
      PASS |      0 | openmpi OSU alltoallv mpirun one_core
      PASS |      0 | openmpi OSU alltoallw mpirun one_core
      PASS |      0 | openmpi OSU barrier mpirun one_core
      PASS |      0 | openmpi OSU bcast mpirun one_core
      PASS |      0 | openmpi OSU bibw mpirun one_core
      PASS |      0 | openmpi OSU bibw_persistent mpirun one_core
      PASS |      0 | openmpi OSU bw mpirun one_core
      PASS |      0 | openmpi OSU bw_persistent mpirun one_core
      PASS |      0 | openmpi OSU cas_latency mpirun one_core
      PASS |      0 | openmpi OSU fop_latency mpirun one_core
      PASS |      0 | openmpi OSU gather mpirun one_core
      PASS |      0 | openmpi OSU gatherv mpirun one_core
      PASS |      0 | openmpi OSU get_acc_latency mpirun one_core
      PASS |      0 | openmpi OSU get_bw mpirun one_core
      PASS |      0 | openmpi OSU get_latency mpirun one_core
      PASS |      0 | openmpi OSU hello mpirun one_core
      PASS |      0 | openmpi OSU iallgather mpirun one_core
      PASS |      0 | openmpi OSU iallgatherv mpirun one_core
      PASS |      0 | openmpi OSU iallreduce mpirun one_core
      PASS |      0 | openmpi OSU ialltoall mpirun one_core
      PASS |      0 | openmpi OSU ialltoallv mpirun one_core
      PASS |      0 | openmpi OSU ialltoallw mpirun one_core
      PASS |      0 | openmpi OSU ibarrier mpirun one_core
      PASS |      0 | openmpi OSU ibcast mpirun one_core
      PASS |      0 | openmpi OSU igather mpirun one_core
      PASS |      0 | openmpi OSU igatherv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_allgather mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_allgatherv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoall mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoallv mpirun one_core
      PASS |      0 | openmpi OSU ineighbor_alltoallw mpirun one_core
      PASS |      0 | openmpi OSU init mpirun one_core
      PASS |      0 | openmpi OSU ireduce mpirun one_core
      PASS |      0 | openmpi OSU ireduce_scatter mpirun one_core
      PASS |      0 | openmpi OSU iscatter mpirun one_core
      PASS |      0 | openmpi OSU iscatterv mpirun one_core
      PASS |      0 | openmpi OSU latency mpirun one_core
      PASS |      0 | openmpi OSU latency_mp mpirun one_core
      PASS |      0 | openmpi OSU latency_persistent mpirun one_core
      PASS |      0 | openmpi OSU mbw_mr mpirun one_core
      PASS |      0 | openmpi OSU multi_lat mpirun one_core
      PASS |      0 | openmpi OSU neighbor_allgather mpirun one_core
      PASS |      0 | openmpi OSU neighbor_allgatherv mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoall mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoallv mpirun one_core
      PASS |      0 | openmpi OSU neighbor_alltoallw mpirun one_core
      PASS |      0 | openmpi OSU put_bibw mpirun one_core
      PASS |      0 | openmpi OSU put_bw mpirun one_core
      PASS |      0 | openmpi OSU put_latency mpirun one_core
      PASS |      0 | openmpi OSU reduce mpirun one_core
      PASS |      0 | openmpi OSU reduce_scatter mpirun one_core
      PASS |      0 | openmpi OSU scatter mpirun one_core
      PASS |      0 | openmpi OSU scatterv mpirun one_core
      PASS |      0 | NON-ROOT IMB-MPI1 PingPong
Checking for failures and known issues:
  no test failures


o Other RDMA devices results for BXNT RoCE, QEDR RoCE/iWARP, CXGB4 iWARP, and HFI OPA0 all showed no new issues in the sanity and openmpi tests.

o Setting this bugzilla as Verified.