Bug 2224042
| Summary: | Perftest versions prior to 23.04 missing Intel Parameters | ||
|---|---|---|---|
| Product: | Red Hat Enterprise Linux 9 | Reporter: | Kamal Heib <kheib> |
| Component: | perftest | Assignee: | Kamal Heib <kheib> |
| Status: | VERIFIED --- | QA Contact: | Brian Chae <bchae> |
| Severity: | high | Docs Contact: | |
| Priority: | unspecified | ||
| Version: | 9.3 | CC: | bchae, dledford, hwkernel-mgr, ivan.d.barrera, kheib, rdma-dev-team, shiraz.saleem, tmichael |
| Target Milestone: | rc | Keywords: | Triaged |
| Target Release: | 9.3 | ||
| Hardware: | All | ||
| OS: | Linux | ||
| Whiteboard: | |||
| Fixed In Version: | perftest-23.04.0.0.23-2.el9 | Doc Type: | If docs needed, set a value |
| Doc Text: | Story Points: | --- | |
| Clone Of: | 2211464 | Environment: | |
| Last Closed: | Type: | Bug | |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
| Embargoed: | |||
|
Description
Kamal Heib
2023-07-19 16:27:10 UTC
When tested with perftest-23.04.0.0.23-2.el9.x86_64 on intel HCA, E810 iRDMA iWARP device, "ib_send_lat" with just defaults failed as the following:
Server host : rdma-dev-30
Client host : rdma-dev-31
[root@rdma-dev-30 ~]$ rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64
[root@rdma-dev-31 ~]$ rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64
[root@rdma-dev-30 ~]$ ib_send_lat
Port number 1 state is Down
Couldn't set the link layer
Couldn't get context for the device
[root@rdma-dev-30 ~]$
[root@rdma-dev-31 ~]$ ib_send_lat 172.31.50.130
Couldn't connect to 172.31.50.130:18515
Unable to open file descriptor for socket connection Unable to init the socket connection
[root@rdma-dev-31 ~]$
---------------------
if required the following flags, at a mininum
server
[root@rdma-dev-30 ~]$ ib_send_lat -d irdma1 -R <<<==================================
************************************
* Waiting for client to connect... *
************************************
---------------------------------------------------------------------------------------
Send Latency Test
Dual-port : OFF Device : irdma1
Number of qps : 1 Transport type : IW
Connection type : RC Using SRQ : OFF
PCIe relax order: ON
ibv_wr* API : OFF
RX depth : 512
Mtu : 4096[B]
Link type : Ethernet
GID index : 0
Max inline data : 101[B]
rdma_cm QPs : ON
Data ex. method : rdma_cm
---------------------------------------------------------------------------------------
Waiting for client rdma_cm QP to connect
Please run the same command with the IB/RoCE interface IP
---------------------------------------------------------------------------------------
local address: LID 0x01 QPN 0x002b PSN 0x381a7c
GID: 180:150:145:173:133:137:00:00:00:00:00:00:00:00:00:00
remote address: LID 0x01 QPN 0x002c PSN 0x8cd6b8
GID: 180:150:145:173:138:169:00:00:00:00:00:00:00:00:00:00
---------------------------------------------------------------------------------------
#bytes #iterations t_min[usec] t_max[usec] t_typical[usec] t_avg[usec] t_stdev[usec] 99% percentile[usec] 99.9% percentile[usec]
Conflicting CPU frequency values detected: 2000.000000 != 2794.665000. CPU Frequency is not max.
Conflicting CPU frequency values detected: 2000.000000 != 2773.170000. CPU Frequency is not max.
2 1000 5.51 7.34 5.57 5.67 0.04 6.04 7.34
---------------------------------------------------------------------------------------
[root@rdma-dev-30 ~]$
client
[root@rdma-dev-31 ~]$ ib_send_lat -R 172.31.50.130 <<<==================================
---------------------------------------------------------------------------------------
Send Latency Test
Dual-port : OFF Device : irdma0
Number of qps : 1 Transport type : IB
Connection type : RC Using SRQ : OFF
PCIe relax order: ON
ibv_wr* API : OFF
TX depth : 1
Mtu : 4096[B]
Link type : Ethernet
GID index : 0
Max inline data : 101[B]
rdma_cm QPs : ON
Data ex. method : rdma_cm
---------------------------------------------------------------------------------------
local address: LID 0x01 QPN 0x002c PSN 0x8cd6b8
GID: 180:150:145:173:138:169:00:00:00:00:00:00:00:00:00:00
remote address: LID 0x01 QPN 0x002b PSN 0x381a7c
GID: 180:150:145:173:133:137:00:00:00:00:00:00:00:00:00:00
---------------------------------------------------------------------------------------
#bytes #iterations t_min[usec] t_max[usec] t_typical[usec] t_avg[usec] t_stdev[usec] 99% percentile[usec] 99.9% percentile[usec]
Conflicting CPU frequency values detected: 2000.000000 != 2792.653000. CPU Frequency is not max.
Conflicting CPU frequency values detected: 2000.000000 != 2787.568000. CPU Frequency is not max.
2 1000 5.51 7.54 5.57 5.67 0.05 6.46 7.54
---------------------------------------------------------------------------------------
[root@rdma-dev-31 ~]$
==================================
However, on a MLX5 IB device, "ib_send_lat" with defaults worked fine, as the following
[root@rdma-dev-19 ~]$ rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64
[root@rdma-dev-20 ~]$ rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64
[root@rdma-dev-19 ~]$ ib_send_lat
************************************
* Waiting for client to connect... *
************************************
---------------------------------------------------------------------------------------
Send Latency Test
Dual-port : OFF Device : mlx5_2
Number of qps : 1 Transport type : IB
Connection type : RC Using SRQ : OFF
PCIe relax order: ON
WARNING: CPU is not PCIe relaxed ordering compliant.
WARNING: You should disable PCIe RO with `--disable_pcie_relaxed` for both server and client.
ibv_wr* API : ON
RX depth : 512
Mtu : 4096[B]
Link type : IB
Max inline data : 236[B]
rdma_cm QPs : OFF
Data ex. method : Ethernet
---------------------------------------------------------------------------------------
local address: LID 0x07 QPN 0x0363 PSN 0x6fe64
remote address: LID 0x08 QPN 0x0363 PSN 0x3610a3
---------------------------------------------------------------------------------------
#bytes #iterations t_min[usec] t_max[usec] t_typical[usec] t_avg[usec] t_stdev[usec] 99% percentile[usec] 99.9% percentile[usec]
Conflicting CPU frequency values detected: 3501.299000 != 3600.000000. CPU Frequency is not max.
Conflicting CPU frequency values detected: 3500.088000 != 3600.000000. CPU Frequency is not max.
2 1000 1.20 2.77 1.24 1.24 0.00 1.37 2.77
---------------------------------------------------------------------------------------
[root@rdma-dev-19 ~]$
[root@rdma-dev-20 ~]$ ib_send_lat 172.31.0.119
---------------------------------------------------------------------------------------
Send Latency Test
Dual-port : OFF Device : mlx5_2
Number of qps : 1 Transport type : IB
Connection type : RC Using SRQ : OFF
PCIe relax order: ON
WARNING: CPU is not PCIe relaxed ordering compliant.
WARNING: You should disable PCIe RO with `--disable_pcie_relaxed` for both server and client.
ibv_wr* API : ON
TX depth : 1
Mtu : 4096[B]
Link type : IB
Max inline data : 236[B]
rdma_cm QPs : OFF
Data ex. method : Ethernet
---------------------------------------------------------------------------------------
local address: LID 0x08 QPN 0x0363 PSN 0x3610a3
remote address: LID 0x07 QPN 0x0363 PSN 0x6fe64
---------------------------------------------------------------------------------------
#bytes #iterations t_min[usec] t_max[usec] t_typical[usec] t_avg[usec] t_stdev[usec] 99% percentile[usec] 99.9% percentile[usec]
Conflicting CPU frequency values detected: 3600.000000 != 3499.673000. CPU Frequency is not max.
Conflicting CPU frequency values detected: 3600.000000 != 3499.672000. CPU Frequency is not max.
2 1000 1.20 3.28 1.23 1.24 0.00 1.44 3.28
---------------------------------------------------------------------------------------
====================================
Setting this bug back to Assigned state, because the test failed with default flags on Intel device, E810 iRDMA IW.
After discussion with Kamal in DEV, the first device, irdma0, was in DOWN state, the default for "ib_send_lat" command would select it and would have the errors. Hence, with rdma-dev-30/31 pair, we cannot test for default flags. We will just go with RDMA sanity and perftest tests results.
1. perftest package : perftest-23.04.0.0.23-2.el9.x86_64
2. tested HW :
MLX5 IB0, IB1, RoCE
E810 iRDMA
3. RDMA test suites tested : sanity, perftest
4. Results
1. E810 iRDMA iWARP
+ [23-07-30 13:00:20] rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64
sanity test results on rdma-dev-30/rdma-dev-31 & Beaker job J:8131466:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, i40e, iw, E810-C & irdma1
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | ibstatus reported expected HCA rate
PASS | 0 | vlan i810_iw.81 create/delete
PASS | 0 | /usr/sbin/ibstatus
PASS | 0 | systemctl start srp_daemon.service
SKIP | 777 | ibsrpdm
PASS | 0 | systemctl stop srp_daemon
PASS | 0 | ping self - 172.31.50.131
PASS | 0 | ping6 self - fe80::b696:91ff:fead:8aa9%i810_iw
FAIL | 127 | /usr/share/pmix/test/pmix_test
PASS | 0 | ping server - 172.31.50.130
PASS | 0 | ping6 server - fe80::b696:91ff:fead:8589%i810_iw
FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong [ known issue - bz bz 2216042 ]
FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv [ known issue - bz bz 2216042 ]
FAIL | 1 | openmpi mpitests-IMB-EXT Window [ known issue - bz bz 2216042 ]
FAIL | 1 | openmpi mpitests-osu_get_bw [ known issue - bz bz 2216042 ]
PASS | 0 | ip multicast addr
PASS | 0 | rping
FAIL | 255 | rcopy [ known issue - bz 2176561 ]
PASS | 0 | ib_read_bw
PASS | 0 | ib_send_bw
PASS | 0 | ib_write_bw
PASS | 0 | iser login
PASS | 0 | mount /dev/sdb /iser
PASS | 0 | iser write 1K
PASS | 0 | iser write 1M
FAIL | 124 | iser write 1G [ on-off issue ]
PASS | 0 | nfsordma mount - XFS_EXT
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - XFS_EXT
PASS | 0 | nfsordma mount - RAMDISK
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - RAMDISK
perftest test results on rdma-dev-30/rdma-dev-31 & Beaker job J:8131466:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, i40e, iw, E810-C & irdma1
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | ib_read_bw RC
PASS | 0 | ib_read_lat RC
PASS | 0 | ib_send_bw RC
PASS | 0 | ib_send_lat RC
PASS | 0 | ib_write_bw RC
PASS | 0 | ib_write_lat RC
Checking for failures and known issues:
no test failures
2. MLX5 IB0, IB1, RoCE
+ [23-07-30 13:00:20] rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64
sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_2
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | load module mlx5_ib
PASS | 0 | load module mlx5_core
PASS | 0 | enable opensm
PASS | 0 | restart opensm
PASS | 0 | osmtest -f c -g 0x248a07030049d468
PASS | 0 | stop opensm
PASS | 0 | disable opensm
FAIL | 1 | ibstatus reported expected HCA rate
PASS | 0 | pkey mlx5_ib0.8080 create/delete
PASS | 0 | /usr/sbin/ibstat
PASS | 0 | /usr/sbin/ibstatus
PASS | 0 | systemctl start srp_daemon.service
PASS | 0 | /usr/sbin/ibsrpdm -vc
PASS | 0 | systemctl stop srp_daemon
PASS | 0 | ping self - 172.31.0.120
PASS | 0 | ping6 self - fe80::268a:703:49:d468%mlx5_ib0
FAIL | 127 | /usr/share/pmix/test/pmix_test [ known issue - see above ]
PASS | 0 | ping server - 172.31.0.119
PASS | 0 | ping6 server - fe80::268a:703:49:d338%mlx5_ib0
FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong [ known issue - see above ]
FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv [ known issue - see above ]
FAIL | 1 | openmpi mpitests-IMB-EXT Window [ known issue - see above ]
FAIL | 1 | openmpi mpitests-osu_get_bw [ known issue - see above ]
PASS | 0 | ip multicast addr
PASS | 0 | rping
PASS | 0 | rcopy
PASS | 0 | ib_read_bw
PASS | 0 | ib_send_bw
PASS | 0 | ib_write_bw
PASS | 0 | iser login
PASS | 0 | mount /dev/sdb /iser
PASS | 0 | iser write 1K
PASS | 0 | iser write 1M
PASS | 0 | iser write 1G
PASS | 0 | nfsordma mount - XFS_EXT
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - XFS_EXT
PASS | 0 | nfsordma mount - RAMDISK
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - RAMDISK
sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_3
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | load module mlx5_ib
PASS | 0 | load module mlx5_core
PASS | 0 | enable opensm
PASS | 0 | restart opensm
PASS | 0 | osmtest -f c -g 0x248a07030049d469
PASS | 0 | stop opensm
PASS | 0 | disable opensm
PASS | 0 | ibstatus reported expected HCA rate
PASS | 0 | pkey mlx5_ib1.8080 create/delete
PASS | 0 | /usr/sbin/ibstat
PASS | 0 | /usr/sbin/ibstatus
PASS | 0 | systemctl start srp_daemon.service
PASS | 0 | /usr/sbin/ibsrpdm -vc
PASS | 0 | systemctl stop srp_daemon
PASS | 0 | ping self - 172.31.1.120
PASS | 0 | ping6 self - fe80::268a:703:49:d469%mlx5_ib1
FAIL | 127 | /usr/share/pmix/test/pmix_test
PASS | 0 | ping server - 172.31.1.119
PASS | 0 | ping6 server - fe80::268a:703:49:d339%mlx5_ib1
FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong
FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv
FAIL | 1 | openmpi mpitests-IMB-EXT Window
FAIL | 1 | openmpi mpitests-osu_get_bw
PASS | 0 | ip multicast addr
PASS | 0 | rping
PASS | 0 | rcopy
PASS | 0 | ib_read_bw
PASS | 0 | ib_send_bw
PASS | 0 | ib_write_bw
PASS | 0 | iser login
PASS | 0 | mount /dev/sdb /iser
PASS | 0 | iser write 1K
PASS | 0 | iser write 1M
PASS | 0 | iser write 1G
PASS | 0 | nfsordma mount - XFS_EXT
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - XFS_EXT
PASS | 0 | nfsordma mount - RAMDISK
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - RAMDISK
sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 Lx & mlx5_bond_0
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | load module mlx5_ib
PASS | 0 | load module mlx5_core
FAIL | 1 | ibstatus reported expected HCA rate
PASS | 0 | /usr/sbin/ibstat
PASS | 0 | /usr/sbin/ibstatus
PASS | 0 | systemctl start srp_daemon.service
SKIP | 777 | ibsrpdm
PASS | 0 | systemctl stop srp_daemon
PASS | 0 | ping self - 172.31.45.120
PASS | 0 | ping6 self - fe80::7efe:90ff:fecb:762a%mlx5_team_ro.45
FAIL | 127 | /usr/share/pmix/test/pmix_test
PASS | 0 | ping server - 172.31.45.119
PASS | 0 | ping6 server - fe80::7efe:90ff:fecb:743a%mlx5_team_ro.45
FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong
FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv
FAIL | 1 | openmpi mpitests-IMB-EXT Window
FAIL | 1 | openmpi mpitests-osu_get_bw
PASS | 0 | ip multicast addr
PASS | 0 | rping
PASS | 0 | rcopy
PASS | 0 | ib_read_bw
PASS | 0 | ib_send_bw
PASS | 0 | ib_write_bw
PASS | 0 | iser login
PASS | 0 | mount /dev/sdb /iser
PASS | 0 | iser write 1K
PASS | 0 | iser write 1M
PASS | 0 | iser write 1G
PASS | 0 | nfsordma mount - XFS_EXT
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - XFS_EXT
PASS | 0 | nfsordma mount - RAMDISK
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - RAMDISK
perftest test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_2
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | ib_atomic_bw RC
PASS | 0 | ib_atomic_bw XRC
PASS | 0 | ib_atomic_lat RC
PASS | 0 | ib_atomic_lat XRC
PASS | 0 | ib_read_bw RC
PASS | 0 | ib_read_bw XRC
PASS | 0 | ib_read_lat RC
PASS | 0 | ib_read_lat XRC
PASS | 0 | ib_send_bw RC
PASS | 0 | ib_send_bw XRC
PASS | 0 | ib_send_lat RC
PASS | 0 | ib_send_lat XRC
PASS | 0 | ib_write_bw RC
PASS | 0 | ib_write_bw XRC
PASS | 0 | ib_write_lat RC
PASS | 0 | ib_write_lat XRC
Checking for failures and known issues:
no test failures
perftest test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_3
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | ib_atomic_bw RC
PASS | 0 | ib_atomic_bw XRC
PASS | 0 | ib_atomic_lat RC
PASS | 0 | ib_atomic_lat XRC
PASS | 0 | ib_read_bw RC
PASS | 0 | ib_read_bw XRC
PASS | 0 | ib_read_lat RC
PASS | 0 | ib_read_lat XRC
PASS | 0 | ib_send_bw RC
PASS | 0 | ib_send_bw XRC
PASS | 0 | ib_send_lat RC
PASS | 0 | ib_send_lat XRC
PASS | 0 | ib_write_bw RC
PASS | 0 | ib_write_bw XRC
PASS | 0 | ib_write_lat RC
PASS | 0 | ib_write_lat XRC
Checking for failures and known issues:
no test failures
perftest test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 Lx & mlx5_bond_0
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | ib_atomic_bw RC
PASS | 0 | ib_atomic_lat RC
PASS | 0 | ib_read_bw RC
PASS | 0 | ib_read_lat RC
PASS | 0 | ib_send_bw RC
PASS | 0 | ib_send_lat RC
PASS | 0 | ib_write_bw RC
PASS | 0 | ib_write_lat RC
PASS | 0 | raw_ethernet_bw RC
PASS | 0 | raw_ethernet_lat RC
Checking for failures and known issues:
no test failures
o No new issues observed in the sanity and pertests on the above RDMA HCAs
o Setting this bug back to MOD state
o Setting this bug with "verified:tested" flag
The verification for ON_QA was done as the following:
1. build tested : RHEL-9.3.0-20230804.9
2. perftest package tested: perftest-23.04.0.0.23-2.el9.x86_64.rpm
+ [23-08-04 16:49:13] rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64
3. HW tested : MLX5 IB0, MLX5 IB1, MLX5 RoCE, QEDR IW, QEDR RoCE, BNXT RoCE
4. RDMA test suites tested : sanity, perftest
5. Results: Only MLX5 IB0, IB1, RoCE test results are attached. Other HW test results will not be shown in this comment section but have been verified as successful.
a. sanity
sanity test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_1
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | load module mlx5_ib
PASS | 0 | load module mlx5_core
PASS | 0 | enable opensm
PASS | 0 | restart opensm
PASS | 0 | osmtest -f c -g 0x248a07030049d75c
PASS | 0 | stop opensm
PASS | 0 | disable opensm
FAIL | 1 | ibstatus reported expected HCA rate
PASS | 0 | pkey mlx5_ib0.8080 create/delete
PASS | 0 | /usr/sbin/ibstat
PASS | 0 | /usr/sbin/ibstatus
PASS | 0 | systemctl start srp_daemon.service
PASS | 0 | /usr/sbin/ibsrpdm -vc
PASS | 0 | systemctl stop srp_daemon
PASS | 0 | ping self - 172.31.0.122
PASS | 0 | ping6 self - fe80::268a:703:49:d75c%mlx5_ib0
FAIL | 127 | /usr/share/pmix/test/pmix_test [ known issue - bz 2176561 ]
PASS | 0 | ping server - 172.31.0.121
PASS | 0 | ping6 server - fe80::268a:703:49:d4f0%mlx5_ib0
FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-IMB-EXT Window [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-osu_get_bw [ known issue - bz 2216042 ]
PASS | 0 | ip multicast addr
PASS | 0 | rping
PASS | 0 | rcopy
PASS | 0 | ib_read_bw
PASS | 0 | ib_send_bw
PASS | 0 | ib_write_bw
PASS | 0 | iser login
PASS | 0 | mount /dev/sdb /iser
PASS | 0 | iser write 1K
PASS | 0 | iser write 1M
PASS | 0 | iser write 1G
PASS | 0 | nfsordma mount - XFS_EXT
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - XFS_EXT
PASS | 0 | nfsordma mount - RAMDISK
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - RAMDISK
sanity test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_2
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | load module mlx5_ib
PASS | 0 | load module mlx5_core
PASS | 0 | enable opensm
PASS | 0 | restart opensm
PASS | 0 | osmtest -f c -g 0x248a07030049d75d
PASS | 0 | stop opensm
PASS | 0 | disable opensm
PASS | 0 | ibstatus reported expected HCA rate
PASS | 0 | pkey mlx5_ib1.8080 create/delete
PASS | 0 | /usr/sbin/ibstat
PASS | 0 | /usr/sbin/ibstatus
PASS | 0 | systemctl start srp_daemon.service
PASS | 0 | /usr/sbin/ibsrpdm -vc
PASS | 0 | systemctl stop srp_daemon
PASS | 0 | ping self - 172.31.1.122
PASS | 0 | ping6 self - fe80::268a:703:49:d75d%mlx5_ib1
FAIL | 127 | /usr/share/pmix/test/pmix_test [ known issue - bz 2176561 ]
PASS | 0 | ping server - 172.31.1.121
PASS | 0 | ping6 server - fe80::268a:703:49:d4f1%mlx5_ib1
FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-IMB-EXT Window [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-osu_get_bw [ known issue - bz 2216042 ]
PASS | 0 | ip multicast addr
PASS | 0 | rping
PASS | 0 | rcopy
PASS | 0 | ib_read_bw
PASS | 0 | ib_send_bw
PASS | 0 | ib_write_bw
PASS | 0 | iser login
PASS | 0 | mount /dev/sdb /iser
PASS | 0 | iser write 1K
PASS | 0 | iser write 1M
PASS | 0 | iser write 1G
PASS | 0 | nfsordma mount - XFS_EXT
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - XFS_EXT
PASS | 0 | nfsordma mount - RAMDISK
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - RAMDISK
sanity test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 & mlx5_0
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | load module mlx5_ib
PASS | 0 | load module mlx5_core
PASS | 0 | ibstatus reported expected HCA rate
PASS | 0 | /usr/sbin/ibstat
PASS | 0 | /usr/sbin/ibstatus
PASS | 0 | systemctl start srp_daemon.service
SKIP | 777 | ibsrpdm
PASS | 0 | systemctl stop srp_daemon
PASS | 0 | ping self - 172.31.45.122
PASS | 0 | ping6 self - fe80::268a:7ff:fe56:b834%mlx5_roce.45
FAIL | 127 | /usr/share/pmix/test/pmix_test [ known issue - bz 2176561 ]
PASS | 0 | ping server - 172.31.45.121
PASS | 0 | ping6 server - fe80::268a:7ff:fe4b:f094%mlx5_roce.45
FAIL | 1 | openmpi mpitests-IMB-MPI1 PingPong [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-IMB-IO S_Read_indv [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-IMB-EXT Window [ known issue - bz 2216042 ]
FAIL | 1 | openmpi mpitests-osu_get_bw [ known issue - bz 2216042 ]
PASS | 0 | ip multicast addr
PASS | 0 | rping
PASS | 0 | rcopy
PASS | 0 | ib_read_bw
PASS | 0 | ib_send_bw
PASS | 0 | ib_write_bw
PASS | 0 | iser login
PASS | 0 | mount /dev/sdb /iser
PASS | 0 | iser write 1K
PASS | 0 | iser write 1M
PASS | 0 | iser write 1G
PASS | 0 | nfsordma mount - XFS_EXT
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - XFS_EXT
PASS | 0 | nfsordma mount - RAMDISK
PASS | 0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
PASS | 0 | nfsordma umount - RAMDISK
b. perftest
perftest test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_1
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | ib_atomic_bw RC
PASS | 0 | ib_atomic_bw XRC
PASS | 0 | ib_atomic_lat RC
PASS | 0 | ib_atomic_lat XRC
PASS | 0 | ib_read_bw RC
PASS | 0 | ib_read_bw XRC
PASS | 0 | ib_read_lat RC
PASS | 0 | ib_read_lat XRC
PASS | 0 | ib_send_bw RC
PASS | 0 | ib_send_bw XRC
PASS | 0 | ib_send_lat RC
PASS | 0 | ib_send_lat XRC
PASS | 0 | ib_write_bw RC
PASS | 0 | ib_write_bw XRC
PASS | 0 | ib_write_lat RC
PASS | 0 | ib_write_lat XRC
Checking for failures and known issues:
no test failures
perftest test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_2
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | ib_atomic_bw RC
PASS | 0 | ib_atomic_bw XRC
PASS | 0 | ib_atomic_lat RC
PASS | 0 | ib_atomic_lat XRC
PASS | 0 | ib_read_bw RC
PASS | 0 | ib_read_bw XRC
PASS | 0 | ib_read_lat RC
PASS | 0 | ib_read_lat XRC
PASS | 0 | ib_send_bw RC
PASS | 0 | ib_send_bw XRC
PASS | 0 | ib_send_lat RC
PASS | 0 | ib_send_lat XRC
PASS | 0 | ib_write_bw RC
PASS | 0 | ib_write_bw XRC
PASS | 0 | ib_write_lat RC
PASS | 0 | ib_write_lat XRC
Checking for failures and known issues:
no test failures
perftest test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 & mlx5_0
Result | Status | Test
---------+--------+------------------------------------
PASS | 0 | ib_atomic_bw RC
PASS | 0 | ib_atomic_lat RC
PASS | 0 | ib_read_bw RC
PASS | 0 | ib_read_lat RC
PASS | 0 | ib_send_bw RC
PASS | 0 | ib_send_lat RC
PASS | 0 | ib_write_bw RC
PASS | 0 | ib_write_lat RC
PASS | 0 | raw_ethernet_bw RC
PASS | 0 | raw_ethernet_lat RC
Checking for failures and known issues:
no test failures
o No new issues were observed
o All sanity and pertest results were verified as successful
Setting this bug as Verified
|