RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2224042 - Perftest versions prior to 23.04 missing Intel Parameters
Summary: Perftest versions prior to 23.04 missing Intel Parameters
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 9
Classification: Red Hat
Component: perftest
Version: 9.3
Hardware: All
OS: Linux
unspecified
high
Target Milestone: rc
: 9.3
Assignee: Kamal Heib
QA Contact: Brian Chae
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-07-19 16:27 UTC by Kamal Heib
Modified: 2023-11-07 11:37 UTC (History)
8 users (show)

Fixed In Version: perftest-23.04.0.0.23-2.el9
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 2211464
Environment:
Last Closed: 2023-11-07 08:55:30 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker RHELPLAN-162772 0 None None None 2023-07-19 16:29:47 UTC
Red Hat Product Errata RHBA-2023:6669 0 None None None 2023-11-07 08:55:33 UTC

Description Kamal Heib 2023-07-19 16:27:10 UTC
+++ This bug was initially created as a clone of Bug #2211464 +++

Description of problem:
Perftest versions prior to April 2023 are missing Intel parameters to run properly with default flags.

Version-Release number of selected component (if applicable):
https://github.com/linux-rdma/perftest/releases
Perftest version 4.5-0.20 and older missing parameters.

How reproducible:
Run perftest on Intel devices with default flags.

Steps to Reproduce:
on a "server" run: ib_send_lat
on a "client" run ib_send_lat IP_address_of_client

Actual results:
Both sides report:
Unable to create QP.
Failed to create QP.
 Couldn't create IB resources

Expected results:
Test should run, report latency

Additional info:
Default values are based on capabilities from a different device.
This has been corrected on the latest perftest release: perftest-23.04.0-0.23
https://github.com/linux-rdma/perftest/releases/tag/23.04.0-0.23

The request here is to please update the distro perftest versions to 23.04.0-0.23 or newer moving forward.

Thank you.

Comment 1 Brian Chae 2023-07-30 17:45:18 UTC
When tested with perftest-23.04.0.0.23-2.el9.x86_64 on intel HCA, E810 iRDMA iWARP device, "ib_send_lat" with just defaults failed as the following:

Server host : rdma-dev-30
Client host : rdma-dev-31

[root@rdma-dev-30 ~]$ rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64

[root@rdma-dev-31 ~]$ rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64


[root@rdma-dev-30 ~]$ ib_send_lat
 Port number 1 state is Down
 Couldn't set the link layer
 Couldn't get context for the device
[root@rdma-dev-30 ~]$

[root@rdma-dev-31 ~]$ ib_send_lat 172.31.50.130
Couldn't connect to 172.31.50.130:18515
Unable to open file descriptor for socket connection Unable to init the socket connection
[root@rdma-dev-31 ~]$


---------------------


if required the following flags, at a mininum


server


[root@rdma-dev-30 ~]$ ib_send_lat -d irdma1 -R       <<<==================================

************************************
* Waiting for client to connect... *
************************************
---------------------------------------------------------------------------------------
                    Send Latency Test
 Dual-port       : OFF          Device         : irdma1
 Number of qps   : 1            Transport type : IW
 Connection type : RC           Using SRQ      : OFF
 PCIe relax order: ON
 ibv_wr* API     : OFF
 RX depth        : 512
 Mtu             : 4096[B]
 Link type       : Ethernet
 GID index       : 0
 Max inline data : 101[B]
 rdma_cm QPs     : ON
 Data ex. method : rdma_cm
---------------------------------------------------------------------------------------
 Waiting for client rdma_cm QP to connect
 Please run the same command with the IB/RoCE interface IP
---------------------------------------------------------------------------------------
 local address: LID 0x01 QPN 0x002b PSN 0x381a7c
 GID: 180:150:145:173:133:137:00:00:00:00:00:00:00:00:00:00
 remote address: LID 0x01 QPN 0x002c PSN 0x8cd6b8
 GID: 180:150:145:173:138:169:00:00:00:00:00:00:00:00:00:00
---------------------------------------------------------------------------------------
 #bytes #iterations    t_min[usec]    t_max[usec]  t_typical[usec]    t_avg[usec]    t_stdev[usec]   99% percentile[usec]   99.9% percentile[usec]
Conflicting CPU frequency values detected: 2000.000000 != 2794.665000. CPU Frequency is not max.
Conflicting CPU frequency values detected: 2000.000000 != 2773.170000. CPU Frequency is not max.
 2       1000          5.51           7.34         5.57                5.67             0.04            6.04                    7.34
---------------------------------------------------------------------------------------
[root@rdma-dev-30 ~]$



client

[root@rdma-dev-31 ~]$ ib_send_lat -R 172.31.50.130      <<<==================================
---------------------------------------------------------------------------------------
                    Send Latency Test
 Dual-port       : OFF          Device         : irdma0
 Number of qps   : 1            Transport type : IB
 Connection type : RC           Using SRQ      : OFF
 PCIe relax order: ON
 ibv_wr* API     : OFF
 TX depth        : 1
 Mtu             : 4096[B]
 Link type       : Ethernet
 GID index       : 0
 Max inline data : 101[B]
 rdma_cm QPs     : ON
 Data ex. method : rdma_cm
---------------------------------------------------------------------------------------
 local address: LID 0x01 QPN 0x002c PSN 0x8cd6b8
 GID: 180:150:145:173:138:169:00:00:00:00:00:00:00:00:00:00
 remote address: LID 0x01 QPN 0x002b PSN 0x381a7c
 GID: 180:150:145:173:133:137:00:00:00:00:00:00:00:00:00:00
---------------------------------------------------------------------------------------
 #bytes #iterations    t_min[usec]    t_max[usec]  t_typical[usec]    t_avg[usec]    t_stdev[usec]   99% percentile[usec]   99.9% percentile[usec]
Conflicting CPU frequency values detected: 2000.000000 != 2792.653000. CPU Frequency is not max.
Conflicting CPU frequency values detected: 2000.000000 != 2787.568000. CPU Frequency is not max.
 2       1000          5.51           7.54         5.57                5.67             0.05            6.46                    7.54
---------------------------------------------------------------------------------------
[root@rdma-dev-31 ~]$


==================================


However, on a MLX5 IB device, "ib_send_lat" with defaults worked fine, as the following


[root@rdma-dev-19 ~]$ rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64

[root@rdma-dev-20 ~]$ rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64

[root@rdma-dev-19 ~]$ ib_send_lat

************************************
* Waiting for client to connect... *
************************************
---------------------------------------------------------------------------------------
                    Send Latency Test
 Dual-port       : OFF          Device         : mlx5_2
 Number of qps   : 1            Transport type : IB
 Connection type : RC           Using SRQ      : OFF
 PCIe relax order: ON
 WARNING: CPU is not PCIe relaxed ordering compliant.
 WARNING: You should disable PCIe RO with `--disable_pcie_relaxed` for both server and client.
 ibv_wr* API     : ON
 RX depth        : 512
 Mtu             : 4096[B]
 Link type       : IB
 Max inline data : 236[B]
 rdma_cm QPs     : OFF
 Data ex. method : Ethernet
---------------------------------------------------------------------------------------
 local address: LID 0x07 QPN 0x0363 PSN 0x6fe64
 remote address: LID 0x08 QPN 0x0363 PSN 0x3610a3
---------------------------------------------------------------------------------------
 #bytes #iterations    t_min[usec]    t_max[usec]  t_typical[usec]    t_avg[usec]    t_stdev[usec]   99% percentile[usec]   99.9% percentile[usec]
Conflicting CPU frequency values detected: 3501.299000 != 3600.000000. CPU Frequency is not max.
Conflicting CPU frequency values detected: 3500.088000 != 3600.000000. CPU Frequency is not max.
 2       1000          1.20           2.77         1.24                1.24             0.00            1.37                    2.77
---------------------------------------------------------------------------------------
[root@rdma-dev-19 ~]$



[root@rdma-dev-20 ~]$ ib_send_lat 172.31.0.119
---------------------------------------------------------------------------------------
                    Send Latency Test
 Dual-port       : OFF          Device         : mlx5_2
 Number of qps   : 1            Transport type : IB
 Connection type : RC           Using SRQ      : OFF
 PCIe relax order: ON
 WARNING: CPU is not PCIe relaxed ordering compliant.     
 WARNING: You should disable PCIe RO with `--disable_pcie_relaxed` for both server and client.
 ibv_wr* API     : ON
 TX depth        : 1
 Mtu             : 4096[B]
 Link type       : IB
 Max inline data : 236[B]
 rdma_cm QPs     : OFF
 Data ex. method : Ethernet
---------------------------------------------------------------------------------------
 local address: LID 0x08 QPN 0x0363 PSN 0x3610a3
 remote address: LID 0x07 QPN 0x0363 PSN 0x6fe64    
---------------------------------------------------------------------------------------
 #bytes #iterations    t_min[usec]    t_max[usec]  t_typical[usec]    t_avg[usec]    t_stdev[usec]   99% percentile[usec]   99.9% percentile[usec]
Conflicting CPU frequency values detected: 3600.000000 != 3499.673000. CPU Frequency is not max.
Conflicting CPU frequency values detected: 3600.000000 != 3499.672000. CPU Frequency is not max.
 2       1000          1.20           3.28         1.23                1.24             0.00            1.44                    3.28
---------------------------------------------------------------------------------------


====================================


Setting this bug back to Assigned state, because the test failed with default flags on Intel device, E810 iRDMA IW.

Comment 2 Brian Chae 2023-07-31 18:29:34 UTC
After discussion with Kamal in DEV, the first device, irdma0, was in DOWN state, the default for "ib_send_lat" command would select it and would have the errors. Hence, with rdma-dev-30/31 pair, we cannot test for default flags. We will just go with RDMA sanity and perftest tests results.


1. perftest package : perftest-23.04.0.0.23-2.el9.x86_64
2. tested HW :
   MLX5 IB0, IB1, RoCE
   E810 iRDMA
3. RDMA test suites tested : sanity, perftest
4. Results

1. E810 iRDMA iWARP

+ [23-07-30 13:00:20] rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64

sanity test results on rdma-dev-30/rdma-dev-31 & Beaker job J:8131466:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, i40e, iw, E810-C & irdma1
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | ibstatus reported expected HCA rate
      PASS |      0 | vlan i810_iw.81 create/delete
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      SKIP |    777 | ibsrpdm
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.50.131
      PASS |      0 | ping6 self - fe80::b696:91ff:fead:8aa9%i810_iw
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.50.130
      PASS |      0 | ping6 server - fe80::b696:91ff:fead:8589%i810_iw
      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong              [ known issue - bz bz 2216042 ]
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv             [ known issue - bz bz 2216042 ]
      FAIL |      1 | openmpi mpitests-IMB-EXT Window                 [ known issue - bz bz 2216042 ]
      FAIL |      1 | openmpi mpitests-osu_get_bw                     [ known issue - bz bz 2216042 ]
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      FAIL |    255 | rcopy                                           [ known issue - bz 2176561 ] 
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      FAIL |    124 | iser write 1G                                   [ on-off issue ]
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK
      
perftest test results on rdma-dev-30/rdma-dev-31 & Beaker job J:8131466:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, i40e, iw, E810-C & irdma1 
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | ib_read_bw RC
      PASS |      0 | ib_read_lat RC
      PASS |      0 | ib_send_bw RC
      PASS |      0 | ib_send_lat RC
      PASS |      0 | ib_write_bw RC
      PASS |      0 | ib_write_lat RC
Checking for failures and known issues:
  no test failures



2. MLX5 IB0, IB1, RoCE


+ [23-07-30 13:00:20] rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64

sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_2
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      PASS |      0 | enable opensm
      PASS |      0 | restart opensm
      PASS |      0 | osmtest -f c -g 0x248a07030049d468
      PASS |      0 | stop opensm
      PASS |      0 | disable opensm
      FAIL |      1 | ibstatus reported expected HCA rate
      PASS |      0 | pkey mlx5_ib0.8080 create/delete
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      PASS |      0 | /usr/sbin/ibsrpdm -vc
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.0.120
      PASS |      0 | ping6 self - fe80::268a:703:49:d468%mlx5_ib0
      FAIL |    127 | /usr/share/pmix/test/pmix_test                           [ known issue - see above ]
      PASS |      0 | ping server - 172.31.0.119
      PASS |      0 | ping6 server - fe80::268a:703:49:d338%mlx5_ib0
      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong                       [ known issue - see above ]
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv                      [ known issue - see above ]
      FAIL |      1 | openmpi mpitests-IMB-EXT Window                          [ known issue - see above ]
      FAIL |      1 | openmpi mpitests-osu_get_bw                              [ known issue - see above ]
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK

sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_3
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      PASS |      0 | enable opensm
      PASS |      0 | restart opensm
      PASS |      0 | osmtest -f c -g 0x248a07030049d469
      PASS |      0 | stop opensm
      PASS |      0 | disable opensm
      PASS |      0 | ibstatus reported expected HCA rate
      PASS |      0 | pkey mlx5_ib1.8080 create/delete
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      PASS |      0 | /usr/sbin/ibsrpdm -vc
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.1.120
      PASS |      0 | ping6 self - fe80::268a:703:49:d469%mlx5_ib1
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.1.119
      PASS |      0 | ping6 server - fe80::268a:703:49:d339%mlx5_ib1
      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv
      FAIL |      1 | openmpi mpitests-IMB-EXT Window
      FAIL |      1 | openmpi mpitests-osu_get_bw
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK

sanity test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 Lx & mlx5_bond_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      FAIL |      1 | ibstatus reported expected HCA rate
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      SKIP |    777 | ibsrpdm
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.45.120
      PASS |      0 | ping6 self - fe80::7efe:90ff:fecb:762a%mlx5_team_ro.45
      FAIL |    127 | /usr/share/pmix/test/pmix_test
      PASS |      0 | ping server - 172.31.45.119
      PASS |      0 | ping6 server - fe80::7efe:90ff:fecb:743a%mlx5_team_ro.45
      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv
      FAIL |      1 | openmpi mpitests-IMB-EXT Window
      FAIL |      1 | openmpi mpitests-osu_get_bw
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK

perftest test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_2
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | ib_atomic_bw RC
      PASS |      0 | ib_atomic_bw XRC
      PASS |      0 | ib_atomic_lat RC
      PASS |      0 | ib_atomic_lat XRC
      PASS |      0 | ib_read_bw RC
      PASS |      0 | ib_read_bw XRC
      PASS |      0 | ib_read_lat RC
      PASS |      0 | ib_read_lat XRC
      PASS |      0 | ib_send_bw RC
      PASS |      0 | ib_send_bw XRC
      PASS |      0 | ib_send_lat RC
      PASS |      0 | ib_send_lat XRC
      PASS |      0 | ib_write_bw RC
      PASS |      0 | ib_write_bw XRC
      PASS |      0 | ib_write_lat RC
      PASS |      0 | ib_write_lat XRC
Checking for failures and known issues:
  no test failures

perftest test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_3
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | ib_atomic_bw RC
      PASS |      0 | ib_atomic_bw XRC
      PASS |      0 | ib_atomic_lat RC
      PASS |      0 | ib_atomic_lat XRC
      PASS |      0 | ib_read_bw RC
      PASS |      0 | ib_read_bw XRC
      PASS |      0 | ib_read_lat RC
      PASS |      0 | ib_read_lat XRC
      PASS |      0 | ib_send_bw RC
      PASS |      0 | ib_send_bw XRC
      PASS |      0 | ib_send_lat RC
      PASS |      0 | ib_send_lat XRC
      PASS |      0 | ib_write_bw RC
      PASS |      0 | ib_write_bw XRC
      PASS |      0 | ib_write_lat RC
      PASS |      0 | ib_write_lat XRC
Checking for failures and known issues:
  no test failures

perftest test results on rdma-dev-19/rdma-dev-20 & Beaker job J:8131470:
5.14.0-345.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 Lx & mlx5_bond_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | ib_atomic_bw RC
      PASS |      0 | ib_atomic_lat RC
      PASS |      0 | ib_read_bw RC
      PASS |      0 | ib_read_lat RC
      PASS |      0 | ib_send_bw RC
      PASS |      0 | ib_send_lat RC
      PASS |      0 | ib_write_bw RC
      PASS |      0 | ib_write_lat RC
      PASS |      0 | raw_ethernet_bw RC
      PASS |      0 | raw_ethernet_lat RC
Checking for failures and known issues:
  no test failures


o No new issues observed in the sanity and pertests on the above RDMA HCAs
o Setting this bug back to MOD state 
o Setting this bug with "verified:tested" flag

Comment 5 Brian Chae 2023-08-04 21:14:43 UTC
The verification for ON_QA was done as the following:

1. build tested : RHEL-9.3.0-20230804.9
2. perftest package tested: perftest-23.04.0.0.23-2.el9.x86_64.rpm 

+ [23-08-04 16:49:13] rpm -q perftest
perftest-23.04.0.0.23-2.el9.x86_64


3. HW tested : MLX5 IB0, MLX5 IB1, MLX5 RoCE, QEDR IW, QEDR RoCE, BNXT RoCE
4. RDMA test suites tested : sanity, perftest
5. Results: Only MLX5 IB0, IB1, RoCE test results are attached. Other HW test results will not be shown in this comment section but have been verified as successful.

   a. sanity

sanity test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_1
    Result | Status | Test 
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      PASS |      0 | enable opensm
      PASS |      0 | restart opensm 
      PASS |      0 | osmtest -f c -g 0x248a07030049d75c 
      PASS |      0 | stop opensm
      PASS |      0 | disable opensm 
      FAIL |      1 | ibstatus reported expected HCA rate
      PASS |      0 | pkey mlx5_ib0.8080 create/delete
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      PASS |      0 | /usr/sbin/ibsrpdm -vc
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.0.122
      PASS |      0 | ping6 self - fe80::268a:703:49:d75c%mlx5_ib0
      FAIL |    127 | /usr/share/pmix/test/pmix_test                      [ known issue - bz 2176561 ]
      PASS |      0 | ping server - 172.31.0.121
      PASS |      0 | ping6 server - fe80::268a:703:49:d4f0%mlx5_ib0
      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong                  [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv                 [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-IMB-EXT Window                     [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-osu_get_bw                         [ known issue - bz 2216042 ]
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw 
      PASS |      0 | ib_write_bw 
      PASS |      0 | iser login 
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK

sanity test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_2
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      PASS |      0 | enable opensm
      PASS |      0 | restart opensm
      PASS |      0 | osmtest -f c -g 0x248a07030049d75d                  
      PASS |      0 | stop opensm
      PASS |      0 | disable opensm 
      PASS |      0 | ibstatus reported expected HCA rate
      PASS |      0 | pkey mlx5_ib1.8080 create/delete
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      PASS |      0 | /usr/sbin/ibsrpdm -vc
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.1.122 
      PASS |      0 | ping6 self - fe80::268a:703:49:d75d%mlx5_ib1
      FAIL |    127 | /usr/share/pmix/test/pmix_test                      [ known issue - bz 2176561 ]
      PASS |      0 | ping server - 172.31.1.121
      PASS |      0 | ping6 server - fe80::268a:703:49:d4f1%mlx5_ib1
      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong                  [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv                 [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-IMB-EXT Window                     [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-osu_get_bw                         [ known issue - bz 2216042 ]
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw 
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK 
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK

sanity test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 & mlx5_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | load module mlx5_ib
      PASS |      0 | load module mlx5_core
      PASS |      0 | ibstatus reported expected HCA rate
      PASS |      0 | /usr/sbin/ibstat
      PASS |      0 | /usr/sbin/ibstatus
      PASS |      0 | systemctl start srp_daemon.service
      SKIP |    777 | ibsrpdm
      PASS |      0 | systemctl stop srp_daemon
      PASS |      0 | ping self - 172.31.45.122
      PASS |      0 | ping6 self - fe80::268a:7ff:fe56:b834%mlx5_roce.45
      FAIL |    127 | /usr/share/pmix/test/pmix_test                      [ known issue - bz 2176561 ]
      PASS |      0 | ping server - 172.31.45.121
      PASS |      0 | ping6 server - fe80::268a:7ff:fe4b:f094%mlx5_roce.45
      FAIL |      1 | openmpi mpitests-IMB-MPI1 PingPong                  [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-IMB-IO S_Read_indv                 [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-IMB-EXT Window                     [ known issue - bz 2216042 ]
      FAIL |      1 | openmpi mpitests-osu_get_bw                         [ known issue - bz 2216042 ]
      PASS |      0 | ip multicast addr
      PASS |      0 | rping
      PASS |      0 | rcopy
      PASS |      0 | ib_read_bw
      PASS |      0 | ib_send_bw
      PASS |      0 | ib_write_bw
      PASS |      0 | iser login
      PASS |      0 | mount /dev/sdb /iser
      PASS |      0 | iser write 1K
      PASS |      0 | iser write 1M
      PASS |      0 | iser write 1G
      PASS |      0 | nfsordma mount - XFS_EXT
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - XFS_EXT
      PASS |      0 | nfsordma mount - RAMDISK
      PASS |      0 | nfsordma - wrote [5KB, 5MB, 5GB in 1KB, 1MB, 1GB bs]
      PASS |      0 | nfsordma umount - RAMDISK


    b. perftest


perftest test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib0, ConnectX-4 & mlx5_1
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | ib_atomic_bw RC
      PASS |      0 | ib_atomic_bw XRC
      PASS |      0 | ib_atomic_lat RC
      PASS |      0 | ib_atomic_lat XRC
      PASS |      0 | ib_read_bw RC
      PASS |      0 | ib_read_bw XRC
      PASS |      0 | ib_read_lat RC
      PASS |      0 | ib_read_lat XRC
      PASS |      0 | ib_send_bw RC
      PASS |      0 | ib_send_bw XRC
      PASS |      0 | ib_send_lat RC
      PASS |      0 | ib_send_lat XRC
      PASS |      0 | ib_write_bw RC
      PASS |      0 | ib_write_bw XRC
      PASS |      0 | ib_write_lat RC
      PASS |      0 | ib_write_lat XRC
Checking for failures and known issues:
  no test failures

perftest test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, ib1, ConnectX-4 & mlx5_2
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | ib_atomic_bw RC
      PASS |      0 | ib_atomic_bw XRC
      PASS |      0 | ib_atomic_lat RC
      PASS |      0 | ib_atomic_lat XRC
      PASS |      0 | ib_read_bw RC
      PASS |      0 | ib_read_bw XRC
      PASS |      0 | ib_read_lat RC
      PASS |      0 | ib_read_lat XRC
      PASS |      0 | ib_send_bw RC
      PASS |      0 | ib_send_bw XRC
      PASS |      0 | ib_send_lat RC
      PASS |      0 | ib_send_lat XRC
      PASS |      0 | ib_write_bw RC
      PASS |      0 | ib_write_bw XRC
      PASS |      0 | ib_write_lat RC
      PASS |      0 | ib_write_lat XRC
Checking for failures and known issues:
  no test failures

perftest test results on rdma-dev-21/rdma-dev-22 & Beaker job J:8154355:
5.14.0-349.el9.x86_64, rdma-core-46.0-1.el9, mlx5, roce.45, ConnectX-4 & mlx5_0
    Result | Status | Test
  ---------+--------+------------------------------------
      PASS |      0 | ib_atomic_bw RC
      PASS |      0 | ib_atomic_lat RC
      PASS |      0 | ib_read_bw RC
      PASS |      0 | ib_read_lat RC
      PASS |      0 | ib_send_bw RC
      PASS |      0 | ib_send_lat RC
      PASS |      0 | ib_write_bw RC
      PASS |      0 | ib_write_lat RC
      PASS |      0 | raw_ethernet_bw RC
      PASS |      0 | raw_ethernet_lat RC
Checking for failures and known issues:
  no test failures


o No new issues were observed
o All sanity and pertest results were verified as successful

Setting this bug as Verified

Comment 7 errata-xmlrpc 2023-11-07 08:55:30 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (perftest bug fix and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2023:6669


Note You need to log in before you can comment on or make changes to this bug.