RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1918115 - glibc: FMA4 math routines are not selected properly after bug 1817513
Summary: glibc: FMA4 math routines are not selected properly after bug 1817513
Keywords:
Status: CLOSED ERRATA
Alias: None
Deadline: 2021-01-19
Product: Red Hat Enterprise Linux 8
Classification: Red Hat
Component: glibc
Version: CentOS Stream
Hardware: x86_64
OS: Linux
high
medium
Target Milestone: alpha
: 8.4
Assignee: Florian Weimer
QA Contact: Sergey Kolosov
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2021-01-20 05:36 UTC by P. Shesh Murthy
Modified: 2023-07-18 14:30 UTC (History)
13 users (show)

Fixed In Version: glibc-2.28-147.el8
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2021-05-18 14:36:50 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Sourceware 26534 0 P2 RESOLVED libm.so 2.32 SIGILL in pow() due to FMA4 instruction on non-FMA4 system 2021-02-15 10:56:56 UTC

Description P. Shesh Murthy 2021-01-20 05:36:18 UTC
Description of problem:
Yum crashes after glibc update to version 2.28-145.
[root@localhost ~]# yum search python
Illegal instruction (core dumped)
[root@localhost ~]# dmesg | tail -5
[  726.087955] traps: yum[1696] trap invalid opcode ip:7f390bbfa59f sp:7ffee22d19a0 error:0 in libm-2.28.so[7f390bb72000+181000]
[  773.118330] traps: yum[1730] trap invalid opcode ip:7f86d6a8c59f sp:7ffc07700320 error:0 in libm-2.28.so[7f86d6a04000+181000]
[  839.974365] traps: yum[1744] trap invalid opcode ip:7f71ee5fc59f sp:7ffd948b36a0 error:0 in libm-2.28.so[7f71ee574000+181000]
[  870.615860] traps: yum[1755] trap invalid opcode ip:7fe4ea5d559f sp:7ffe5ac21760 error:0 in libm-2.28.so[7fe4ea54d000+181000]
[  899.107078] traps: yum[1767] trap invalid opcode ip:7f0465e5859f sp:7ffdfbacafe0 error:0 in libm-2.28.so[7f0465dd0000+181000]
[root@localhost ~]#


Version-Release number of selected component (if applicable):
glibc-2.28-145.el8.x86_64
glibc-2.28-145.el8.i686

How reproducible:

Problem seen on some processors.
Following is  the processor info on the problem machine.
/proc/cpuinfo

processor       : 0
vendor_id       : GenuineIntel
cpu family      : 6
model           : 142
model name      : Intel(R) Core(TM) i7-8650U CPU @ 1.90GHz
stepping        : 10
microcode       : 0xb4
cpu MHz         : 2112.002
cache size      : 8192 KB
physical id     : 0
siblings        : 1
core id         : 0
cpu cores       : 1
apicid          : 0
initial apicid  : 0
fpu             : yes
fpu_exception   : yes
cpuid level     : 22
wp              : yes
flags           : fpu vme de pse tsc msr pae mce cx8 apic sep mtrr pge mca cmov pat pse36 clflush mmx fxsr sse sse2 ss syscall nx rdtscp lm constant_tsc arch_perfmon nopl tsc_reliable nonstop_tsc cpuid pni pclmulqdq ssse3 fma cx16 sse4_1 sse4_2 movbe popcnt aes xsave avx hypervisor lahf_lm 3dnowprefetch pti arat
bugs            : cpu_meltdown spectre_v1 spectre_v2 spec_store_bypass l1tf mds swapgs itlb_multihit
bogomips        : 4224.00
clflush size    : 64
cache_alignment : 64
address sizes   : 40 bits physical, 48 bits virtual
power management:


Steps to Reproduce:
1. Update to glibc 2.28-145
2. Run any yum command
3.

Actual results:


Expected results:


Additional info:

Comment 1 Florian Weimer 2021-01-20 07:20:15 UTC
The CPU flags do not match the CPU model, so I assume that virtualization is involved. What is your hypervisor, and how have you configured it?

Would you please check if systemd-coredumpd has obtained a coredump, and try to get a backtrace and disassembly using “coredumpctl gdb”?

Thanks.

Comment 2 P. Shesh Murthy 2021-01-20 08:17:44 UTC
1. Yes. This is virtual machine running on a windows host. I am using the vmware workstation (15.X). Are there any specific settings that you are interested in.

2. Following is the trace obtained with "coredumpctl gdb"


[root@localhost sysctl.d]# coredumpctl gdb
           PID: 10417 (yum)
           UID: 0 (root)
           GID: 0 (root)
        Signal: 4 (ILL)
     Timestamp: Wed 2021-01-20 19:09:55 IST (1min 45s ago)
  Command Line: /usr/libexec/platform-python /usr/bin/yum search python
    Executable: /usr/libexec/platform-python3.6
 Control Group: /user.slice/user-0.slice/session-1.scope
          Unit: session-1.scope
         Slice: user-0.slice
       Session: 1
     Owner UID: 0 (root)
       Boot ID: 4c95db69a0a34fe591a4ffb589bac207
    Machine ID: cb9c7789d294492691e347bdbd1c181e
      Hostname: localhost.localdomain
       Storage: /var/lib/systemd/coredump/core.yum.0.4c95db69a0a34fe591a4ffb589bac207.10417.1611149995000000.lz4
       Message: Process 10417 (yum) of user 0 dumped core.

                Stack trace of thread 10417:
                #0  0x00007fe6c05b359f __exp1_fma4 (libm.so.6)
                #1  0x00007fe6c05b4384 __ieee754_pow_fma4.localalias.0 (libm.so.6)
                #2  0x00007fe6c053a408 powf32x (libm.so.6)
                #3  0x00007fe6c104e3f8 float_pow (libpython3.6m.so.1.0)
                #4  0x00007fe6c102fb3c long_pow (libpython3.6m.so.1.0)
                #5  0x00007fe6c100b2f3 ternary_op.isra.39 (libpython3.6m.so.1.0)
                #6  0x00007fe6c10749a7 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #7  0x00007fe6c0fcef14 _PyEval_EvalCodeWithName (libpython3.6m.so.1.0)
                #8  0x00007fe6c0fd02b3 PyEval_EvalCode (libpython3.6m.so.1.0)
                #9  0x00007fe6c10e0f20 builtin_exec (libpython3.6m.so.1.0)
                #10 0x00007fe6c106b502 PyCFunction_Call (libpython3.6m.so.1.0)
                #11 0x00007fe6c1078bfe _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #12 0x00007fe6c0fcef14 _PyEval_EvalCodeWithName (libpython3.6m.so.1.0)
                #13 0x00007fe6c104f4f0 fast_function (libpython3.6m.so.1.0)
                #14 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #15 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #16 0x00007fe6c104f308 fast_function (libpython3.6m.so.1.0)
                #17 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #18 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #19 0x00007fe6c104f308 fast_function (libpython3.6m.so.1.0)
                #20 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #21 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #22 0x00007fe6c104f308 fast_function (libpython3.6m.so.1.0)
                #23 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #24 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #25 0x00007fe6c0fd0422 _PyFunction_FastCallDict (libpython3.6m.so.1.0)
                #26 0x00007fe6c0fd11fe _PyObject_FastCallDict (libpython3.6m.so.1.0)
                #27 0x00007fe6c10b5a7e _PyObject_CallMethodIdObjArgs (libpython3.6m.so.1.0)
                #28 0x00007fe6c0fd18a4 PyImport_ImportModuleLevelObject (libpython3.6m.so.1.0)
                #29 0x00007fe6c1077219 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #30 0x00007fe6c0fcef14 _PyEval_EvalCodeWithName (libpython3.6m.so.1.0)
                #31 0x00007fe6c0fd02b3 PyEval_EvalCode (libpython3.6m.so.1.0)
                #32 0x00007fe6c10e0f20 builtin_exec (libpython3.6m.so.1.0)
                #33 0x00007fe6c106b502 PyCFunction_Call (libpython3.6m.so.1.0)
                #34 0x00007fe6c1078bfe _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #35 0x00007fe6c0fcef14 _PyEval_EvalCodeWithName (libpython3.6m.so.1.0)
                #36 0x00007fe6c104f4f0 fast_function (libpython3.6m.so.1.0)
                #37 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #38 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #39 0x00007fe6c104f308 fast_function (libpython3.6m.so.1.0)
                #40 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #41 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #42 0x00007fe6c104f308 fast_function (libpython3.6m.so.1.0)
                #43 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #44 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #45 0x00007fe6c104f308 fast_function (libpython3.6m.so.1.0)
                #46 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #47 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #48 0x00007fe6c0fd0422 _PyFunction_FastCallDict (libpython3.6m.so.1.0)
                #49 0x00007fe6c0fd11fe _PyObject_FastCallDict (libpython3.6m.so.1.0)
                #50 0x00007fe6c10b5a7e _PyObject_CallMethodIdObjArgs (libpython3.6m.so.1.0)
                #51 0x00007fe6c0fd18a4 PyImport_ImportModuleLevelObject (libpython3.6m.so.1.0)
                #52 0x00007fe6c1077219 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #53 0x00007fe6c0fcef14 _PyEval_EvalCodeWithName (libpython3.6m.so.1.0)
                #54 0x00007fe6c0fd02b3 PyEval_EvalCode (libpython3.6m.so.1.0)
                #55 0x00007fe6c10e0f20 builtin_exec (libpython3.6m.so.1.0)
                #56 0x00007fe6c106b502 PyCFunction_Call (libpython3.6m.so.1.0)
                #57 0x00007fe6c1078bfe _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #58 0x00007fe6c0fcef14 _PyEval_EvalCodeWithName (libpython3.6m.so.1.0)
                #59 0x00007fe6c104f4f0 fast_function (libpython3.6m.so.1.0)
                #60 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)
                #61 0x00007fe6c1073488 _PyEval_EvalFrameDefault (libpython3.6m.so.1.0)
                #62 0x00007fe6c104f308 fast_function (libpython3.6m.so.1.0)
                #63 0x00007fe6c1072847 call_function (libpython3.6m.so.1.0)

GNU gdb (GDB) Red Hat Enterprise Linux 8.2-15.el8
Copyright (C) 2018 Free Software Foundation, Inc.
License GPLv3+: GNU GPL version 3 or later <http://gnu.org/licenses/gpl.html>
This is free software: you are free to change and redistribute it.
There is NO WARRANTY, to the extent permitted by law.
Type "show copying" and "show warranty" for details.
This GDB was configured as "x86_64-redhat-linux-gnu".
Type "show configuration" for configuration details.
For bug reporting instructions, please see:
<http://www.gnu.org/software/gdb/bugs/>.
Find the GDB manual and other documentation resources online at:
    <http://www.gnu.org/software/gdb/documentation/>.

For help, type "help".
Type "apropos word" to search for commands related to "word"...
Reading symbols from /usr/libexec/platform-python3.6...Reading symbols from .gnu_debugdata for /usr/libexec/platform-python3.6...(no debugging symbols found)...done.
(no debugging symbols found)...done.

warning: core file may not match specified executable file.
[New LWP 10417]
[Thread debugging using libthread_db enabled]
Using host libthread_db library "/lib64/libthread_db.so.1".
Core was generated by `/usr/libexec/platform-python /usr/bin/yum search python'.
Program terminated with signal SIGILL, Illegal instruction.
#0  0x00007fe6c05b359f in __exp1_fma4 () from /lib64/libm.so.6
Missing separate debuginfos, use: yum debuginfo-install platform-python-3.6.8-33.el8.x86_64
(gdb)

Comment 3 Florian Weimer 2021-01-20 08:33:23 UTC
Thanks, the backtrace was illuminating. We need to backport this upstream commit:

commit 23af890b3f04e80da783ba64e6b6d94822e01d54
Author: Ondřej Hošek <ondra.hosek>
Date:   Wed Aug 26 04:26:50 2020 +0200

    x86-64: Fix FMA4 detection in ifunc [BZ #26534]
    
    A typo in commit 107e6a3c2212ba7a3a4ec7cae8d82d73f7c95d0b causes the
    FMA4 code path to be taken on systems that support FMA, even if they do
    not support FMA4. Fix this to detect FMA4.

Comment 6 P. Shesh Murthy 2021-01-22 04:23:41 UTC
Hi

Is there a tentative date for the new package to get published on Centos Stream repository?

Thanks
#psm

Comment 7 Florian Weimer 2021-01-23 14:11:53 UTC
(In reply to P. Shesh Murthy from comment #6)
> Is there a tentative date for the new package to get published on Centos
> Stream repository?

I expect an update of CentOS Stream in mid-February, as the new build has to pass through QE first. I have uploaded repository with a hotfix build:

http://people.redhat.com/~fweimer/HorTWjurJ7Qk/glibc-2.28-146.el8.0.bz1918115.0/

As a workaround, you can select a different machine model in the hypervisor. Anything that advertises AVX2 to the guest should work. As a sight effect, you should also get better performance from your VMs. A machine model that does not indicate support for FMA will work, too.

The trigger for this bug is that the CPUID information in the guest has the FMA bit set, but the virtualized CPU lacks support for XSAVE, so that the kernel cannot enable it. This case should still be handled correctly in the sense that there are no crashes, of course, but it also should not happen in practice (neither on bare metal, nor with properly configured hypervisor machine models).

Comment 8 P. Shesh Murthy 2021-01-23 15:01:33 UTC
Thanks for the information.

My vmware machine was running in ESX 5.0 compat mode. Once I updated the machine to ESX 6.0, I see the avx2 flag in /proc/cpuinfo and the problem is no longer observed.

Comment 14 errata-xmlrpc 2021-05-18 14:36:50 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Moderate: glibc security, bug fix, and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2021:1585


Note You need to log in before you can comment on or make changes to this bug.