Bug 1878430

Summary: [machines] Uncaught TypeError: Cannot read property 'getAttribute' of undefined
Product: Red Hat Enterprise Linux 8 Reporter: Greg Scott <gscott>
Component: cockpit-appstreamAssignee: Simon Kobyda <skobyda>
Status: CLOSED WORKSFORME QA Contact: Release Test Team <release-test-team-automation>
Severity: high Docs Contact:
Priority: high    
Version: 8.2CC: kkoukiou, skobyda, wshi, xchen, ymao, yunyang
Target Milestone: rc   
Target Release: ---   
Hardware: All   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2021-08-30 12:04:19 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
These are the Java errors I captured during a Cockpit session trying to look at virtual machines. none

Description Greg Scott 2020-09-12 20:33:51 UTC
Created attachment 1714653 [details]
These are the Java errors I captured during a Cockpit session trying to look at virtual machines.

Description of problem:
After using leapp to upgrade from RHEL 7.8 to RHEL 8.2, and following the instructions to add virtual machine management in 

https://access.redhat.com/documentation/en-us/red_hat_enterprise_linux/8/html/configuring_and_managing_virtualization/managing-virtual-machines-in-the-web-console_configuring-and-managing-virtualization#setting-up-the-rhel-8-web-console-to-manage-vms_managing-virtual-machines-in-the-web-console

Cockpit sometimes does not show any VMs and always generates a Java Oops error.

Second problem - on the same system upgraded with leapp, Cockpit still shows a service warning after cleaning up the errors that caused the warning.

Version-Release number of selected component (if applicable):
RHEL 8.2

How reproducible:
At will

Steps to Reproduce:
1. Add a few libvirt KVM VMs to a RHEL 7 system.
2. Use leapp to upgrade the RHEL 7 system to RHEL 8.2
3. Follow the directions in chapter 5 of the Configuring and Managing Virtualization Guide.
4. Launch Cockpit on the newly upgraded machine.

Actual results:

Note the Java Oops errors and the inconsistent treatment of existing VMs when refreshing the display.

Clean up any service errors and note the warning does not go away.


Expected results:

The Virtual Machines section should show existing virtual machines and should not generate Java Oops errors.

Additional info:

At first, I thought the Cockpit problem might be from the VMs that were on this system prior to the upgrade. So I built a new VM after upgrading it. Cockpit treats the new VM the same as the VMs that were on it prior to the upgrade.

See the attachment for the Java errors I captured.

I also demonstrated the problems with a video. See
https://www.dgregscott.com/cockpitbz/cockpitbz2020-09-12.mp4

I tried restarting several slices, sockets, and service. None made a difference. That system is also an NFS server and I will reboot it as soon as I can move some key components back off it again. I'll post whether a reboot clears the problem in a comment.

Comment 1 Greg Scott 2020-09-12 22:50:56 UTC
A reboot cured the service warning error.

The reboot did not cure the Cockpit Java Ooops problem with virtual machines.

But the reboot helped gather more data. After the leapp upgrade, networking was a mess with some old stuff and some new NetworkManager stuff. Bridge br0 was left over from RHEL 7 and it stopped working after the reboot. I got rid of it and used Cockpit to set up a NetworkManager style bridge named bridge0. That made networking come back alive.

When I ran cockpit on the bare metal console and connected to https://127.0.0.1:9090, the Ooops errors were still there, but I could see all VMs.

But when I ran cockpit from another system and connected over the LAN to https://10.10.10.3:9090, the bad behavior sometimes showing VMs, sometimes not, came back.
I started one VM. It's a DHCP client, it acquired and IP Address, and all networking functions from inside the VM seem fine.

Here are more Cockpit errors.

****************************************************

index.js:15 Refused to apply inline style because it violates the following Content Security Policy directive: "default-src 'self' https://10.10.10.3:9090". Either the 'unsafe-inline' keyword, a hash ('sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU='), or a nonce ('nonce-...') is required to enable inline execution. Note also that 'style-src' was not explicitly set, so 'default-src' is used as a fallback.

h @ index.js:15
machines.js:92 Refused to apply inline style because it violates the following Content Security Policy directive: "default-src 'self' https://10.10.10.3:9090". Either the 'unsafe-inline' keyword, a hash ('sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU='), or a nonce ('nonce-...') is required to enable inline execution. Note also that 'style-src' was not explicitly set, so 'default-src' is used as a fallback.

p @ machines.js:92
overview.js:15 Refused to apply inline style because it violates the following Content Security Policy directive: "default-src 'self' https://10.10.10.3:9090". Either the 'unsafe-inline' keyword, a hash ('sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU='), or a nonce ('nonce-...') is required to enable inline execution. Note also that 'style-src' was not explicitly set, so 'default-src' is used as a fallback.

h @ overview.js:15
updates.js:11 Refused to apply inline style because it violates the following Content Security Policy directive: "default-src 'self' https://10.10.10.3:9090". Either the 'unsafe-inline' keyword, a hash ('sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU='), or a nonce ('nonce-...') is required to enable inline execution. Note also that 'style-src' was not explicitly set, so 'default-src' is used as a fallback.

m @ updates.js:11
services.js:19 Refused to apply inline style because it violates the following Content Security Policy directive: "default-src 'self' https://10.10.10.3:9090". Either the 'unsafe-inline' keyword, a hash ('sha256-47DEQpj8HBSa+/TImW+5JCeuQeRkm5NMpJWZG3hSuFU='), or a nonce ('nonce-...') is required to enable inline execution. Note also that 'style-src' was not explicitly set, so 'default-src' is used as a fallback.

m @ services.js:19
cockpit.js:606 grep: /sys/class/dmi/id/power/autosuspend_delay_ms: Input/output error

p @ cockpit.js:606
machines.js:129 index.js: Setting LibvirtDBus as virt provider.
machines.js:124 initResource action failed: {"problem":null,"name":"org.freedesktop.DBus.Error.UnknownMethod","message":"Introspection data for method org.libvirt.Connect ListInterfaces not available"}
(anonymous) @ machines.js:124
machines.js:124 getAllInterfaces action failed: {"problem":null,"name":"org.freedesktop.DBus.Error.UnknownMethod","message":"Introspection data for method org.libvirt.Connect ListInterfaces not available"}
(anonymous) @ machines.js:124
machines.js:124 Uncaught TypeError: Cannot read property 'getAttribute' of undefined
    at machines.js:124
    at Sa (machines.js:124)
    at Function.<anonymous> (machines.js:124)
    at s (cockpit.js:979)
    at cockpit.js:991
    at n (cockpit.js:897)
cockpit.js:606 grep: /etc/dnf/automatic.conf: No such file or directory

p @ cockpit.js:606
2cockpit.js:606 usage: virt-install --name NAME --memory MB STORAGE INSTALL [options]
virt-install: error: argument --install: expected one argument

p @ cockpit.js:606
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/base1/cockpit.min.js.map: HTTP error: status code 404, net::ERR_HTTP_RESPONSE_CODE_FAILURE
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/base1/jquery.min.js.map: HTTP error: status code 404, net::ERR_HTTP_RESPONSE_CODE_FAILURE
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/shell/index.min.js.map: HTTP error: status code 404, net::ERR_HTTP_RESPONSE_CODE_FAILURE
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/machines/machines.min.js.map: HTTP error: status code 404, net::ERR_HTTP_RESPONSE_CODE_FAILURE
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/system/overview.min.js.map: HTTP error: status code 404, net::ERR_HTTP_RESPONSE_CODE_FAILURE
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/updates/updates.min.js.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/performance/performance.min.js.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/system/services.min.js.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/base1/patternfly.min.css.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/shell/index.css.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/system/overview.css.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/system/services.css.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/updates/updates.css.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID
DevTools failed to load SourceMap: Could not load content for https://10.10.10.3:9090/cockpit/$c2c1b9269f8ceb0d31973053ea81df57c9384dadf1e6cb8273fd2802aa184f99/machines/machines.css.map: Certificate error: net::ERR_CERT_AUTHORITY_INVALID

Comment 2 Martin Pitt 2020-09-14 06:49:14 UTC
Hello Greg, thanks for the report! I'm devoting this bugzilla ticket to the Virtual Machines crash. For the Services page issue, if it happens again, can you please check if the failed service occurs further down in the list? I tried clearing a failed service with the current Cockpit version in RHEL 8.3, and that worked fine. (I don't currently have an 8.2 at hand)

The VM crash looks like libvirt-dbus is not fully available. As you were logging in as root, it's likely not a permission problem. Could you please log in with ssh or use Cockpit's Terminal, and see whether this command works?

    busctl call org.libvirt /org/libvirt/QEMU org.libvirt.Connect ListDomains u 0

It should show you a list of domains, it doesn't otherwise modify your system.

@Simon: There seems to be a missing error handling after "initResource action failed", it seems it's trying to parse an undefined XML.

Comment 3 Greg Scott 2020-09-14 14:12:42 UTC
From an ssh session logged in as root:

[root@storage2015 ~]#
[root@storage2015 ~]# busctl call org.libvirt /org/libvirt/QEMU org.libvirt.Connect ListDomains u 0
ao 4 "/org/libvirt/QEMU/domain/_76b7598f_f0a5_4beb_928c_18e620130b75" "/org/libvirt/QEMU/domain/_72c97620_88c0_4f93_83e2_5ca2441da6c7" "/org/libvirt/QEMU/domain/_7e4015ef_d28b_4e7b_9a7e_f25284029c8d" "/org/libvirt/QEMU/domain/_3757175d_8f92_409e_a598_18b88c1570b7"
[root@storage2015 ~]#
[root@storage2015 ~]#

Comment 4 Greg Scott 2020-09-14 14:21:17 UTC
One more note - I have another RHEL 8.2 system built from scratch. The Cockpit Virtual Machines section on built-from-scratch one is error-free. The difference between the error-free one and the error-prone one is, I built the error-free one from scratch and used leapp to upgrade the error-prone one from RHEL 7.8.

- Greg

Comment 5 Greg Scott 2020-09-16 15:56:29 UTC
Meantime, how do I get rid of the problem? Would removing and reinstalling cockpit-machines make a difference? Was the busctl stuff I posted above useful?

- Greg

Comment 6 Greg Scott 2020-09-18 19:05:07 UTC
Comparing the good and bad systems - the good system built from bare metal, the bad system upgraded with leapp, yum list installed | grep cockpit shows cockpit-podman installed on the good system and not on the bad system. The bad system has cockpit-leapp, which is not on the good system.

Good system:
[root@overflow images]# yum list installed | grep cockpit
cockpit.x86_64                                     211.3-1.el8                                    @anaconda
cockpit-bridge.x86_64                              211.3-1.el8                                    @anaconda
cockpit-machines.noarch                            211.3-1.el8                                    @rhel-8-for-x86_64-appstream-rpms
cockpit-packagekit.noarch                          211.3-1.el8                                    @AppStream
cockpit-podman.noarch                              17-1.module+el8.2.1+6636+bf4db4ab              @rhel-8-for-x86_64-appstream-rpms
cockpit-storaged.noarch                            211.3-1.el8                                    @AppStream
cockpit-system.noarch                              211.3-1.el8                                    @anaconda
cockpit-ws.x86_64                                  211.3-1.el8                                    @anaconda
subscription-manager-cockpit.noarch                1.26.17-1.el8_2                                @rhel-8-for-x86_64-baseos-rpms
[root@overflow images]#

Bad system:
[root@storage2015 images]# yum list installed | grep cockpit
cockpit.x86_64                                     211.3-1.el8                                    @System
cockpit-bridge.x86_64                              211.3-1.el8                                    @System
cockpit-leapp.noarch                               0.1.1-1.el7                                    @System
cockpit-machines.noarch                            211.3-1.el8                                    @rhel-8-for-x86_64-appstream-rpms
cockpit-packagekit.noarch                          211.3-1.el8                                    @System
cockpit-storaged.noarch                            211.3-1.el8                                    @System
cockpit-system.noarch                              211.3-1.el8                                    @System
cockpit-ws.x86_64                                  211.3-1.el8                                    @System
subscription-manager-cockpit.noarch                1.26.20-1.el8_2                                @System
[root@storage2015 images]#

Comment 7 Greg Scott 2020-09-18 19:31:09 UTC
Looking at cockpit-podman, I see a bunch of dependencies. None feel like they will make a difference with whatever is broken. I might install it anyway.

[root@storage2015 images]# yum install cockpit-podman
Updating Subscription Management repositories.
Last metadata expiration check: 1:30:09 ago on Fri 18 Sep 2020 12:36:36 PM CDT.
Dependencies resolved.
================================================================================
 Package            Arch   Version       Repository                        Size
================================================================================
Installing:
 cockpit-podman     noarch 17-1.module+el8.2.1+6636+bf4db4ab
                                         rhel-8-for-x86_64-appstream-rpms 1.1 M
Installing dependencies:
 conmon             x86_64 2:2.0.17-1.module+el8.2.1+6771+3533eb4c
                                         rhel-8-for-x86_64-appstream-rpms  39 k
 containernetworking-plugins
                    x86_64 0.8.6-1.module+el8.2.1+6626+598993b4
                                         rhel-8-for-x86_64-appstream-rpms  20 M
 containers-common  x86_64 1:1.0.0-1.module+el8.2.1+6676+604e1b26
                                         rhel-8-for-x86_64-appstream-rpms  52 k
 criu               x86_64 3.14-2.module+el8.2.1+6750+e53a300c
                                         rhel-8-for-x86_64-appstream-rpms 500 k
 fuse-overlayfs     x86_64 1.0.0-2.module+el8.2.1+6465+1a51e8b6
                                         rhel-8-for-x86_64-appstream-rpms  60 k
 fuse3-libs         x86_64 3.2.1-12.el8  rhel-8-for-x86_64-baseos-rpms     94 k
 libnet             x86_64 1.1.6-15.el8  rhel-8-for-x86_64-appstream-rpms  67 k
 libslirp           x86_64 4.3.0-3.module+el8.2.1+6816+bedf4f91
                                         rhel-8-for-x86_64-appstream-rpms  68 k
 libvarlink         x86_64 18-3.el8      rhel-8-for-x86_64-baseos-rpms     44 k
 podman             x86_64 1.9.3-2.module+el8.2.1+6867+366c07d6
                                         rhel-8-for-x86_64-appstream-rpms  13 M
 protobuf-c         x86_64 1.3.0-4.el8   rhel-8-for-x86_64-appstream-rpms  37 k
 runc               x86_64 1.0.0-66.rc10.module+el8.2.1+6465+1a51e8b6
                                         rhel-8-for-x86_64-appstream-rpms 2.7 M
 slirp4netns        x86_64 1.0.1-1.module+el8.2.1+6595+03641d72
                                         rhel-8-for-x86_64-appstream-rpms  45 k
Installing weak dependencies:
 container-selinux  noarch 2:2.135.0-1.module+el8.2.1+6849+893e4f4a
                                         rhel-8-for-x86_64-appstream-rpms  49 k
Enabling module streams:
 container-tools           rhel8

Transaction Summary
================================================================================
Install  15 Packages

Total download size: 38 M
Installed size: 147 M
Is this ok [y/N]: n
Operation aborted.
[root@storage2015 images]#

Comment 8 Greg Scott 2020-09-18 22:02:10 UTC
When I look at /etc/dbus-1/system.d/ I see differences between the good and bad system.

Good system built from bare metal:
[root@overflow dbus-1]# ls system.d -al
total 164
drwxr-xr-x. 2 root root 4096 Sep  6 14:37 .
drwxr-xr-x. 4 root root   78 Jun 30 04:33 ..
-rw-r--r--. 1 root root 1142 Oct 25  2018 avahi-dbus.conf
-rw-r--r--. 1 root root  400 Aug 10  2018 blivet.conf
-rw-r--r--. 1 root root 1315 Jan 13  2020 bluetooth.conf
-rw-r--r--. 1 root root 1007 Nov  9  2018 com.redhat.NewPrinterNotification.conf
-rw-r--r--. 1 root root 1016 Nov  9  2018 com.redhat.PrinterDriversInstaller.conf
-rw-r--r--. 1 root root 2534 Apr 28 15:17 com.redhat.RHSM1.conf
-rw-r--r--. 1 root root 1430 Apr 28 15:17 com.redhat.RHSM1.Facts.conf
-rw-r--r--. 1 root root 1390 Apr 28 15:17 com.redhat.SubscriptionManager.conf
-rw-r--r--. 1 root root  587 Dec 11  2019 com.redhat.tuned.conf
-r--r--r--. 1 root root  460 Feb 14  2020 cups.conf
-rw-r--r--. 1 root root  475 May 18 05:19 dnsmasq.conf
-rw-r--r--. 1 root root 3883 Feb  5  2020 gdm.conf
-rw-r--r--. 1 root root 2073 Aug 12  2018 net.hadess.SensorProxy.conf
-rw-r--r--. 1 root root  929 Oct 19  2016 net.hadess.SwitcherooControl.conf
-rw-r--r--. 1 root root  465 Jun  3 02:44 nm-dispatcher.conf
-rw-r--r--. 1 root root  387 Jun  3 02:44 nm-ifcfg-rh.conf
-rw-r--r--. 1 root root 2429 Dec  5  2018 oddjob.conf
-rw-r--r--. 1 root root 1863 Feb 26  2017 oddjob-mkhomedir.conf
-rw-r--r--. 1 root root 2079 Jan 11  2020 org.fedoraproject.Setroubleshootd.conf
-rw-r--r--. 1 root root  771 Jan 11  2020 org.fedoraproject.SetroubleshootFixit.conf
-rw-r--r--. 1 root root  917 Dec 15  2019 org.freedesktop.Accounts.conf
-rw-r--r--. 1 root root  971 Apr  3  2018 org.freedesktop.Flatpak.SystemHelper.conf
-rw-r--r--. 1 root root 1043 Nov  7  2018 org.freedesktop.fwupd.conf
-rw-r--r--. 1 root root  711 Oct  9  2019 org.freedesktop.GeoClue2.Agent.conf
-rw-r--r--. 1 root root 1411 Oct  9  2019 org.freedesktop.GeoClue2.conf
-rw-r--r--. 1 root root  422 Feb 14  2020 org.freedesktop.ModemManager1.conf
-rw-r--r--. 1 root root 9740 Jun  3 02:44 org.freedesktop.NetworkManager.conf
-rw-r--r--. 1 root root 1331 Nov 25  2019 org.freedesktop.PackageKit.conf
-rw-r--r--. 1 root root  638 Nov  4  2019 org.freedesktop.PolicyKit1.conf
-rw-r--r--. 1 root root  404 Feb 21  2020 org.freedesktop.realmd.conf
-rw-r--r--. 1 root root 1075 Jun 16  2009 org.freedesktop.RealtimeKit1.conf
-rw-r--r--. 1 root root 1555 Aug 12  2018 org.freedesktop.UPower.conf
-rw-r--r--. 1 root root  570 Aug 12  2018 org.gnome.GConf.Defaults.conf
-rw-r--r--. 1 root root  535 Aug 12  2018 org.opensuse.CupsPkHelper.Mechanism.conf
-rw-r--r--. 1 root root  535 Jan 17  2020 org.selinux.conf
-rw-r--r--. 1 root root 1084 Aug 23  2016 pulseaudio-system.conf
-rw-r--r--. 1 root root  409 Dec  9  2018 teamd.conf
-rw-r--r--. 1 root root  743 Nov  4  2019 wpa_supplicant.conf
[root@overflow dbus-1]#


Bad system, upgraded with leapp:
[root@storage2015 dbus-1]# ls system.d -al
total 172
drwxr-xr-x. 2 root root 4096 Sep 11 18:47 .
drwxr-xr-x. 4 root root   74 Jun 30 04:33 ..
-rw-r--r--. 1 root root 1142 Oct 25  2018 avahi-dbus.conf
-rw-r--r--. 1 root root  400 Aug 10  2018 blivet.conf
-rw-r--r--. 1 root root 1315 Jan 13  2020 bluetooth.conf
-rw-r--r--. 1 root root 1007 Nov  9  2018 com.redhat.NewPrinterNotification.conf
-rw-r--r--. 1 root root 1016 Nov  9  2018 com.redhat.PrinterDriversInstaller.conf
-rw-r--r--. 1 root root 2534 Aug 17 15:21 com.redhat.RHSM1.conf
-rw-r--r--. 1 root root 1430 Aug 17 15:21 com.redhat.RHSM1.Facts.conf
-rw-r--r--. 1 root root 1390 Aug 17 15:21 com.redhat.SubscriptionManager.conf
-rw-r--r--. 1 root root  587 Dec 11  2019 com.redhat.tuned.conf
-r--r--r--. 1 root root  460 Feb 14  2020 cups.conf
-rw-r--r--. 1 root root  957 Jul 16  2019 dbus-abrt.conf
-rw-r--r--. 1 root root  475 May 18 05:19 dnsmasq.conf
-rw-r--r--. 1 root root 3883 Feb  5  2020 gdm.conf
-rw-r--r--. 1 root root 2073 Aug 12  2018 net.hadess.SensorProxy.conf
-rw-r--r--. 1 root root  929 Oct 19  2016 net.hadess.SwitcherooControl.conf
-rw-r--r--. 1 root root  465 Jun  3 02:44 nm-dispatcher.conf
-rw-r--r--. 1 root root  387 Jun  3 02:44 nm-ifcfg-rh.conf
-rw-r--r--. 1 root root  590 Jul  9  2019 nm-libreswan-service.conf
-rw-r--r--. 1 root root 2429 Dec  5  2018 oddjob.conf
-rw-r--r--. 1 root root 1863 Feb 26  2017 oddjob-mkhomedir.conf
-rw-r--r--. 1 root root 2079 Jan 11  2020 org.fedoraproject.Setroubleshootd.conf
-rw-r--r--. 1 root root  771 Jan 11  2020 org.fedoraproject.SetroubleshootFixit.conf
-rw-r--r--. 1 root root  917 Dec 15  2019 org.freedesktop.Accounts.conf
-rw-r--r--. 1 root root  971 Apr  3  2018 org.freedesktop.Flatpak.SystemHelper.conf
-rw-r--r--. 1 root root  711 Oct  9  2019 org.freedesktop.GeoClue2.Agent.conf
-rw-r--r--. 1 root root 1411 Oct  9  2019 org.freedesktop.GeoClue2.conf
-rw-r--r--. 1 root root  422 Feb 14  2020 org.freedesktop.ModemManager1.conf
-rw-r--r--. 1 root root 9740 Jun  3 02:44 org.freedesktop.NetworkManager.conf
-rw-r--r--. 1 root root 1331 Nov 25  2019 org.freedesktop.PackageKit.conf
-rw-r--r--. 1 root root  638 Nov  4  2019 org.freedesktop.PolicyKit1.conf
-rw-r--r--. 1 root root  447 Jul 16  2019 org.freedesktop.problems.daemon.conf
-rw-r--r--. 1 root root  404 Feb 21  2020 org.freedesktop.realmd.conf
-rw-r--r--. 1 root root 1075 Jun 16  2009 org.freedesktop.RealtimeKit1.conf
-rw-r--r--. 1 root root 1555 Aug 12  2018 org.freedesktop.UPower.conf
-rw-r--r--. 1 root root  570 Aug 12  2018 org.gnome.GConf.Defaults.conf
-rw-r--r--. 1 root root  535 Aug 12  2018 org.opensuse.CupsPkHelper.Mechanism.conf
-rw-r--r--. 1 root root  535 Jan 17  2020 org.selinux.conf
-rw-r--r--. 1 root root 1084 Aug 23  2016 pulseaudio-system.conf
-rw-r--r--. 1 root root  409 Dec  9  2018 teamd.conf
-rw-r--r--. 1 root root  743 Nov  4  2019 wpa_supplicant.conf
[root@storage2015 dbus-1]#

Comment 9 Greg Scott 2020-09-20 20:02:05 UTC
Go figure - the errors are gone. Here is everything I can think of that may have changed between now and a few daya ago:

- A week of time passed.
- I moved a few virtual machines around.
- The PCs running the web browsers all rebooted.

Nothing new installed on that server, no config changes, no patches - the Ooops errors just disappeared sometime between Sept. 14 and Sept. 19. Welcome to the Twilight Zone.

Or could the whole thing have been some kind of bad browser interaction? But it would have been a bad interaction with both Chrome and Firefox on two different systems. And with Firefox on the real console of the problem system.

I wonder if that busctl command you wanted me to try does anything different now.

Still looks pretty much the same to me.

Last login: Sat Sep 19 10:22:25 2020 from 10.10.10.101
[root@storage2015 ~]# uptime
 14:51:36 up 7 days, 21:30,  2 users,  load average: 0.24, 0.19, 0.17
[root@storage2015 ~]#
[root@storage2015 ~]# busctl call org.libvirt /org/libvirt/QEMU org.libvirt.Connect ListDomains u 0
ao 5 "/org/libvirt/QEMU/domain/_c5fb3a91_6eb6_49f4_a4c7_205d8fc85078" "/org/libvirt/QEMU/domain/_2bdb7815_17d7_4d4f_830c_656fec30a14e" "/org/libvirt/QEMU/domain/_72c97620_88c0_4f93_83e2_5ca2441da6c7" "/org/libvirt/QEMU/domain/_7e4015ef_d28b_4e7b_9a7e_f25284029c8d" "/org/libvirt/QEMU/domain/_3757175d_8f92_409e_a598_18b88c1570b7"
[root@storage2015 ~]#
[root@storage2015 ~]# date
Sun Sep 20 14:53:05 CDT 2020
[root@storage2015 ~]#

Comment 13 Red Hat Bugzilla 2023-09-15 00:48:04 UTC
The needinfo request[s] on this closed bug have been removed as they have been unresolved for 500 days