Hide Forgot
Description of problem: Today we recognize that processes we have started with a remote shell using ssh have a nice of 19. But we have not started them with any nice. We detected that /usr/sbin/sshd is running with nice 19. Subprocesses like ssh logins inherit the nice of the daemon. This is the reason why all remote logins with ssh also have nice 19. But why is /usr/sbin/sshd running with nice 19? We checked our hosts and found that /usr/sbin/sshd was restarted today on all our Fedora 14 hosts. But the restart time (the time shown by "ps -efl | grep /usr/sbin/sshd" is different on each hosts. We also found that this start time corresponds to a yum-updatesd update step we found in /var/log/yum.log (a set of package updates nearly at the same time). We found this correspondence on all our Fedora 14 hosts. Because of the timestamps in /var/log/yum.log and the start time of sshd I speculate that it may have to do with one of these packages: glibc-2.13-1.x86_64 glibc-common-2.13-1.x86_64 glibc-headers-2.13-1.x86_64 glibc-devel-2.13-1.x86_64 selinux-policy-3.9.7-29.fc14.noarch setroubleshoot-server-3.0.25-1.fc14.x86_64 Or it may have to do something with yum or yum-updatesd or with sshd? I checked the scripts of the rpm packages for sshd (rpm -q --scripts ...package list from yum.log... | grep sshd) but nothing was found. I don't know what program has restarted /usr/sbin/sshd and why. A temporary workaround without rebooting the host is the following: Log in as root (or using su or sudo) and start "renice 0 PID" where PID is the pid of /usr/sbin/sshd. Then log out and log in again. Version-Release number of selected component (if applicable): yum-updatesd-0.9-3.fc12.noarch yum-3.2.28-5.fc14.noarch yum-langpacks-0.1.5-3.fc14.noarch yum-metadata-parser-1.1.4-2.fc14.x86_64 yum-plugin-changelog-1.1.28-1.fc14.noarch yum-plugin-downloadonly-1.1.28-1.fc14.noarch yum-plugin-fastestmirror-1.1.28-1.fc14.noarch yum-plugin-filter-data-1.1.28-1.fc14.noarch yum-plugin-keys-1.1.28-1.fc14.noarch yum-plugin-list-data-1.1.28-1.fc14.noarch yum-plugin-merge-conf-1.1.28-1.fc14.noarch yum-plugin-post-transaction-actions-1.1.28-1.fc14.noarch yum-plugin-priorities-1.1.28-1.fc14.noarch yum-plugin-protectbase-1.1.28-1.fc14.noarch yum-plugin-refresh-updatesd-1.1.28-1.fc14.noarch yum-plugin-security-1.1.28-1.fc14.noarch yum-plugin-tsflags-1.1.28-1.fc14.noarch yum-plugin-verify-1.1.28-1.fc14.noarch yum-plugin-versionlock-1.1.28-1.fc14.noarch yum-presto-0.6.2-2.fc14.noarch yum-utils-1.1.28-1.fc14.noarch glibc-2.13-1.i686 glibc-2.13-1.x86_64 glibc-common-2.13-1.x86_64 glibc-devel-2.13-1.i686 glibc-devel-2.13-1.x86_64 glibc-headers-2.13-1.x86_64 glibc-static-2.13-1.x86_64 gnome-applet-sshmenu-3.18-1.fc13.noarch kernel-2.6.35.10-74.fc14.x86_64 kernel-2.6.35.6-45.fc14.x86_64 kernel-devel-2.6.35.10-74.fc14.x86_64 kernel-devel-2.6.35.6-45.fc14.x86_64 kernel-doc-2.6.35.10-74.fc14.noarch kernel-headers-2.6.35.10-74.fc14.x86_64 ksshaskpass-0.5.3-1.fc14.x86_64 libssh-0.4.8-1.fc14.x86_64 libssh2-1.2.4-1.fc14.i686 libssh2-1.2.4-1.fc14.x86_64 mussh-0.7-4.fc12.noarch openssh-5.5p1-24.fc14.2.x86_64 openssh-askpass-5.5p1-24.fc14.2.x86_64 openssh-clients-5.5p1-24.fc14.2.x86_64 openssh-server-5.5p1-24.fc14.2.x86_64 pam_ssh-1.97-4.fc14.x86_64 How reproducible: It appears today (2011-02-09) after an automatic package update by yum-updatesd. It seems that only Fedora 14 is affected. Steps to Reproduce: 1. Install Fedora 14 including updates before 2011-02-09. 2. Create yum repositories as available at 2011-02-09 and configure yum to use these repos. 3. Start sshd. 4. Check that sshd runs with nice 0. 5. Start yum-updatesd. 6. Wait until yum-updatesd has installed the updated packages as of 2011-02-09. 7. Check if sshd was restarted and what the nice is. Actual results: sshd runs with nice 19. Expected results: sshd should run with nice 0. Additional info: The output of "ps -efl |grep /usr/sbin/sshd" on the example host that corresponds to the yum.log part in the attachment: 5 S root 15749 1 0 99 19 - 18767 poll_s 12:42 ? 00:00:00 /usr/sbin/sshd
It seems that I have submitted the bug report accedentally twice. Sorry. *** This bug has been marked as a duplicate of bug 676427 ***