Bug 232902
Summary: | Auto Apply Errata not working correctly. | ||
---|---|---|---|
Product: | [Retired] Red Hat Network | Reporter: | Clifford Perry <cperry> |
Component: | RHN/Web Site | Assignee: | Kevin A. Smith <ksmith> |
Status: | CLOSED CURRENTRELEASE | QA Contact: | Preethi Thomas <pthomas> |
Severity: | medium | Docs Contact: | |
Priority: | medium | ||
Version: | rhn500 | CC: | akrherz, duffy, newbery, rhn-bugs |
Target Milestone: | --- | ||
Target Release: | --- | ||
Hardware: | All | ||
OS: | Linux | ||
URL: | https://www.redhat.com/archives/rhn-users/2007-March/msg00059.html | ||
Whiteboard: | |||
Fixed In Version: | 5.0.0 | Doc Type: | Bug Fix |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2007-03-27 13:16:27 UTC | Type: | --- |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: | |||
Bug Depends On: | |||
Bug Blocks: | 232426 |
Description
Clifford Perry
2007-03-19 13:40:37 UTC
Having reviewed, it seems that there is a bug in the background daemon that schedules to application of 'Apply Auto Errata'. I have placed this bug onto the must fix list for the next scheduled minor maintence release of the RHN Hosted web site to get fixed. Cliff. * Fixed overly "generous" logic in the ErrataQueueWorker which was causing auto-errata updates to be applied to too many machines. Suggested Test Plan: 1) Find an account which has errata auto-updates disabled. 2) Create and publish a new errata. 3) Wait for the errata to get applied. You can monitor the progress of errata processing by: 3a) Edit /usr/share/rhn/classes/log4j.properties on scripts.back-webqa.redhat.com 3b) Add the following line: log4j.logger.com.redhat.rhn.task.ErrataCacheTask=DEBUG 3c) Bounce taskomatic: /sbin/service taskomatic restart 3d) Monitor the /var/log/rhn/rhn_taskomatic_daemon.log file 4) Once errata processing is complete, verify that the server used in step #1 does not have the errata scheduled. fails qa. looks like its not fixed. ksmith is already looking in to it. Ok. Let's try this again with a different test path. We'll need to simulate the effects of prod-ops' errata scripts so some manual database munging is involved. 1) Use the /svn/trunk/eng/backend/server/test/test_errata_import_webqa.py script to create a new errata. Change the advisory name before running the script. I've used the house number part of my address to try and insure uniqueness. 2) After the script has run, execute the following SQL query against the webqa database: select id from rhnErrata where created > sysdate - 1 and created < sysdate and advisory like '%ADVISORY_NAME%' Note: Replace the term ADVISORY_NAME with the unique value you selected in step 1. Make sure to leave the percent signs in place otherwise the query will not work correctly. 3) Jot down the id returned from the query. 4) Create a work entry for taskomatic so it can process the new errata. This is normally done by prod-ops' scripts, but we'll have to simulate it here. insert into rhnErrataQueue values (ERRATA_ID, sysdate, sysdate, sysdate); commit; Note: Replace the term ERRATA_ID with the id from step 3. 5) Wait for taskomatic to process this record. I'd suggest waiting 15-20 minutes. 6) Login to an account which has servers with auto errata updates disabled. 7) Verify that these servers _do not_ have a pending errata action for the newly created errata. verified. Thanks kevin for the detailed test plan. I've checked this in stage using my own account... System 1007229597 has a bunch of errata scheduled (correct) System 1007229598 does *not* have a bunch of errata scheduled (correct) Moving to RELEASE_PENDING Why no announcement to the community about this? Daryl, can you elaborate? Hi Máirín, I did on the email list. thanks, daryl Greetings, This problem appears to be back as I have most of my auto-apply systems yet to apply RHEL5.1 .. Some SSIDs for ya: 1007413619 1008254853 (I have lots more if necessary) daryl see bug 373131 re: comment #16 |