Red Hat Satellite engineering is moving the tracking of its product development work on Satellite to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "Satellite project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs will be migrated starting at the end of May. If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "Satellite project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/SAT-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1017921 - cli: node sync: reports "Sync complete" when sync hasn't finished on the node
Summary: cli: node sync: reports "Sync complete" when sync hasn't finished on the node
Keywords:
Status: CLOSED DUPLICATE of bug 1018236
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: Hammer
Version: 6.0.2
Hardware: Unspecified
OS: Unspecified
unspecified
high
Target Milestone: Unspecified
Assignee: Brad Buckingham
QA Contact: Katello QA List
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2013-10-10 18:46 UTC by Brad Buckingham
Modified: 2014-01-27 14:05 UTC (History)
1 user (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2013-10-16 12:26:29 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)

Description Brad Buckingham 2013-10-10 18:46:01 UTC
Description of problem:

The Satellite CLI reports "Sync Complete" for "node sync"; however, the node appears to be continuing to sync content.  This can be confusing to the user.

Version-Release number of selected component (if applicable):
katello-1.4.6-21.el6sat.noarch
katello-cli-1.4.3-18.el6sat.noarch
pulp-server-2.3.0-0.17.beta.el6sat.noarch

How reproducible:
So far, I've seen this occur on 1 out of 3 nodes; therefore, reproducing it might be tricky.

Steps to Reproduce:

1. Install/configure Satellite 6 (katello-configure)
2. Import a manifest, enable & sync RHEL 6Server
3. Create a content view definition, adding the repo from 3
4. Publish a content view from the definition

5. Install/configure Satellite 6 node (node-installer)
6. Using the CLI on the Satellite 6, add Library to the node.  E.g, 
   node add_environment --org Katello_Infrastructure --environment Library --id 1
7. Sync the Library to the node
   node sync --org Katello_Infrastructure --environment Library --id 1

Actual results:

After some time, CLI reports: 
Sync Complete                                         

However, the node appears to be continuing to sync content.  E.g.
 
pulp-admin -u [user] -p [password] tasks details --task-id 1fe27def-5c4c-44e6-aba7-4e9b2634c90c
+----------------------------------------------------------------------+
                              Task Details
+----------------------------------------------------------------------+

Operations:   sync
Resources:    Katello_Infrastructure-Red_Hat_Enterprise_Linux_Server-Red_Hat_Ent
              erprise_Linux_6_Server_RPMs_x86_64_6Server (repository)
State:        Running
Start Time:   2013-10-10T12:47:15Z
Finish Time:  Incomplete
Result:       Incomplete
Task Id:      1fe27def-5c4c-44e6-aba7-4e9b2634c90c
Progress:     
  Nodes Http Importer: 
    Repo Id:  Katello_Infrastructure-Red_Hat_Enterprise_Linux_Server-Red_Hat_Ent
              erprise_Linux_6_Server_RPMs_x86_64_6Server
    State:    adding_units
    Unit Add: 
      Completed: 3316
      Details:   /var/lib/pulp/content/rpm/selinux-policy/3.7.19/93.el6/noarch/e
                 ad049fa1f26b929a935f8d25e1e3f0a486827e7/Packages/selinux-policy
                 -3.7.19-93.el6.noarch.rpm
      Total:     13169


[root@sat-perf-05 rhel]# pulp-admin -u [user] -p [password] tasks details --task-id 1fe27def-5c4c-44e6-aba7-4e9b2634c90c
+----------------------------------------------------------------------+
                              Task Details
+----------------------------------------------------------------------+

Operations:   sync
Resources:    Katello_Infrastructure-Red_Hat_Enterprise_Linux_Server-Red_Hat_Ent
              erprise_Linux_6_Server_RPMs_x86_64_6Server (repository)
State:        Running
Start Time:   2013-10-10T12:47:15Z
Finish Time:  Incomplete
Result:       Incomplete
Task Id:      1fe27def-5c4c-44e6-aba7-4e9b2634c90c
Progress:     
  Nodes Http Importer: 
    Repo Id:  Katello_Infrastructure-Red_Hat_Enterprise_Linux_Server-Red_Hat_Ent
              erprise_Linux_6_Server_RPMs_x86_64_6Server
    State:    adding_units
    Unit Add: 
      Completed: 3770
      Details:   /var/lib/pulp/content/rpm/initscripts/9.03.23/1.el6/x86_64/2ba3
                 1c42345159643b07df6b95081db2870d3eb7/Packages/initscripts-9.03.
                 23-1.el6.x86_64.rpm
      Total:     13169

Expected results:

The CLI shouldn't report the sync as complete until all tasks associated have been completed.

Additional info:

This was observed on a production install.  It is possible that this issue was triggered by the low timeout values in /etc/pulp/server.conf (e.g. update_timeout: 10:600).  In other words the Satellite 6 server may have timed out at the 600s, while the agent on the node continued syncing.

Comment 2 Corey Welton 2013-10-14 18:16:32 UTC
Possibly related (or not) https://bugzilla.redhat.com/show_bug.cgi?id=1018236

Comment 3 Brad Buckingham 2013-10-16 12:26:29 UTC
The root cause of this bug and bug 1018236 are the same.  Closing this once, since the other was opened by QA. :)

*** This bug has been marked as a duplicate of bug 1018236 ***


Note You need to log in before you can comment on or make changes to this bug.