Red Hat Satellite engineering is moving the tracking of its product development work on Satellite to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "Satellite project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs will be migrated starting at the end of May. If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "Satellite project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/SAT-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2173870 - When using the customer data (json) with 13 diff conf files, we can see some weird behavior when updating the hypervisors
Summary: When using the customer data (json) with 13 diff conf files, we can see some ...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: Virt-who Configure Plugin
Version: 6.12.1
Hardware: Unspecified
OS: Unspecified
medium
medium
Target Milestone: 6.15.0
Assignee: Lucy Fu
QA Contact: yanpliu
URL:
Whiteboard:
: 2169993 (view as bug list)
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-02-28 08:25 UTC by Waldirio M Pinheiro
Modified: 2024-07-30 03:07 UTC (History)
10 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 2266142 (view as bug list)
Environment:
Last Closed: 2024-04-23 17:13:52 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Foreman Issue Tracker 36855 0 Normal Ready For Testing When using the customer data (json) with 13 diff conf files, we can see some weird behavior when updating the hypervisor... 2023-10-31 21:16:56 UTC
Github theforeman foreman_virt_who_configure pull 174 0 None Merged Fixes #36855 - Use one ServiceUser for all configs in an organization 2023-11-21 15:21:31 UTC
Red Hat Bugzilla 2173870 0 medium CLOSED When using the customer data (json) with 13 diff conf files, we can see some weird behavior when updating the hypervisor... 2024-07-30 03:10:19 UTC
Red Hat Bugzilla 2260621 0 unspecified CLOSED virt-who host and virtual host guest did not display the correct mapping info on old Legacy Content Host UI->Details 2024-06-06 17:07:10 UTC
Red Hat Issue Tracker SAT-17022 0 None None None 2023-04-10 13:10:51 UTC
Red Hat Product Errata RHSA-2024:2010 0 None None None 2024-04-23 17:13:54 UTC

Internal Links: 2173870 2266142

Description Waldirio M Pinheiro 2023-02-28 08:25:49 UTC
Description of problem:
When running each conf file individually, all of them will pass. When running all of them at the same time, we can see the error below and some of the tasks will fail with warning

---
-------------------------------------|--------------|---------|---------|---------------------|---------------------|-----------------|-------|-----------------------------------------------------------------------------
ID                                   | ACTION       | STATE   | RESULT  | STARTED AT          | ENDED AT            | DURATION        | OWNER | TASK ERRORS
-------------------------------------|--------------|---------|---------|---------------------|---------------------|-----------------|-------|-----------------------------------------------------------------------------
1a90b366-04f4-49c5-b47f-7c98fc087b42 | Hypervisors  | running | pending | 2023/02/28 08:19:54 |                     | 00:05:01.016713 | admin |
ae9f0fcd-15af-45e6-9126-e69642656569 | Hypervisors  | stopped | success | 2023/02/28 08:18:25 | 2023/02/28 08:19:58 | 00:01:32.98492  | admin |
353edc94-e7a2-44fa-a196-4f623404b621 | Hypervisors  | stopped | success | 2023/02/28 08:18:21 | 2023/02/28 08:18:32 | 00:00:10.577236 | admin |
7c7dc021-af07-4903-96ce-6fb1c97472e3 | Hypervisors  | stopped | success | 2023/02/28 08:18:18 | 2023/02/28 08:18:32 | 00:00:14.722839 | admin |
6d0cfac1-5a29-4525-84f8-0ee9a89cef7f | Hypervisors  | stopped | warning | 2023/02/28 08:18:13 | 2023/02/28 08:18:14 | 00:00:00.548461 | admin | Job blocked by the following existing jobs: 8a889d5886941aa201869719a6f30bcc
0db6455a-060b-453c-bec1-6544b9852ddb | Hypervisors  | stopped | warning | 2023/02/28 08:18:10 | 2023/02/28 08:18:11 | 00:00:00.615463 | admin | Job blocked by the following existing jobs: 8a889d5886941aa201869719a6f30bcc
b8eecda6-be2d-42bc-abaa-4f89f4b95d92 | Hypervisors  | stopped | success | 2023/02/28 08:18:06 | 2023/02/28 08:19:59 | 00:01:53.554576 | admin |
937836c6-d897-45bf-851b-a447d74a72cb | Hypervisors  | stopped | success | 2023/02/28 08:18:02 | 2023/02/28 08:18:05 | 00:00:03.17935  | admin |
c3214360-a41c-4e07-8fad-1968bfec0480 | Hypervisors  | stopped | success | 2023/02/28 08:17:58 | 2023/02/28 08:18:14 | 00:00:15.604428 | admin |
f38ed113-a7af-444d-985b-029f7c4d3e5b | Hypervisors  | stopped | warning | 2023/02/28 08:17:53 | 2023/02/28 08:17:55 | 00:00:01.340925 | admin | Job blocked by the following existing jobs: 8a889d5886941aa20186971933280123
f992c44f-6c77-4c9b-bd16-144287e2b907 | Hypervisors  | stopped | warning | 2023/02/28 08:17:47 | 2023/02/28 08:17:48 | 00:00:01.086172 | admin | Job blocked by the following existing jobs: 8a889d5886941aa20186971933280123
266aff7f-e731-49b9-8a75-aee7cb6b399d | Hypervisors  | stopped | warning | 2023/02/28 08:17:42 | 2023/02/28 08:17:43 | 00:00:00.899523 | admin | Job blocked by the following existing jobs: 8a889d5886941aa20186971933280123
5dcb6a3c-830a-4066-8caa-cf0ab61e5e5c | Hypervisors  | stopped | success | 2023/02/28 08:17:37 | 2023/02/28 08:20:06 | 00:02:29.594717 | admin |
cd3872d2-e11b-452b-b66a-0308aea43903 | Hypervisors  | stopped | success | 2023/02/28 08:16:45 | 2023/02/28 08:17:11 | 00:00:25.756576 | admin |
-------------------------------------|--------------|---------|---------|---------------------|---------------------|-----------------|-------|-----------------------------------------------------------------------------

---

Version-Release number of selected component (if applicable):


How reproducible:


Steps to Reproduce:
1. Using the customer data
2. Configuring the fake virt-who
3. Pushing all of them at the same time

Actual results:
Failing as we can see above

Expected results:
All of them passing

Additional info:

Comment 4 William Poteat 2023-02-28 19:11:58 UTC
If the configurations are pointing to the same server and org, they should use the same username/password.
This way the data gets consolidated into a single report.
Otherwise, the job queuing process will drop repeated requests by org so that it will not be overloaded by aggressive reporting.

If the config generation is designed to have a unique username/password for each configuration pointing to the same server and org, then it must be redesigned.

Comment 6 Rehana 2023-03-07 14:44:14 UTC
*** Bug 2169993 has been marked as a duplicate of this bug. ***

Comment 11 Brad Buckingham 2023-10-30 11:29:29 UTC
Bulk setting Target Milestone = 6.15.0 where sat-6.15.0+ is set.

Comment 13 William Poteat 2023-11-21 13:31:45 UTC
The Satellite server

Comment 34 errata-xmlrpc 2024-04-23 17:13:52 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Important: Satellite 6.15.0 release), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2024:2010


Note You need to log in before you can comment on or make changes to this bug.