Bug 2038995 - When executing the content migration (pre-upgrade process), there is a PG query created by pulp that will be sitting forever
Summary: When executing the content migration (pre-upgrade process), there is a PG que...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: Pulp
Version: 6.9.9
Hardware: All
OS: All
unspecified
urgent
Target Milestone: 6.9.10
Assignee: satellite6-bugs
QA Contact: Pablo Mendez Hernandez
URL:
Whiteboard:
: 2097204 2138974 (view as bug list)
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2022-01-10 17:34 UTC by Waldirio M Pinheiro
Modified: 2023-07-17 17:59 UTC (History)
14 users (show)

Fixed In Version: pulp-2to3-migration-0.11.12-1
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2022-11-17 17:17:17 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
Hotfix RPM for Satellite 6.9.9 (225.80 KB, application/x-rpm)
2022-08-29 20:06 UTC, wclark
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Github pulp pulp-2to3-migration issues 568 0 None closed Migration stucks in calculating total of rpms to be migrated and migrating rpms for days 2022-08-18 16:27:03 UTC
Github pulp pulp-2to3-migration pull 580 0 None Merged Rpm migration stucks for days 2022-08-18 16:27:31 UTC
Red Hat Knowledge Base (Solution) 3397771 0 None None None 2023-07-17 17:59:03 UTC
Red Hat Knowledge Base (Solution) 6964914 0 None None None 2022-08-18 16:25:19 UTC
Red Hat Product Errata RHSA-2022:8532 0 None None None 2022-11-17 17:17:30 UTC

Description Waldirio M Pinheiro 2022-01-10 17:34:42 UTC
Description of problem:
When proceeding with the pulp2 - pulp3 migration, the process will execute some PG queries and according to the # of entries in the customer, this query can spend a long time.

Foe example, here, we can see a single query spent ~12h

---
pulpcore=# SELECT COUNT(*) AS "__count" FROM "pulp_2to3_migration_pulp2rpm" INNER JOIN "pulp_2to3_migration_pulp2content" ON ( "pulp_2to3_migration_pulp2rpm"."pulp2content_id" = "pulp_2to3_migration_pulp2content"."pulp_id" ) where not (not ("pulp_2to3_migration_pulp2content"."pulp3_content_id" IS NULL) and not ( "pulp_2to3_migration_pulp2content"."pulp2_id" IN (SELECT DISTINCT U0."pulp2_unit_id" FROM "pulp_2to3_migration_pulp2lazycatalog" U0 WHERE U0."is_migrated" = false )));

 __count
---------
   93490
(1 row)

Time: 42712560.277 ms (11:51:52.560)
pulpcore=# 
---

Version-Release number of selected component (if applicable):
6.10

How reproducible:
100%

Steps to Reproduce:
1. Enable and sync a lot of repositories, all of them as immediate
2. Proceed with the migration
3.

Actual results:
The process will spend a long time, checking the pulp API, we can see the same step of the pulp talk will be there for hours and the PG will be running the query for hours/days.

Expected results:
The content migration be faster and the PG be not stuck with any kind of query.

Additional info:

Comment 7 Waldirio M Pinheiro 2022-08-18 16:25:19 UTC
*** Bug 2097204 has been marked as a duplicate of this bug. ***

Comment 8 wclark 2022-08-29 20:06:17 UTC
Created attachment 1908385 [details]
Hotfix RPM for Satellite 6.9.9

INSTALL INSTRUCTIONS:

1. Take a complete backup or snapshot of Satellite 6.9.9 server

2. Download the hotfix RPM attached to this BZ and copy it to Satellite server

3. # yum install ./python3-pulp-2to3-migration-0.11.10-2.HOTFIXRHBZ2074099RHBZ2038995.el7pc.noarch.rpm --disableplugin=foreman-protector

Comment 12 Daniel Alley 2022-10-20 13:21:05 UTC
Waldirio, shouldn't this be marked as 6.9, because it's a bug in the migration process between Pulp 2 and Pulp 3?

Comment 24 errata-xmlrpc 2022-11-17 17:17:17 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Important: Satellite 6.9.10 Async Security Update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2022:8532

Comment 25 Daniel Alley 2023-07-17 17:59:03 UTC
*** Bug 2138974 has been marked as a duplicate of this bug. ***


Note You need to log in before you can comment on or make changes to this bug.