Red Hat Satellite engineering is moving the tracking of its product development work on Satellite to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "Satellite project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs will be migrated starting at the end of May. If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "Satellite project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/SAT-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2025948 - content prepare for 2to3 migration fails with "The requested URL's length exceeds the capacity limit for this server"
Summary: content prepare for 2to3 migration fails with "The requested URL's length ex...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: Repositories
Version: 6.9.7
Hardware: Unspecified
OS: Unspecified
high
medium
Target Milestone: 6.9.8
Assignee: Justin Sherrill
QA Contact: Shweta Singh
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2021-11-23 13:13 UTC by Justin Sherrill
Modified: 2022-03-02 14:54 UTC (History)
12 users (show)

Fixed In Version: tfm-rubygem-katello-3.18.1.49-1
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2022-01-27 17:33:09 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Foreman Issue Tracker 34103 0 High Ready For Testing content prepare for 2to3 migration fails with "The requested URL\'s length exceeds the capacity limit for this server 2021-12-08 19:08:41 UTC
Red Hat Knowledge Base (Solution) 6601221 0 None None None 2021-12-19 16:06:14 UTC
Red Hat Product Errata RHBA-2022:0320 0 None None None 2022-01-27 17:33:17 UTC

Description Justin Sherrill 2021-11-23 13:13:48 UTC
Description of problem:

Migrating with lots of docker manifest lists (~90), will fail with "The requested URL's length exceeds the capacity limit for this server".

The user's satellite had osp13/osp16 container images synced


How reproducible:
always

Steps to Reproduce:
1.  Sync docker repos that container docker manifest lists, at least 90.
2.   run 'foreman-maintain content prepare'


Additional info:


2021-11-23T12:51:24 [E|bac|] Error message: the server returns an error
 | HTTP status code: 414
 | Response headers: {"Date"=>"Tue, 23 Nov 2021 11:51:24 GMT", "Server"=>"Apache", "Content-Length"=>"248", "Connection"=>"close", "Content-Type"=>"text/html; charset=iso-8859-1"}
 | Response body: <!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
 | <html><head>
 | <title>414 Request-URI Too Long</title>
 | </head><body>
 | <h1>Request-URI Too Long</h1>
 | <p>The requested URL's length exceeds the capacity
 | limit for this server.<br />
 | </p>
 | </body></html>
 |  (Pulp2to3MigrationClient::ApiError)
 | /opt/theforeman/tfm/root/usr/share/gems/gems/pulp_2to3_migration_client-0.10.0/lib/pulp_2to3_migration_client/api_client.rb:81:in `call_api'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/pulp_2to3_migration_client-0.10.0/lib/pulp_2to3_migration_client/api/pulp2_content_api.rb:139:in `list_with_http_info'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/pulp_2to3_migration_client-0.10.0/lib/pulp_2to3_migration_client/api/pulp2_content_api.rb:43:in `list'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:354:in `block in import_content_type'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/batches.rb:136:in `block in find_in_batches'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/batches.rb:238:in `block in in_batches'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/batches.rb:222:in `loop'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/batches.rb:222:in `in_batches'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/activerecord-6.0.3.4/lib/active_record/relation/batches.rb:135:in `find_in_batches'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:351:in `import_content_type'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:114:in `block (4 levels) in import_pulp3_content'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/lib/katello/logging.rb:6:in `time'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:113:in `block (3 levels) in import_pulp3_content'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:112:in `each'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:112:in `block (2 levels) in import_pulp3_content'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:107:in `each'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:107:in `block in import_pulp3_content'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/lib/katello/logging.rb:6:in `time'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/services/katello/pulp3/migration.rb:106:in `import_pulp3_content'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/lib/actions/pulp3/import_migration.rb:11:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.8/lib/dynflow/action.rb:571:in `block (3 levels) in execute_run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.8/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.8/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.8/lib/dynflow/middleware.rb:32:in `run'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.8/lib/dynflow/middleware/stack.rb:23:in `call'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.8/lib/dynflow/middleware/stack.rb:27:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/dynflow-1.4.8/lib/dynflow/middleware.rb:19:in `pass'
 | /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.1.46/app/lib/actions/middleware/remote_action.rb:16:in `block in run'

Comment 1 Justin Sherrill 2021-11-23 20:41:43 UTC
Workaround:


Edit /opt/theforeman/tfm/root/usr/share/gems/gems/katello-3.18.*/app/services/katello/pulp3/migration.rb

and change:

      GET_QUERY_ID_LENGTH = 90
to
      GET_QUERY_ID_LENGTH = 35


Then run:

satellite-maintain service restart

Comment 2 Francisco Garcia 2021-11-24 07:13:10 UTC
Hi again,

after re-running the upgrade process, it launched several hundred Refresh Distribution tasks, and none of them were progressing for 8h+ hours.

root@satellite ~ # hammer task list --search "state != stopped"                                          
-------------------------------------|-------------------------------------------------------|-----------|---------|---------------------|----------|---------------|--------------
ID                                   | ACTION                                                | STATE     | RESULT  | STARTED AT          | ENDED AT | OWNER         | TASK ERRORS
-------------------------------------|-------------------------------------------------------|-----------|---------|---------------------|----------|---------------|--------------
16e3f018-cfbd-44f7-b045-e3b2e2d38818 | InsightsCloud::Async::InsightsScheduledSync           | scheduled | pending |                     |          |               |
2e501080-7424-473a-aef3-d4930fcbab22 | Create RSS notifications                              | scheduled | pending |                     |          |               |
2e59ca5f-002d-40c8-8375-a1454f28d037 | ForemanInventoryUpload::Async::GenerateAllReportsJob  | scheduled | pending |                     |          |               |
58562523-6fc7-4501-99af-c1d57d85526e | Subscription expiration notification                  | scheduled | pending |                     |          |               | 
6921ee03-72bd-4b51-8363-873b52f5efa9 | Pulp disk space notification                          | scheduled | pending |                     |          |               | 
8d3f8f06-926d-4851-8dd7-36a7925efe8d | Inventory scheduled sync                              | scheduled | pending |                     |          | foreman_admin | 
a807834c-6579-4151-838b-128e28879dcb | Subscription Manifest validity check                  | scheduled | pending |                     |          |               | 
c80f4f03-826e-4aab-bdee-642e3d4a7ba4 | Subscription expiration notification                  | scheduled | pending |                     |          |               | 
de9b0d27-4b35-46d8-8f93-646666c7d3bf | Clean up StoredValues                                 | scheduled | pending |                     |          |               | 
edd7bbe5-373b-4ea7-8149-79d5ac708f46 | Pulp disk space notification                          | scheduled | pending |                     |          |               | 
f423c7a1-0d06-4c1e-b972-320802d75923 | Insights client status aging                          | scheduled | pending |                     |          | foreman_admin | 
b19eb5a4-f1bf-4b9a-a90c-4eb8cfa4dc7b | Refresh distribution                                  | running   | pending | 2021/11/23 14:11:08 |          | foreman_admin | 
4dc43a06-200f-4208-994f-25c68debc27f | Refresh distribution                                  | running   | pending | 2021/11/23 14:11:07 |          | foreman_admin | 
be376298-5c9d-49bd-8245-1cb289d54f76 | Refresh distribution                                  | running   | pending | 2021/11/23 14:10:59 |          | foreman_admin |
b28d938a-5799-4341-8f03-b0a2e795e8db | Refresh distribution                                  | running   | pending | 2021/11/23 14:10:58 |          | foreman_admin |
3a50255c-221b-43b3-9fd2-f5a3a17b4012 | Refresh distribution                                  | running   | pending | 2021/11/23 14:10:56 |          | foreman_admin |
61c14a7b-87e9-40ce-a13d-caa8848af41f | Refresh distribution                                  | running   | pending | 2021/11/23 14:10:55 |          | foreman_admin |
11647722-388c-4b8b-9beb-ee55ac812464 | Refresh distribution                                  | running   | pending | 2021/11/23 14:10:54 |          | foreman_admin |
caa1cb88-622b-4d78-b482-b79399ef362d | Refresh distribution                                  | running   | pending | 2021/11/23 14:10:53 |          | foreman_admin |
6a47416e-085a-46c6-8a7c-3825e5be5ca0 | Refresh distribution                                  | running   | pending | 2021/11/23 14:10:52 |          | foreman_admin |          
daa44fa1-8475-474a-b975-558faf69f28e | Refresh distribution                                  | running   | pending | 2021/11/23 14:08:54 |          | foreman_admin |              
e9a51b4c-4d89-4d73-aee6-bdcf6c806720 | Refresh distribution                                  | running   | pending | 2021/11/23 14:08:53 |          | foreman_admin |              
5521daca-f1a9-4fc4-9d07-46ad8790dedc | Refresh distribution                                  | running   | pending | 2021/11/23 14:08:52 |          | foreman_admin |              
7f7dd069-6564-4f64-bc41-8ce7ebfd6383 | Refresh distribution                                  | running   | pending | 2021/11/23 14:08:51 |          | foreman_admin |              
5270b317-1fbf-452d-acac-82159088b0c3 | Refresh distribution                                  | running   | pending | 2021/11/23 14:08:50 |          | foreman_admin |              

After it became obvious they weren't progressing I tried cleaning them, as I thought they were leftovers from an earlier upgrade process. 

root@satellite ~ # foreman-rake foreman_tasks:cleanup TASK_SEARCH="label = Actions::Pulp3::Repository::RefreshDistribution" STATES="running,paused,stopped,suspended"
API controllers newer than Apipie cache! Run apipie:cache rake task to regenerate cache.
About to remove 693 tasks matching filter
Deleted 693 tasks matching filter
No orphaned task locks found, skipping.
No orphaned execution plans found, skipping.
No orphaned job invocations found, skipping.

Now I've relaunched the upgrade and still stuck in a task that is not progressing, Actions::Pulp3::ContentMigration :

# foreman-maintain service restart ; time foreman-maintain upgrade run --target-version=6.10 -y 
[...]
| All services started                                                [OK]      
--------------------------------------------------------------------------------
Switch support for certain content from Pulp 2 to Pulp 3: 
Performing final content migration before switching content 

# hammer task list --search 'state = running'
                                                                                                         
-------------------------------------|--------------------|---------|---------|---------------------|----------|-------------------|------------
ID                                   | ACTION             | STATE   | RESULT  | STARTED AT          | ENDED AT | OWNER             | TASK ERRORS
-------------------------------------|--------------------|---------|---------|---------------------|----------|-------------------|------------
bc656576-4717-4d6c-9f82-a76a7996385d | Content Migration  | running | pending | 2021/11/23 23:05:07 |          | foreman_api_admin |            
-------------------------------------|--------------------|---------|---------|---------------------|----------|-------------------|------------

How can I progress from here? (attached a new sosreport in the case itself)

Comment 5 Justin Sherrill 2021-12-07 18:31:11 UTC
Created redmine issue https://projects.theforeman.org/issues/34103 from this bug

Comment 7 Bryan Kearney 2021-12-13 04:05:42 UTC
Upstream bug assigned to jsherril

Comment 8 Bryan Kearney 2021-12-13 04:05:44 UTC
Upstream bug assigned to jsherril

Comment 9 Bryan Kearney 2022-01-06 00:04:59 UTC
Moving this bug to POST for triage into Satellite since the upstream issue https://projects.theforeman.org/issues/34103 has been resolved.

Comment 17 Shweta Singh 2022-01-25 16:42:04 UTC
Verified in 6.9.8 snap 2 

Synced repository containing large number of docker manifest lists(~90) 
Complete repo sync and content migration to pulp3 is successful on Satellite 6.9.8

Steps to reproduce:
1. Create a repo containing large number of docker manifest lists.
2. Sync the repo with "complete" sync.
3. Run command "foreman-maintain content prepare" to migrate content to pulp3

Expected Results: 

Migration should be successful without errors.

Actual Results:

Content Migration completed successfully.

Comment 21 errata-xmlrpc 2022-01-27 17:33:09 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Satellite 6.9.8 Async Bug Fix Update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2022:0320


Note You need to log in before you can comment on or make changes to this bug.