Red Hat Satellite engineering is moving the tracking of its product development work on Satellite to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "Satellite project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs will be migrated starting at the end of May. If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "Satellite project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/SAT-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2026277 - null value in column "manifest_id" violates not-null constraint error while syncing RHOSP container images
Summary: null value in column "manifest_id" violates not-null constraint error while s...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: Pulp
Version: 6.10.1
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: 6.11.0
Assignee: satellite6-bugs
QA Contact: Lai
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2021-11-24 08:58 UTC by Rafal Szmigiel
Modified: 2022-08-08 08:36 UTC (History)
18 users (show)

Fixed In Version: tfm-pulpcore-python-pulp-container-2.9.2
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 2043697 (view as bug list)
Environment:
Last Closed: 2022-07-05 14:30:29 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
python3-pulp-container-2.8.1-0.3.HOTFIXRHBZ2026277.el7pc.noarch.rpm (212.90 KB, application/x-rpm)
2022-01-17 20:08 UTC, wclark
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Github pulp pulp_container issues 537 0 None open null value in column "manifest_id" violates not-null constraint error during sync 2022-01-14 16:50:55 UTC
Red Hat Knowledge Base (Solution) 6962604 0 None None None 2022-06-09 12:21:46 UTC
Red Hat Product Errata RHSA-2022:5498 0 None None None 2022-07-05 14:30:39 UTC

Description Rafal Szmigiel 2021-11-24 08:58:00 UTC
Description of problem:

Hi,

I'm very unsure whether this is Satellite, our container images registry or RHOSP documentation issue, but since it occurs in Satellite I hope you'll be able at least to point me towards the right direction.

I'm running Red Hat Satellite (build: 6.10.1) and following the procedure to mirror RHOSP container image into Satellite, as described here: https://access.redhat.com/documentation/en-us/red_hat_openstack_platform/16.1/html/director_installation_and_usage/preparing-for-director-installation#preparing-a-satellite-server-for-container-images

At step "6 Synchronize the container images" where I run:


$ hammer product synchronize --organization "Default Organization" --name "OSP Containers"

synchronisation process fails for some of the images, including but not limited to: nova-api, cinder-api, nova-compute

with the following error:

nova-api:


# hammer repository sync --id 89
[................................................................................................................................................................................................................................................................................................................................................................................................................................] [100%]
No content added.
Total steps: 11502/11502
--------------------------------
Associating Content: 11368/11368
Downloading Artifacts: 67/67
Downloading tag list: 1/1
Processing Tags: 66/66
Error: null value in column "manifest_id" violates not-null constraint
DETAIL:  Failing row contains (2524398, null, f998f2de-a87e-42e8-8c41-98e70e0165f4).

# hammer task show --id e45dc3cc-0110-4f2b-a17b-7259fe46c936
ID:          e45dc3cc-0110-4f2b-a17b-7259fe46c936
Action:      Synchronize repository {"text"=>"repository 'nova-api'", "link"=>nil} product {"text"=>"product 'OSP Containers'", "link"=>"/products/292/"} organization {"text"=>"organization 'Default Organization'", "link"=>"/organizations/1/edit"}
State:       stopped
Result:      warning
Started at:  2021/11/24 08:29:35
Ended at:    2021/11/24 08:30:24
Owner:       admin
Task errors: null value in column "manifest_id" violates not-null constraint
DETAIL:  Failing row contains (2524398, null, f998f2de-a87e-42e8-8c41-98e70e0165f4).


then after another run it works fine:
# hammer task show --id 40137c6d-73d6-4c7b-a900-bac9c92353e2
ID:          40137c6d-73d6-4c7b-a900-bac9c92353e2
Action:      Synchronize repository {"text"=>"repository 'nova-api'", "link"=>nil} product {"text"=>"product 'OSP Containers'", "link"=>"/products/292/"} organization {"text"=>"organization 'Default Organization'", "link"=>"/organizations/1/edit"}
State:       stopped
Result:      success
Started at:  2021/11/24 08:41:05
Ended at:    2021/11/24 08:43:03
Owner:       admin
Task errors:

The same applies for some other repositories, such like nova-compute:
# hammer repository sync --name nova-compute --product 'OSP Containers' --organization 'Default Organization'
[................................................................................................................................................................................................................................................................................................................................................................................................................................] [100%]
No content added.
Total steps: 18336/18336
--------------------------------
Associating Content: 18175/18175
Downloading Artifacts: 82/82
Downloading tag list: 1/1
Processing Tags: 78/78
Error: null value in column "manifest_id" violates not-null constraint
DETAIL:  Failing row contains (2543177, null, dbc5093b-5b54-40ac-beec-db0e0a6b7f69).



As a result there is no content added to Satellite registry for the affected images:

Content Counts
Content Type
Container Image Manifests 	0
Container Image Manifest Lists 	0
Container Image Tags 	0


Version-Release number of selected component (if applicable):
6.10.1

How reproducible:
Happens randomly for random repositories. Sometimes running the same repo sync for multiple times in a row doesn't change anything, sometimes it just start working.

Actual results:
Some OSP container images are not mirrored into Satellite's registry.

Expected results:
All of the required OSP container images are being mirrored to Satellite's registry.

Additional info:

Comment 1 Rafal Szmigiel 2021-11-25 10:40:14 UTC
I've configured Sync Policy to re-run sync every hour. After about 12hrs it finally synchronised all images.
But still this is pretty disappointing and I'd like to know where the problem comes from.

Comment 2 Ina Panova 2021-12-02 16:07:52 UTC
@Rafal can you share some logs with tracebacks? thanks.

Comment 5 Rafal Szmigiel 2021-12-02 17:16:35 UTC
Hey,

Attaching tracebacks as requested:

Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: pulp [a2b82b5f-0427-4420-919e-b149d00c759e]: pulpcore.tasking.pulpcore_worker:INFO: Task fb0f3c7c-e537-4d15-9363-ea357f73d910 failed (null value in column "manifest_id" violates not-null constraint
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: DETAIL:  Failing row contains (34796245, null, 30fd903d-0869-4787-9743-67aa57e068ca).
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: )
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: pulp [a2b82b5f-0427-4420-919e-b149d00c759e]: pulpcore.tasking.pulpcore_worker:INFO:   File "/usr/lib/python3.6/site-packages/pulpcore/tasking/pulpcore_worker.py", line 317, in _perform_task
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: result = func(*args, **kwargs)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/pulp_container/app/tasks/synchronize.py", line 44, in synchronize
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: return dv.create()
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/declarative_version.py", line 151, in create
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: loop.run_until_complete(pipeline)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: return future.result()
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 225, in create_pipeline
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: await asyncio.gather(*futures)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 43, in __call__
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: await self.run()
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/pulp_container/app/tasks/sync_stages.py", line 461, in run
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: BlobManifest.objects.bulk_create(objs=blob_list, ignore_conflicts=True, batch_size=1000)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/models/manager.py", line 82, in manager_method
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: return getattr(self.get_queryset(), name)(*args, **kwargs)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/models/query.py", line 474, in bulk_create
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: ids = self._batched_insert(objs_without_pk, fields, batch_size, ignore_conflicts=ignore_conflicts)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/models/query.py", line 1211, in _batched_insert
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: self._insert(item, fields=fields, using=self.db, ignore_conflicts=ignore_conflicts)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/models/query.py", line 1186, in _insert
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: return query.get_compiler(using=using).execute_sql(return_id)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/models/sql/compiler.py", line 1377, in execute_sql
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: cursor.execute(sql, params)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 67, in execute
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 76, in _execute_with_wrappers
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: return executor(sql, params, many, context)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: return self.cursor.execute(sql, params)
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/utils.py", line 89, in __exit__
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: raise dj_exc_value.with_traceback(traceback) from exc_value
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
Nov 30 03:44:59 satellite.local pulpcore-worker-3[1231]: return self.cursor.execute(sql, params)

Nov 23 09:42:23 satellite.local pulpcore-worker-5: pulp [65351aa8-eb7b-4167-ab35-57e104b5a339]: pulpcore.tasking.pulpcore_worker:INFO: Task 7d38dc27-fb2e-489a-8895-bc250e59fe4b failed (null value in column "manifest_id" violates not-null constraint
Nov 23 09:42:23 satellite.local pulpcore-worker-5: DETAIL:  Failing row contains (41938, null, 3cdba870-bc13-4637-9e30-ae4a8d93e135).
Nov 23 09:42:23 satellite.local pulpcore-worker-5: )
Nov 23 09:42:23 satellite.local pulpcore-worker-5: pulp [65351aa8-eb7b-4167-ab35-57e104b5a339]: pulpcore.tasking.pulpcore_worker:INFO:   File "/usr/lib/python3.6/site-packages/pulpcore/tasking/pulpcore_worker.py", line 317, in _perform_task
Nov 23 09:42:23 satellite.local pulpcore-worker-5: result = func(*args, **kwargs)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/pulp_container/app/tasks/synchronize.py", line 44, in synchronize
Nov 23 09:42:23 satellite.local pulpcore-worker-5: return dv.create()
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/declarative_version.py", line 151, in create
Nov 23 09:42:23 satellite.local pulpcore-worker-5: loop.run_until_complete(pipeline)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
Nov 23 09:42:23 satellite.local pulpcore-worker-5: return future.result()
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 225, in create_pipeline
Nov 23 09:42:23 satellite.local pulpcore-worker-5: await asyncio.gather(*futures)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/pulpcore/plugin/stages/api.py", line 43, in __call__
Nov 23 09:42:23 satellite.local pulpcore-worker-5: await self.run()
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/pulp_container/app/tasks/sync_stages.py", line 461, in run
Nov 23 09:42:23 satellite.local pulpcore-worker-5: BlobManifest.objects.bulk_create(objs=blob_list, ignore_conflicts=True, batch_size=1000)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/models/manager.py", line 82, in manager_method
Nov 23 09:42:23 satellite.local pulpcore-worker-5: return getattr(self.get_queryset(), name)(*args, **kwargs)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/models/query.py", line 474, in bulk_create
Nov 23 09:42:23 satellite.local pulpcore-worker-5: ids = self._batched_insert(objs_without_pk, fields, batch_size, ignore_conflicts=ignore_conflicts)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/models/query.py", line 1211, in _batched_insert
Nov 23 09:42:23 satellite.local pulpcore-worker-5: self._insert(item, fields=fields, using=self.db, ignore_conflicts=ignore_conflicts)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/models/query.py", line 1186, in _insert
Nov 23 09:42:23 satellite.local pulpcore-worker-5: return query.get_compiler(using=using).execute_sql(return_id)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/models/sql/compiler.py", line 1377, in execute_sql
Nov 23 09:42:23 satellite.local pulpcore-worker-5: cursor.execute(sql, params)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 67, in execute
Nov 23 09:42:23 satellite.local pulpcore-worker-5: return self._execute_with_wrappers(sql, params, many=False, executor=self._execute)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 76, in _execute_with_wrappers
Nov 23 09:42:23 satellite.local pulpcore-worker-5: return executor(sql, params, many, context)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
Nov 23 09:42:23 satellite.local pulpcore-worker-5: return self.cursor.execute(sql, params)
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/utils.py", line 89, in __exit__
Nov 23 09:42:23 satellite.local pulpcore-worker-5: raise dj_exc_value.with_traceback(traceback) from exc_value
Nov 23 09:42:23 satellite.local pulpcore-worker-5: File "/usr/lib/python3.6/site-packages/django/db/backends/utils.py", line 84, in _execute
Nov 23 09:42:23 satellite.local pulpcore-worker-5: return self.cursor.execute(sql, params)

Thank you for your help!

Best Regards,

Rafal

Comment 6 Ina Panova 2021-12-06 14:38:37 UTC
@Rafal,
if you inspect the repo, for example nova-api through the Registry API, since not all the information is displayed via RH registry UI, you will see that it has 66 tags, compared to 33 tags https://catalog.redhat.com/software/containers/rhosp-rhel8/openstack-nova-api/5de6bdfe5a13461646f8fe2a?tag=all&architecture=amd64

```
$ less | curl -H "Authorization: $ACCESS_TOKEN" 'https://registry.redhat.io/v2/rhosp-rhel8/openstack-nova-api/tags/list' -L| python -m json.tool | grep 16| wc -l
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   665  100   665    0     0    998      0 --:--:-- --:--:-- --:--:--   997
100  1136  100  1136    0     0   1629      0 --:--:-- --:--:-- --:--:--  1629
66
```

The rest of the tags you see are tags of the source containers, which are fairly big in size. For what I know, OSP workflows do not care about these source files. I have taken one tag that references the image that contains source files and inspected it. It has 506 layers, when I summed up size of each layer it was 1.2GB and this is just for one image, there are in total for this repo 24 images that contain source files.
```
$ curl -H "Accept:application/vnd.docker.distribution.manifest.v2+json" -H "Authorization: $ACCESS_TOKEN" 'https://registry.redhat.io/v2/rhosp-rhel8/openstack-nova-api/manifests/16.1.5-2-source' -L| python -m json.tool | grep size| wc -l
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   701  100   701    0     0   1611      0 --:--:-- --:--:-- --:--:--  1611
100 82531  100 82531    0     0   149k      0 --:--:-- --:--:-- --:--:-- 1611k
506


$ less | curl -H "Authorization: $ACCESS_TOKEN" 'https://registry.redhat.io/v2/rhosp-rhel8/openstack-nova-api/tags/list' -L| python -m json.tool | grep source| wc -l
  % Total    % Received % Xferd  Average Speed   Time    Time     Time  Current
                                 Dload  Upload   Total   Spent    Left  Speed
100   665  100   665    0     0   2166      0 --:--:-- --:--:-- --:--:--  2159
100  1136  100  1136    0     0    964      0  0:00:01  0:00:01 --:--:--     0
24
```

If you exclude all images that are tagged as source images and sync only regular images the repo size will be 18GB. It took me 16 mins to sync these tags.
```
18G	/var/lib/pulp/media/artifact
18G	/var/lib/pulp/media/
```

Suggestions:
1. when creating repo tell it to exclude tags that have 'source' in their name
2. do you need tags of all the repo if not then when creating repo tell it to sync only specific list of tags

If I am not mistaken this option is exposed in Satellite as Limit Sync Tags https://access.redhat.com/documentation/en-us/red_hat_satellite/6.10/html/content_management_guide/managing_container_images#Importing_Container_Images

3. In Satellite 6.10 there is  pulp3 which offers the option to sync container images with the on_demand policy if you don't need all the bits right away, they will be downloaded only when requested by the client, i.e during podman pull, this is also configurable on the repo. It takes less then several minutes to complete the sync since only manifests will be downloaded. 

I am still working on reproducing this BZ to confirm couple of my theories where we can improve codewise regardless of the suggestions I've listed above, but so far I ran out disk space plus I am being throttled by the registry server.

Comment 7 Ian Ballou 2021-12-06 15:41:07 UTC
A couple notes on what Ina said:

- Limit Sync Tags can indeed be used to sync only tags with specific names.

- While the installed version of Pulp 3 supports it, Satellite doesn't yet allow for on_demand syncing of container content in 6.10.

Comment 8 Ina Panova 2021-12-07 18:39:00 UTC
I seem to not being able to reproduce this bug. I tried to sync nova-api. But based on the traceback provided,this upstream issue is very plausible https://pulp.plan.io/issues/9424

Comment 11 Ina Panova 2022-01-07 23:10:49 UTC
Since we're getting more customer cases,can I ask for a reproducer? I am having troubles to get this reproduced in my dev environment. 
Maybe @rafal or @lai would be able to help with that?

Comment 13 Rafal Szmigiel 2022-01-10 10:06:18 UTC
Hey,

Just a small note: If I specify tag limits as suggested above, for an instance:

while read IMAGE; do \
  IMAGE_NAME=$(echo $IMAGE | cut -d"/" -f3 | sed "s/openstack-//g") ; \
  IMAGE_NOURL=$(echo $IMAGE | sed "s/registry.redhat.io\///g") ; \
  hammer repository create \
  --organization "Default Organization" \
  --product "OSP Containers New" \
  --content-type docker \
  --url https://registry.redhat.io \
  --docker-upstream-name $IMAGE_NOURL \
  --upstream-username ‘blebleble’ \
  --upstream-password ‘blablabla' \
  --docker-tags-whitelist 16.2,16.2.0 \
  --name $IMAGE_NAME ; done < satellite_images

the problem does not occur.
I believe we should suggest in the official doc the use of tag limits not only because of this issue, but also for the sake of disk space usage.
There is no need to mirror every single image, including source images.

Best Regards,

Rafal

Comment 28 Ina Panova 2022-01-14 15:53:40 UTC
here is pulp upstream patch that should fix the original manifest_id null database error https://github.com/pulp/pulp_container/pull/535

Comment 29 Mike McCune 2022-01-14 16:51:46 UTC
Moving to POST as upstream patch is merged

Comment 31 wclark 2022-01-17 19:41:48 UTC
Due to an issue with build tooling, the previously supplied RPM did not contain the fix. A new hotfix RPM will be attached shortly.

Comment 32 wclark 2022-01-17 20:08:15 UTC
Created attachment 1851425 [details]
python3-pulp-container-2.8.1-0.3.HOTFIXRHBZ2026277.el7pc.noarch.rpm

HOTFIX RPM is available for Satellite 6.10.1

To install the hotfix:

1. Take a complete backup or snapshot before installing the hotfix

2. Download the attached RPM and copy it to the affected Satellite/Capsule servers

3. # yum install ./python3-pulp-container-2.8.1-0.3.HOTFIXRHBZ2026277.el7pc.noarch.rpm --disableplugin=foreman-protector

4. # systemctl restart pulpcore-worker@*.service pulpcore-content.service

Comment 35 Lai 2022-02-14 22:47:57 UTC
Steps to reproduce

1. Create a custom docker repo with openstack nova-api, cinder-api, and nova-compute (ensure that the system has adequate space) (I got mine from quay.io)
2. Sync the repos

Expected:
Repos should all sync successfully

Actual:
Repos does sync successfully.

Please note that the image attacked doesn't nova-compute and that's because of the limitation of my vm which ran out of space after nova-api and cinder-api synced successfully.  This limitation expands to a fresh vm with just syncing nova-compute itself.

Verified in 7.0 snap 9 with python38-pulp-container-2.9.2-1.el8pc.noarch on both rhel8.5 and rhel7.9

Comment 40 errata-xmlrpc 2022-07-05 14:30:29 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Moderate: Satellite 6.11 Release), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2022:5498


Note You need to log in before you can comment on or make changes to this bug.