Bug 1399294
Summary: | Synchronizing a repository with large amount rpms causes large memory usages while indexing | ||
---|---|---|---|
Product: | Red Hat Satellite | Reporter: | Ivan Necas <inecas> |
Component: | Repositories | Assignee: | Justin Sherrill <jsherril> |
Status: | CLOSED ERRATA | QA Contact: | jcallaha |
Severity: | high | Docs Contact: | |
Priority: | high | ||
Version: | 6.2.0 | CC: | abalakht, andrew.schofield, aperotti, bbuckingham, bkearney, brubisch, chrobert, daniele, erinn.looneytriggs, jcallaha, jsherril, kdixon, mmccune, pmoravec, sthirugn, wpinheir |
Target Milestone: | Unspecified | Keywords: | Performance, PrioBumpField, Triaged |
Target Release: | Unused | ||
Hardware: | Unspecified | ||
OS: | Unspecified | ||
Whiteboard: | |||
Fixed In Version: | rubygem-katello-3.0.0.90-1 | Doc Type: | If docs needed, set a value |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2016-12-19 08:18:09 UTC | Type: | Bug |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: | |||
Bug Depends On: | |||
Bug Blocks: | 1353215, 1394313 |
Description
Ivan Necas
2016-11-28 17:13:05 UTC
The problem seems to be in the fact, that we accumulate all the metadata when indexing the RPMS, and then perform the indexing, while we should do the indexing in slices, so that garbage collection can take it's part in the meantime, see: https://github.com/Katello/katello/blob/ddf031d62943c122f0a877c4640686c1bae0c877/app/models/katello/glue/pulp/repo.rb#L403 https://github.com/Katello/katello/blob/ddf031d62943c122f0a877c4640686c1bae0c877/app/models/katello/glue/pulp/repo.rb#L483 https://github.com/Katello/katello/blob/ddf031d62943c122f0a877c4640686c1bae0c877/app/models/katello/glue/pulp/repo.rb#L458 I have not performed deeper analysis of the code, this are just two obvious examples. Created redmine issue http://projects.theforeman.org/issues/17512 from this bug Upstream bug component is Repositories Moving this bug to POST for triage into Satellite 6 since the upstream issue http://projects.theforeman.org/issues/17512 has been resolved. *** Hotfix Available *** The below hotfix includes fixes for 4 bugs (including this bug) around content synchronization, memory consumption and performance. This hotfix resolves: BZ 1288656 BZ 1391704 BZ 1398438 BZ 1399294 Instructions for application: 1) download to your Satellite: http://people.redhat.com/~mmccune/hotfix/HOTFIX-BZ-1288656-1391704-1398438-1399294.tar.gz 2) verify md5sum: $ md5sum HOTFIX-BZ-1288656-1391704-1398438-1399294.tar.gz 6a47aa576cb18beb7bda19dec9149da8 HOTFIX-BZ-1288656-1391704-1398438-1399294.tar.gz 3) extract and upgrade version in subdirectory depending on EL6 or EL7: rpm -Uvh tfm-rubygem-katello*.rpm 4) katello-service restart 5) resume operations The bugs mentioned in this comment are undergoing formalized testing to profile memory use but we wanted to make this available early for anyone who is interested in trying the fix before it is released as part of a 6.2.Z release. WARNING: We just released Satellite 6.2.5: https://access.redhat.com/errata/RHBA-2016:2940 The hotfix in the previous comment is built against 6.2.4 and the fix for this bug and others mentioned are to be included in 6.2.6. Will attach a second hotfix for those using 6.2.5. *** HOTFIX For 6.2.5 *** If you wish to apply the hotfix for this bug against 6.2.5, we have made it available here: http://people.redhat.com/~mmccune/hotfix/HOTFIX-6.2.5-BZ-1288656-1391704-1398438-1399294.tar.gz Same instructions as Comment 12 but use the above tar.gz $ md5sum HOTFIX-6.2.5-BZ-1288656-1391704-1398438-1399294.tar.gz bb4c1135306fd3306a97b5cf30405947 HOTFIX-6.2.5-BZ-1288656-1391704-1398438-1399294.tar.gz Verified in Satellite 6.2.6 Much smaller RSS growth seen during sync of these 4 repositories: Red Hat Enterprise Linux 5 Desktop RPMs x86_64 5Client Red Hat Enterprise Linux 5 Desktop RPMs x86_64 5.11 Red Hat Enterprise Linux 5 Desktop RPMs i386 5Client Red Hat Enterprise Linux 5 Desktop RPMs i386 5.11 foreman 2266 11.0 0.4 2024728 312760 ? Sl 10:55 0:20 dynflow_executor ---- Start Syncs ---- foreman 2266 11.0 0.4 2024728 313008 ? Sl 10:55 0:21 dynflow_executor foreman 2266 10.7 0.4 2091292 321284 ? Sl 10:55 0:22 dynflow_executor foreman 2266 9.9 0.4 2091292 323772 ? Sl 10:55 0:23 dynflow_executor foreman 2266 9.1 0.5 2224420 330184 ? Sl 10:55 0:24 dynflow_executor foreman 2266 8.3 0.5 2224420 333828 ? Sl 10:55 0:25 dynflow_executor foreman 2266 7.5 0.5 2224420 340096 ? Sl 10:55 0:26 dynflow_executor foreman 2266 7.2 0.5 2224420 341172 ? Sl 10:55 0:27 dynflow_executor foreman 2266 6.7 0.5 2224420 342028 ? Sl 10:55 0:28 dynflow_executor foreman 2266 6.3 0.5 2224420 342452 ? Sl 10:55 0:29 dynflow_executor foreman 2266 6.1 0.5 2224420 342704 ? Sl 10:55 0:30 dynflow_executor foreman 2266 5.9 0.5 2224420 343144 ? Sl 10:55 0:31 dynflow_executor foreman 2266 5.7 0.5 2224420 343572 ? Sl 10:55 0:32 dynflow_executor foreman 2266 5.5 0.5 2224420 344016 ? Sl 10:55 0:33 dynflow_executor foreman 2266 5.3 0.5 2224420 347244 ? Sl 10:55 0:34 dynflow_executor foreman 2266 5.2 0.5 2224420 347252 ? Sl 10:55 0:35 dynflow_executor foreman 2266 5.1 0.5 2224420 347308 ? Sl 10:55 0:36 dynflow_executor foreman 2266 4.9 0.5 2224420 347328 ? Sl 10:55 0:37 dynflow_executor foreman 2266 4.8 0.5 2224420 350096 ? Sl 10:55 0:38 dynflow_executor foreman 2266 4.7 0.5 2224420 351288 ? Sl 10:55 0:39 dynflow_executor foreman 2266 4.6 0.5 2224420 351936 ? Sl 10:55 0:40 dynflow_executor foreman 2266 4.6 0.5 2224420 353436 ? Sl 10:55 0:41 dynflow_executor foreman 2266 4.5 0.5 2224420 356160 ? Sl 10:55 0:42 dynflow_executor foreman 2266 4.4 0.5 2224420 359080 ? Sl 10:55 0:43 dynflow_executor foreman 2266 4.3 0.5 2224420 360604 ? Sl 10:55 0:44 dynflow_executor foreman 2266 4.3 0.5 2224420 361448 ? Sl 10:55 0:45 dynflow_executor foreman 2266 4.2 0.5 2224420 362072 ? Sl 10:55 0:46 dynflow_executor foreman 2266 4.2 0.5 2224420 363532 ? Sl 10:55 0:47 dynflow_executor foreman 2266 4.1 0.5 2224420 363684 ? Sl 10:55 0:48 dynflow_executor foreman 2266 4.1 0.5 2224420 364068 ? Sl 10:55 0:49 dynflow_executor foreman 2266 4.0 0.5 2224420 364720 ? Sl 10:55 0:50 dynflow_executor foreman 2266 4.0 0.5 2224420 367256 ? Sl 10:55 0:51 dynflow_executor foreman 2266 4.0 0.5 2224420 369168 ? Sl 10:55 0:52 dynflow_executor foreman 2266 4.1 0.6 2226592 418600 ? Sl 10:55 0:54 dynflow_executor foreman 2266 5.0 0.7 2292128 475572 ? Sl 10:55 1:06 dynflow_executor foreman 2266 5.6 0.7 2289956 475992 ? Sl 10:55 1:16 dynflow_executor foreman 2266 6.1 0.7 2289956 476084 ? Sl 10:55 1:24 dynflow_executor foreman 2266 6.8 0.7 2289956 482516 ? Sl 10:55 1:34 dynflow_executor foreman 2266 7.4 0.7 2289956 482896 ? Sl 10:55 1:42 dynflow_executor foreman 2266 8.5 0.8 2358020 541964 ? Sl 10:55 2:00 dynflow_executor foreman 2266 9.6 0.8 2358020 548760 ? Sl 10:55 2:17 dynflow_executor foreman 2266 10.2 0.8 2358020 550188 ? Sl 10:55 2:27 dynflow_executor foreman 2266 10.9 0.8 2358020 551960 ? Sl 10:55 2:39 dynflow_executor foreman 2266 11.6 0.9 2358020 597992 ? Sl 10:55 2:50 dynflow_executor foreman 2266 12.6 0.9 2358020 598136 ? Sl 10:55 3:07 dynflow_executor foreman 2266 13.4 0.9 2425388 628604 ? Sl 10:55 3:21 dynflow_executor foreman 2266 14.2 0.9 2429988 642172 ? Sl 10:55 3:35 dynflow_executor foreman 2266 14.8 1.0 2495524 665116 ? Sl 10:55 3:45 dynflow_executor foreman 2266 15.4 1.0 2495524 695004 ? Sl 10:55 3:57 dynflow_executor foreman 2266 16.1 1.0 2490468 704028 ? Sl 10:55 4:09 dynflow_executor foreman 2266 16.1 1.0 2490468 705884 ? Sl 10:55 4:11 dynflow_executor ---- Stop Syncs ---- foreman 2266 16.0 1.0 2490468 709196 ? Sl 10:55 4:12 dynflow_executor foreman 2266 15.8 1.0 2490468 715508 ? Sl 10:55 4:13 dynflow_executor foreman 2266 15.8 1.0 2490468 716196 ? Sl 10:55 4:14 dynflow_executor foreman 2266 15.5 1.0 2490468 721588 ? Sl 10:55 4:15 dynflow_executor foreman 2266 15.2 1.1 2490468 728080 ? Sl 10:55 4:16 dynflow_executor Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2016:2958 |