Bug 1640628 - Prevent multiple instances of /usr/bin/smart-proxy-openscap-send
Summary: Prevent multiple instances of /usr/bin/smart-proxy-openscap-send
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: SCAP Plugin
Version: 6.4
Hardware: x86_64
OS: Linux
urgent
urgent
Target Milestone: 6.5.0
Assignee: Ondřej Pražák
QA Contact: Jameer Pathan
URL: https://projects.theforeman.org/issue...
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2018-10-18 12:49 UTC by Taft Sanders
Modified: 2019-11-05 22:34 UTC (History)
13 users (show)

Fixed In Version: rubygem-smart_proxy_openscap-0.7.1
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-05-14 12:38:16 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
rubygem-smart_proxy_openscap-0.7.1-1.el7sat.noarch.rpm (33.11 KB, application/x-rpm)
2019-08-15 16:56 UTC, Mike McCune
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Foreman Issue Tracker 25242 0 Normal Closed Prevent multiple instances of /usr/bin/smart-proxy-openscap-send 2020-10-30 13:50:01 UTC
Red Hat Knowledge Base (Solution) 3665251 0 None None None 2018-10-24 20:12:10 UTC
Red Hat Product Errata RHSA-2019:1222 0 None None None 2019-05-14 12:38:25 UTC

Description Taft Sanders 2018-10-18 12:49:03 UTC
Description of problem:
Shortly after upgrading CPU spike occurred for dozens or ruby processes.

Version-Release number of selected component (if applicable):
6.4

How reproducible:
n/a

Steps to Reproduce:
1.
2.
3.

Actual results:
8 cores at 100% usage

Expected results:
normal CPU usage

Additional info:
still looking to narrow down the scope of the cause, debugs to come

Comment 2 Taft Sanders 2018-10-18 13:08:28 UTC
top - 09:07:52 up 1 day, 22:21,  5 users,  load average: 71.08, 70.38, 69.25
Tasks: 596 total,  71 running, 525 sleeping,   0 stopped,   0 zombie
%Cpu0  :  77.9/22.1  100[|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||]
%Cpu1  :  78.9/21.1  100[|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||]
%Cpu2  :  79.8/20.2  100[|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||]
%Cpu3  :  76.4/23.3  100[|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||]
%Cpu4  :  78.9/21.1  100[|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||]
%Cpu5  :  78.3/21.4  100[|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||]
%Cpu6  :  79.2/20.8  100[|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||]
%Cpu7  :  76.7/23.3  100[|||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||||]
KiB Mem : 64944124 total,  5754328 free, 16206064 used, 42983732 buff/cache
KiB Swap:  8257532 total,  8257532 free,        0 used. 46378396 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND                             
 1884 foreman   20   0 1588000 513532   5120 S  15.0  0.8   6:13.44 ruby                                
 1127 foreman   20   0 1653664 509972   6084 S  14.6  0.8   6:30.50 ruby                                
 1066 foreman   20   0 1653464 509856   6172 S  14.0  0.8   6:09.10 ruby                                
 1152 foreman   20   0 1653764 510384   6148 S  14.0  0.8   6:32.77 ruby                                
 1327 foreman   20   0 1654164 512420   6144 S  13.7  0.8   6:15.43 ruby                                
29842 foreman+  20   0  332720  86436   5672 R  13.1  0.1   0:00.54 ruby                                
 1779 foreman   20   0 1653440 514500   6132 S  11.8  0.8   6:10.75 ruby                                
29883 foreman+  20   0  282220  35564   5360 R  10.8  0.1   0:00.34 ruby                                
29855 foreman+  20   0  288556  41892   5360 R  10.5  0.1   0:00.37 ruby                                
 1178 foreman   20   0 1653864 512804   6096 S  10.2  0.8   6:24.99 ruby                                
29749 foreman+  20   0       0      0      0 R  10.2  0.0   0:00.57 ruby                                
29878 foreman+  20   0  282616  36052   5360 R  10.2  0.1   0:00.32 ruby                                
 1103 foreman   20   0 1653564 509124   6068 S   9.9  0.8   6:27.08 ruby                                
 1610 foreman   20   0 1654364 514524   6120 S   9.6  0.8   6:24.62 ruby                                
29852 foreman+  20   0  283144  36620   5360 R   9.6  0.1   0:00.32 ruby                                
29867 foreman+  20   0  279712  32984   5360 R   9.6  0.1   0:00.30 ruby                                
29759 foreman+  20   0  337308  91196   5688 R   8.9  0.1   0:00.56 ruby                                
29873 foreman+  20   0  278260  31588   5360 R   8.9  0.0   0:00.28 ruby                                
29952 foreman+  20   0  276788  29996   5360 R   8.9  0.0   0:00.28 ruby                                
29739 foreman+  20   0  330180  83792   5672 R   8.6  0.1   0:00.51 ruby                                
29835 foreman+  20   0  287940  41564   5460 R   8.6  0.1   0:00.38 ruby                                
29872 foreman+  20   0  274960  28492   5360 R   8.6  0.0   0:00.27 ruby                                
29920 foreman+  20   0  278788  32120   5360 R   8.6  0.0   0:00.27 ruby                                
29839 foreman+  20   0  290140  43488   5360 R   8.3  0.1   0:00.37 ruby                                
29869 foreman+  20   0  274960  28432   5360 R   8.3  0.0   0:00.26 ruby                                
29892 foreman+  20   0  274460  27708   5360 R   8.3  0.0   0:00.26 ruby                                
 1203 foreman   20   0 1653964 509940   6120 S   8.0  0.8   6:18.87 ruby                                
32576 root      20   0 1424412   7212   2400 S   8.0  0.0   3:44.43 PassengerHelper                     
29821 foreman+  20   0  287792  41144   5360 R   7.6  0.1   0:00.36 ruby                                
 1243 foreman   20   0 1654064 511580   6148 S   7.3  0.8   6:15.00 ruby                                
29965 foreman+  20   0  273780  27084   5360 R   7.3  0.0   0:00.23 ruby                                
29967 foreman+  20   0  273112  26392   5360 R   7.3  0.0   0:00.23 ruby                                
 1428 foreman   20   0 1651668 514188   4724 S   6.1  0.8   6:14.12 ruby                                
29947 foreman+  20   0  270188  23664   5360 R   6.1  0.0   0:00.19 ruby                                
29968 foreman+  20   0  268652  22100   5356 R   5.1  0.0   0:00.16 ruby                                
29969 foreman+  20   0  267436  20768   5352 R   5.1  0.0   0:00.16 ruby                                
29982 foreman+  20   0  266200  19848   5668 R   4.5  0.0   0:00.14 ruby                                
29985 foreman+  20   0  264408  18020   5604 R   4.5  0.0   0:00.14 ruby

Comment 3 Taft Sanders 2018-10-18 13:40:37 UTC
Stopping the Satellite services changed nothing in the system load, however rebooting the system seemed to level out the spike for now:

top - 09:39:45 up 3 min,  1 user,  load average: 2.40, 3.67, 1.72
Tasks: 270 total,   1 running, 269 sleeping,   0 stopped,   0 zombie
%Cpu0  :   1.0/0.3     1[|                                                                            ]
%Cpu1  :   1.0/0.3     1[|                                                                            ]
%Cpu2  :   1.3/0.3     2[|                                                                            ]
%Cpu3  :   0.3/0.0     0[                                                                             ]
%Cpu4  :   1.7/0.3     2[|                                                                            ]
%Cpu5  :   1.3/0.3     2[|                                                                            ]
%Cpu6  :   1.0/0.3     1[|                                                                            ]
%Cpu7  :   0.3/0.0     0[                                                                             ]
KiB Mem : 64944124 total, 56527144 free,  7461952 used,   955028 buff/cache
KiB Swap:  8257532 total,  8257532 free,        0 used. 56824956 avail Mem 

  PID USER      PR  NI    VIRT    RES    SHR S  %CPU %MEM     TIME+ COMMAND                             
 1116 tomcat    20   0    9.8g   1.4g  19124 S   2.3  2.3   1:04.56 java                                
   59 root      20   0       0      0      0 S   1.0  0.0   0:07.74 kworker/0:1                         
 1149 apache    20   0 1056416  75644  10692 S   1.0  0.1   0:02.46 celery                              
 1327 apache    20   0 1056384  75580  10692 S   0.7  0.1   0:02.33 celery                              
 1352 apache    20   0 1056404  73572  10692 S   0.7  0.1   0:02.38 celery                              
 1443 mongodb   20   0   25.0g  43704   7500 S   0.7  0.1   0:01.56 mongod                              
 1146 apache    20   0  541192  52172   5552 S   0.3  0.1   0:01.38 celery                              
 1330 apache    20   0 1056440  75620  10692 S   0.3  0.1   0:02.33 celery                              
 1336 apache    20   0 1056516  75652  10692 S   0.3  0.1   0:02.35 celery                              
 1340 apache    20   0 1056408  73584  10692 S   0.3  0.1   0:02.41 celery

Comment 4 Craig Donnelly 2018-10-18 19:39:20 UTC
Looks like it is related to this:

root       803  0.0  0.0 126284  1712 ?        Ss   09:36   0:01 /usr/sbin/crond -n
root      6585  0.0  0.0 182328  2532 ?        S    10:00   0:00  \_ /usr/sbin/CROND -n
foreman+  6597  0.0  0.0 113172  1212 ?        Ss   10:00   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+  6601  3.6  0.0 251048 49932 ?        Sl   10:00  12:13  |       \_ ruby /usr/bin/smart-proxy-openscap-send
foreman+ 23308 44.0  0.1 318448 72036 ?        Rl   15:38   0:00  |           \_ ruby /usr/bin/smart-proxy-arf-json /var/tmp/cee46ee7-6971-4880-98af-6dc430685918-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1523265251-20181018-6601-1rfn0nx /var
root     22600  0.0  0.0 182328  2532 ?        S    10:30   0:00  \_ /usr/sbin/CROND -n
foreman+ 22612  0.0  0.0 113172  1212 ?        Ss   10:30   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 22616  3.7  0.0 251848 50732 ?        Sl   10:30  11:31  |       \_ ruby /usr/bin/smart-proxy-openscap-send
foreman+ 23344  0.0  0.0 243064 10204 ?        Rl   15:38   0:00  |           \_ ruby /usr/bin/smart-proxy-arf-json /var/tmp/8167025e-6f0c-44dc-a262-bb9c0961dc03-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1522689558-20181018-22616-98f3ne /var
root     20805  0.0  0.0 182328  2532 ?        S    11:00   0:00  \_ /usr/sbin/CROND -n
foreman+ 20822  0.0  0.0 113172  1212 ?        Ss   11:00   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 20833  3.9  0.0 252896 51784 ?        Sl   11:00  11:02  |       \_ ruby /usr/bin/smart-proxy-openscap-send
foreman+ 23300 42.0  0.1 316732 70452 ?        Rl   15:38   0:00  |           \_ ruby /usr/bin/smart-proxy-arf-json /var/tmp/97c333f0-5a81-4603-8cb6-4e016305676c-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1521811767-20181018-20833-1qwnngi /va
root     31643  0.0  0.0 182328  2532 ?        S    11:30   0:00  \_ /usr/sbin/CROND -n
foreman+ 31653  0.0  0.0 113172  1212 ?        Ss   11:30   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 31664  3.8  0.0 248360 47232 ?        Rl   11:30   9:41  |       \_ ruby /usr/bin/smart-proxy-openscap-send
root     20760  0.0  0.0 182328  2532 ?        S    12:00   0:00  \_ /usr/sbin/CROND -n
foreman+ 20780  0.0  0.0 113172  1212 ?        Ss   12:00   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 20796  3.9  0.0 247392 46404 ?        Sl   12:00   8:33  |       \_ ruby /usr/bin/smart-proxy-openscap-send
root     23047  0.0  0.0 182328  2532 ?        S    12:30   0:00  \_ /usr/sbin/CROND -n
foreman+ 23054  0.0  0.0 113172  1212 ?        Ss   12:30   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 23061  3.6  0.0 246664 45672 ?        Sl   12:30   6:59  |       \_ ruby /usr/bin/smart-proxy-openscap-send
foreman+ 23341  0.0  0.0 266600 20268 ?        Rl   15:38   0:00  |           \_ ruby /usr/bin/smart-proxy-arf-json /var/tmp/d6504948-30f5-4f93-8caa-628a2ffee6a9-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1532445593-20181018-23061-15ma86u /va
root      3856  0.0  0.0 182328  2532 ?        S    13:00   0:00  \_ /usr/sbin/CROND -n
foreman+  3875  0.0  0.0 113172  1208 ?        Ss   13:00   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+  3880  3.6  0.0 245544 44528 ?        Sl   13:00   5:51  |       \_ ruby /usr/bin/smart-proxy-openscap-send
foreman+ 23335  0.0  0.0 265500 17648 ?        Rl   15:38   0:00  |           \_ ruby /usr/bin/smart-proxy-arf-json /var/tmp/a982e2a4-9f0a-4e50-9768-5b90ff943169-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1530444892-20181018-3880-1k107jr /var
root     26760  0.0  0.0 182328  2532 ?        S    13:30   0:00  \_ /usr/sbin/CROND -n
foreman+ 26776  0.0  0.0 113172  1208 ?        Ss   13:30   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 26784  3.6  0.0 244676 43668 ?        Sl   13:30   4:39  |       \_ ruby /usr/bin/smart-proxy-openscap-send
root     25265  0.0  0.0 182328  2532 ?        S    14:00   0:00  \_ /usr/sbin/CROND -n
foreman+ 25277  0.0  0.0 113172  1212 ?        Ss   14:00   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 25285  3.6  0.0 243320 42340 ?        Sl   14:00   3:37  |       \_ ruby /usr/bin/smart-proxy-openscap-send
root      3316  0.0  0.0 182328  2532 ?        S    14:30   0:00  \_ /usr/sbin/CROND -n
foreman+  3332  0.0  0.0 113172  1212 ?        Ss   14:30   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+  3338  3.5  0.0 242112 40976 ?        Sl   14:30   2:25  |       \_ ruby /usr/bin/smart-proxy-openscap-send
root     12788  0.0  0.0 182328  2532 ?        S    15:00   0:00  \_ /usr/sbin/CROND -n
foreman+ 12814  0.0  0.0 113172  1212 ?        Ss   15:00   0:00  |   \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 12818  3.5  0.0 240660 39672 ?        Sl   15:00   1:22  |       \_ ruby /usr/bin/smart-proxy-openscap-send
root     23334  0.0  0.0 182328  2532 ?        S    15:30   0:00  \_ /usr/sbin/CROND -n
foreman+ 23349  0.0  0.0 113172  1212 ?        Ss   15:30   0:00      \_ /bin/sh -c smart-proxy-openscap-send >> /var/log/foreman-proxy/cron.log
foreman+ 23352  3.2  0.0 238800 37808 ?        Sl   15:30   0:17          \_ ruby /usr/bin/smart-proxy-openscap-send

Comment 5 Craig Donnelly 2018-10-18 19:45:09 UTC
There appears to be some loop going on with the files attempting to be touched by smart-proxy-arf-json in /var/tmp, they are infinitely changing:

# date && find . -type f | sort
Thu Oct 18 15:44:46 EDT 2018
./275ce370-91c0-49dc-a97f-5ebd1efc99f5-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1534297524-20181018-30268-1xrydy
./275ce370-91c0-49dc-a97f-5ebd1efc99f5-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1534297524-json-20181018-30268-w65t37
./2d4a689d-6f4c-40c2-9dc0-7205b6838c79-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1522811356-20181018-22616-oxh2pa
./2d4a689d-6f4c-40c2-9dc0-7205b6838c79-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1522811356-json-20181018-22616-1sj7z1g
./2e82e1b9-05ab-49c0-bc61-2418ed95e052-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1524671918-json-20181018-30523-10zgtjl
./3242701e-56ec-4f19-80df-5bc1033e8302-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1532728491-20181018-23061-1l1bkke
./3242701e-56ec-4f19-80df-5bc1033e8302-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1532728491-json-20181018-23061-xkqt3v
./50ac4282-7a38-4e1d-a358-927c448d25f1-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1535053195-20181018-20796-mzr79x
./50ac4282-7a38-4e1d-a358-927c448d25f1-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1535053195-json-20181018-20796-19otvta
./6783e61f-502b-487e-aa9e-5c726fa46a50-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1523385550-20181018-6601-1nt3bga
./6783e61f-502b-487e-aa9e-5c726fa46a50-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1523385550-json-20181018-6601-8rn1h2
./80c89636-c15f-47ce-b7eb-bf2e83c8787d-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1538383501-20181018-31664-12astvy
./80c89636-c15f-47ce-b7eb-bf2e83c8787d-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1538383501-json-20181018-31664-13szfrd
./a81cfd04-2f31-409b-aede-36dca023ca55-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1521933566-20181018-20833-1vn6uuz
./a81cfd04-2f31-409b-aede-36dca023ca55-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1521933566-json-20181018-20833-1ai4nzv
./afa4c636-61d3-4d9c-aa63-5d3e18237901-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1524056025-json-20181018-7103-1l7yn5t
./CpIdeB/insights-batman.usersys.redhat.com-20180924034924.tar.gz
./fcc4cd44-b465-439c-92dd-bfdf5e05081f-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1527270792-20181018-25285-1hub1lr
./fcc4cd44-b465-439c-92dd-bfdf5e05081f-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1527270792-json-20181018-25285-1lhjmlz
./FjKckV/insights-batman.usersys.redhat.com-20180922034433.tar.gz
./foreman-debug-SLdZp.tar.xz
./ImK7E0/insights-batman.usersys.redhat.com-20180926034527.tar.gz
./l7He7U/insights-batman.usersys.redhat.com-20180923034946.tar.gz
./scl0nVRJg
./sclFakK8h
./systemd-private-b8e309fd956240f48f068ade6f7035fd-httpd.service-b0Cxrw/tmp/sclVwVDUM

# date && find . -type f | sort
Thu Oct 18 15:44:47 EDT 2018
./275ce370-91c0-49dc-a97f-5ebd1efc99f5-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1534297524-20181018-30268-1xrydy
./275ce370-91c0-49dc-a97f-5ebd1efc99f5-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-1-1534297524-json-20181018-30268-w65t37
./2e82e1b9-05ab-49c0-bc61-2418ed95e052-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1524671918-json-20181018-30523-10zgtjl
./afa4c636-61d3-4d9c-aa63-5d3e18237901-ad1f8e33-e340-4b2c-bf74-b542e6fc558b-2-1524056025-json-20181018-7103-1l7yn5t
./CpIdeB/insights-batman.usersys.redhat.com-20180924034924.tar.gz
./FjKckV/insights-batman.usersys.redhat.com-20180922034433.tar.gz
./foreman-debug-SLdZp.tar.xz
./ImK7E0/insights-batman.usersys.redhat.com-20180926034527.tar.gz
./l7He7U/insights-batman.usersys.redhat.com-20180923034946.tar.gz
./scl0nVRJg
./sclFakK8h
./systemd-private-b8e309fd956240f48f068ade6f7035fd-httpd.service-b0Cxrw/tmp/sclVwVDUM

Comment 6 Craig Donnelly 2018-10-18 19:49:24 UTC
Here is a lovely bit of spam on the trail:

10.12.213.140 - - [18/Oct/2018:15:48:59 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1523467150 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:48:59 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1529138987 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:48:59 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1532921688 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:48:59 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1527460690 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:00 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1530916487 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1522016065 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1522893855 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1535843840 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1530916787 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1524153824 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1532921989 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1522624740 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1527460990 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1525760908 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1529139287 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1538574899 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:01 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1523467450 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1527461290 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1530917087 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1524154124 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1535844165 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1532922288 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1522625040 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1525761208 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1538575199 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1529139587 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1535844490 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1527461590 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1522016365 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1522894156 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1524154424 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1532922588 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1530917387 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1522625340 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1529139887 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1538575499 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1525761508 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:02 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1523467750 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1538575799 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1525761808 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1532922888 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1535844815 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1524154724 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1527461890 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1529140187 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1530917687 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1522625640 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:03 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1525762108 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:04 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1522894456 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:04 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/1/1522016665 HTTP/1.1" 422 171 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:04 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1535845139 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:04 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1524155024 HTTP/1.1" 404 86 "-" "Ruby"
10.12.213.140 - - [18/Oct/2018:15:49:04 -0400] "POST /api/v2/compliance/arf_reports/ad1f8e33-e340-4b2c-bf74-b542e6fc558b/2/1532923189 HTTP/1.1" 404 86 "-" "Ruby"

Comment 7 Lukas Zapletal 2018-10-19 06:16:41 UTC
Hello,

the "pstree" output looks like this is a cronjob. Ondrej, can you look into this please?

In the meantime, turn off the cron job which sends reports every minute:

/etc/cron.d/rubygem-smart_proxy_openscap

Then run the script manually:

su foreman-proxy -s /usr/bin/smart-proxy-openscap-send

and figure out how slow it is and then tune the cron job to avoid starting multiple jobs.

We need to solve why this is slow and also prevent from executing this script in parallel: https://projects.theforeman.org/issues/25242

Comment 8 Marek Hulan 2018-10-19 12:07:47 UTC
Did customer changed the cron job? I defer to Ondrej but I don't think we set the interval to 1 minute by default.

Comment 9 Taft Sanders 2018-10-19 12:25:40 UTC
(In reply to Marek Hulan from comment #8)
> Did customer changed the cron job? I defer to Ondrej but I don't think we
> set the interval to 1 minute by default.

Hey Marek,
Not a customer environment, it's a test environment I have set up here. Nothing custom about the openscap, just a fresh upgrade from 6.3 to 6.4.

Comment 10 Lukas Zapletal 2018-10-22 06:09:04 UTC
Sorry, did I say "every minute"? I think it's every 30 minutes or something like that, that was a typo.

Comment 11 Ondřej Pražák 2018-10-25 05:53:22 UTC
Connecting redmine issue http://projects.theforeman.org/issues/25242 from this bug

Comment 12 Ondřej Pražák 2018-10-25 05:55:37 UTC
I'll look into it.

Comment 14 Lukas Zapletal 2018-10-31 15:41:11 UTC
Can you confirm that changing the cron job from 30 minutes to let's say every 2 hours helped to let the processes at least to finish?

Comment 15 Taft Sanders 2018-10-31 16:30:02 UTC
Hey Luke,

I moved the cron job out to /root so that it wouldn't be run, restarted the Satellite server (reboot), and observed no more issue. About 3 days ago I moved it back to the cron directory to create a KCS for this issue, and the issue seems to be non-reproducible after the fact. So maybe the issue was local to the test machine? Not sure at this point as my testing hasn't gotten to Openscap functionality yet. If you weren't able to reproduce, you may want to lower the priority on this until I can conclude Openscap testing on my side to see if the issue is reproducible.

Comment 16 Lukas Zapletal 2018-11-01 07:31:35 UTC
Thanks Taft.

My wild guess is that there were so many compliance reports enqued that our cron job started processing them all and since we don't have a lock mechanism, it was starting more and more jobs every 30 minutes.

An upstream ticket was filed to prevent starting new cron job if another is still running. Let me rename this one.

Comment 17 Satellite Program 2018-11-13 13:06:10 UTC
Upstream bug assigned to oprazak

Comment 18 Satellite Program 2018-11-13 13:06:14 UTC
Upstream bug assigned to oprazak

Comment 21 Lukas Zapletal 2018-11-16 13:10:31 UTC
Thanks this explains it. OpenSCAP endpoint is logging excessive amount of JSON data into INFO log level. Created separate BZ for this: https://bugzilla.redhat.com/show_bug.cgi?id=1650543

Comment 22 Lukas Zapletal 2018-11-20 11:54:25 UTC
I think the cron bug is doable for 6.5 if it gets ACKed.

Comment 29 Jameer Pathan 2019-01-09 13:08:58 UTC
needinfo:

with snap 10
I had reports in spool directory and I have tried to start 2 instances of smart-proxy-openscap-send simultaneously on terminals.
one was run successfully while the other one just return to shell without giving any warning and return code 0. 
Is this expected? or second instance run should give some warning message.

Comment 30 Ondřej Pražák 2019-01-11 09:10:37 UTC
smart-proxy-openscap-send is usually not executed manually from terminal, so anything printed to stdout would get lost. We do log a message about not sending spool [1], so I expect it would be in the logs.

[1] https://github.com/theforeman/smart_proxy_openscap/blob/master/bin/smart-proxy-openscap-send#L47-L50

Comment 31 Jameer Pathan 2019-01-28 13:58:22 UTC
verified

@satellite 6.5.0 snap 12
@rubygem-openscap-0.4.7-3.el7sat.noarch.rpm
@rubygem-smart_proxy_openscap-0.7.1-1.el7sat.noarch.rpm

steps:
1. had few reports in spool directory.
2. started two instances of smart-proxy-openscap-send simultaneously.

observation:
1. Additional instance detected the script is already running and exited without sending anything.

2. "Lock file /var/spool/foreman-proxy/openscap/spool.lock for openscap spool exists, not sending spool to server" in 
/var/log/foreman-proxy/proxy.log

Comment 33 errata-xmlrpc 2019-05-14 12:38:16 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2019:1222

Comment 34 Mike McCune 2019-08-15 16:56:25 UTC
*** Satellite 6.4.4 Hotfix Available ***

1) Download rubygem-smart_proxy_openscap-0.7.1-1.el7sat.noarch.rpm from this bugzilla to your Satellite

2) Install:
         
rpm -Uvh rubygem-smart_proxy_openscap-0.7.1-1.el7sat.noarch.rpm

3) restart:

satellite-maintain service restart

4) resume operations

Comment 35 Mike McCune 2019-08-15 16:56:54 UTC
Created attachment 1604117 [details]
rubygem-smart_proxy_openscap-0.7.1-1.el7sat.noarch.rpm


Note You need to log in before you can comment on or make changes to this bug.