Bug 1529465 - [HA] after 1 day the agent.log increased to 11GB
Summary: [HA] after 1 day the agent.log increased to 11GB
Keywords:
Status: CLOSED DUPLICATE of bug 1525453
Alias: None
Product: ovirt-hosted-engine-ha
Classification: oVirt
Component: Agent
Version: 2.2.1
Hardware: Unspecified
OS: Unspecified
unspecified
urgent
Target Milestone: ---
: ---
Assignee: bugs@ovirt.org
QA Contact: meital avital
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2017-12-28 08:56 UTC by Kobi Hakimi
Modified: 2018-01-01 08:42 UTC (History)
3 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2018-01-01 08:42:54 UTC
oVirt Team: SLA
Embargoed:


Attachments (Terms of Use)
agent.log file size and provision 1 day ago (34.70 KB, image/png)
2017-12-28 08:56 UTC, Kobi Hakimi
no flags Details
zip file of agent.log after few minutes its become to 215MB (4.97 MB, application/x-gzip)
2017-12-31 12:30 UTC, Kobi Hakimi
no flags Details

Description Kobi Hakimi 2017-12-28 08:56:56 UTC
Created attachment 1373108 [details]
agent.log file size and provision 1 day ago

Description of problem:
[HA] after 1 day the agent.log increased to 11GB

Version-Release number of selected component (if applicable):
ovirt-hosted-engine-ha-2.2.2-1.el7ev.noarch
Software Version:4.2.0-0.6.el7

How reproducible:
100%

Steps to Reproduce:
1. In case of some problem like bug:
https://bugzilla.redhat.com/show_bug.cgi?id=1529458 

Actual results:
The agent log increased all the time.
in my case, its increased to 11GB.

Expected results:
To limit the log file size.
when its reach the limited size archive it and create a new one.

Additional info:
See attached snapshot of agent.log file size

Comment 1 Yaniv Kaul 2017-12-29 14:51:14 UTC
- Severity is missing.
- A snippet of the log would have been useful - to see if there's a repeating log entry there.
- Does it have a log rotate policy?

Comment 2 Kobi Hakimi 2017-12-31 09:55:04 UTC
Sorry but this machine no longer exists.
if I'll see this again I'll add a snippet of the log file.

about the log rotate question, I think No.
otherwise, we didn't get this file with enormous size. 
but the developer should answer it better than me.

Comment 3 Yaniv Kaul 2017-12-31 10:05:09 UTC
(In reply to Kobi Hakimi from comment #2)
> Sorry but this machine no longer exists.
> if I'll see this again I'll add a snippet of the log file.

No worries, re-setting the NEEDINFO to get it.
> 
> about the log rotate question, I think No.
> otherwise, we didn't get this file with enormous size. 
> but the developer should answer it better than me.

You should see the config as part of the package.

Comment 4 Doron Fediuck 2017-12-31 11:39:13 UTC
The HA daemon rotates once a day, keeping the last 7 days for history.
What we need is to find the reason for the flood, which means we need to see a snippet of the logs you had.

Comment 5 Kobi Hakimi 2017-12-31 12:21:28 UTC
from what I reproduced when killing the vdsmd we can see in agent.log the following line repeat 2 times in one millisecond:
Client localhost:54321::WARNING::2017-12-31 14:12:17,772::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event

as you can see in the following log snippet:
============================================
Client localhost:54321::WARNING::2017-12-31 14:12:17,772::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,773::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,774::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,775::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,775::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,776::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,776::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,778::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,778::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,779::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,779::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,780::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,780::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,782::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,782::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,783::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,783::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,784::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,785::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,786::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,786::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,787::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,787::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,789::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,789::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,790::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event
Client localhost:54321::WARNING::2017-12-31 14:12:17,790::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event
Client localhost:54321::WARNING::2017-12-31 14:12:17,791::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event

Comment 6 Kobi Hakimi 2017-12-31 12:30:04 UTC
Created attachment 1374896 [details]
zip file of agent.log after few minutes its become to 215MB

Comment 7 Doron Fediuck 2018-01-01 08:42:54 UTC

*** This bug has been marked as a duplicate of bug 1525453 ***


Note You need to log in before you can comment on or make changes to this bug.