Bug 1529465
Summary: | [HA] after 1 day the agent.log increased to 11GB | ||||||||
---|---|---|---|---|---|---|---|---|---|
Product: | [oVirt] ovirt-hosted-engine-ha | Reporter: | Kobi Hakimi <khakimi> | ||||||
Component: | Agent | Assignee: | bugs <bugs> | ||||||
Status: | CLOSED DUPLICATE | QA Contact: | meital avital <mavital> | ||||||
Severity: | urgent | Docs Contact: | |||||||
Priority: | unspecified | ||||||||
Version: | 2.2.1 | CC: | bugs, dfediuck, khakimi | ||||||
Target Milestone: | --- | ||||||||
Target Release: | --- | ||||||||
Hardware: | Unspecified | ||||||||
OS: | Unspecified | ||||||||
Whiteboard: | |||||||||
Fixed In Version: | Doc Type: | If docs needed, set a value | |||||||
Doc Text: | Story Points: | --- | |||||||
Clone Of: | Environment: | ||||||||
Last Closed: | 2018-01-01 08:42:54 UTC | Type: | Bug | ||||||
Regression: | --- | Mount Type: | --- | ||||||
Documentation: | --- | CRM: | |||||||
Verified Versions: | Category: | --- | |||||||
oVirt Team: | SLA | RHEL 7.3 requirements from Atomic Host: | |||||||
Cloudforms Team: | --- | Target Upstream Version: | |||||||
Embargoed: | |||||||||
Attachments: |
|
Description
Kobi Hakimi
2017-12-28 08:56:56 UTC
- Severity is missing. - A snippet of the log would have been useful - to see if there's a repeating log entry there. - Does it have a log rotate policy? Sorry but this machine no longer exists. if I'll see this again I'll add a snippet of the log file. about the log rotate question, I think No. otherwise, we didn't get this file with enormous size. but the developer should answer it better than me. (In reply to Kobi Hakimi from comment #2) > Sorry but this machine no longer exists. > if I'll see this again I'll add a snippet of the log file. No worries, re-setting the NEEDINFO to get it. > > about the log rotate question, I think No. > otherwise, we didn't get this file with enormous size. > but the developer should answer it better than me. You should see the config as part of the package. The HA daemon rotates once a day, keeping the last 7 days for history. What we need is to find the reason for the flood, which means we need to see a snippet of the logs you had. from what I reproduced when killing the vdsmd we can see in agent.log the following line repeat 2 times in one millisecond: Client localhost:54321::WARNING::2017-12-31 14:12:17,772::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event as you can see in the following log snippet: ============================================ Client localhost:54321::WARNING::2017-12-31 14:12:17,772::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,773::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,774::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,775::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,775::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,776::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,776::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,778::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,778::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,779::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,779::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,780::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,780::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,782::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,782::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,783::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,783::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,784::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,785::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,786::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,786::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,787::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,787::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,789::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,789::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,790::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Client localhost:54321::WARNING::2017-12-31 14:12:17,790::betterAsyncore::177::vds.dispatcher::(log_info) unhandled close event Client localhost:54321::WARNING::2017-12-31 14:12:17,791::betterAsyncore::177::vds.dispatcher::(log_info) unhandled write event Created attachment 1374896 [details]
zip file of agent.log after few minutes its become to 215MB
*** This bug has been marked as a duplicate of bug 1525453 *** |