| Summary: | kworker Task over 50% after some time | ||
|---|---|---|---|
| Product: | [Fedora] Fedora | Reporter: | nostritius |
| Component: | kernel | Assignee: | Kernel Maintainer List <kernel-maint> |
| Status: | CLOSED INSUFFICIENT_DATA | QA Contact: | Fedora Extras Quality Assurance <extras-qa> |
| Severity: | medium | Docs Contact: | |
| Priority: | unspecified | ||
| Version: | 19 | CC: | dennis, gansalmon, itamar, jonathan, kernel-maint, madhu.chinakonda, michele, michele, nostritius |
| Target Milestone: | --- | ||
| Target Release: | --- | ||
| Hardware: | x86_64 | ||
| OS: | Linux | ||
| Whiteboard: | |||
| Fixed In Version: | Doc Type: | Bug Fix | |
| Doc Text: | Story Points: | --- | |
| Clone Of: | Environment: | ||
| Last Closed: | 2013-12-27 09:08:07 UTC | Type: | Bug |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
|
Description
nostritius
2013-11-16 10:10:42 UTC
I have to correct myself, the numbers of the kworker are every time different in both, the first and the second number. Hard to say if it is expected or not. you could do the following (either before kworker threads go high in cpu usage [better] or during the event itself [less ideal): sudo perf record -a Let it run for a minute or so (while the kworker threads are consuming CPU) and then stop it. Then run: sudo perf report --stdio > perf.txt Upload 'perf.txt' to this case Setting needinfo appropriately. see c#2 Hi can you provide the info asked for in comment 2 please? Thanks, Michele Please excuse me for not answering. I decided to move to debian, because it works more battery friendly then fedora even without the kworker task. if no one else is interested in this problem you can close here. Please excuse me for not answering. I decided to move to debian, because it works more battery friendly then fedora even without the kworker task. if no one else is interested in this problem you can close here. Ok, no worries. I'll close this for now. If someone else will see this issue they will open another BZ |