Bug 861149 - rhevm-reports producing huge log under /usr/share/jasperreports-server-pro
Summary: rhevm-reports producing huge log under /usr/share/jasperreports-server-pro
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: ovirt-engine-reports
Version: 3.1.0
Hardware: Unspecified
OS: Unspecified
high
urgent
Target Milestone: ---
: ---
Assignee: Alex Lourie
QA Contact: David Botzer
URL:
Whiteboard: infra integration
Depends On:
Blocks: 796214
TreeView+ depends on / blocked
 
Reported: 2012-09-27 16:08 UTC by Stephen Gordon
Modified: 2014-01-14 00:04 UTC (History)
8 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2012-12-04 19:59:00 UTC
oVirt Team: ---
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
Logs from latest run of rhevm-reports-setup. (5.17 KB, text/x-log)
2012-10-15 19:58 UTC, Stephen Gordon
no flags Details

Description Stephen Gordon 2012-09-27 16:08:37 UTC
Description of problem:

I updated to SI19.1 to QE something else, after adding the repo:

# yum update rhevm-setup
# rhevm-upgrade
# yum update

At the third command the reports updates came down as expected, then I ran rhevm-dwh-setup and rhevm-reports-setup. These also completed successfully.

Shortly there after though when trying to do something else the machine reported that root was 100% full. On investigation I found that 14G of my 20G HDD space on root had been taken up by this file:

/usr/share/jasperreports-server-pro/js-install-pro_2012-09-27_14-26-22894.log

-rw-r--r--. 1 root root 14G Sep 27 15:17 js-install-pro_2012-09-27_14-26-22894.log

I know we are trying to improve the amount and format of information we log, but this is ridiculous. Looking at it it appears to being flooded with these messages:

   [input] Database [rhevmreports] already exists. Drop it and create new? WARNING: All existing data will be lost! This operation may not be rolled back. Enter 'y' to recreate or 'n' to skip this step. Default is 'n' (y, [n])

I'm assuming whatever process is actually running these isn't listening to standard input anywhere to receive an answer (rhevm-reports-setup finished successfully without asking me this question).

Comment 1 Stephen Gordon 2012-09-27 16:12:39 UTC
For comparison the log from a previous version/run (probably SI18 on this machine) is 36K.

Versions:

rhevm-dwh-3.1.0-13.el6ev.noarch
rhevm-reports-3.1.0-14.el6ev.noarch

Comment 2 David Botzer 2012-10-09 10:15:19 UTC
3.1/si20
To workaround this, need to do the following:
1. dropdb rhevmreports -- but than u loose all previous Adhoc reports and any 
   other configuration done (such as tasks)
2. rhevm-reports-setup

Comment 3 Yaniv Lavi 2012-10-09 10:17:07 UTC
This happens because db is not dropped and jasper is waiting for user input.
Alex, please check we are testing for the db correctly (even in remote db setup).
Also opened case with jasper number 28910.


Yaniv

Comment 4 Alex Lourie 2012-10-15 16:07:51 UTC
Steve or David

Please reproduce this, write detailed reproduction steps and attach the log file of reports setup execution.

Thank you.

Comment 5 Stephen Gordon 2012-10-15 19:57:09 UTC
(In reply to comment #4)
> Steve or David
> 
> Please reproduce this, write detailed reproduction steps and attach the log
> file of reports setup execution.
> 
> Thank you.

As per the bug description the way to reproduce was:

1) Upgrade from si19.1->si20:

# yum update rhevm-setup
# rhevm-upgrade
# yum update
# rhevm-dwh-setup
# rhevm-reports-setup

2) Wait for / to be flooded as a result of a quickly growing js-install-pro*.log. 
3) Profit? 

I no longer have logs from the original run, nor do I have an si19.1 setup to try again on. However I ran rhevm-setup on my si20 install to refresh the installation and it resulted in the same behavior.

While js-install-pro*.log has moved to /usr/share/jasperreports-server-pro/buildomatic/logs/, this file again grew to 14G and resulted in / being 100% full.

Comment 6 Stephen Gordon 2012-10-15 19:58:22 UTC
Created attachment 627654 [details]
Logs from latest run of rhevm-reports-setup.

Comment 7 Stephen Gordon 2012-10-15 20:04:17 UTC
(In reply to comment #5)
> I no longer have logs from the original run, nor do I have an si19.1 setup
> to try again on. However I ran rhevm-setup on my si20 install to refresh the
> installation and it resulted in the same behavior.

Obviously here I meant to say rhevm-reports-setup, not rhevm-setup. Log attached.

Comment 8 Stephen Gordon 2012-10-15 20:12:55 UTC
Note that it looks like the reports common_utils perform the db existence check slightly differently to the dwh common_utils.

From /usr/share/ovirt-engine-reports/common_utils.py:

        "--set ON_ERROR_STOP=1",

From /usr/share/ovirt-engine-dwh/common_utils.py:

        "--set",
        "ON_ERROR_STOP=1",

Comment 9 Alex Lourie 2012-10-17 10:52:34 UTC
Hi Stephen

This is actually seems to be a root of a problem. You're right about the code difference, but it should've thrown an exception on setup, not finish it correctly.

Anyway, this problem was fixed and you should be getting the fixed code in the next build. I would appreciate if you'd retest it with the build si21.

Thank you.

Comment 10 Yaniv Lavi 2012-10-29 14:23:38 UTC
Any news on this?
Can we close it?



Yaniv

Comment 11 Stephen Gordon 2012-10-30 18:23:27 UTC
(In reply to comment #10)
> Any news on this?

I installed the newer build and still have some hard drive space available. A significant improvement.

> Can we close it?

What about QE?

Comment 12 Alex Lourie 2012-10-31 10:50:24 UTC
(In reply to comment #11)
> (In reply to comment #10)
> > Any news on this?
> 
> I installed the newer build and still have some hard drive space available.
> A significant improvement.
> 

Great.

> > Can we close it?
> 
> What about QE?

Moving bug to QA for proper testing.

Note for QA: Please verify the fix with the last build.

Comment 13 David Botzer 2012-11-11 12:02:09 UTC
Fixed, 3.1/si24 (upgrade si23-si24)
Reports-setup works correctly
Fixed, 3.1/si24 (upgrade si23-si24)


Note You need to log in before you can comment on or make changes to this bug.