Bug 1118350 - there must be at most one instance of dwh per engine
Summary: there must be at most one instance of dwh per engine
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: oVirt
Classification: Retired
Component: ovirt-engine-dwh
Version: 3.5
Hardware: Unspecified
OS: Unspecified
high
urgent
Target Milestone: ---
: 3.5.0
Assignee: Yedidyah Bar David
QA Contact: movciari
URL:
Whiteboard: integration
Depends On:
Blocks: 1122021 1140986
TreeView+ depends on / blocked
 
Reported: 2014-07-10 13:28 UTC by Yedidyah Bar David
Modified: 2014-10-17 12:21 UTC (History)
10 users (show)

Fixed In Version: oVirt-3.5 GA ovirt-engine-dwh-3.5.0-1.fc19.noarch.rpm
Doc Type: Bug Fix
Doc Text:
Clone Of:
: 1122021 (view as bug list)
Environment:
Last Closed: 2014-10-17 12:21:42 UTC
oVirt Team: ---


Attachments (Terms of Use)


Links
System ID Priority Status Summary Last Updated
oVirt gerrit 31321 master MERGED packaging: dbscripts: Add DWH hostname and uuid Never
oVirt gerrit 31322 ovirt-engine-3.5 MERGED packaging: dbscripts: Add DWH hostname and uuid Never
oVirt gerrit 31325 master MERGED packaging: setup: Prevent more than one dwh per engine Never
oVirt gerrit 31871 master MERGED history: updated etl to check valid installation Never
oVirt gerrit 32820 master MERGED packaging: setup: use env statement at misc in single_etl Never
oVirt gerrit 32976 ovirt-engine-dwh-3.5 MERGED packaging: setup: use env statement at misc in single_etl Never

Description Yedidyah Bar David 2014-07-10 13:28:46 UTC
Description of problem:

Since we now allow running dwh and engine on separate hosts, it's possible to setup two (or more) dwh instances against a single engine.

This will seem to work well - no conflicts/failures/etc are expected - but in practice only one of the dwh servers will get each update on the engine, so the history will be scattered around them and no-one will have a single correct view of the history.

For now, we should prevent that. We should add a row somewhere in the engine db (Yaniv told me in which table but I don't remember currently) during setup, if it does not exist already, and do something if it does (abort, alert the user and ask for confirmation, etc.).

In the future we might decide that there is use for more than one dwh and add support for that.

Comment 1 Shirly Radco 2014-07-24 07:56:49 UTC
I think we should have a field in the engine that has a key generated by the dwh that can verify this is the correct dwh process.
Didi suggested to check on engine the value if the is a process running.
I suggest a key so that if we use kill -9 and not update the engine db with the dwh stopping we will still be able to run it again.
Can we generete such a key and save it for the next run if the process stopes?

Comment 2 Yedidyah Bar David 2014-07-24 08:07:57 UTC
(In reply to Shirly Radco from comment #1)
> I think we should have a field in the engine that has a key generated by the
> dwh that can verify this is the correct dwh process.

Sounds reasonable. Where do you want to save it? Probably simplest is add a new file /etc/ovirt-engine-dwh/ovirt-engine-dwhd.conf.d/10-setup-dwhd.conf with a single key DWH_ID or something like that.

> Didi suggested to check on engine the value if the is a process running.

Don't remember that. Did I say who will check and how?

> I suggest a key so that if we use kill -9 and not update the engine db with
> the dwh stopping we will still be able to run it again.
> Can we generete such a key and save it for the next run if the process
> stopes?

Comment 3 Yedidyah Bar David 2014-09-11 11:21:23 UTC
Fails on clean setup same host as engine

Comment 4 Yedidyah Bar David 2014-09-14 06:35:14 UTC
*** Bug 1140986 has been marked as a duplicate of this bug. ***

Comment 5 Shirly Radco 2014-09-14 08:26:40 UTC
*** Bug 1141205 has been marked as a duplicate of this bug. ***

Comment 6 Sandro Bonazzola 2014-10-17 12:21:42 UTC
oVirt 3.5 has been released and should include the fix for this issue.


Note You need to log in before you can comment on or make changes to this bug.