Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.

Bug 1118350

Summary: there must be at most one instance of dwh per engine
Product: [Retired] oVirt Reporter: Yedidyah Bar David <didi>
Component: ovirt-engine-dwhAssignee: Yedidyah Bar David <didi>
Status: CLOSED CURRENTRELEASE QA Contact: movciari
Severity: urgent Docs Contact:
Priority: high    
Version: 3.5CC: bugs, didi, gklein, iheim, info, movciari, rbalakri, sradco, yeylon, ylavi
Target Milestone: ---   
Target Release: 3.5.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard: integration
Fixed In Version: oVirt-3.5 GA ovirt-engine-dwh-3.5.0-1.fc19.noarch.rpm Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
: 1122021 (view as bug list) Environment:
Last Closed: 2014-10-17 12:21:42 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On:    
Bug Blocks: 1122021, 1140986    

Description Yedidyah Bar David 2014-07-10 13:28:46 UTC
Description of problem:

Since we now allow running dwh and engine on separate hosts, it's possible to setup two (or more) dwh instances against a single engine.

This will seem to work well - no conflicts/failures/etc are expected - but in practice only one of the dwh servers will get each update on the engine, so the history will be scattered around them and no-one will have a single correct view of the history.

For now, we should prevent that. We should add a row somewhere in the engine db (Yaniv told me in which table but I don't remember currently) during setup, if it does not exist already, and do something if it does (abort, alert the user and ask for confirmation, etc.).

In the future we might decide that there is use for more than one dwh and add support for that.

Comment 1 Shirly Radco 2014-07-24 07:56:49 UTC
I think we should have a field in the engine that has a key generated by the dwh that can verify this is the correct dwh process.
Didi suggested to check on engine the value if the is a process running.
I suggest a key so that if we use kill -9 and not update the engine db with the dwh stopping we will still be able to run it again.
Can we generete such a key and save it for the next run if the process stopes?

Comment 2 Yedidyah Bar David 2014-07-24 08:07:57 UTC
(In reply to Shirly Radco from comment #1)
> I think we should have a field in the engine that has a key generated by the
> dwh that can verify this is the correct dwh process.

Sounds reasonable. Where do you want to save it? Probably simplest is add a new file /etc/ovirt-engine-dwh/ovirt-engine-dwhd.conf.d/10-setup-dwhd.conf with a single key DWH_ID or something like that.

> Didi suggested to check on engine the value if the is a process running.

Don't remember that. Did I say who will check and how?

> I suggest a key so that if we use kill -9 and not update the engine db with
> the dwh stopping we will still be able to run it again.
> Can we generete such a key and save it for the next run if the process
> stopes?

Comment 3 Yedidyah Bar David 2014-09-11 11:21:23 UTC
Fails on clean setup same host as engine

Comment 4 Yedidyah Bar David 2014-09-14 06:35:14 UTC
*** Bug 1140986 has been marked as a duplicate of this bug. ***

Comment 5 Shirly Radco 2014-09-14 08:26:40 UTC
*** Bug 1141205 has been marked as a duplicate of this bug. ***

Comment 6 Sandro Bonazzola 2014-10-17 12:21:42 UTC
oVirt 3.5 has been released and should include the fix for this issue.