Bug 1055153
Summary: | vdsmd not starting on first run since vdsm logs are not included in rpm | |||
---|---|---|---|---|
Product: | [Retired] oVirt | Reporter: | Andrew Lau <andrew> | |
Component: | vdsm | Assignee: | Douglas Schilling Landgraf <dougsland> | |
Status: | CLOSED CURRENTRELEASE | QA Contact: | Jiri Belka <jbelka> | |
Severity: | urgent | Docs Contact: | ||
Priority: | urgent | |||
Version: | 3.4 | CC: | acathrow, andrew, bazulay, brad, danken, didi, dougsland, eedri, fdeutsch, fw, iheim, info, jbelka, jboggs, mgoldboi, obasan, pstehlik, sbonazzo, scott, ybronhei, yeylon | |
Target Milestone: | --- | |||
Target Release: | 3.4.0 | |||
Hardware: | Unspecified | |||
OS: | Unspecified | |||
Whiteboard: | infra | |||
Fixed In Version: | ovirt-3.4.0-beta3 | Doc Type: | Bug Fix | |
Doc Text: | Story Points: | --- | ||
Clone Of: | ||||
: | 1061561 (view as bug list) | Environment: | ||
Last Closed: | 2014-03-31 12:33:23 UTC | Type: | Bug | |
Regression: | --- | Mount Type: | --- | |
Documentation: | --- | CRM: | ||
Verified Versions: | Category: | --- | ||
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | ||
Cloudforms Team: | --- | Target Upstream Version: | ||
Embargoed: | ||||
Bug Depends On: | ||||
Bug Blocks: | 1024889, 1061561, 1065924 | |||
Attachments: |
Description
Andrew Lau
2014-01-19 08:49:17 UTC
Andrew can you reproduce and attach logs from ovirt-hosted-engine-setup and from vdsm? Sorry, I don't have any test equipment on hand right now but I have my list of steps on how to reproduce if that helps. But I'll try get my hands on a dev server and report back. Created attachment 854408 [details]
failure log from 3.4 BETA
Created attachment 854416 [details]
ovirt hosted engine setup failure log 3.4 beta
Created attachment 854417 [details]
supervdsm log from failure
I'm contributing logs at the request of Andrew, as my failure is identical. You can see in the ovirt users mailing list under the subject "oVirt 3.4.0 beta - Hosted Engine Setup -- issues" vdsm log is empty # ls -l /var/log/vdsm/vdsm.log -rw-r--r-- 1 root root 0 Jan 20 10:13 /var/log/vdsm/vdsm.log # ps -ef | grep -i vdsm root 1628 1 0 09:16 ? 00:00:00 /usr/bin/python /usr/share/vdsm/supervdsmServer --sockfile /var/run/vdsm/svdsm.sock # service vdsmd status Redirecting to /bin/systemctl status vdsmd.service vdsmd.service - Virtual Desktop Server Manager Loaded: loaded (/usr/lib/systemd/system/vdsmd.service; enabled) Active: failed (Result: start-limit) since Thu 2014-01-23 09:16:38 EST; 1min 26s ago Process: 2387 ExecStopPost=/usr/libexec/vdsm/vdsmd_init_common.sh --post-stop (code=exited, status=0/SUCCESS) Process: 2380 ExecStart=/usr/share/vdsm/daemonAdapter -0 /dev/null -1 /dev/null -2 /dev/null /usr/share/vdsm/vdsm (code=exited, status=1/FAILURE) Process: 2328 ExecStartPre=/usr/libexec/vdsm/vdsmd_init_common.sh --pre-start (code=exited, status=0/SUCCESS) Jan 23 09:16:38 ovirttest.internal.monetra.com systemd[1]: Unit vdsmd.service entered failed state. Jan 23 09:16:38 ovirttest.internal.monetra.com systemd[1]: vdsmd.service holdoff time over, scheduling restart. Jan 23 09:16:38 ovirttest.internal.monetra.com systemd[1]: Stopping Virtual Desktop Server Manager... Jan 23 09:16:38 ovirttest.internal.monetra.com systemd[1]: Starting Virtual Desktop Server Manager... Jan 23 09:16:38 ovirttest.internal.monetra.com systemd[1]: vdsmd.service start request repeated too quickly, refusing to start. Jan 23 09:16:38 ovirttest.internal.monetra.com systemd[1]: Failed to start Virtual Desktop Server Manager. Jan 23 09:16:38 ovirttest.internal.monetra.com systemd[1]: Unit vdsmd.service entered failed state. I've attached the supervdsm.log and ovirt-hosted-engine-setup-20140123091611.log incase that's helpful. it seems that it's vdsm not working correctly on first run: respawn: slave '/usr/share/vdsm/vdsm --pidfile /var/run/vdsm/vdsmd.pid' died too quickly for more than 30 seconds, master sleeping for 900 seconds Just noticed that /var/log/vdsm/vdsm.log has wrong file permissions of root:root, while it should be vdsm:kvm. Manually changing file permissions to vdsm:kvm fixes the vdsm service issue for me. Please see: https://www.mail-archive.com/users@ovirt.org/msg13907.html For a full description of my environment, settings, etc. Also, Frank (fraenki) was on IRC and determined a work around as a permission issue on the vdsm.log ... simply chown vdsm:kvm /var/log/vdsm/vdsm.log And it works. Here is a cut-and-paste of my steps and environment to reproduce from that mailing list: - Fedora 19 (64bit), minimal install, selected 'standard' add-on utilities. This was a fresh install just for this test. - 512MB /boot ext4 - 80GB / ext4 in LVM, 220GB free in VG - "yum -y update" performed to get all latest updates - SElinux in permissive mode - Hardware: - Supermicro 1026T-URF barebones - single CPU populated (Xeon E5630 4x2.53GHz) - 12GB ECC DDR3 RAM - H/W Raid with SSDs - Networking: - Network Manager DISABLED - 4 GbE ports (p2p1, p2p2, em1, em2) - all 4 ports configured in a bond (bond0) using balance-alb - ovirtmgmt bridge pre-created with 'bond0' as the only member, assigned a static IP address - firewall DISABLED - /etc/yum.repos.d/ovirt.rep has ONLY the 3.4.0 beta repo enabled: [ovirt-3.4.0-beta] name=3.4.0 beta testing repo for the oVirt project baseurl=http://ovirt.org/releases/3.4.0-beta/rpm/Fedora/$releasever/ enabled=1 skip_if_unavailable=1 gpgcheck=0 - only other packages installed were "ntp" and "screen" - performed "yum install ovirt-hosted-engine-setup" then "screen" then "hosted-engine --deploy" Sandro, I have no idea how this could be related to multiple nics, but could it be that ovirt-hosted-engine-setup has create /var/log/vdsm/vdsm.log as root, instead of the correct ownership? If so, please take bug back. Oh, I see that you are already investigating this.
On Thu, Jan 23, 2014 at 03:42:09PM +0100, Sandro Bonazzola wrote:
> It sorted out that changing owner to vdsm:kvm on /var/log/vdsm/vdsm.log solve the issue.
> Investigating how owner have been set to root:root on that file.
Please, fix vdsm packaging following https://fedoraproject.org/wiki/PackagingDrafts/Logfiles # rpm -qf /var/log/vdsm/vdsm.log file /var/log/vdsm/vdsm.log is not owned by any package it must be created by rpm with proper permission and owned by it. Thanks, Sandro, I do not mind adding /var/log/vdsm/vdsm.log to the rpm. But do you know what has created it with the wrong ownership? Another bug may be lurking there. I don't really know. On a clean system: - yum install ovirt-hosted-engine-setup just install vdsm as dependency. - hosted-engine --deploy then calls: - vdsm-tool configure --force - on systemd: just start vdsmd service - on sysvinit first start cgconfig, messagebus, libvirtd and then vdsmd. And nothing writes into vdsm.log from hosted-engine code directly. (In reply to Sandro Bonazzola from comment #14) > I don't really know. On a clean system: > - yum install ovirt-hosted-engine-setup just install vdsm as dependency. > - hosted-engine --deploy then calls: > - vdsm-tool configure --force > - on systemd: just start vdsmd service > - on sysvinit first start cgconfig, messagebus, libvirtd and then vdsmd. > > And nothing writes into vdsm.log from hosted-engine code directly. Perhaps logrotate? afaik logrotate does not create the file, and on rotating it creates with the origin permissions, which means the first user who writes to vsdm.log will own the file. It should be vdsm, but maybe we have a glitch . better to understand what was changed here, I suspect the usages of vdsm-tool which also loads vdsm.logger.conf in lib/vdsm/tool/upgrade.py for example BTW, http://gerrit.ovirt.org/23696 fixes the issue for rpm-based distributions only. Douglas, please move the code to Makefile's install target, so it works for non-rpms, too. *** Bug 1055129 has been marked as a duplicate of this bug. *** Douglas, please give /var/log/vdsm/metadata.log (and all other logs normally seen on a vdsm installation) the same treatment. *** Bug 1057119 has been marked as a duplicate of this bug. *** Keep new as there is a discussion on going in gerrit. Hi, (In reply to Sandro Bonazzola from comment #14) > I don't really know. On a clean system: > - yum install ovirt-hosted-engine-setup just install vdsm as dependency. > - hosted-engine --deploy then calls: > - vdsm-tool configure --force > - on systemd: just start vdsmd service > - on sysvinit first start cgconfig, messagebus, libvirtd and then vdsmd. > > And nothing writes into vdsm.log from hosted-engine code directly. I have executed manually hosted-engine --deploy (I have one nic at this host) and checked the logs the permissions are fine. # ls -la /var/log/vdsm/ total 156 drwxr-xr-x. 3 vdsm kvm 4096 Jan 27 13:47 . drwxr-xr-x. 20 root root 4096 Jan 27 13:40 .. drwxr-xr-x. 2 vdsm kvm 4096 Jan 21 11:20 backup -rw-r--r--. 1 vdsm kvm 0 Jan 27 13:42 metadata.log -rw-r--r--. 1 vdsm kvm 1983 Jan 27 13:50 mom.log -rw-r--r--. 1 root root 8223 Jan 27 13:50 supervdsm.log -rw-r--r--. 1 vdsm kvm 125211 Jan 27 13:50 vdsm.log # rpm -qa | grep -i hosted ovirt-hosted-engine-setup-1.1.0-0.1.beta1.fc19.noarch ovirt-hosted-engine-ha-1.1.0-0.1.beta1.fc19.noarch I will try a new fresh install. (In reply to Sandro Bonazzola from comment #12) > Please, fix vdsm packaging following > https://fedoraproject.org/wiki/PackagingDrafts/Logfiles > > # rpm -qf /var/log/vdsm/vdsm.log > file /var/log/vdsm/vdsm.log is not owned by any package > > it must be created by rpm with proper permission and owned by it. We have fixed that with attached patches, as soon the logs get created by vdsm with right permissions it will belongs to vdsm package but might not resolve the main problem if somehow the log changes the permission to root:root. Andrew, did it happen more than once? can you reproduce it with the above scenario? I did a fresh install and was able to reproduce the problem this way: 1. Installed RHEL 6.5 and all updates 2. Added and enabled oVirt 3.4 *nightly* repository 3. Installed ovirt-hosted-engine-setup-1.2.0-0.0.master.20140117.gitfaf77a5.el6.noarch and all dependencies 4. Running `hosted-engine --deploy` the first time I got this error: Command '/sbin/service' failed to execute The setup log had a more detailed error message, see attached file 34-hosted-engine-fail1.txt. Still, no vdsm.log was created, but only supervdsm.log. 5. Running `hosted-engine --deploy` again revealed a different error: Failed to execute stage 'Environment customization': [Errno 111] Connection refused And now the vdsm.log is present and is owned by root:root, which apparently is wrong. The full error from the setup log is attached as 34-hosted-engine-fail2.txt. 6. Fixed file permissions of vdsm.log and metadata.log to vdsm:kvm and manually restarted both supervdsmd and vdsmd (multiple times). After seeing log messages in vdsm.log I continued to the next step. 7. The third run of `hosted-engine --deploy` was successful. Created attachment 856259 [details] see comment 24: first run of hosted-engine --deploy Created attachment 856260 [details] see comment 24: second run of hosted-engine --deploy *** Bug 1058536 has been marked as a duplicate of this bug. *** Hello Frank/Andrew/Brad, Thanks for your logs and help. I did a scratch-build (not official) based on vdsm available in beta channel including 3 below patches. Can you please help testing it? - vdsm.spec: vdsm should own vdsm.log http://gerrit.ovirt.org/#/c/23718/ - vdsm.spec: own metadata supervdsm mom logs http://gerrit.ovirt.org/#/c/23786/ - configurator: use sanlock group for sanlock check http://gerrit.ovirt.org/#/c/23788/ To download: EL6: http://koji.fedoraproject.org/koji/taskinfo?taskID=6462288 F19: http://koji.fedoraproject.org/koji/taskinfo?taskID=6462299 Thanks! Hi Douglas, I'm going to add an extra to the cluster to try this, do I need to grab all those RPMs individually or can I just take http://kojipkgs.fedoraproject.org//work/tasks/2289/6462289/vdsm-4.14.1-3.el6.x86_64.rpm ? Cheers. Hi Andrew, (In reply to Andrew Lau from comment #29) > Hi Douglas, > > I'm going to add an extra to the cluster to try this, do I need to grab all > those RPMs individually or can I just take > http://kojipkgs.fedoraproject.org//work/tasks/2289/6462289/vdsm-4.14.1-3.el6. > x86_64.rpm ? > > Cheers. On a fresh Fedora or Centos install ========================================== As you soon you get installed ovirt-hosted-engine-setup: # rpm -qa | grep -i vdsm # yum remove <all those vdsm package> (please note that ovirt-hosted-engine-setup probably will be removed as dependency) Now you will need to download the same packages you removed, probably: http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-4.14.1-3.el6.x86_64.rpm http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-python-4.14.1-3.el6.x86_64.rpm http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-python-zombiereaper-4.14.1-3.el6.noarch.rpm http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-xmlrpc-4.14.1-3.el6.noarch.rpm Then, re-install ovirt-hosted-engine-setup and give a try. Hi, For those following you also want to grab http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-cli-4.14.1-3.el6.noarch.rpm Unfortunately hit a strange new error: [ ERROR ] Failed to execute stage 'Environment customization': [Errno 1] _ssl.c:492: error:14090086:SSL routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed vdsm.log http://www.fpaste.org/72370/90917878/ ovirt-hosted-engine-setup http://www.fpaste.org/72371/91789613/ Also, supervdsm.log is owned by root:root drwxr-xr-x. 2 vdsm kvm 4096 Jan 28 16:48 backup -rw-r--r--. 1 vdsm kvm 0 Jan 29 2014 metadata.log -rw-r--r--. 1 vdsm kvm 822 Jan 29 00:58 mom.log -rw-r--r--. 1 root root 1403 Jan 29 00:58 supervdsm.log -rw-r--r--. 1 vdsm kvm 30962 Jan 29 01:02 vdsm.log Additional notes: I forgot to resolve the hostname the first time I ran the hosted-engine --deploy command. I'm not sure if that caused this issue, but in the past those tiny things with a failed first run seemed to have been the cause and source of my previous workarounds. I will retry when time permits. Hi Andrew, First, thanks for your quick reply. (In reply to Andrew Lau from comment #31) > Hi, > > For those following you also want to grab > http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-cli-4.14.1-3. > el6.noarch.rpm Perfect. > > Unfortunately hit a strange new error: > [ ERROR ] Failed to execute stage 'Environment customization': [Errno 1] > _ssl.c:492: error:14090086:SSL > routines:SSL3_GET_SERVER_CERTIFICATE:certificate verify failed > > vdsm.log > http://www.fpaste.org/72370/90917878/ > From logs I can see: "SSLError: sslv3 alert bad certificate, client 127.0.0.1" VDSM could generated the certificate for the wrong hostname (as you mentioned below in "Additional notes")? The below commands would help to check: # openssl x509 -in /etc/pki/vdsm/certs/cacert.pem -noout -text # openssl x509 -in /etc/pki/vdsm/certs/vdsmcert.pem -noout -text # openssl verify -CAfile /etc/pki/vdsm/certs/cacert.pem /etc/pki/vdsm/certs/vdsmcert.pem > ovirt-hosted-engine-setup > http://www.fpaste.org/72371/91789613/ The previous error "Command '/sbin/service' failed to execute" is gone Patch http://gerrit.ovirt.org/#/c/23788/ worked :) > Also, supervdsm.log is owned by root:root > > drwxr-xr-x. 2 vdsm kvm 4096 Jan 28 16:48 backup > -rw-r--r--. 1 vdsm kvm 0 Jan 29 2014 metadata.log > -rw-r--r--. 1 vdsm kvm 822 Jan 29 00:58 mom.log > -rw-r--r--. 1 root root 1403 Jan 29 00:58 supervdsm.log > -rw-r--r--. 1 vdsm kvm 30962 Jan 29 01:02 vdsm.log > That's OK (supervdsm as root)- the below patches worked too - vdsm.spec: vdsm should own vdsm.log http://gerrit.ovirt.org/#/c/23718/ - vdsm.spec: own metadata supervdsm mom logs http://gerrit.ovirt.org/#/c/23786/ > Additional notes: > > I forgot to resolve the hostname the first time I ran the hosted-engine > --deploy command. I'm not sure if that caused this issue, but in the past > those tiny things with a failed first run seemed to have been the cause and > source of my previous workarounds. I will retry when time permits. Thanks! No luck with that SSL check it just hung there for hours so I went with a fresh install. Success!! yum install ovirt-hosted-engine-setup yum remove vdsm* rm -rf /var/log/vdsm/ yum install http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-4.14.1-3.el6.x86_64.rpm http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-python-4.14.1-3.el6.x86_64.rpm http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-python-zombiereaper-4.14.1-3.el6.noarch.rpm http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-xmlrpc-4.14.1-3.el6.noarch.rpm http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-cli-4.14.1-3.el6.noarch.rpm yum install ovirt-hosted-engine-setup ovirt-hosted-engine-setup did not error. Cheers, Andrew. (In reply to Andrew Lau from comment #33) > No luck with that SSL check it just hung there for hours so I went with a > fresh install. > > Success!! > > yum install ovirt-hosted-engine-setup > yum remove vdsm* > > rm -rf /var/log/vdsm/ > yum install > http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-4.14.1-3.el6. > x86_64.rpm > http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-python-4.14.1- > 3.el6.x86_64.rpm > http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-python- > zombiereaper-4.14.1-3.el6.noarch.rpm > http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-xmlrpc-4.14.1- > 3.el6.noarch.rpm > http://kojipkgs.fedoraproject.org/work/tasks/2289/6462289/vdsm-cli-4.14.1-3. > el6.noarch.rpm > > yum install ovirt-hosted-engine-setup > > ovirt-hosted-engine-setup did not error. > > Cheers, > Andrew. Thanks Andrew!!! No problems. How soon till these will appear in the nightly? Hi Andrew, (In reply to Andrew Lau from comment #35) > No problems. How soon till these will appear in the nightly? The patch is under review. To speed up the process you could go to http://gerrit.ovirt.org/#/c/23788/ and click on Review -> Verified which meant that you have tested it but still need other developer to review before get merged. Thanks! lets separate between [1] that relates to bug 1057225 to the log issue. [1] http://gerrit.ovirt.org/#/c/23788/ the log issue was already merged *** Bug 1066509 has been marked as a duplicate of this bug. *** using vdsm-4.14.4-2.git74b4a27 I see this problem again on ovirt-live (In reply to Ohad Basan from comment #39) > using vdsm-4.14.4-2.git74b4a27 I see this problem again on ovirt-live Can you please provide details about the error you see? Thanks sure I see that vdsm doesn't come up vdsm.log owner is "root" and rpm -qf shows the vdsm log has no owner package (In reply to Ohad Basan from comment #41) > sure > I see that vdsm doesn't come up > vdsm.log owner is "root" > and rpm -qf shows the vdsm log has no owner package Hello Ohad, I see: http://resources.ovirt.org/releases/3.4.0-beta3/rpm/Fedora/19/x86_64/vdsm-4.14.3-0.fc19.x86_64.rpm Shouldn't ovirt-live use it instead of vdsm-4.14.4-2.git74b4a27 as fixed version say? using the el6 variant http://resources.ovirt.org/pub/ovirt-3.4-snapshot/rpm/el6/x86_64/ Hello Ohad, http://jenkins.ovirt.org/job/vdsm_create_rpms_3.4 was generating the package in a wrong way, it used: ./autogen.sh --system --enable-hooks which will generate: localstatedir=/var Then it get replaced with ./configure --enable-hooks by localstatedir=$prefix/var which prefix is /usr/local David Caro fixed it and we should be fine, moving again to ON_QA. Fell free to move again to ASSIGNED in case you see it. Thanks! ok, vdsm runs OK after first run # /etc/init.d/vdsmd status VDS daemon server is running # ls -lZ /var/log/vdsm/ drwxr-xr-x. vdsm kvm system_u:object_r:var_log_t:s0 backup -rw-r--r--. vdsm kvm unconfined_u:object_r:var_log_t:s0 metadata.log -rw-r--r--. vdsm kvm unconfined_u:object_r:var_log_t:s0 mom.log -rw-r--r--. root root unconfined_u:object_r:var_log_t:s0 supervdsm.log -rw-r--r--. vdsm kvm unconfined_u:object_r:var_log_t:s0 vdsm.log # rpm -q vdsm vdsm-4.14.3-0.el6.x86_64 *** Bug 1064047 has been marked as a duplicate of this bug. *** Still having this bug with latest node image: ovirt-node-iso-3.0.4-1.0.201401291204.vdsm34rc2.el6.iso vdsm-python-zombiereaper-4.14.5-0.el6.noarch vdsm-cli-4.14.5-0.el6.noarch vdsm-reg-4.14.5-0.el6.noarch vdsm-xmlrpc-4.14.5-0.el6.noarch vdsm-4.14.5-0.el6.x86_64 vdsm-gluster-4.14.5-0.el6.noarch ovirt-node-plugin-vdsm-0.1.1-10.el6.noarch vdsm-python-4.14.5-0.el6.x86_64 ls -lart total 24 drwxr-xr-x. 2 vdsm kvm 4096 2014-03-17 10:13 backup drwxr-xr-x. 13 root root 4096 2014-03-17 10:28 .. -rw-r--r--. 1 root root 5464 2014-03-17 10:28 supervdsm.log -rw-r--r--. 1 root root 0 2014-03-17 10:28 vdsm.log -rw-r--r--. 1 root root 0 2014-03-17 10:28 metadata.log drwxr-xr-x. 3 vdsm kvm 4096 2014-03-17 10:28 . -rw-r--r--. 1 vdsm kvm 352 2014-03-17 10:28 upgrade.log Reopening based on comment #47. This is blocker for 3.4.0 GA release, please fix ASAP. (In reply to Sandro Bonazzola from comment #48) > Reopening based on comment #47. This is blocker for 3.4.0 GA release, please > fix ASAP. Howdy Fabian, Looks like oVirt Node installer requires to be updated to include on persist call to the below files already generated during the vdsm rpm install. The files you see from above comment#47 is when the rpm install doesnt' generate/set it. From a clean install of ovirt-node-iso-3.0.4-1.0.201401291204.vdsm34rc2.el6.iso # ls -la /var/log/vdsm drwxr-xr-x. 3 36 kvm 4096 Mar 14 06:25 . drwxr-xr-x. 11 root root 4096 Mar 14 06:29 .. drwxr-xr-x. 2 36 kvm 4096 Mar 10 09:22 backup -rw-r--r--. 1 root root 0 Mar 14 06:25 supervdsm.log On the other hand, mounting the ext2 image, we have the files: # wget http://resources.ovirt.org/releases/3.4.0_pre/iso/ovirt-node-iso-3.0.4-1.0.201401291204.vdsm34rc2.el6.iso # mkdir /tmp/{1,2,3} # mount -o loop ovirt-node-iso-3.0.4-1.0.201401291204.vdsm34rc2.el6.iso /tmp/1 # mount /tmp/1/LiveOS/squashfs.img /tmp/2 # mount /tmp/2/LiveOS/ext3fs.img /tmp/3 # ls -la /tmp/3/var/log/vdsm total 40 drwxr-xr-x. 3 36 kvm 4096 Mar 14 06:25 . drwxr-xr-x. 11 root root 4096 Mar 14 06:29 .. drwxr-xr-x. 2 36 kvm 4096 Mar 10 09:22 backup -rw-r--r--. 1 36 kvm 0 Mar 14 06:25 metadata.log -rw-r--r--. 1 36 kvm 0 Mar 14 06:25 mom.log -rw-r--r--. 1 root root 0 Mar 14 06:25 supervdsm.log -rw-r--r--. 1 36 kvm 0 Mar 14 06:25 vdsm.log Thanks for this analysis Douglas. The problem is actually within the default CentOS+Fedora /etc/rwtab, it contains: $ grep var/log /etc/rwtab dirs /var/log That means that all dirs below /var/log will be copied, but no files. Effectively this will remove all files. Is there the possibility that you create the files when the daemon is started? (In reply to Fabian Deutsch from comment #50) > Thanks for this analysis Douglas. > > The problem is actually within the default CentOS+Fedora /etc/rwtab, it > contains: > > $ grep var/log /etc/rwtab > dirs /var/log > > That means that all dirs below /var/log will be copied, but no files. > Effectively this will remove all files. > > Is there the possibility that you create the files when the daemon is > started? Moving back this bug to verified since this is a different bug. Fabian, Douglas, please open a new BZ for the rwtab issue. this is an automated message: moving to Closed CURRENT RELEASE since oVirt 3.4.0 has been released Just for the record (and remove needinfo): (In reply to Fabian Deutsch from comment #50) > Thanks for this analysis Douglas. > > The problem is actually within the default CentOS+Fedora /etc/rwtab, it > contains: > > $ grep var/log /etc/rwtab > dirs /var/log > > That means that all dirs below /var/log will be copied, but no files. > Effectively this will remove all files. > > Is there the possibility that you create the files when the daemon is > started? We are doing it at this moment. |