Bug 761035 - Build hangs when multiple providers specified
Summary: Build hangs when multiple providers specified
Keywords:
Status: CLOSED WORKSFORME
Alias: None
Product: CloudForms Cloud Engine
Classification: Retired
Component: imagefactory
Version: 1.0.0
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: rc
Assignee: Ian McLeod
QA Contact: Martin Kočí
URL:
Whiteboard: HUDSONdone=bug761035.py
Depends On:
Blocks: ce-sprint-next
TreeView+ depends on / blocked
 
Reported: 2011-12-07 15:44 UTC by Steve Reichard
Modified: 2011-12-19 07:45 UTC (History)
8 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2011-12-07 23:12:23 UTC
Embargoed:


Attachments (Terms of Use)
logs tarball (244.03 KB, application/x-compressed-tar)
2011-12-07 15:44 UTC, Steve Reichard
no flags Details
imagefactory.log (57.22 KB, text/plain)
2011-12-07 23:12 UTC, wes hayutin
no flags Details

Description Steve Reichard 2011-12-07 15:44:38 UTC
Created attachment 542012 [details]
logs tarball

Description of problem:

After having successfully built & pushed a simple r61 system for each rhevm , vsphere, and ec2, I attempt to build all 3 at the same time.

The build has just hung.

I've attached a tarball with several logs.


Version-Release number of selected component (if applicable):

[root@cf-cloudforms9 ~]# /pub/scripts/post_install_configuration_scripts/cf-versions 
Red Hat Enterprise Linux Server release 6.1 (Santiago)
Linux cf-cloudforms9.cloud.lab.eng.bos.redhat.com 2.6.32-131.17.1.el6.x86_64 #1 SMP Thu Sep 29 10:24:25 EDT 2011 x86_64 x86_64 x86_64 GNU/Linux
postgresql-8.4.9-1.el6_1.1.x86_64
mongodb-1.8.0-6.el6.x86_64
euca2ools-1.3.1-4.el6_0.noarch
ruby-1.8.7.299-7.el6_1.1.x86_64
rubygems-1.8.10-1.el6.noarch
deltacloud-core-0.4.1-8.el6.noarch
rubygem-deltacloud-client-0.4.0-3.el6.noarch
package libdeltacloud is not installed
hail-0.8-0.2.gf9c5b967.el6_0.x86_64
puppet-2.6.6-1.el6_0.noarch
aeolus-configure-2.3.0-1.el6.noarch
iwhd-1.0-1.el6.x86_64
imagefactory-0.8.9-1.el6.noarch
aeolus-conductor-daemons-0.6.0-3.el6.noarch
aeolus-conductor-0.6.0-3.el6.noarch
You have new mail in /var/spool/mail/root
[root@cf-cloudforms9 ~]# 

How reproducible:


Saw this a couple of sprints ago and brought it to imcleod attention.  The cli made it impossible for last sprint.

At this time unsure of repeatability for this sprint, will need to get my system unstuck first.

Steps to Reproduce:
1.
2.
3.
  
Actual results:


Expected results:


Additional info:

Comment 1 Steve Reichard 2011-12-07 19:49:17 UTC
Command:

aeolus-cli build --target rhevm,vsphere,ec2 --template /pub/projects/cloudforms/files/templates/r61-brew.xml


template:

[root@ra-users templates]# cat r61-brew.xml 
<?xml version="1.0" encoding="UTF-8"?>
<template>
  <name>first_brew</name>
  <description>brew_test1</description>
  <os>
    <name>RHEL-6</name>
    <version>1</version>
    <arch>x86_64</arch>
    <install type="url">
      <url>http://download.devel.redhat.com/nightly/latest-RHEL6.1/6/Server/x86_64/os/</url>
    </install>
    <rootpw>redhat</rootpw>
  </os>
</template>
[root@ra-users templates]#

Comment 2 wes hayutin 2011-12-07 19:59:13 UTC

*** This bug has been marked as a duplicate of bug 721097 ***

Comment 3 wes hayutin 2011-12-07 23:12:23 UTC
root@qeblade30 ~]# aeolus-cli build --target ec2,vsphere,rhevm --template RHEL61.tpl
Image: f92495e0-2a49-46f7-ab4a-f60c6e4627b7
Build: be2ccca8-4bfb-49ae-b2c3-e8478fcd7ddc
Target Image: 2afa45ef-638f-46e2-b49a-d3105f9dd609	 :Status New
Target Image: 6a07da02-f737-4ea3-9757-202fb072a07f	 :Status BUILDING
Target Image: c8bd0034-4e78-4e66-905a-4dab0be277ea	 :Status New
[root@qeblade30 ~]# 



[root@qeblade30 ~]# aeolus-cli status --targetimage 2afa45ef-638f-46e2-b49a-d3105f9dd609
Build Status: COMPLETE
[root@qeblade30 ~]# aeolus-cli status --targetimage 6a07da02-f737-4ea3-9757-202fb072a07f
Build Status: BUILDING
[root@qeblade30 ~]# aeolus-cli status --targetimage c8bd0034-4e78-4e66-905a-4dab0be277ea
Build Status: New
[root@qeblade30 ~]# aeolus-cli status --targetimage 6a07da02-f737-4ea3-9757-202fb072a07f
Build Status: BUILDING
[root@qeblade30 ~]# aeolus-cli status --targetimage c8bd0034-4e78-4e66-905a-4dab0be277ea
Build Status: New
[root@qeblade30 ~]# aeolus-cli status --targetimage 6a07da02-f737-4ea3-9757-202fb072a07f
Build Status: COMPLETE
[root@qeblade30 ~]# aeolus-cli status --targetimage c8bd0034-4e78-4e66-905a-4dab0be277ea
Build Status: BUILDING
[root@qeblade30 ~]# aeolus-cli status --targetimage c8bd0034-4e78-4e66-905a-4dab0be277ea
Build Status: BUILDING
[root@qeblade30 ~]# aeolus-cli status --targetimage c8bd0034-4e78-4e66-905a-4dab0be277ea
Build Status: BUILDING
[root@qeblade30 ~]# aeolus-cli status --targetimage c8bd0034-4e78-4e66-905a-4dab0be277ea
Build Status: COMPLETE
[root@qeblade30 ~]# 

[root@qeblade30 ~]# rpm -qa | grep aeolus
rubygem-aeolus-image-0.2.0-1.el6.noarch
aeolus-conductor-doc-0.7.0-4.el6.noarch
aeolus-configure-2.4.0-3.el6.noarch
aeolus-conductor-daemons-0.7.0-4.el6.noarch
rubygem-aeolus-cli-0.2.0-3.el6.noarch
aeolus-all-0.7.0-4.el6.noarch
aeolus-conductor-0.7.0-4.el6.noarch
[root@qeblade30 ~]# rpm -qa | grep imagefactory
imagefactory-jeosconf-ec2-rhel-1.0.0rc1-1.el6.noarch
rubygem-imagefactory-console-0.4.0-1.el6.noarch
imagefactory-jeosconf-ec2-fedora-1.0.0rc1-1.el6.noarch
imagefactory-1.0.0rc1-1.el6.noarch
[root@qeblade30 ~]# cat /etc/redhat-release 
Red Hat Enterprise Linux Server release 6.1 (Santiago)

Comment 4 wes hayutin 2011-12-07 23:12:58 UTC
Created attachment 542251 [details]
imagefactory.log

Comment 5 wes hayutin 2011-12-07 23:14:46 UTC
[root@qeblade30 ~]# cat /etc/imagefactory/imagefactory.conf 
{
  "warehouse": "http://localhost:9090/",
  "warehouse_key": "fHdCl5EaZ77m/d/NbaYOM5uLsxRVaetw",
  "warehouse_secret": "o4cUFqg1W/FnJHWArfLZthRO1yneBgyx",
  "image_bucket": "images",
  "build_bucket": "builds",
  "target_bucket": "target_images",
  "template_bucket": "templates",
  "icicle_bucket": "icicles",
  "provider_bucket": "provider_images",
  "imgdir": "/home/var/lib/imagefactory/images",
  "ec2_build_style": "snapshot",
  "ec2_ami_type": "s3",
  "rhevm_image_format": "qcow2",
  "clients": {
    "hyZuS/krsNmIrNXjs+zfQh/rcJwQfDPY": "zocKHXIBXZ+jMhcGkwltBG56c6nwlKob"
    },
  "proxy_ami_id": "ami-id",
  "max_concurrent_local_sessions": 1,
  "max_concurrent_ec2_sessions": 1
}
[root@qeblade30 ~]# cat /etc/oz/oz.cfg 
[paths]
output_dir = /home/var/lib/libvirt/images
data_dir = /home/var/lib/oz
screenshot_dir = .

[libvirt]
uri = qemu:///system
# type = kvm
# bridge_name = virbr0

[cache]
original_media = yes
modified_media = no
jeos = no
[root@qeblade30 ~]# cat /etc/iwhd/
conf.js   users.js  
[root@qeblade30 ~]# cat /etc/iwhd/conf.js 
[
  {
    "name": "primary",
    "type": "fs",
    "path": "/var/lib/iwhd"
  }
]
[root@qeblade30 ~]#

Comment 6 wes hayutin 2011-12-07 23:16:27 UTC
If the two recreates have diff settings or Steve if your rpms are newer.. please reopen the bug..

If not .. please try upgrading your rpms.
Thanks


Note You need to log in before you can comment on or make changes to this bug.