Login
Log in using an SSO provider:
Fedora Account System
Red Hat Associate
Red Hat Customer
Login using a Red Hat Bugzilla account
Forgot Password
Create an Account
Red Hat Bugzilla – Attachment 746471 Details for
Bug 962031
IOError: [Errno 30] Read-only file system: '/mnt/sysimage/anaconda-yum.yumtx'
Home
New
Search
Simple Search
Advanced Search
My Links
Browse
Requests
Reports
Current State
Search
Tabular reports
Graphical reports
Duplicates
Other Reports
User Changes
Plotly Reports
Bug Status
Bug Severity
Non-Defaults
Product Dashboard
Help
Page Help!
Bug Writing Guidelines
What's new
Browser Support Policy
5.0.4.rh92 Release notes
FAQ
Guides index
User guide
Web Services
Contact
Legal
[?]
This site requires JavaScript to be enabled to function correctly, please enable it.
File: anaconda-tb
anaconda-tb (text/plain), 1.97 MB, created by
Reartes Guillermo
on 2013-05-11 03:55:43 UTC
(
hide
)
Description:
File: anaconda-tb
Filename:
MIME Type:
Creator:
Reartes Guillermo
Created:
2013-05-11 03:55:43 UTC
Size:
1.97 MB
patch
obsolete
>anaconda 19.25-1 exception report >Traceback (most recent call first): > File "/usr/lib/python2.7/site-packages/yum/__init__.py", line 6501, in save_ts > f = open(filename, 'w') > File "/usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py", line 1532, in install > self._yum.save_ts(ts_file) > File "/usr/lib64/python2.7/site-packages/pyanaconda/install.py", line 137, in doInstall > payload.install() > File "/usr/lib64/python2.7/threading.py", line 766, in run > self.__target(*self.__args, **self.__kwargs) > File "/usr/lib64/python2.7/site-packages/pyanaconda/threads.py", line 168, in run > threading.Thread.run(self, *args, **kwargs) >IOError: [Errno 30] Read-only file system: '/mnt/sysimage/anaconda-yum.yumtx' > >Local variables in innermost frame: >auto: False >self: <yum.YumBase object at 0x7fae0c03ca10> >filename: /mnt/sysimage/anaconda-yum.yumtx > > >Anaconda instance, containing members: >_instClass: DefaultInstall instance, containing members: >_intf: GraphicalUserInterface instance, containing members: > _intf.instclass: Already dumped (DefaultInstall instance) > _intf._mehInterface: GraphicalExceptionHandlingIface instance, containing members: > _intf._mehInterface._lightbox_func: <bound method GraphicalUserInterface.lightbox_over_current_action of <pyanaconda.ui.gui.GraphicalUserInterface object at 0x7fae0fbdff50>> > _intf._quitDialog: <class 'pyanaconda.ui.gui.QuitDialog'> > _intf.data: #version=DEVEL ># System authorization information >auth --enableshadow --passalgo=sha512 ># Use CDROM installation media >cdrom > ># Run the Setup Agent on first boot >firstboot --enable >ignoredisk --only-use=sda,sdd,sdc,sdb ># Keyboard layouts >keyboard --xlayouts='es' ># System language >lang en_US.UTF-8 > ># Network information >network --bootproto=dhcp --device=eth0 --onboot=off --ipv6=auto --activate >network --hostname=localhost.localdomain ># Root password >rootpw --iscrypted $6$e6oq3OibmAR0vwcR$0WWG.Wo5jVx3RYbRgaIIBfclAOyoTPes/hHiC7FDGH.38klJDMaJFbSrFAKWggCWzChD9.V/v7bYRx5xr5Njo/ ># System timezone >timezone America/Noronha --isUtc ># System bootloader configuration >bootloader --location=mbr --boot-drive=sda ># Partition clearing information >clearpart --none --initlabel --drives=sda,sdd,sdc,sdb > >%packages >@core > >%end > > > _intf.storage: Blivet instance, containing members: > _intf.storage.clearPartChoice: None > _intf.storage.eddDict: {'sda': 128} > _intf.storage.dasd: DASD instance, containing members: > _intf.storage.dasd.dasdfmt: /sbin/dasdfmt > _intf.storage.dasd.commonArgv: [-y, -d, cdl, -b, 4096] > _intf.storage.dasd.started: True > _intf.storage.dasd.totalCylinders: 0 > _intf.storage.dasd._maxFormatJobs: 0 > _intf.storage.dasd._devices: [] > _intf.storage.dasd._completedCylinders: 0.0 > _intf.storage.dasd._dasdlist: [] > _intf.storage.roots: [Root instance, containing members: > mounts: {'/boot': BTRFSSubVolumeDevice instance, containing members: > mounts.major: 0 > mounts._partedDevice: None > mounts.exists: False > mounts.req_size: None > mounts._size: 0 > mounts.id: 6 > mounts.controllable: True > mounts.uuid: None > mounts._format: existing None > mounts.parents: [BTRFSVolumeDevice instance, containing members: > major: 0 > _partedDevice: None > exists: False > req_size: None > _size: 0 > id: 5 > controllable: True > dataLevel: None > uuid: 852bfcd3-84c3-4cb0-92cc-787d2f56d51c > _format: existing None > parents: [non-existent 9958MB partition sda2 (4) >, non-existent 9958MB partition sdd2 (10) >, non-existent 9958MB partition sdc2 (13) >, non-existent 9958MB partition sdb2 (16) >] > subvolumes: [] > minor: 0 > size_policy: 9958.0 > fstabComment: Skipped > bus: Skipped > deviceLinks: [] > _targetSize: 0 > sysfsPath: Skipped > _model: Skipped > metaDataLevel: None > kids: 0 > _vendor: Skipped > _name: btrfs.5 > protected: False > originalFormat: BTRFS instance, containing members: > originalFormat.uuid: None > originalFormat.exists: True > originalFormat._mountpoint: None > originalFormat._size: 0.0 > originalFormat._majorminor: None > originalFormat._mountType: btrfs > originalFormat.fsprofile: None > originalFormat.label: fedora_dhcppc0 > originalFormat._targetSize: 0.0 > originalFormat.volUUID: 852bfcd3-84c3-4cb0-92cc-787d2f56d51c > originalFormat._minInstanceSize: None > originalFormat.mountopts: subvolid=0 > originalFormat.mountpoint: None > originalFormat._device: /dev/sda2 > _serial: None >] > mounts.sysfsPath: Skipped > mounts.minor: 0 > mounts.fstabComment: Skipped > mounts.bus: Skipped > mounts.deviceLinks: [] > mounts._targetSize: 0 > mounts.vol_id: 256 > mounts._model: Skipped > mounts.kids: 0 > mounts._vendor: Skipped > mounts._name: boot > mounts.protected: False > mounts.originalFormat: BTRFS instance, containing members: > mounts.originalFormat.uuid: None > mounts.originalFormat.exists: True > mounts.originalFormat._mountpoint: None > mounts.originalFormat._size: 0.0 > mounts.originalFormat._majorminor: None > mounts.originalFormat._mountType: btrfs > mounts.originalFormat.fsprofile: None > mounts.originalFormat.label: None > mounts.originalFormat._targetSize: 0.0 > mounts.originalFormat.volUUID: None > mounts.originalFormat._minInstanceSize: None > mounts.originalFormat.mountopts: subvol=boot > mounts.originalFormat.mountpoint: None > mounts.originalFormat._device: /dev/sda2 > mounts._serial: None >, '/': BTRFSSubVolumeDevice instance, containing members: > mounts.major: 0 > mounts._partedDevice: None > mounts.exists: False > mounts.req_size: None > mounts._size: 0 > mounts.id: 7 > mounts.controllable: True > mounts.uuid: None > mounts._format: existing None > mounts.parents: [Already dumped (BTRFSVolumeDevice instance) >] > mounts.sysfsPath: Skipped > mounts.minor: 0 > mounts.fstabComment: Skipped > mounts.bus: Skipped > mounts.deviceLinks: [] > mounts._targetSize: 0 > mounts.vol_id: 259 > mounts._model: Skipped > mounts.kids: 0 > mounts._vendor: Skipped > mounts._name: root > mounts.protected: False > mounts.originalFormat: BTRFS instance, containing members: > mounts.originalFormat.uuid: None > mounts.originalFormat.exists: True > mounts.originalFormat._mountpoint: None > mounts.originalFormat._size: 0.0 > mounts.originalFormat._majorminor: None > mounts.originalFormat._mountType: btrfs > mounts.originalFormat.fsprofile: None > mounts.originalFormat.label: None > mounts.originalFormat._targetSize: 0.0 > mounts.originalFormat.volUUID: None > mounts.originalFormat._minInstanceSize: None > mounts.originalFormat.mountopts: subvol=root > mounts.originalFormat.mountpoint: None > mounts.originalFormat._device: /dev/sda2 > mounts._serial: None >} > swaps: [non-existent 2040MB mdarray dhcppc0:swap (3) >] > name: Fedora Linux 19 for x86_64 >] > _intf.storage.zfcp: ZFCP instance, containing members: > _intf.storage.zfcp.down: False > _intf.storage.zfcp.hasReadConfig: True > _intf.storage.zfcp.intf: None > _intf.storage.zfcp.fcpdevs: set([]) > _intf.storage._defaultFSType: ext4 > _intf.storage.autoPartEscrowCert: None > _intf.storage.iscsi: iscsi instance, containing members: > _intf.storage.iscsi.initiatorSet: False > _intf.storage.iscsi.ifaces: {} > _intf.storage.iscsi.started: False > _intf.storage.iscsi._initiator: Skipped > _intf.storage.iscsi.discovered_targets: {} > _intf.storage.iscsi.ibftNodes: [] > _intf.storage.escrowCertificates: {} > _intf.storage.fsset: FSSet instance, containing members: > _intf.storage.fsset.origFStab: None > _intf.storage.fsset._usb: NoDevice instance, containing members: > _intf.storage.fsset._usb.major: 0 > _intf.storage.fsset._usb._partedDevice: None > _intf.storage.fsset._usb.exists: True > _intf.storage.fsset._usb._size: 0 > _intf.storage.fsset._usb.id: 48 > _intf.storage.fsset._usb.controllable: True > _intf.storage.fsset._usb.uuid: None > _intf.storage.fsset._usb._format: USBFS instance, containing members: > _intf.storage.fsset._usb._format.uuid: None > _intf.storage.fsset._usb._format.exists: True > _intf.storage.fsset._usb._format._mountpoint: None > _intf.storage.fsset._usb._format._majorminor: None > _intf.storage.fsset._usb._format._minInstanceSize: None > _intf.storage.fsset._usb._format.fsprofile: None > _intf.storage.fsset._usb._format.label: None > _intf.storage.fsset._usb._format._targetSize: 0 > _intf.storage.fsset._usb._format._size: 0 > _intf.storage.fsset._usb._format.mountopts: None > _intf.storage.fsset._usb._format.mountpoint: /proc/bus/usb > _intf.storage.fsset._usb._format._device: usbfs > _intf.storage.fsset._usb.parents: [] > _intf.storage.fsset._usb.deviceLinks: [] > _intf.storage.fsset._usb.minor: 0 > _intf.storage.fsset._usb.fstabComment: Skipped > _intf.storage.fsset._usb.bus: Skipped > _intf.storage.fsset._usb.sysfsPath: Skipped > _intf.storage.fsset._usb._targetSize: 0 > _intf.storage.fsset._usb._model: Skipped > _intf.storage.fsset._usb.kids: 0 > _intf.storage.fsset._usb._vendor: Skipped > _intf.storage.fsset._usb._name: usbfs > _intf.storage.fsset._usb.protected: False > _intf.storage.fsset._usb.originalFormat: USBFS instance, containing members: > _intf.storage.fsset._usb.originalFormat.uuid: None > _intf.storage.fsset._usb.originalFormat.exists: True > _intf.storage.fsset._usb.originalFormat._mountpoint: None > _intf.storage.fsset._usb.originalFormat.mountpoint: /proc/bus/usb > _intf.storage.fsset._usb.originalFormat._majorminor: None > _intf.storage.fsset._usb.originalFormat.fsprofile: None > _intf.storage.fsset._usb.originalFormat.label: None > _intf.storage.fsset._usb.originalFormat._targetSize: 0 > _intf.storage.fsset._usb.originalFormat._minInstanceSize: None > _intf.storage.fsset._usb.originalFormat.mountopts: None > _intf.storage.fsset._usb.originalFormat._size: 0 > _intf.storage.fsset._usb.originalFormat._device: usbfs > _intf.storage.fsset._usb._serial: None > _intf.storage.fsset.devicetree: DeviceTree instance, containing members: > _intf.storage.fsset.devicetree.dasd: Already dumped (DASD instance) > _intf.storage.fsset.devicetree.populated: True > _intf.storage.fsset.devicetree.exclusiveDisks: [] > _intf.storage.fsset.devicetree._actions: [] > _intf.storage.fsset.devicetree.iscsi: Already dumped (iscsi instance) > _intf.storage.fsset.devicetree._cleanup: False > _intf.storage.fsset.devicetree._devices: [OpticalDevice instance, containing members: > major: 11 > _partedDevice: parted.Device instance -- > model: QEMU QEMU DVD-ROM path: /dev/sr0 type: 1 > sectorSize: 2048 physicalSectorSize: 2048 > length: 2347520 openCount: 0 readOnly: True > externalMode: False dirty: False bootDirty: False > host: 2 did: 0 busy: True > hardwareGeometry: (146, 255, 63) biosGeometry: (146, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127ea9e0> > exists: True > _size: 4585.0 > id: 0 > controllable: True > uuid: None > _format: Iso9660FS instance, containing members: > _format.uuid: 2013-05-10-11-54-01-00 > _format.exists: True > _format._mountpoint: None > _format._majorminor: 011000 > _format._mountType: iso9660 > _format.fsprofile: None > _format.label: Fedora_19-Beta-TC4_x86_64 > _format._targetSize: 0.0 > _format._device: /dev/sr0 > _format._minInstanceSize: None > _format.mountopts: None > _format.mountpoint: None > _format._size: 0.0 > parents: [] > sysfsPath: /devices/pci0000:00/0000:00:01.1/ata2/host1/target1:0:0/1:0:0:0/block/sr0 > minor: 0 > fstabComment: Skipped > bus: Skipped > deviceLinks: [/dev/cdrom, /dev/disk/by-id/ata-QEMU_DVD-ROM_QM00003, /dev/disk/by-label/Fedora\x2019-Beta-TC4\x20x86_64, /dev/disk/by-uuid/2013-05-10-11-54-01-00] > _targetSize: 0 > _model: QEMU_DVD-ROM > kids: 0 > _vendor: None > _name: sr0 > protected: False > originalFormat: Iso9660FS instance, containing members: > originalFormat.uuid: 2013-05-10-11-54-01-00 > originalFormat.exists: True > originalFormat._mountpoint: None > originalFormat.mountpoint: None > originalFormat._majorminor: None > originalFormat.fsprofile: None > originalFormat.label: Fedora_19-Beta-TC4_x86_64 > originalFormat._targetSize: 0.0 > originalFormat._minInstanceSize: None > originalFormat.mountopts: None > originalFormat._size: 0.0 > originalFormat._device: /dev/sr0 > _serial: None >, existing 12000MB disk sda (1) with existing msdos disklabel >, existing 12000MB disk sdd (8) with existing msdos disklabel >, existing 12000MB disk sdc (11) with existing msdos disklabel >, existing 12000MB disk sdb (14) with existing msdos disklabel >, FileDevice instance, containing members: > major: 0 > _partedDevice: None > exists: True > _size: 0 > id: 17 > controllable: False > uuid: None > _format: existing None > parents: [] > sysfsPath: Skipped > minor: 0 > fstabComment: Skipped > bus: Skipped > deviceLinks: [] > _targetSize: 0 > _model: Skipped > kids: 1 > _vendor: Skipped > _name: /run/install/repo/LiveOS/squashfs.img > protected: False > originalFormat: existing None > _serial: None >, LoopDevice instance, containing members: > major: 0 > _partedDevice: None > exists: True > _size: 0 > id: 18 > controllable: False > uuid: None > _format: existing squashfs > parents: [Already dumped (FileDevice instance) >] > sysfsPath: Skipped > minor: 0 > fstabComment: Skipped > bus: Skipped > deviceLinks: [] > _targetSize: 0 > _model: Skipped > kids: 0 > _vendor: Skipped > _name: loop0 > protected: False > originalFormat: existing squashfs > _serial: None >, FileDevice instance, containing members: > major: 0 > _partedDevice: None > exists: True > _size: 0 > id: 19 > controllable: False > uuid: None > _format: existing None > parents: [] > sysfsPath: Skipped > minor: 0 > fstabComment: Skipped > bus: Skipped > deviceLinks: [] > _targetSize: 0 > _model: Skipped > kids: 1 > _vendor: Skipped > _name: /LiveOS/rootfs.img > protected: False > originalFormat: existing None > _serial: None >, LoopDevice instance, containing members: > major: 0 > _partedDevice: None > exists: True > _size: 0 > id: 20 > controllable: False > uuid: None > _format: Ext4FS instance, containing members: > _format.errors: False > _format.uuid: 932a9ea8-7790-43fd-a10c-20d783f65a9d > _format.exists: True > _format._mountpoint: None > _format._majorminor: None > _format._mountType: ext4 > _format.fsprofile: None > _format.label: Anaconda > _format._targetSize: 1024.0 > _format.dirty: False > _format._minInstanceSize: 0.0 > _format.mountopts: None > _format.mountpoint: None > _format._device: /dev/loop1 > _format._size: 1024.0 > parents: [Already dumped (FileDevice instance) >] > sysfsPath: Skipped > minor: 0 > fstabComment: Skipped > bus: Skipped > deviceLinks: [/dev/disk/by-label/Anaconda, /dev/disk/by-uuid/932a9ea8-7790-43fd-a10c-20d783f65a9d] > _targetSize: 0 > _model: Skipped > kids: 0 > _vendor: Skipped > _name: loop1 > protected: False > originalFormat: Ext4FS instance, containing members: > originalFormat.errors: False > originalFormat.uuid: 932a9ea8-7790-43fd-a10c-20d783f65a9d > originalFormat.exists: True > originalFormat._mountpoint: None > originalFormat._size: 1024.0 > originalFormat._majorminor: None > originalFormat._mountType: ext4 > originalFormat.fsprofile: None > originalFormat.label: Anaconda > originalFormat._targetSize: 1024.0 > originalFormat.dirty: False > originalFormat._minInstanceSize: 0.0 > originalFormat.mountopts: None > originalFormat.mountpoint: None > originalFormat._device: /dev/loop1 > _serial: None >, FileDevice instance, containing members: > major: 0 > _partedDevice: None > exists: True > _size: 0 > id: 21 > controllable: False > uuid: None > _format: existing None > parents: [] > sysfsPath: Skipped > minor: 0 > fstabComment: Skipped > bus: Skipped > deviceLinks: [] > _targetSize: 0 > _model: Skipped > kids: 1 > _vendor: Skipped > _name: /overlay (deleted) > protected: False > originalFormat: existing None > _serial: None >, LoopDevice instance, containing members: > major: 0 > _partedDevice: None > exists: True > _size: 0 > id: 22 > controllable: False > uuid: None > _format: existing DM_snapshot_cow > parents: [Already dumped (FileDevice instance) >] > sysfsPath: Skipped > minor: 0 > fstabComment: Skipped > bus: Skipped > deviceLinks: [] > _targetSize: 0 > _model: Skipped > kids: 1 > _vendor: Skipped > _name: loop2 > protected: False > originalFormat: existing DM_snapshot_cow > _serial: None >, existing 1024MB dm live-rw (23) with existing ext4 filesystem >, existing 384MB partition sda3 (25) with non-existent mdmember >, existing 384MB partition sdb3 (26) with non-existent mdmember >, existing 384MB partition sdc3 (27) with non-existent mdmember >, existing 384MB partition sdd3 (28) with non-existent mdmember >, existing 767MB mdarray swap (29) with existing swap >, existing 512MB partition sda2 (31) with non-existent mdmember >, existing 512MB partition sdb2 (32) with non-existent mdmember >, existing 512MB partition sdc2 (33) with non-existent mdmember >, existing 512MB partition sdd2 (34) with non-existent mdmember >, existing 511MB mdarray boot (35) with existing ext4 filesystem mounted at /boot >, existing 3004MB partition sda1 (37) with non-existent mdmember >, existing 3004MB partition sdb1 (38) with non-existent mdmember >, existing 3004MB partition sdc1 (39) with non-existent mdmember >, existing 3004MB partition sdd1 (40) with non-existent mdmember >, existing 6003MB mdarray root (41) with existing ext4 filesystem mounted at / >] > _intf.storage.fsset.devicetree.ignoredDisks: [loop3, loop4, loop5, loop6, loop7] > _intf.storage.fsset.devicetree._completed_actions: [ActionDestroyFormat instance, containing members: > device: Already dumped (BTRFSSubVolumeDevice instance) > origFormat: BTRFS instance, containing members: > origFormat.uuid: None > origFormat.exists: True > origFormat._mountpoint: None > origFormat._majorminor: 008002 > origFormat._mountType: btrfs > origFormat.fsprofile: None > origFormat.label: None > origFormat._targetSize: 0.0 > origFormat.volUUID: None > origFormat._minInstanceSize: None > origFormat.mountopts: subvol=boot > origFormat.mountpoint: None > origFormat._device: /dev/sda2 > origFormat._size: 0.0 > id: 0 >, ActionDestroyDevice instance, containing members: > device: Already dumped (BTRFSSubVolumeDevice instance) > id: 1 >, ActionDestroyFormat instance, containing members: > device: Already dumped (BTRFSSubVolumeDevice instance) > origFormat: BTRFS instance, containing members: > origFormat.uuid: None > origFormat.exists: True > origFormat._mountpoint: None > origFormat._majorminor: 008002 > origFormat._mountType: btrfs > origFormat.fsprofile: None > origFormat.label: None > origFormat._targetSize: 0.0 > origFormat.volUUID: None > origFormat._minInstanceSize: None > origFormat.mountopts: subvol=root > origFormat.mountpoint: None > origFormat._device: /dev/sda2 > origFormat._size: 0.0 > id: 2 >, ActionDestroyDevice instance, containing members: > device: Already dumped (BTRFSSubVolumeDevice instance) > id: 3 >, ActionDestroyFormat instance, containing members: > device: Already dumped (BTRFSVolumeDevice instance) > origFormat: BTRFS instance, containing members: > origFormat.uuid: None > origFormat.exists: True > origFormat._mountpoint: None > origFormat._majorminor: 008002 > origFormat._mountType: btrfs > origFormat.fsprofile: None > origFormat.label: fedora_dhcppc0 > origFormat._targetSize: 0.0 > origFormat.volUUID: 852bfcd3-84c3-4cb0-92cc-787d2f56d51c > origFormat._minInstanceSize: None > origFormat.mountopts: subvolid=0 > origFormat.mountpoint: None > origFormat._device: /dev/sda2 > origFormat._size: 0.0 > id: 4 >, ActionDestroyDevice instance, containing members: > device: Already dumped (BTRFSVolumeDevice instance) > id: 5 >, ActionDestroyFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: BTRFS instance, containing members: > origFormat.uuid: c2bf2d26-7177-4e7b-b298-6e134a95e913 > origFormat.exists: True > origFormat._mountpoint: None > origFormat._majorminor: 008002 > origFormat._mountType: btrfs > origFormat.fsprofile: None > origFormat.label: fedora_dhcppc0 > origFormat._targetSize: 0.0 > origFormat.volUUID: 852bfcd3-84c3-4cb0-92cc-787d2f56d51c > origFormat._minInstanceSize: None > origFormat.mountopts: None > origFormat.mountpoint: None > origFormat._device: /dev/sda2 > origFormat._size: 0.0 > id: 6 >, ActionDestroyDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 7 >, ActionDestroyFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: BTRFS instance, containing members: > origFormat.uuid: 4f0bfd9e-856d-44e8-81d1-e1ee467c09c7 > origFormat.exists: True > origFormat._mountpoint: None > origFormat._majorminor: 008050 > origFormat._mountType: btrfs > origFormat.fsprofile: None > origFormat.label: fedora_dhcppc0 > origFormat._targetSize: 0.0 > origFormat.volUUID: 852bfcd3-84c3-4cb0-92cc-787d2f56d51c > origFormat._minInstanceSize: None > origFormat.mountopts: None > origFormat.mountpoint: None > origFormat._device: /dev/sdd2 > origFormat._size: 0.0 > id: 8 >, ActionDestroyDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 9 >, ActionDestroyFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: BTRFS instance, containing members: > origFormat.uuid: f54da582-0e57-46d0-a99b-420d668f19e5 > origFormat.exists: True > origFormat._mountpoint: None > origFormat._majorminor: 008034 > origFormat._mountType: btrfs > origFormat.fsprofile: None > origFormat.label: fedora_dhcppc0 > origFormat._targetSize: 0.0 > origFormat.volUUID: 852bfcd3-84c3-4cb0-92cc-787d2f56d51c > origFormat._minInstanceSize: None > origFormat.mountopts: None > origFormat.mountpoint: None > origFormat._device: /dev/sdc2 > origFormat._size: 0.0 > id: 10 >, ActionDestroyDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 11 >, ActionDestroyFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: BTRFS instance, containing members: > origFormat.uuid: ecfc79bc-a494-44e4-b0e3-463ef6a52a1a > origFormat.exists: True > origFormat._mountpoint: None > origFormat._majorminor: 008018 > origFormat._mountType: btrfs > origFormat.fsprofile: None > origFormat.label: fedora_dhcppc0 > origFormat._targetSize: 0.0 > origFormat.volUUID: 852bfcd3-84c3-4cb0-92cc-787d2f56d51c > origFormat._minInstanceSize: None > origFormat.mountopts: None > origFormat.mountpoint: None > origFormat._device: /dev/sdb2 > origFormat._size: 0.0 > id: 12 >, ActionDestroyDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 13 >, ActionDestroyFormat instance, containing members: > device: Already dumped (MDRaidArrayDevice instance) > origFormat: non-existent swap > id: 14 >, ActionDestroyDevice instance, containing members: > device: Already dumped (MDRaidArrayDevice instance) > id: 15 >, ActionDestroyFormat instance, containing members: > device: non-existent 2041MB partition sda1 (2) > origFormat: non-existent mdmember > id: 16 >, ActionDestroyDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 17 >, ActionDestroyFormat instance, containing members: > device: Already dumped (DiskDevice instance) > origFormat: non-existent msdos disklabel > id: 18 >, ActionDestroyFormat instance, containing members: > device: non-existent 2041MB partition sdd1 (9) > origFormat: non-existent mdmember > id: 20 >, ActionDestroyDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 21 >, ActionDestroyFormat instance, containing members: > device: Already dumped (DiskDevice instance) > origFormat: non-existent msdos disklabel > id: 22 >, ActionDestroyFormat instance, containing members: > device: non-existent 2041MB partition sdc1 (12) > origFormat: non-existent mdmember > id: 24 >, ActionDestroyDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 25 >, ActionDestroyFormat instance, containing members: > device: Already dumped (DiskDevice instance) > origFormat: non-existent msdos disklabel > id: 26 >, ActionDestroyFormat instance, containing members: > device: non-existent 2041MB partition sdb1 (15) > origFormat: non-existent mdmember > id: 28 >, ActionDestroyDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 29 >, ActionDestroyFormat instance, containing members: > device: Already dumped (DiskDevice instance) > origFormat: non-existent msdos disklabel > id: 30 >, ActionCreateFormat instance, containing members: > device: Already dumped (DiskDevice instance) > origFormat: existing None > id: 19 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 61 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 48 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 35 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 36 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 49 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 62 >, ActionCreateFormat instance, containing members: > device: Already dumped (DiskDevice instance) > origFormat: existing None > id: 23 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 67 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 54 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 41 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 42 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 55 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 68 >, ActionCreateFormat instance, containing members: > device: Already dumped (DiskDevice instance) > origFormat: existing None > id: 27 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 65 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 52 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 39 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 40 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 53 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 66 >, ActionCreateFormat instance, containing members: > device: Already dumped (DiskDevice instance) > origFormat: existing None > id: 31 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 63 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 50 >, ActionCreateDevice instance, containing members: > device: Already dumped (PartitionDevice instance) > id: 37 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 38 >, ActionCreateDevice instance, containing members: > device: Already dumped (MDRaidArrayDevice instance) > id: 43 >, ActionCreateFormat instance, containing members: > device: Already dumped (MDRaidArrayDevice instance) > origFormat: non-existent None > id: 44 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 51 >, ActionCreateDevice instance, containing members: > device: Already dumped (MDRaidArrayDevice instance) > id: 56 >, ActionCreateFormat instance, containing members: > device: Already dumped (MDRaidArrayDevice instance) > origFormat: non-existent None > id: 57 >, ActionCreateFormat instance, containing members: > device: Already dumped (PartitionDevice instance) > origFormat: non-existent None > id: 64 >, ActionCreateDevice instance, containing members: > device: Already dumped (MDRaidArrayDevice instance) > id: 69 >, ActionCreateFormat instance, containing members: > device: Already dumped (MDRaidArrayDevice instance) > origFormat: non-existent None > id: 70 >] > _intf.storage.fsset.devicetree.names: [sr0, sda, sdd, sdc, sdb, loop0, /run/install/repo/LiveOS/squashfs.img, loop1, /LiveOS/rootfs.img, loop2, /overlay (deleted), loop3, loop4, loop5, loop6, loop7, live-rw, md127, swap, boot, root] > _intf.storage.fsset.devicetree.liveBackingDevice: None > _intf.storage.fsset.devicetree.protectedDevNames: [] > _intf.storage.fsset.devicetree.unusedRaidMembers: [] > _intf.storage.fsset.devicetree.diskImages: {} > _intf.storage.fsset.devicetree._hidden: [] > _intf.storage.fsset.devicetree.protectedDevSpecs: [LABEL=Fedorax2019-Beta-TC4x20x86_64] > _intf.storage.fsset.preserveLines: [] > _intf.storage.fsset._run: DirectoryDevice instance, containing members: > _intf.storage.fsset._run.major: 0 > _intf.storage.fsset._run._partedDevice: None > _intf.storage.fsset._run.exists: True > _intf.storage.fsset._run._size: 0 > _intf.storage.fsset._run.id: 49 > _intf.storage.fsset._run.controllable: True > _intf.storage.fsset._run.uuid: None > _intf.storage.fsset._run._format: BindFS instance, containing members: > _intf.storage.fsset._run._format.uuid: None > _intf.storage.fsset._run._format.exists: True > _intf.storage.fsset._run._format._mountpoint: /mnt/sysimage/run > _intf.storage.fsset._run._format._majorminor: None > _intf.storage.fsset._run._format._minInstanceSize: None > _intf.storage.fsset._run._format._mountType: bind > _intf.storage.fsset._run._format.fsprofile: None > _intf.storage.fsset._run._format.label: None > _intf.storage.fsset._run._format._targetSize: None > _intf.storage.fsset._run._format._size: None > _intf.storage.fsset._run._format.mountopts: None > _intf.storage.fsset._run._format.mountpoint: /run > _intf.storage.fsset._run._format._device: /run > _intf.storage.fsset._run.parents: [] > _intf.storage.fsset._run.deviceLinks: [] > _intf.storage.fsset._run.minor: 0 > _intf.storage.fsset._run.fstabComment: Skipped > _intf.storage.fsset._run.bus: Skipped > _intf.storage.fsset._run.sysfsPath: Skipped > _intf.storage.fsset._run._targetSize: 0 > _intf.storage.fsset._run._model: Skipped > _intf.storage.fsset._run.kids: 0 > _intf.storage.fsset._run._vendor: Skipped > _intf.storage.fsset._run._name: /run > _intf.storage.fsset._run.protected: False > _intf.storage.fsset._run.originalFormat: BindFS instance, containing members: > _intf.storage.fsset._run.originalFormat.uuid: None > _intf.storage.fsset._run.originalFormat.exists: True > _intf.storage.fsset._run.originalFormat._mountpoint: None > _intf.storage.fsset._run.originalFormat.mountpoint: /run > _intf.storage.fsset._run.originalFormat._majorminor: None > _intf.storage.fsset._run.originalFormat.fsprofile: None > _intf.storage.fsset._run.originalFormat.label: None > _intf.storage.fsset._run.originalFormat._targetSize: None > _intf.storage.fsset._run.originalFormat._minInstanceSize: None > _intf.storage.fsset._run.originalFormat.mountopts: None > _intf.storage.fsset._run.originalFormat._size: None > _intf.storage.fsset._run.originalFormat._device: /run > _intf.storage.fsset._run._serial: None > _intf.storage.fsset._devshm: NoDevice instance, containing members: > _intf.storage.fsset._devshm.major: 0 > _intf.storage.fsset._devshm._partedDevice: None > _intf.storage.fsset._devshm.exists: True > _intf.storage.fsset._devshm._size: 0 > _intf.storage.fsset._devshm.id: 43 > _intf.storage.fsset._devshm.controllable: True > _intf.storage.fsset._devshm.uuid: None > _intf.storage.fsset._devshm._format: TmpFS instance, containing members: > _intf.storage.fsset._devshm._format.uuid: None > _intf.storage.fsset._devshm._format.exists: True > _intf.storage.fsset._devshm._format._mountpoint: /mnt/sysimage/dev/shm > _intf.storage.fsset._devshm._format._majorminor: None > _intf.storage.fsset._devshm._format._minInstanceSize: None > _intf.storage.fsset._devshm._format.fsprofile: None > _intf.storage.fsset._devshm._format.label: None > _intf.storage.fsset._devshm._format._targetSize: 0 > _intf.storage.fsset._devshm._format._size: 0 > _intf.storage.fsset._devshm._format.mountopts: None > _intf.storage.fsset._devshm._format.mountpoint: /dev/shm > _intf.storage.fsset._devshm._format._device: tmpfs > _intf.storage.fsset._devshm.parents: [] > _intf.storage.fsset._devshm.deviceLinks: [] > _intf.storage.fsset._devshm.minor: 0 > _intf.storage.fsset._devshm.fstabComment: Skipped > _intf.storage.fsset._devshm.bus: Skipped > _intf.storage.fsset._devshm.sysfsPath: Skipped > _intf.storage.fsset._devshm._targetSize: 0 > _intf.storage.fsset._devshm._model: Skipped > _intf.storage.fsset._devshm.kids: 0 > _intf.storage.fsset._devshm._vendor: Skipped > _intf.storage.fsset._devshm._name: tmpfs > _intf.storage.fsset._devshm.protected: False > _intf.storage.fsset._devshm.originalFormat: TmpFS instance, containing members: > _intf.storage.fsset._devshm.originalFormat.uuid: None > _intf.storage.fsset._devshm.originalFormat.exists: True > _intf.storage.fsset._devshm.originalFormat._mountpoint: None > _intf.storage.fsset._devshm.originalFormat.mountpoint: /dev/shm > _intf.storage.fsset._devshm.originalFormat._majorminor: None > _intf.storage.fsset._devshm.originalFormat.fsprofile: None > _intf.storage.fsset._devshm.originalFormat.label: None > _intf.storage.fsset._devshm.originalFormat._targetSize: 0 > _intf.storage.fsset._devshm.originalFormat._minInstanceSize: None > _intf.storage.fsset._devshm.originalFormat.mountopts: None > _intf.storage.fsset._devshm.originalFormat._size: 0 > _intf.storage.fsset._devshm.originalFormat._device: tmpfs > _intf.storage.fsset._devshm._serial: None > _intf.storage.fsset._dev: DirectoryDevice instance, containing members: > _intf.storage.fsset._dev.major: 0 > _intf.storage.fsset._dev._partedDevice: None > _intf.storage.fsset._dev.exists: True > _intf.storage.fsset._dev._size: 0 > _intf.storage.fsset._dev.id: 42 > _intf.storage.fsset._dev.controllable: True > _intf.storage.fsset._dev.uuid: None > _intf.storage.fsset._dev._format: BindFS instance, containing members: > _intf.storage.fsset._dev._format.uuid: None > _intf.storage.fsset._dev._format.exists: True > _intf.storage.fsset._dev._format._mountpoint: /mnt/sysimage/dev > _intf.storage.fsset._dev._format._majorminor: None > _intf.storage.fsset._dev._format._minInstanceSize: None > _intf.storage.fsset._dev._format._mountType: bind > _intf.storage.fsset._dev._format.fsprofile: None > _intf.storage.fsset._dev._format.label: None > _intf.storage.fsset._dev._format._targetSize: None > _intf.storage.fsset._dev._format._size: None > _intf.storage.fsset._dev._format.mountopts: None > _intf.storage.fsset._dev._format.mountpoint: /dev > _intf.storage.fsset._dev._format._device: /dev > _intf.storage.fsset._dev.parents: [] > _intf.storage.fsset._dev.deviceLinks: [] > _intf.storage.fsset._dev.minor: 0 > _intf.storage.fsset._dev.fstabComment: Skipped > _intf.storage.fsset._dev.bus: Skipped > _intf.storage.fsset._dev.sysfsPath: Skipped > _intf.storage.fsset._dev._targetSize: 0 > _intf.storage.fsset._dev._model: Skipped > _intf.storage.fsset._dev.kids: 0 > _intf.storage.fsset._dev._vendor: Skipped > _intf.storage.fsset._dev._name: /dev > _intf.storage.fsset._dev.protected: False > _intf.storage.fsset._dev.originalFormat: BindFS instance, containing members: > _intf.storage.fsset._dev.originalFormat.uuid: None > _intf.storage.fsset._dev.originalFormat.exists: True > _intf.storage.fsset._dev.originalFormat._mountpoint: None > _intf.storage.fsset._dev.originalFormat.mountpoint: /dev > _intf.storage.fsset._dev.originalFormat._majorminor: None > _intf.storage.fsset._dev.originalFormat.fsprofile: None > _intf.storage.fsset._dev.originalFormat.label: None > _intf.storage.fsset._dev.originalFormat._targetSize: None > _intf.storage.fsset._dev.originalFormat._minInstanceSize: None > _intf.storage.fsset._dev.originalFormat.mountopts: None > _intf.storage.fsset._dev.originalFormat._size: None > _intf.storage.fsset._dev.originalFormat._device: /dev > _intf.storage.fsset._dev._serial: None > _intf.storage.fsset.blkidTab: None > _intf.storage.fsset._proc: NoDevice instance, containing members: > _intf.storage.fsset._proc.major: 0 > _intf.storage.fsset._proc._partedDevice: None > _intf.storage.fsset._proc.exists: True > _intf.storage.fsset._proc._size: 0 > _intf.storage.fsset._proc.id: 46 > _intf.storage.fsset._proc.controllable: True > _intf.storage.fsset._proc.uuid: None > _intf.storage.fsset._proc._format: ProcFS instance, containing members: > _intf.storage.fsset._proc._format.uuid: None > _intf.storage.fsset._proc._format.exists: True > _intf.storage.fsset._proc._format._mountpoint: /mnt/sysimage/proc > _intf.storage.fsset._proc._format._majorminor: None > _intf.storage.fsset._proc._format._minInstanceSize: None > _intf.storage.fsset._proc._format.fsprofile: None > _intf.storage.fsset._proc._format.label: None > _intf.storage.fsset._proc._format._targetSize: 0 > _intf.storage.fsset._proc._format._size: 0 > _intf.storage.fsset._proc._format.mountopts: None > _intf.storage.fsset._proc._format.mountpoint: /proc > _intf.storage.fsset._proc._format._device: proc > _intf.storage.fsset._proc.parents: [] > _intf.storage.fsset._proc.deviceLinks: [] > _intf.storage.fsset._proc.minor: 0 > _intf.storage.fsset._proc.fstabComment: Skipped > _intf.storage.fsset._proc.bus: Skipped > _intf.storage.fsset._proc.sysfsPath: Skipped > _intf.storage.fsset._proc._targetSize: 0 > _intf.storage.fsset._proc._model: Skipped > _intf.storage.fsset._proc.kids: 0 > _intf.storage.fsset._proc._vendor: Skipped > _intf.storage.fsset._proc._name: proc > _intf.storage.fsset._proc.protected: False > _intf.storage.fsset._proc.originalFormat: ProcFS instance, containing members: > _intf.storage.fsset._proc.originalFormat.uuid: None > _intf.storage.fsset._proc.originalFormat.exists: True > _intf.storage.fsset._proc.originalFormat._mountpoint: None > _intf.storage.fsset._proc.originalFormat.mountpoint: /proc > _intf.storage.fsset._proc.originalFormat._majorminor: None > _intf.storage.fsset._proc.originalFormat.fsprofile: None > _intf.storage.fsset._proc.originalFormat.label: None > _intf.storage.fsset._proc.originalFormat._targetSize: 0 > _intf.storage.fsset._proc.originalFormat._minInstanceSize: None > _intf.storage.fsset._proc.originalFormat.mountopts: None > _intf.storage.fsset._proc.originalFormat._size: 0 > _intf.storage.fsset._proc.originalFormat._device: proc > _intf.storage.fsset._proc._serial: None > _intf.storage.fsset.active: True > _intf.storage.fsset.cryptTab: CryptTab instance, containing members: > _intf.storage.fsset.cryptTab.devicetree: Already dumped (DeviceTree instance) > _intf.storage.fsset.cryptTab.blkidTab: None > _intf.storage.fsset.cryptTab.chroot: Skipped > _intf.storage.fsset.cryptTab.mappings: {} > _intf.storage.fsset._devpts: NoDevice instance, containing members: > _intf.storage.fsset._devpts.major: 0 > _intf.storage.fsset._devpts._partedDevice: None > _intf.storage.fsset._devpts.exists: True > _intf.storage.fsset._devpts._size: 0 > _intf.storage.fsset._devpts.id: 44 > _intf.storage.fsset._devpts.controllable: True > _intf.storage.fsset._devpts.uuid: None > _intf.storage.fsset._devpts._format: DevPtsFS instance, containing members: > _intf.storage.fsset._devpts._format.uuid: None > _intf.storage.fsset._devpts._format.exists: True > _intf.storage.fsset._devpts._format._mountpoint: /mnt/sysimage/dev/pts > _intf.storage.fsset._devpts._format._majorminor: None > _intf.storage.fsset._devpts._format._minInstanceSize: None > _intf.storage.fsset._devpts._format.fsprofile: None > _intf.storage.fsset._devpts._format.label: None > _intf.storage.fsset._devpts._format._targetSize: 0 > _intf.storage.fsset._devpts._format._size: 0 > _intf.storage.fsset._devpts._format.mountopts: None > _intf.storage.fsset._devpts._format.mountpoint: /dev/pts > _intf.storage.fsset._devpts._format._device: devpts > _intf.storage.fsset._devpts.parents: [] > _intf.storage.fsset._devpts.deviceLinks: [] > _intf.storage.fsset._devpts.minor: 0 > _intf.storage.fsset._devpts.fstabComment: Skipped > _intf.storage.fsset._devpts.bus: Skipped > _intf.storage.fsset._devpts.sysfsPath: Skipped > _intf.storage.fsset._devpts._targetSize: 0 > _intf.storage.fsset._devpts._model: Skipped > _intf.storage.fsset._devpts.kids: 0 > _intf.storage.fsset._devpts._vendor: Skipped > _intf.storage.fsset._devpts._name: devpts > _intf.storage.fsset._devpts.protected: False > _intf.storage.fsset._devpts.originalFormat: DevPtsFS instance, containing members: > _intf.storage.fsset._devpts.originalFormat.uuid: None > _intf.storage.fsset._devpts.originalFormat.exists: True > _intf.storage.fsset._devpts.originalFormat._mountpoint: None > _intf.storage.fsset._devpts.originalFormat.mountpoint: /dev/pts > _intf.storage.fsset._devpts.originalFormat._majorminor: None > _intf.storage.fsset._devpts.originalFormat.fsprofile: None > _intf.storage.fsset._devpts.originalFormat.label: None > _intf.storage.fsset._devpts.originalFormat._targetSize: 0 > _intf.storage.fsset._devpts.originalFormat._minInstanceSize: None > _intf.storage.fsset._devpts.originalFormat.mountopts: None > _intf.storage.fsset._devpts.originalFormat._size: 0 > _intf.storage.fsset._devpts.originalFormat._device: devpts > _intf.storage.fsset._devpts._serial: None > _intf.storage.fsset._sysfs: NoDevice instance, containing members: > _intf.storage.fsset._sysfs.major: 0 > _intf.storage.fsset._sysfs._partedDevice: None > _intf.storage.fsset._sysfs.exists: True > _intf.storage.fsset._sysfs._size: 0 > _intf.storage.fsset._sysfs.id: 45 > _intf.storage.fsset._sysfs.controllable: True > _intf.storage.fsset._sysfs.uuid: None > _intf.storage.fsset._sysfs._format: SysFS instance, containing members: > _intf.storage.fsset._sysfs._format.uuid: None > _intf.storage.fsset._sysfs._format.exists: True > _intf.storage.fsset._sysfs._format._mountpoint: /mnt/sysimage/sys > _intf.storage.fsset._sysfs._format._majorminor: None > _intf.storage.fsset._sysfs._format._minInstanceSize: None > _intf.storage.fsset._sysfs._format.fsprofile: None > _intf.storage.fsset._sysfs._format.label: None > _intf.storage.fsset._sysfs._format._targetSize: 0 > _intf.storage.fsset._sysfs._format._size: 0 > _intf.storage.fsset._sysfs._format.mountopts: None > _intf.storage.fsset._sysfs._format.mountpoint: /sys > _intf.storage.fsset._sysfs._format._device: sysfs > _intf.storage.fsset._sysfs.parents: [] > _intf.storage.fsset._sysfs.deviceLinks: [] > _intf.storage.fsset._sysfs.minor: 0 > _intf.storage.fsset._sysfs.fstabComment: Skipped > _intf.storage.fsset._sysfs.bus: Skipped > _intf.storage.fsset._sysfs.sysfsPath: Skipped > _intf.storage.fsset._sysfs._targetSize: 0 > _intf.storage.fsset._sysfs._model: Skipped > _intf.storage.fsset._sysfs.kids: 0 > _intf.storage.fsset._sysfs._vendor: Skipped > _intf.storage.fsset._sysfs._name: sysfs > _intf.storage.fsset._sysfs.protected: False > _intf.storage.fsset._sysfs.originalFormat: SysFS instance, containing members: > _intf.storage.fsset._sysfs.originalFormat.uuid: None > _intf.storage.fsset._sysfs.originalFormat.exists: True > _intf.storage.fsset._sysfs.originalFormat._mountpoint: None > _intf.storage.fsset._sysfs.originalFormat.mountpoint: /sys > _intf.storage.fsset._sysfs.originalFormat._majorminor: None > _intf.storage.fsset._sysfs.originalFormat.fsprofile: None > _intf.storage.fsset._sysfs.originalFormat.label: None > _intf.storage.fsset._sysfs.originalFormat._targetSize: 0 > _intf.storage.fsset._sysfs.originalFormat._minInstanceSize: None > _intf.storage.fsset._sysfs.originalFormat.mountopts: None > _intf.storage.fsset._sysfs.originalFormat._size: 0 > _intf.storage.fsset._sysfs.originalFormat._device: sysfs > _intf.storage.fsset._sysfs._serial: None > _intf.storage.fsset._selinux: NoDevice instance, containing members: > _intf.storage.fsset._selinux.major: 0 > _intf.storage.fsset._selinux._partedDevice: None > _intf.storage.fsset._selinux.exists: True > _intf.storage.fsset._selinux._size: 0 > _intf.storage.fsset._selinux.id: 47 > _intf.storage.fsset._selinux.controllable: True > _intf.storage.fsset._selinux.uuid: None > _intf.storage.fsset._selinux._format: SELinuxFS instance, containing members: > _intf.storage.fsset._selinux._format.uuid: None > _intf.storage.fsset._selinux._format.exists: True > _intf.storage.fsset._selinux._format._mountpoint: /mnt/sysimage/sys/fs/selinux > _intf.storage.fsset._selinux._format._majorminor: None > _intf.storage.fsset._selinux._format._minInstanceSize: None > _intf.storage.fsset._selinux._format.fsprofile: None > _intf.storage.fsset._selinux._format.label: None > _intf.storage.fsset._selinux._format._targetSize: 0 > _intf.storage.fsset._selinux._format._size: 0 > _intf.storage.fsset._selinux._format.mountopts: None > _intf.storage.fsset._selinux._format.mountpoint: /sys/fs/selinux > _intf.storage.fsset._selinux._format._device: selinuxfs > _intf.storage.fsset._selinux.parents: [] > _intf.storage.fsset._selinux.deviceLinks: [] > _intf.storage.fsset._selinux.minor: 0 > _intf.storage.fsset._selinux.fstabComment: Skipped > _intf.storage.fsset._selinux.bus: Skipped > _intf.storage.fsset._selinux.sysfsPath: Skipped > _intf.storage.fsset._selinux._targetSize: 0 > _intf.storage.fsset._selinux._model: Skipped > _intf.storage.fsset._selinux.kids: 0 > _intf.storage.fsset._selinux._vendor: Skipped > _intf.storage.fsset._selinux._name: selinuxfs > _intf.storage.fsset._selinux.protected: False > _intf.storage.fsset._selinux.originalFormat: SELinuxFS instance, containing members: > _intf.storage.fsset._selinux.originalFormat.uuid: None > _intf.storage.fsset._selinux.originalFormat.exists: True > _intf.storage.fsset._selinux.originalFormat._mountpoint: None > _intf.storage.fsset._selinux.originalFormat.mountpoint: /sys/fs/selinux > _intf.storage.fsset._selinux.originalFormat._majorminor: None > _intf.storage.fsset._selinux.originalFormat.fsprofile: None > _intf.storage.fsset._selinux.originalFormat.label: None > _intf.storage.fsset._selinux.originalFormat._targetSize: 0 > _intf.storage.fsset._selinux.originalFormat._minInstanceSize: None > _intf.storage.fsset._selinux.originalFormat.mountopts: None > _intf.storage.fsset._selinux.originalFormat._size: 0 > _intf.storage.fsset._selinux.originalFormat._device: selinuxfs > _intf.storage.fsset._selinux._serial: None > _intf.storage.config: StorageDiscoveryConfig instance, containing members: > _intf.storage.config.clearPartType: 2 > _intf.storage.config.clearNonExistent: False > _intf.storage.config.ignoredDisks: [] > _intf.storage.config.protectedDevSpecs: [LABEL=Fedorax2019-Beta-TC4x20x86_64] > _intf.storage.config.ignoreDiskInteractive: False > _intf.storage.config.exclusiveDisks: [sda, sdd, sdc, sdb] > _intf.storage.config.clearPartDevices: [] > _intf.storage.config.zeroMbr: False > _intf.storage.config.diskImages: {} > _intf.storage.config.clearPartDisks: [sda, sdd, sdc, sdb] > _intf.storage.config.mpathFriendlyNames: True > _intf.storage.config.initializeDisks: True > _intf.storage.size_sets: [] > _intf.storage.autoPartType: 0 > _intf.storage._bootloader: GRUB2 instance, containing members: > _intf.storage._bootloader._disk_order: [] > _intf.storage._bootloader.console_options: Skipped > _intf.storage._bootloader.console: Skipped > _intf.storage._bootloader.skip_bootloader: False > _intf.storage._bootloader.warnings: [] > _intf.storage._bootloader.chain_images: [] > _intf.storage._bootloader.stage2_is_preferred_stage1: False > _intf.storage._bootloader.disks: [existing 12000MB disk sda (1) with existing msdos disklabel >, existing 12000MB disk sdb (14) with existing msdos disklabel >, existing 12000MB disk sdc (11) with existing msdos disklabel >, existing 12000MB disk sdd (8) with existing msdos disklabel >] > _intf.storage._bootloader._update_only: False > _intf.storage._bootloader._default_image: None > _intf.storage._bootloader.stage2_device: Already dumped (MDRaidArrayDevice instance) > _intf.storage._bootloader.encrypted_password: Skipped > _intf.storage._bootloader.errors: [] > _intf.storage._bootloader.stage1_device: Already dumped (DiskDevice instance) > _intf.storage._bootloader._timeout: None > _intf.storage._bootloader.stage1_disk: Already dumped (DiskDevice instance) > _intf.storage._bootloader.password: None > _intf.storage._bootloader.dracut_args: > _intf.storage._bootloader.boot_args: $([ -x /usr/sbin/rhcrashkernel-param ] && /usr/sbin/rhcrashkernel-param || :) > _intf.storage._bootloader.linux_images: [] > _intf.storage.devicetree: Already dumped (DeviceTree instance) > _intf.storage._dumpFile: /tmp/storage.state > _intf.storage.ksdata: Already dumped (AnacondaKSHandler instance) > _intf.storage.services: set([]) > _intf.storage.encryptionPassphrase: Skipped > _intf.storage.encryptionCipher: None > _intf.storage.doAutoPart: False > _intf.storage.encryptionRetrofit: False > _intf.storage._nextID: 0 > _intf.storage.fcoe: fcoe instance, containing members: > _intf.storage.fcoe.started: True > _intf.storage.fcoe.nics: [] > _intf.storage.fcoe.lldpadStarted: False > _intf.storage.autoPartitionRequests: [PartSpec instance (0x7fae0fbea090) -- > mountpoint = / lv = True singlePV = False btrfs = True > weight = 0 fstype = ext4 encrypted = True > size = 1024 maxSize = 51200 grow = True > >, PartSpec instance (0x7fae0fbea050) -- > mountpoint = /home lv = True singlePV = False btrfs = True > weight = 0 fstype = ext4 encrypted = True > size = 500 maxSize = None grow = True > >, PartSpec instance (0x7fae0fbea0d0) -- > mountpoint = /boot lv = False singlePV = False btrfs = False > weight = 2000 fstype = ext4 encrypted = False > size = 500 maxSize = None grow = False > >, PartSpec instance (0x7fae0fbea110) -- > mountpoint = None lv = False singlePV = False btrfs = False > weight = 5000 fstype = biosboot encrypted = False > size = 1 maxSize = None grow = False > >, PartSpec instance (0x7fae0fbea150) -- > mountpoint = None lv = True singlePV = False btrfs = False > weight = 0 fstype = swap encrypted = True > size = 4032 maxSize = None grow = False > >] > _intf.storage.autoPartAddBackupPassphrase: False > _intf.storage.encryptedAutoPart: False > _intf._isFinal: False > _intf._distributionText: <function distributionText at 0x7fae34d97b18> > _intf._ui: None > _intf._actions: Skipped > _intf.payload: YumPayload instance, containing members: > _intf.payload._groups: Skipped > _intf.payload._yum: Skipped > _intf.payload.storage: Already dumped (Blivet instance) > _intf.payload.install_device: OpticalDevice instance, containing members: > _intf.payload.install_device.major: 11 > _intf.payload.install_device._partedDevice: parted.Device instance -- > model: QEMU QEMU DVD-ROM path: /dev/sr0 type: 1 > sectorSize: 2048 physicalSectorSize: 2048 > length: 2347520 openCount: 0 readOnly: True > externalMode: False dirty: False bootDirty: False > host: 2 did: 0 busy: True > hardwareGeometry: (146, 255, 63) biosGeometry: (146, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127ea9e0> > _intf.payload.install_device.exists: True > _intf.payload.install_device._size: 4585.0 > _intf.payload.install_device.id: 0 > _intf.payload.install_device.controllable: True > _intf.payload.install_device.uuid: None > _intf.payload.install_device._format: Iso9660FS instance, containing members: > _intf.payload.install_device._format.uuid: 2013-05-10-11-54-01-00 > _intf.payload.install_device._format.exists: True > _intf.payload.install_device._format._mountpoint: None > _intf.payload.install_device._format._majorminor: 011000 > _intf.payload.install_device._format._minInstanceSize: None > _intf.payload.install_device._format._mountType: iso9660 > _intf.payload.install_device._format.fsprofile: None > _intf.payload.install_device._format.label: Fedora_19-Beta-TC4_x86_64 > _intf.payload.install_device._format._targetSize: 0.0 > _intf.payload.install_device._format._size: 0.0 > _intf.payload.install_device._format.mountopts: None > _intf.payload.install_device._format.mountpoint: None > _intf.payload.install_device._format._device: /dev/sr0 > _intf.payload.install_device.parents: [] > _intf.payload.install_device.deviceLinks: [/dev/cdrom, /dev/disk/by-id/ata-QEMU_DVD-ROM_QM00003, /dev/disk/by-label/Fedora\x2019-Beta-TC4\x20x86_64, /dev/disk/by-uuid/2013-05-10-11-54-01-00] > _intf.payload.install_device.minor: 0 > _intf.payload.install_device.fstabComment: Skipped > _intf.payload.install_device.bus: Skipped > _intf.payload.install_device.sysfsPath: /devices/pci0000:00/0000:00:01.1/ata2/host1/target1:0:0/1:0:0:0/block/sr0 > _intf.payload.install_device._targetSize: 0 > _intf.payload.install_device._model: QEMU_DVD-ROM > _intf.payload.install_device.kids: 0 > _intf.payload.install_device._vendor: None > _intf.payload.install_device._name: sr0 > _intf.payload.install_device.protected: False > _intf.payload.install_device.originalFormat: Iso9660FS instance, containing members: > _intf.payload.install_device.originalFormat.uuid: 2013-05-10-11-54-01-00 > _intf.payload.install_device.originalFormat.exists: True > _intf.payload.install_device.originalFormat._mountpoint: None > _intf.payload.install_device.originalFormat.mountpoint: None > _intf.payload.install_device.originalFormat._majorminor: None > _intf.payload.install_device.originalFormat.fsprofile: None > _intf.payload.install_device.originalFormat.label: Fedora_19-Beta-TC4_x86_64 > _intf.payload.install_device.originalFormat._targetSize: 0.0 > _intf.payload.install_device.originalFormat._minInstanceSize: None > _intf.payload.install_device.originalFormat.mountopts: None > _intf.payload.install_device.originalFormat._size: 0.0 > _intf.payload.install_device.originalFormat._device: /dev/sr0 > _intf.payload.install_device._serial: None > _intf.payload._root_dir: /tmp/yum.root > _intf.payload._repos_dir: /tmp/yum.repos.d > _intf.payload._packages: [] > _intf.payload._requiredPackages: [grub2, mdadm, e2fsprogs, authconfig, firewalld] > _intf.payload._requiredGroups: [] > _intf.payload._setup: True > _intf.payload.txID: 1368244314.89 > _intf.payload._createdInitrds: False > _intf.payload._space_required: 806.95 MB > _intf.payload.data: Already dumped (AnacondaKSHandler instance) > _intf.payload._kernelVersionList: [] > _intf._currentAction: ProgressHub instance, containing members: > _intf._currentAction._progress_id: 14297 > _intf._currentAction._progressNotebook: Notebook instance, containing members: > _intf._currentAction.payload: Already dumped (YumPayload instance) > _intf._currentAction.paths: {'spokes': [('pyanaconda.ui.gui.spokes.%s', '/tmp/updates/pyanaconda/ui/gui/spokes'), ('pyanaconda.ui.gui.spokes.%s', '/usr/lib/python2.7/site-packages/pyanaconda/ui/gui/spokes'), ('pyanaconda.ui.gui.spokes.%s', '/usr/lib/site-python/pyanaconda/ui/gui/spokes'), ('pyanaconda.ui.gui.spokes.%s', '/usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes')], 'categories': [('pyanaconda.ui.gui.categories.%s', '/tmp/updates/pyanaconda/ui/gui/categories'), ('pyanaconda.ui.gui.categories.%s', '/usr/lib/python2.7/site-packages/pyanaconda/ui/gui/categories'), ('pyanaconda.ui.gui.categories.%s', '/usr/lib/site-python/pyanaconda/ui/gui/categories'), ('pyanaconda.ui.gui.categories.%s', '/usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/categories')]} > _intf._currentAction._checker: None > _intf._currentAction.applyOnSkip: False > _intf._currentAction.storage: Already dumped (Blivet instance) > _intf._currentAction.skipTo: None > _intf._currentAction._rnotes_id: 14296 > _intf._currentAction._notReadySpokes: [] > _intf._currentAction._inSpoke: False > _intf._currentAction._update_spoke_id: 14294 > _intf._currentAction._incompleteSpokes: [] > _intf._currentAction.instclass: Already dumped (DefaultInstall instance) > _intf._currentAction._currentStep: 20 > _intf._currentAction._progressLabel: Label instance, containing members: > _intf._currentAction._rnotesPages: <itertools.cycle object at 0x7fae0dea4638> > _intf._currentAction._data: Already dumped (AnacondaKSHandler instance) > _intf._currentAction._progressBar: ProgressBar instance, containing members: > _intf._currentAction._spokes: {'PasswordSpoke': PasswordSpoke instance, containing members: > _intf._currentAction._spokes._kickstarted: False > _intf._currentAction._spokes._window: SpokeWindow instance, containing members: > _intf._currentAction._spokes.applyOnSkip: False > _intf._currentAction._spokes.instclass: Already dumped (DefaultInstall instance) > _intf._currentAction._spokes.builder: Builder instance, containing members: > _intf._currentAction._spokes.confirm: Entry instance, containing members: > _intf._currentAction._spokes.storage: Already dumped (Blivet instance) > _intf._currentAction._spokes.selector: SpokeSelector instance, containing members: > _intf._currentAction._spokes.skipTo: None > _intf._currentAction._spokes._oldweak: Skipped > _intf._currentAction._spokes.pw: Entry instance, containing members: > _intf._currentAction._spokes._password: Skipped > _intf._currentAction._spokes._data: Already dumped (AnacondaKSHandler instance) > _intf._currentAction._spokes.payload: Already dumped (YumPayload instance) > _intf._currentAction._spokes._error: False >, 'UserSpoke': UserSpoke instance, containing members: > _intf._currentAction._spokes.guesser: {<Entry object at 0x7fae05a374b0 (GtkEntry at 0x5e65540)>: True} > _intf._currentAction._spokes._advanced: AdvancedUserDialog instance, containing members: > _intf._currentAction._spokes._advanced._window: Dialog instance, containing members: > _intf._currentAction._spokes._advanced.applyOnSkip: False > _intf._currentAction._spokes._advanced._groupDict: {'wheel': group --name=wheel > >} > _intf._currentAction._spokes._advanced.builder: Builder instance, containing members: > _intf._currentAction._spokes._advanced.skipTo: None > _intf._currentAction._spokes._advanced._data: Already dumped (AnacondaKSHandler instance) > _intf._currentAction._spokes._advanced._user: F19_UserData instance, containing members: > _intf._currentAction._spokes._advanced._user.shell: Skipped > _intf._currentAction._spokes._advanced._user.name: Skipped > _intf._currentAction._spokes._advanced._user.isCrypted: False > _intf._currentAction._spokes._advanced._user.lock: False > _intf._currentAction._spokes._advanced._user.password_kickstarted: False > _intf._currentAction._spokes._advanced._user.gid: None > _intf._currentAction._spokes._advanced._user.gecos: Skipped > _intf._currentAction._spokes._advanced._user.lineno: 0 > _intf._currentAction._spokes._advanced._user.groups: [] > _intf._currentAction._spokes._advanced._user.uid: None > _intf._currentAction._spokes._advanced._user.password: Skipped > _intf._currentAction._spokes._advanced._user.homedir: Skipped > _intf._currentAction._spokes._oldweak: None > _intf._currentAction._spokes._wheel: Already dumped (F12_GroupData instance) > _intf._currentAction._spokes.payload: Already dumped (YumPayload instance) > _intf._currentAction._spokes.applyOnSkip: False > _intf._currentAction._spokes.confirm: Entry instance, containing members: > _intf._currentAction._spokes.storage: Already dumped (Blivet instance) > _intf._currentAction._spokes.skipTo: None > _intf._currentAction._spokes.b_advanced: Button instance, containing members: > _intf._currentAction._spokes.username: Entry instance, containing members: > _intf._currentAction._spokes.instclass: Already dumped (DefaultInstall instance) > _intf._currentAction._spokes.selector: SpokeSelector instance, containing members: > _intf._currentAction._spokes.pw_label: Label instance, containing members: > _intf._currentAction._spokes._pwq: <pwquality.PWQSettings object at 0x7fae09dd2de0> > _intf._currentAction._spokes._data: Already dumped (AnacondaKSHandler instance) > _intf._currentAction._spokes.pw: Entry instance, containing members: > _intf._currentAction._spokes._error: False > _intf._currentAction._spokes._groupDict: {'wheel': Already dumped (F12_GroupData instance) >} > _intf._currentAction._spokes.admin: CheckButton instance, containing members: > _intf._currentAction._spokes.builder: Builder instance, containing members: > _intf._currentAction._spokes.pw_bar: LevelBar instance, containing members: > _intf._currentAction._spokes.usepassword: CheckButton instance, containing members: > _intf._currentAction._spokes._window: SpokeWindow instance, containing members: > _intf._currentAction._spokes.fullname: Entry instance, containing members: > _intf._currentAction._spokes._user: Already dumped (F19_UserData instance) >} > _intf._currentAction._totalSteps: 24 > _intf._currentAction.builder: Builder instance, containing members: > _intf._currentAction._window: HubWindow instance, containing members: > _intf._currentAction._autoContinue: False > _intf._currentAction._configurationDone: False >updateSrc: None >rootParts: None >id: None >rescue: False >mediaDevice: None >_network: None >methodstr: None >proxyPassword: None >desktop: Desktop instance, containing members: > desktop.info: {} > desktop.always_quote: False > desktop.write_quote: True > desktop.filename: None > desktop.read_unquote: True > desktop.runlevel: 3 > desktop._lines: [] >canReIPL: False >xdriver: None >stage2: hd:LABEL=Fedorax2019-Beta-TC4x20x86_64 >rescue_mount: True >_bootloader: GRUB2 instance, containing members: > _bootloader._disk_order: [] > _bootloader.console_options: Skipped > _bootloader.console: Skipped > _bootloader.skip_bootloader: False > _bootloader.warnings: [] > _bootloader.chain_images: [] > _bootloader.stage2_is_preferred_stage1: False > _bootloader.disks: [] > _bootloader._update_only: False > _bootloader._default_image: None > _bootloader.encrypted_password: Skipped > _bootloader.errors: [] > _bootloader.stage1_device: None > _bootloader._timeout: None > _bootloader.stage1_disk: None > _bootloader.password: None > _bootloader.dracut_args: > _bootloader.boot_args: $([ -x /usr/sbin/rhcrashkernel-param ] && /usr/sbin/rhcrashkernel-param || :) > _bootloader.linux_images: [] >_payload: Already dumped (YumPayload instance) >displayMode: g >ksdata: Already dumped (AnacondaKSHandler instance) >proxyUsername: None >extraModules: [] >mehConfig: Config instance, containing members: > mehConfig.programArch: x86_64 > mehConfig.callbackDict: {'lsblk_output': (<function lsblk_callback at 0x7fae0fc006e0>, True), 'type': (<function <lambda> at 0x7fae0fc00848>, True), 'nmcli_dev_list': (<function nmcli_dev_list_callback at 0x7fae0fc00758>, True)} > mehConfig.attrSkipList: [_intf._actions, _intf._currentAction._xklwrapper, _intf._currentAction.language.translations, _intf._currentAction.language.locales, _intf._currentAction._spokes["PasswordSpoke"]._oldweak, _intf._currentAction._spokes["PasswordSpoke"]._password, _intf._currentAction._spokes["UserSpoke"]._password, _intf._currentAction._spokes["UserSpoke"]._oldweak, _intf.storage.bootloader.password, _intf.storage.data, _intf.storage.encryptionPassphrase, _bootloader.encrypted_password, _bootloader.password, payload._groups, payload._yum] > mehConfig.programVersion: 19.25-1 > mehConfig.localSkipList: [passphrase, password, _oldweak, _password] > mehConfig.programName: anaconda > mehConfig.fileList: [/tmp/anaconda.log, /tmp/packaging.log, /tmp/program.log, /tmp/storage.log, /tmp/ifcfg.log, /tmp/yum.log, /mnt/sysimage/root/install.log, /proc/cmdline, /tmp/syslog] >isHeadless: False >reIPLMessage: None >_storage: Already dumped (Blivet instance) >dir: None >opts: {'noipv6': False, 'noipv4': False, 'updateSrc': None, 'selinux': True, 'module': [], 'syslog': None, 'leavebootorder': False, 'images': [], 'dmraid': True, 'armPlatform': None, 'memcheck': True, 'iscsi': False, 'ksfile': None, 'dirinstall': False, 'kbdtype': None, 'runres': None, 'xdriver': None, 'display_mode': 'g', 'stage2': 'hd:LABEL=Fedorax2019-Beta-TC4x20x86_64', 'method': None, 'vncpassword': '', 'vnc': False, 'rescue': False, 'noverifyssl': False, 'autostep': False, 'geoloc': None, 'proxy': None, 'dlabel': False, 'vncconnect': None, 'lang': None, 'askmethod': False, 'liveinst': False, 'loglevel': None, 'isHeadless': False, 'eject': True, 'rescue_nomount': False, 'keymap': None, 'mpath': True, 'ibft': True, 'debug': False, 'extlinux': False, 'multiLib': False, 'nofb': None, 'targetArch': None} >proxy: None >Registered callbacks: > > >/tmp/anaconda.log: >03:47:06,649 INFO anaconda: /sbin/anaconda 19.25-1 >03:47:07,190 INFO anaconda: 2097152 kB (2048 MB) are available >03:47:07,190 INFO anaconda: check_memory(): total:2048, needed:512, graphical:512 >03:47:07,233 INFO anaconda: anaconda called with cmdline = ['/sbin/anaconda'] >03:47:07,234 INFO anaconda: Default encoding = utf-8 >03:47:07,778 INFO anaconda: Display mode = g >03:47:07,780 INFO anaconda: 2097152 kB (2048 MB) are available >03:47:07,780 INFO anaconda: check_memory(): total:2048, needed:512, graphical:512 >03:47:09,742 DEBUG anaconda: X server has signalled a successful start. >03:47:09,757 INFO anaconda: Starting window manager, pid 766. >03:47:11,550 INFO anaconda: using only installclass _Fedora >03:47:12,393 INFO anaconda: bootloader GRUB2 on X86 platform >03:47:12,395 INFO anaconda: bootloader GRUB2 on X86 platform >03:47:12,459 DEBUG anaconda: network: devices found ['eth0'] >03:47:12,613 DEBUG anaconda: network: dumping ifcfg file for default autoconnection on eth0 >03:47:12,778 DEBUG anaconda: network: setting autoconnect of eth0 to False >03:47:12,800 DEBUG anaconda: updating hostname localhost.localdomain >03:47:12,835 INFO anaconda: Running Thread: AnaStorageThread (140385544660736) >03:47:12,862 INFO anaconda: Running Thread: AnaWaitForConnectingNMThread (140385536268032) >03:47:12,870 INFO anaconda: Running Thread: AnaPayloadThread (140385527875328) >03:47:13,825 DEBUG anaconda: waiting for connecting NM (dhcp?) >03:47:14,836 DEBUG anaconda: connected, waited 1 seconds >03:47:14,881 DEBUG anaconda: updating hostname localhost.localdomain >03:47:14,882 INFO anaconda: Thread Done: AnaWaitForConnectingNMThread (140385536268032) >03:47:14,887 INFO anaconda: Running Thread: AnaGeolocationRefreshThread (140385515247360) >03:47:14,888 INFO anaconda: Starting geolocation lookup >03:47:14,889 INFO anaconda: Geolocation provider: Fedora MirrorManager >03:47:16,263 INFO anaconda: Geolocation lookup finished in 1.4 seconds >03:47:16,264 INFO anaconda: territory: BR >03:47:16,300 INFO anaconda: Thread Done: AnaGeolocationRefreshThread (140385515247360) >03:47:21,468 INFO anaconda: Thread Done: AnaStorageThread (140385544660736) >03:47:24,945 INFO anaconda: Thread Done: AnaPayloadThread (140385527875328) >03:47:34,872 DEBUG anaconda: network standalone spoke (init): completed: True >03:47:35,009 INFO anaconda: Running Thread: AnaDateTimeThread (140385527875328) >03:47:35,016 INFO anaconda: fs space: 0 B needed: 3 GB >03:47:35,062 INFO anaconda: Thread Done: AnaDateTimeThread (140385527875328) >03:47:35,091 WARN anaconda: /usr/lib64/python2.7/site-packages/gi/types.py:113: Warning: g_object_disconnect: invalid signal spec "button-release-event" > return info.invoke(*args, **kwargs) > >03:47:35,096 INFO anaconda: fs space: 0 B needed: 3 GB >03:47:35,232 INFO anaconda: fs space: 0 B needed: 3 GB >03:47:35,329 INFO anaconda: fs space: 0 B needed: 3 GB >03:47:35,329 INFO anaconda: Running Thread: AnaSourceWatcher (140385536268032) >03:47:35,432 DEBUG anaconda: network: selected device eth0 >03:47:35,470 DEBUG anaconda: updating hostname localhost.localdomain >03:47:35,499 INFO anaconda: fs space: 0 B needed: 3 GB >03:47:35,529 INFO anaconda: Running Thread: AnaSoftwareWatcher (140385443092224) >03:47:35,566 INFO anaconda: Running Thread: AnaStorageWatcher (140385409521408) >03:47:35,638 INFO anaconda: Running Thread: AnaCustomStorageInit (140385426306816) >03:47:35,712 INFO anaconda: Thread Done: AnaStorageWatcher (140385409521408) >03:47:35,961 INFO anaconda: Running Thread: AnaNTPserver0 (140385434699520) >03:47:35,974 INFO anaconda: Running Thread: AnaNTPserver1 (140385527875328) >03:47:36,009 INFO anaconda: Running Thread: AnaNTPserver2 (140385409521408) >03:47:36,066 INFO anaconda: Running Thread: AnaNTPserver3 (140385417914112) >03:47:36,069 INFO anaconda: Thread Done: AnaSourceWatcher (140385536268032) >03:47:36,177 DEBUG anaconda: updating hostname localhost.localdomain >03:47:36,245 DEBUG anaconda: updating hostname localhost.localdomain >03:47:36,481 INFO anaconda: Thread Done: AnaCustomStorageInit (140385426306816) >03:47:37,004 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.datetime_spoke.DatetimeSpoke object at 0x7fae09b348d0> >03:47:37,006 INFO anaconda: setting <pyanaconda.ui.gui.spokes.source.SourceSpoke object at 0x7fae087c8f10> status to: Probing storage... >03:47:37,007 INFO anaconda: setting <pyanaconda.ui.gui.spokes.source.SourceSpoke object at 0x7fae087c8f10> status to: Downloading package metadata... >03:47:37,008 INFO anaconda: setting <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> status to: Downloading package metadata... >03:47:37,009 INFO anaconda: setting <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> status to: Downloading group metadata... >03:47:37,010 INFO anaconda: setting <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> status to: Probing storage... >03:47:37,019 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> >03:47:37,025 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.source.SourceSpoke object at 0x7fae087c8f10> >03:47:37,031 INFO anaconda: setting <pyanaconda.ui.gui.spokes.network.NetworkSpoke object at 0x7fae087c8b50> status to: Wired (eth0) connected >03:47:38,502 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> >03:47:38,502 INFO anaconda: Thread Done: AnaSoftwareWatcher (140385443092224) >03:47:38,502 INFO anaconda: Running Thread: AnaCheckSoftwareThread (140385426306816) >03:47:38,510 INFO anaconda: spoke is not ready: <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> >03:47:38,524 INFO anaconda: spoke is not ready: <pyanaconda.ui.gui.spokes.source.SourceSpoke object at 0x7fae087c8f10> >03:47:38,524 INFO anaconda: setting <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> status to: Checking software dependencies... >03:47:38,540 WARN anaconda: /usr/lib64/python2.7/site-packages/gi/types.py:113: Warning: g_object_set_property: assertion `G_IS_VALUE (value)' failed > return info.invoke(*args, **kwargs) > >03:47:38,541 WARN anaconda: /usr/lib64/python2.7/site-packages/gi/types.py:113: Warning: g_value_unset: assertion `G_IS_VALUE (value)' failed > return info.invoke(*args, **kwargs) > >03:47:42,373 INFO anaconda: Thread Done: AnaNTPserver2 (140385409521408) >03:47:42,564 INFO anaconda: Thread Done: AnaNTPserver1 (140385527875328) >03:47:42,649 INFO anaconda: Thread Done: AnaNTPserver3 (140385417914112) >03:47:42,660 INFO anaconda: Thread Done: AnaNTPserver0 (140385434699520) >03:47:48,746 INFO anaconda: Thread Done: AnaCheckSoftwareThread (140385426306816) >03:47:49,036 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> >03:47:49,067 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.source.SourceSpoke object at 0x7fae087c8f10> >03:47:58,106 INFO anaconda: Running Thread: AnaCheckSoftwareThread (140385544660736) >03:47:59,004 INFO anaconda: spoke is not ready: <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> >03:47:59,012 INFO anaconda: spoke is not ready: <pyanaconda.ui.gui.spokes.source.SourceSpoke object at 0x7fae087c8f10> >03:47:59,016 INFO anaconda: setting <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> status to: Checking software dependencies... >03:47:59,670 INFO anaconda: Thread Done: AnaCheckSoftwareThread (140385544660736) >03:48:00,034 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.software.SoftwareSelectionSpoke object at 0x7fae08794350> >03:48:00,066 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.source.SourceSpoke object at 0x7fae087c8f10> >03:48:05,666 DEBUG anaconda: disk free: 3.87 MB fs free: 0 B sw needs: 622.82 MB auto swap: 4.03 GB >03:48:08,781 INFO anaconda: Running Thread: AnaExecuteStorageThread (140385544660736) >03:48:08,801 DEBUG anaconda: new disk order: [] >03:48:08,922 DEBUG anaconda: stage1 device cannot be of type btrfs subvolume >03:48:08,925 DEBUG anaconda: stage1 device cannot be of type mdarray >03:48:08,926 DEBUG anaconda: stage1 device cannot be of type btrfs volume >03:48:08,927 DEBUG anaconda: stage1 device cannot be of type btrfs subvolume >03:48:08,928 DEBUG anaconda: _is_valid_disklabel(sda) returning True >03:48:08,929 DEBUG anaconda: _is_valid_size(sda) returning True >03:48:08,929 DEBUG anaconda: _is_valid_location(sda) returning True >03:48:08,930 DEBUG anaconda: _is_valid_format(sda) returning True >03:48:08,931 DEBUG anaconda: is_valid_stage1_device(sda) returning True >03:48:08,933 INFO anaconda: Thread Done: AnaExecuteStorageThread (140385544660736) >03:48:09,016 DEBUG anaconda: ui: devices=['/LiveOS/rootfs.img', '/overlay (deleted)', '/run/install/repo/LiveOS/squashfs.img', 'boot', 'dhcppc0:swap', 'fedora_dhcppc0', 'live-rw', 'loop0', 'loop1', 'loop2', 'root', 'sda', 'sda1', 'sda2', 'sdb', 'sdb1', 'sdb2', 'sdc', 'sdc1', 'sdc2', 'sdd', 'sdd1', 'sdd2', 'sr0'] >03:48:09,017 DEBUG anaconda: ui: unused=[] >03:48:09,018 DEBUG anaconda: ui: new_devices=[] >03:48:09,060 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:48:09,061 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:48:09,061 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:48:09,062 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:48:09,142 INFO anaconda: spoke is not ready: <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> >03:48:09,143 INFO anaconda: setting <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> status to: Saving storage configuration... >03:48:09,148 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> >03:48:10,578 DEBUG anaconda: page clicked: Fedora Linux 19 for x86_64 >03:48:10,579 DEBUG anaconda: show first mountpoint: Fedora Linux 19 for x86_64 >03:48:10,590 DEBUG anaconda: populate_right_side: existing 39832MB btrfs subvolume boot (6) with existing btrfs filesystem >03:48:10,590 DEBUG anaconda: updated device_disks to ['sda', 'sdd', 'sdc', 'sdb'] >03:48:10,591 DEBUG anaconda: updated device_container_name to fedora_dhcppc0 >03:48:10,592 DEBUG anaconda: updated device_container_raid_level to single >03:48:10,592 DEBUG anaconda: updated device_container_encrypted to False >03:48:10,593 DEBUG anaconda: updated device_container_size to 9958.0 >03:48:10,598 DEBUG anaconda: fs type changed: btrfs >03:48:10,605 INFO anaconda: getting device type for BTRFS >03:48:10,606 DEBUG anaconda: device_type_changed: 3 BTRFS >03:48:10,613 INFO anaconda: getting device type for BTRFS >03:48:10,614 INFO anaconda: getting device type for BTRFS >03:48:10,615 INFO anaconda: getting device type for BTRFS >03:48:10,616 INFO anaconda: getting device type for BTRFS >03:48:10,618 INFO anaconda: getting device type for BTRFS >03:48:10,620 INFO anaconda: getting device type for BTRFS >03:48:10,621 INFO anaconda: getting device type for BTRFS >03:48:10,623 INFO anaconda: getting device type for BTRFS >03:48:10,624 DEBUG anaconda: populate_raid: 3, single >03:48:10,625 INFO anaconda: getting device type for BTRFS >03:48:10,627 INFO anaconda: getting device type for BTRFS >03:48:10,628 DEBUG anaconda: populate_raid: 3, single >03:48:10,629 INFO anaconda: getting device type for BTRFS >03:48:10,631 DEBUG anaconda: new container selection: fedora_dhcppc0 >03:48:10,631 DEBUG anaconda: default container is fedora_dhcppc0 >03:48:11,359 DEBUG anaconda: current selector: existing 39832MB btrfs subvolume boot (6) with existing btrfs filesystem >03:48:11,361 DEBUG anaconda: notebook page = 1 >03:48:11,363 INFO anaconda: ui: saving changes to device boot >03:48:11,365 DEBUG anaconda: old name: boot >03:48:11,366 DEBUG anaconda: new name: None >03:48:11,375 DEBUG anaconda: old size: 39832.0 >03:48:11,375 DEBUG anaconda: new size: 39832 >03:48:11,376 INFO anaconda: getting device type for BTRFS >03:48:11,377 DEBUG anaconda: old device type: 3 >03:48:11,378 DEBUG anaconda: new device type: 3 >03:48:11,378 DEBUG anaconda: reformat: False >03:48:11,382 DEBUG anaconda: old fs type: btrfs >03:48:11,383 DEBUG anaconda: new fs type: btrfs >03:48:11,384 DEBUG anaconda: old encryption setting: False >03:48:11,384 DEBUG anaconda: new encryption setting: False >03:48:11,385 DEBUG anaconda: old label: >03:48:11,385 DEBUG anaconda: new_label: >03:48:11,386 DEBUG anaconda: old mountpoint: >03:48:11,386 DEBUG anaconda: new mountpoint: >03:48:11,387 DEBUG anaconda: old raid level: single >03:48:11,387 DEBUG anaconda: new raid level: None >03:48:11,389 DEBUG anaconda: old container: fedora_dhcppc0 >03:48:11,390 DEBUG anaconda: new container: fedora_dhcppc0 >03:48:11,390 DEBUG anaconda: old container encrypted: False >03:48:11,391 DEBUG anaconda: new container encrypted: False >03:48:11,391 DEBUG anaconda: old container raid level: single >03:48:11,392 DEBUG anaconda: new container raid level: single >03:48:11,392 DEBUG anaconda: old container size request: 9958.0 >03:48:11,393 DEBUG anaconda: new container size request: 9958.0 >03:48:11,393 DEBUG anaconda: old disks: ['sda', 'sdd', 'sdc', 'sdb'] >03:48:11,394 DEBUG anaconda: new disks: ['sda', 'sdd', 'sdc', 'sdb'] >03:48:11,395 DEBUG anaconda: populate_right_side: existing 39832MB btrfs subvolume boot (6) with existing btrfs filesystem >03:48:11,395 DEBUG anaconda: updated device_disks to ['sda', 'sdd', 'sdc', 'sdb'] >03:48:11,396 DEBUG anaconda: updated device_container_name to fedora_dhcppc0 >03:48:11,397 DEBUG anaconda: updated device_container_raid_level to single >03:48:11,397 DEBUG anaconda: updated device_container_encrypted to False >03:48:11,398 DEBUG anaconda: updated device_container_size to 9958.0 >03:48:11,401 DEBUG anaconda: fs type changed: btrfs >03:48:11,408 INFO anaconda: getting device type for BTRFS >03:48:11,410 DEBUG anaconda: populate_raid: 3, single >03:48:11,410 INFO anaconda: getting device type for BTRFS >03:48:11,412 DEBUG anaconda: new container selection: None >03:48:11,413 DEBUG anaconda: new container selection: None >03:48:11,414 DEBUG anaconda: new container selection: fedora_dhcppc0 >03:48:11,415 DEBUG anaconda: default container is fedora_dhcppc0 >03:48:11,419 DEBUG anaconda: new selector: existing 2039MB mdarray dhcppc0:swap (3) with existing swap >03:48:11,422 DEBUG anaconda: populate_right_side: existing 2039MB mdarray dhcppc0:swap (3) with existing swap >03:48:11,423 DEBUG anaconda: updated device_disks to ['sda', 'sdd', 'sdc', 'sdb'] >03:48:11,423 DEBUG anaconda: updated device_container_name to None >03:48:11,424 DEBUG anaconda: updated device_container_raid_level to None >03:48:11,425 DEBUG anaconda: updated device_container_encrypted to False >03:48:11,425 DEBUG anaconda: updated device_container_size to 0 >03:48:11,431 DEBUG anaconda: fs type changed: swap >03:48:11,437 INFO anaconda: getting device type for None >03:48:11,438 ERR anaconda: unknown device type: 'None' >03:48:11,438 DEBUG anaconda: device_type_changed: None None >03:48:11,441 INFO anaconda: getting device type for RAID >03:48:11,442 DEBUG anaconda: device_type_changed: 1 RAID >03:48:11,443 INFO anaconda: getting device type for RAID >03:48:11,444 INFO anaconda: getting device type for RAID >03:48:11,445 INFO anaconda: getting device type for RAID >03:48:11,446 INFO anaconda: getting device type for RAID >03:48:11,447 INFO anaconda: getting device type for RAID >03:48:11,449 INFO anaconda: getting device type for RAID >03:48:11,450 INFO anaconda: getting device type for RAID >03:48:11,451 INFO anaconda: getting device type for RAID >03:48:11,452 INFO anaconda: getting device type for RAID >03:48:11,453 INFO anaconda: getting device type for RAID >03:48:11,454 INFO anaconda: getting device type for RAID >03:48:11,455 INFO anaconda: getting device type for RAID >03:48:11,456 DEBUG anaconda: populate_raid: 1, raid1 >03:48:11,457 INFO anaconda: getting device type for RAID >03:48:11,459 INFO anaconda: getting device type for RAID >03:48:11,460 DEBUG anaconda: populate_raid: 1, raid1 >03:48:11,461 INFO anaconda: getting device type for RAID >03:48:12,660 DEBUG anaconda: removing device 'existing 2039MB mdarray dhcppc0:swap (3) with existing swap' from page Fedora Linux 19 for x86_64 >03:48:15,054 INFO anaconda: ui: removed device dhcppc0:swap >03:48:15,075 DEBUG anaconda: ui: devices=['/LiveOS/rootfs.img', '/overlay (deleted)', '/run/install/repo/LiveOS/squashfs.img', 'live-rw', 'loop0', 'loop1', 'loop2', 'sda', 'sdb', 'sdc', 'sdd', 'sr0'] >03:48:15,076 DEBUG anaconda: ui: unused=[] >03:48:15,076 DEBUG anaconda: ui: new_devices=[] >03:48:15,082 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:48:15,083 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:48:15,084 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:48:15,085 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:48:28,829 DEBUG anaconda: requested size = 768 MB ; available space = 47.99 GB >03:48:29,166 DEBUG anaconda: ui: devices=['/LiveOS/rootfs.img', '/overlay (deleted)', '/run/install/repo/LiveOS/squashfs.img', 'live-rw', 'loop0', 'loop1', 'loop2', 'sda', 'sda1', 'sdb', 'sdc', 'sdd', 'sr0'] >03:48:29,167 DEBUG anaconda: ui: unused=[] >03:48:29,167 DEBUG anaconda: ui: new_devices=['sda1'] >03:48:29,173 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:48:29,173 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:48:29,175 DEBUG anaconda: populate_right_side: non-existent 768MB partition sda1 (24) with non-existent swap >03:48:29,176 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:48:29,176 DEBUG anaconda: updated device_container_name to None >03:48:29,177 DEBUG anaconda: updated device_container_raid_level to None >03:48:29,178 DEBUG anaconda: updated device_container_encrypted to False >03:48:29,178 DEBUG anaconda: updated device_container_size to 0 >03:48:29,181 INFO anaconda: getting device type for RAID >03:48:29,190 INFO anaconda: getting device type for Standard Partition >03:48:29,191 DEBUG anaconda: device_type_changed: 2 Standard Partition >03:48:29,192 INFO anaconda: getting device type for Standard Partition >03:48:29,193 INFO anaconda: getting device type for Standard Partition >03:48:29,194 INFO anaconda: getting device type for Standard Partition >03:48:29,197 INFO anaconda: getting device type for Standard Partition >03:48:29,198 INFO anaconda: getting device type for Standard Partition >03:48:29,200 INFO anaconda: getting device type for Standard Partition >03:48:29,201 INFO anaconda: getting device type for Standard Partition >03:48:29,203 INFO anaconda: getting device type for Standard Partition >03:48:29,204 DEBUG anaconda: populate_raid: 2, None >03:48:29,204 INFO anaconda: getting device type for Standard Partition >03:48:29,206 INFO anaconda: getting device type for Standard Partition >03:48:29,207 DEBUG anaconda: populate_raid: 2, None >03:48:29,208 INFO anaconda: getting device type for Standard Partition >03:48:29,210 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:48:29,211 DEBUG anaconda: current selector: non-existent 768MB partition sda1 (24) with non-existent swap >03:48:29,212 DEBUG anaconda: notebook page = 1 >03:48:29,212 INFO anaconda: ui: saving changes to device sda1 >03:48:29,213 DEBUG anaconda: old name: sda1 >03:48:29,214 DEBUG anaconda: new name: None >03:48:29,217 DEBUG anaconda: old size: 768.0 >03:48:29,218 DEBUG anaconda: new size: 768 >03:48:29,218 INFO anaconda: getting device type for Standard Partition >03:48:29,219 DEBUG anaconda: old device type: 2 >03:48:29,220 DEBUG anaconda: new device type: 2 >03:48:29,220 DEBUG anaconda: reformat: True >03:48:29,224 DEBUG anaconda: old fs type: swap >03:48:29,224 DEBUG anaconda: new fs type: swap >03:48:29,225 DEBUG anaconda: old encryption setting: False >03:48:29,225 DEBUG anaconda: new encryption setting: False >03:48:29,226 DEBUG anaconda: old label: >03:48:29,226 DEBUG anaconda: new_label: >03:48:29,227 DEBUG anaconda: old mountpoint: >03:48:29,228 DEBUG anaconda: new mountpoint: None >03:48:29,228 DEBUG anaconda: old raid level: None >03:48:29,229 DEBUG anaconda: new raid level: None >03:48:29,230 DEBUG anaconda: old container: None >03:48:29,231 DEBUG anaconda: new container: None >03:48:29,231 DEBUG anaconda: old container encrypted: False >03:48:29,232 DEBUG anaconda: new container encrypted: False >03:48:29,232 DEBUG anaconda: old container raid level: None >03:48:29,233 DEBUG anaconda: new container raid level: None >03:48:29,233 DEBUG anaconda: old container size request: 0 >03:48:29,234 DEBUG anaconda: new container size request: 0 >03:48:29,235 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:29,235 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:29,402 DEBUG anaconda: populate_right_side: non-existent 768MB partition sda1 (24) with non-existent swap >03:48:29,403 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:48:29,403 DEBUG anaconda: updated device_container_name to None >03:48:29,403 DEBUG anaconda: updated device_container_raid_level to None >03:48:29,404 DEBUG anaconda: updated device_container_encrypted to False >03:48:29,404 DEBUG anaconda: updated device_container_size to 0 >03:48:29,409 INFO anaconda: getting device type for Standard Partition >03:48:29,410 DEBUG anaconda: populate_raid: 2, None >03:48:29,410 INFO anaconda: getting device type for Standard Partition >03:48:29,412 DEBUG anaconda: leaving save_right_side >03:48:29,412 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:48:29,412 DEBUG anaconda: populate_right_side: non-existent 768MB partition sda1 (24) with non-existent swap >03:48:29,413 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:48:29,413 DEBUG anaconda: updated device_container_name to None >03:48:29,414 DEBUG anaconda: updated device_container_raid_level to None >03:48:29,414 DEBUG anaconda: updated device_container_encrypted to False >03:48:29,414 DEBUG anaconda: updated device_container_size to 0 >03:48:29,419 INFO anaconda: getting device type for Standard Partition >03:48:29,420 DEBUG anaconda: populate_raid: 2, None >03:48:29,421 INFO anaconda: getting device type for Standard Partition >03:48:31,638 INFO anaconda: getting device type for RAID >03:48:31,641 DEBUG anaconda: device_type_changed: 1 RAID >03:48:31,642 INFO anaconda: getting device type for RAID >03:48:31,644 INFO anaconda: getting device type for RAID >03:48:31,645 INFO anaconda: getting device type for RAID >03:48:31,646 INFO anaconda: getting device type for RAID >03:48:31,647 INFO anaconda: getting device type for RAID >03:48:31,648 INFO anaconda: getting device type for RAID >03:48:31,649 INFO anaconda: getting device type for RAID >03:48:31,650 INFO anaconda: getting device type for RAID >03:48:31,651 INFO anaconda: getting device type for RAID >03:48:31,652 INFO anaconda: getting device type for RAID >03:48:31,653 INFO anaconda: getting device type for RAID >03:48:31,654 INFO anaconda: getting device type for RAID >03:48:31,655 INFO anaconda: getting device type for RAID >03:48:31,656 DEBUG anaconda: populate_raid: 1, raid1 >03:48:31,657 INFO anaconda: getting device type for RAID >03:48:37,690 INFO anaconda: ui: saving changes to device sda1 >03:48:37,693 DEBUG anaconda: old name: sda1 >03:48:37,693 DEBUG anaconda: new name: swap >03:48:37,702 DEBUG anaconda: old size: 768.0 >03:48:37,704 DEBUG anaconda: new size: 768 >03:48:37,705 INFO anaconda: getting device type for RAID >03:48:37,708 DEBUG anaconda: old device type: 2 >03:48:37,709 DEBUG anaconda: new device type: 1 >03:48:37,710 DEBUG anaconda: reformat: True >03:48:37,712 DEBUG anaconda: old fs type: swap >03:48:37,712 DEBUG anaconda: new fs type: swap >03:48:37,713 DEBUG anaconda: old encryption setting: False >03:48:37,713 DEBUG anaconda: new encryption setting: False >03:48:37,713 DEBUG anaconda: old label: >03:48:37,714 DEBUG anaconda: new_label: >03:48:37,714 DEBUG anaconda: old mountpoint: >03:48:37,714 DEBUG anaconda: new mountpoint: None >03:48:37,715 DEBUG anaconda: old raid level: None >03:48:37,715 DEBUG anaconda: new raid level: raid10 >03:48:37,716 DEBUG anaconda: old container: None >03:48:37,716 DEBUG anaconda: new container: None >03:48:37,717 DEBUG anaconda: old container encrypted: False >03:48:37,717 DEBUG anaconda: new container encrypted: False >03:48:37,717 DEBUG anaconda: old container raid level: None >03:48:37,717 DEBUG anaconda: new container raid level: None >03:48:37,718 DEBUG anaconda: old container size request: 0 >03:48:37,718 DEBUG anaconda: new container size request: 0 >03:48:37,718 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:37,719 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:38,197 DEBUG anaconda: populate_right_side: non-existent 768MB mdarray swap (29) with non-existent swap >03:48:38,198 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:48:38,198 DEBUG anaconda: updated device_container_name to None >03:48:38,199 DEBUG anaconda: updated device_container_raid_level to None >03:48:38,199 DEBUG anaconda: updated device_container_encrypted to False >03:48:38,199 DEBUG anaconda: updated device_container_size to 0 >03:48:38,207 INFO anaconda: getting device type for RAID >03:48:38,208 DEBUG anaconda: populate_raid: 1, raid10 >03:48:38,209 INFO anaconda: getting device type for RAID >03:48:38,210 DEBUG anaconda: leaving save_right_side >03:48:39,753 INFO anaconda: ui: saving changes to device swap >03:48:39,755 DEBUG anaconda: old name: swap >03:48:39,758 DEBUG anaconda: new name: swap >03:48:39,771 DEBUG anaconda: old size: 768.0 >03:48:39,771 DEBUG anaconda: new size: 768 >03:48:39,772 INFO anaconda: getting device type for RAID >03:48:39,773 DEBUG anaconda: old device type: 1 >03:48:39,773 DEBUG anaconda: new device type: 1 >03:48:39,774 DEBUG anaconda: reformat: True >03:48:39,777 DEBUG anaconda: old fs type: swap >03:48:39,778 DEBUG anaconda: new fs type: swap >03:48:39,779 DEBUG anaconda: old encryption setting: False >03:48:39,779 DEBUG anaconda: new encryption setting: False >03:48:39,780 DEBUG anaconda: old label: >03:48:39,780 DEBUG anaconda: new_label: >03:48:39,781 DEBUG anaconda: old mountpoint: >03:48:39,781 DEBUG anaconda: new mountpoint: None >03:48:39,782 DEBUG anaconda: old raid level: raid10 >03:48:39,783 DEBUG anaconda: new raid level: raid10 >03:48:39,784 DEBUG anaconda: old container: None >03:48:39,785 DEBUG anaconda: new container: None >03:48:39,785 DEBUG anaconda: old container encrypted: False >03:48:39,786 DEBUG anaconda: new container encrypted: False >03:48:39,786 DEBUG anaconda: old container raid level: None >03:48:39,787 DEBUG anaconda: new container raid level: None >03:48:39,787 DEBUG anaconda: old container size request: 0 >03:48:39,788 DEBUG anaconda: new container size request: 0 >03:48:39,789 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:39,789 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:40,312 DEBUG anaconda: populate_right_side: non-existent 768MB mdarray swap (29) with non-existent swap >03:48:40,313 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:48:40,313 DEBUG anaconda: updated device_container_name to None >03:48:40,314 DEBUG anaconda: updated device_container_raid_level to None >03:48:40,314 DEBUG anaconda: updated device_container_encrypted to False >03:48:40,314 DEBUG anaconda: updated device_container_size to 0 >03:48:40,322 INFO anaconda: getting device type for RAID >03:48:40,323 DEBUG anaconda: populate_raid: 1, raid10 >03:48:40,324 INFO anaconda: getting device type for RAID >03:48:40,325 DEBUG anaconda: leaving save_right_side >03:48:47,270 DEBUG anaconda: requested size = 512 MB ; available space = 46.46 GB >03:48:47,738 DEBUG anaconda: ui: devices=['/LiveOS/rootfs.img', '/overlay (deleted)', '/run/install/repo/LiveOS/squashfs.img', 'live-rw', 'loop0', 'loop1', 'loop2', 'sda', 'sda1', 'sda2', 'sdb', 'sdb1', 'sdc', 'sdc1', 'sdd', 'sdd1', 'sr0', 'swap'] >03:48:47,739 DEBUG anaconda: ui: unused=[] >03:48:47,739 DEBUG anaconda: ui: new_devices=['sda1', 'swap'] >03:48:47,749 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:48:47,750 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:48:47,750 DEBUG anaconda: populate_right_side: non-existent 512MB partition sda1 (30) with non-existent ext4 filesystem mounted at /boot >03:48:47,751 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:48:47,751 DEBUG anaconda: updated device_container_name to None >03:48:47,751 DEBUG anaconda: updated device_container_raid_level to None >03:48:47,752 DEBUG anaconda: updated device_container_encrypted to False >03:48:47,752 DEBUG anaconda: updated device_container_size to 0 >03:48:47,755 DEBUG anaconda: fs type changed: ext4 >03:48:47,762 INFO anaconda: getting device type for Standard Partition >03:48:47,763 DEBUG anaconda: device_type_changed: 2 Standard Partition >03:48:47,764 INFO anaconda: getting device type for Standard Partition >03:48:47,765 INFO anaconda: getting device type for Standard Partition >03:48:47,767 INFO anaconda: getting device type for Standard Partition >03:48:47,768 INFO anaconda: getting device type for Standard Partition >03:48:47,770 INFO anaconda: getting device type for Standard Partition >03:48:47,771 INFO anaconda: getting device type for Standard Partition >03:48:47,773 INFO anaconda: getting device type for Standard Partition >03:48:47,774 INFO anaconda: getting device type for Standard Partition >03:48:47,776 DEBUG anaconda: populate_raid: 2, None >03:48:47,776 INFO anaconda: getting device type for Standard Partition >03:48:47,778 INFO anaconda: getting device type for Standard Partition >03:48:47,779 DEBUG anaconda: populate_raid: 2, None >03:48:47,779 INFO anaconda: getting device type for Standard Partition >03:48:47,782 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:48:47,782 DEBUG anaconda: current selector: non-existent 512MB partition sda1 (30) with non-existent ext4 filesystem mounted at /boot >03:48:47,783 DEBUG anaconda: notebook page = 1 >03:48:47,783 INFO anaconda: ui: saving changes to device sda1 >03:48:47,784 DEBUG anaconda: old name: sda1 >03:48:47,784 DEBUG anaconda: new name: None >03:48:47,787 DEBUG anaconda: old size: 512.0 >03:48:47,788 DEBUG anaconda: new size: 512 >03:48:47,788 INFO anaconda: getting device type for Standard Partition >03:48:47,789 DEBUG anaconda: old device type: 2 >03:48:47,789 DEBUG anaconda: new device type: 2 >03:48:47,790 DEBUG anaconda: reformat: True >03:48:47,793 DEBUG anaconda: old fs type: ext4 >03:48:47,793 DEBUG anaconda: new fs type: ext4 >03:48:47,794 DEBUG anaconda: old encryption setting: False >03:48:47,794 DEBUG anaconda: new encryption setting: False >03:48:47,794 DEBUG anaconda: old label: >03:48:47,795 DEBUG anaconda: new_label: >03:48:47,795 DEBUG anaconda: old mountpoint: /boot >03:48:47,795 DEBUG anaconda: new mountpoint: /boot >03:48:47,796 DEBUG anaconda: old raid level: None >03:48:47,797 DEBUG anaconda: new raid level: None >03:48:47,798 DEBUG anaconda: old container: None >03:48:47,798 DEBUG anaconda: new container: None >03:48:47,798 DEBUG anaconda: old container encrypted: False >03:48:47,799 DEBUG anaconda: new container encrypted: False >03:48:47,799 DEBUG anaconda: old container raid level: None >03:48:47,800 DEBUG anaconda: new container raid level: None >03:48:47,800 DEBUG anaconda: old container size request: 0 >03:48:47,800 DEBUG anaconda: new container size request: 0 >03:48:47,801 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:47,801 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:47,801 DEBUG anaconda: nothing changed for new device >03:48:47,802 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:48:47,802 DEBUG anaconda: populate_right_side: non-existent 512MB partition sda1 (30) with non-existent ext4 filesystem mounted at /boot >03:48:47,803 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:48:47,803 DEBUG anaconda: updated device_container_name to None >03:48:47,804 DEBUG anaconda: updated device_container_raid_level to None >03:48:47,804 DEBUG anaconda: updated device_container_encrypted to False >03:48:47,804 DEBUG anaconda: updated device_container_size to 0 >03:48:47,811 INFO anaconda: getting device type for Standard Partition >03:48:47,812 DEBUG anaconda: populate_raid: 2, None >03:48:47,813 INFO anaconda: getting device type for Standard Partition >03:48:51,186 INFO anaconda: getting device type for RAID >03:48:51,187 DEBUG anaconda: device_type_changed: 1 RAID >03:48:51,188 INFO anaconda: getting device type for RAID >03:48:51,189 INFO anaconda: getting device type for RAID >03:48:51,190 INFO anaconda: getting device type for RAID >03:48:51,192 INFO anaconda: getting device type for RAID >03:48:51,193 INFO anaconda: getting device type for RAID >03:48:51,194 INFO anaconda: getting device type for RAID >03:48:51,195 INFO anaconda: getting device type for RAID >03:48:51,196 INFO anaconda: getting device type for RAID >03:48:51,197 INFO anaconda: getting device type for RAID >03:48:51,199 INFO anaconda: getting device type for RAID >03:48:51,200 INFO anaconda: getting device type for RAID >03:48:51,201 INFO anaconda: getting device type for RAID >03:48:51,202 INFO anaconda: getting device type for RAID >03:48:51,203 DEBUG anaconda: populate_raid: 1, raid1 >03:48:51,204 INFO anaconda: getting device type for RAID >03:48:59,595 INFO anaconda: ui: saving changes to device sda1 >03:48:59,597 DEBUG anaconda: old name: sda1 >03:48:59,599 DEBUG anaconda: new name: boot >03:48:59,605 DEBUG anaconda: old size: 512.0 >03:48:59,605 DEBUG anaconda: new size: 512 >03:48:59,606 INFO anaconda: getting device type for RAID >03:48:59,607 DEBUG anaconda: old device type: 2 >03:48:59,607 DEBUG anaconda: new device type: 1 >03:48:59,608 DEBUG anaconda: reformat: True >03:48:59,611 DEBUG anaconda: old fs type: ext4 >03:48:59,612 DEBUG anaconda: new fs type: ext4 >03:48:59,613 DEBUG anaconda: old encryption setting: False >03:48:59,613 DEBUG anaconda: new encryption setting: False >03:48:59,614 DEBUG anaconda: old label: >03:48:59,614 DEBUG anaconda: new_label: >03:48:59,615 DEBUG anaconda: old mountpoint: /boot >03:48:59,615 DEBUG anaconda: new mountpoint: /boot >03:48:59,617 DEBUG anaconda: old raid level: None >03:48:59,617 DEBUG anaconda: new raid level: raid1 >03:48:59,618 DEBUG anaconda: old container: None >03:48:59,619 DEBUG anaconda: new container: None >03:48:59,619 DEBUG anaconda: old container encrypted: False >03:48:59,620 DEBUG anaconda: new container encrypted: False >03:48:59,620 DEBUG anaconda: old container raid level: None >03:48:59,621 DEBUG anaconda: new container raid level: None >03:48:59,621 DEBUG anaconda: old container size request: 0 >03:48:59,622 DEBUG anaconda: new container size request: 0 >03:48:59,623 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:59,623 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:00,423 DEBUG anaconda: populate_right_side: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:49:00,423 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:00,424 DEBUG anaconda: updated device_container_name to None >03:49:00,424 DEBUG anaconda: updated device_container_raid_level to None >03:49:00,424 DEBUG anaconda: updated device_container_encrypted to False >03:49:00,425 DEBUG anaconda: updated device_container_size to 0 >03:49:00,433 INFO anaconda: getting device type for RAID >03:49:00,434 DEBUG anaconda: populate_raid: 1, raid1 >03:49:00,435 INFO anaconda: getting device type for RAID >03:49:00,436 DEBUG anaconda: leaving save_right_side >03:49:01,883 INFO anaconda: ui: saving changes to device boot >03:49:01,886 DEBUG anaconda: old name: boot >03:49:01,889 DEBUG anaconda: new name: boot >03:49:01,895 DEBUG anaconda: old size: 512.0 >03:49:01,895 DEBUG anaconda: new size: 512 >03:49:01,896 INFO anaconda: getting device type for RAID >03:49:01,897 DEBUG anaconda: old device type: 1 >03:49:01,897 DEBUG anaconda: new device type: 1 >03:49:01,898 DEBUG anaconda: reformat: True >03:49:01,901 DEBUG anaconda: old fs type: ext4 >03:49:01,902 DEBUG anaconda: new fs type: ext4 >03:49:01,902 DEBUG anaconda: old encryption setting: False >03:49:01,903 DEBUG anaconda: new encryption setting: False >03:49:01,904 DEBUG anaconda: old label: >03:49:01,904 DEBUG anaconda: new_label: >03:49:01,905 DEBUG anaconda: old mountpoint: /boot >03:49:01,905 DEBUG anaconda: new mountpoint: /boot >03:49:01,907 DEBUG anaconda: old raid level: raid1 >03:49:01,907 DEBUG anaconda: new raid level: raid1 >03:49:01,909 DEBUG anaconda: old container: None >03:49:01,909 DEBUG anaconda: new container: None >03:49:01,910 DEBUG anaconda: old container encrypted: False >03:49:01,910 DEBUG anaconda: new container encrypted: False >03:49:01,911 DEBUG anaconda: old container raid level: None >03:49:01,911 DEBUG anaconda: new container raid level: None >03:49:01,912 DEBUG anaconda: old container size request: 0 >03:49:01,912 DEBUG anaconda: new container size request: 0 >03:49:01,913 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:01,913 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:01,914 DEBUG anaconda: nothing changed for new device >03:49:12,542 DEBUG anaconda: requested size = 6 GB ; available space = 44.41 GB >03:49:13,242 DEBUG anaconda: ui: devices=['/LiveOS/rootfs.img', '/overlay (deleted)', '/run/install/repo/LiveOS/squashfs.img', 'boot', 'live-rw', 'loop0', 'loop1', 'loop2', 'sda', 'sda1', 'sda2', 'sda3', 'sdb', 'sdb1', 'sdb2', 'sdc', 'sdc1', 'sdc2', 'sdd', 'sdd1', 'sdd2', 'sr0', 'swap'] >03:49:13,243 DEBUG anaconda: ui: unused=[] >03:49:13,243 DEBUG anaconda: ui: new_devices=['swap', 'boot', 'sda3'] >03:49:13,258 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:49:13,259 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:49:13,261 DEBUG anaconda: populate_right_side: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:49:13,262 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:13,262 DEBUG anaconda: updated device_container_name to None >03:49:13,262 DEBUG anaconda: updated device_container_raid_level to None >03:49:13,263 DEBUG anaconda: updated device_container_encrypted to False >03:49:13,263 DEBUG anaconda: updated device_container_size to 0 >03:49:13,271 INFO anaconda: getting device type for RAID >03:49:13,272 DEBUG anaconda: populate_raid: 1, raid1 >03:49:13,273 INFO anaconda: getting device type for RAID >03:49:13,276 DEBUG anaconda: page clicked: New Fedora 19-Beta-TC4 Installation >03:49:13,278 DEBUG anaconda: current selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:49:13,279 DEBUG anaconda: notebook page = 1 >03:49:13,279 INFO anaconda: ui: saving changes to device boot >03:49:13,280 DEBUG anaconda: old name: boot >03:49:13,280 DEBUG anaconda: new name: boot >03:49:13,285 DEBUG anaconda: old size: 512.0 >03:49:13,286 DEBUG anaconda: new size: 512 >03:49:13,286 INFO anaconda: getting device type for RAID >03:49:13,287 DEBUG anaconda: old device type: 1 >03:49:13,287 DEBUG anaconda: new device type: 1 >03:49:13,288 DEBUG anaconda: reformat: True >03:49:13,291 DEBUG anaconda: old fs type: ext4 >03:49:13,291 DEBUG anaconda: new fs type: ext4 >03:49:13,291 DEBUG anaconda: old encryption setting: False >03:49:13,292 DEBUG anaconda: new encryption setting: False >03:49:13,292 DEBUG anaconda: old label: >03:49:13,292 DEBUG anaconda: new_label: >03:49:13,293 DEBUG anaconda: old mountpoint: /boot >03:49:13,293 DEBUG anaconda: new mountpoint: /boot >03:49:13,294 DEBUG anaconda: old raid level: raid1 >03:49:13,295 DEBUG anaconda: new raid level: raid1 >03:49:13,296 DEBUG anaconda: old container: None >03:49:13,296 DEBUG anaconda: new container: None >03:49:13,297 DEBUG anaconda: old container encrypted: False >03:49:13,297 DEBUG anaconda: new container encrypted: False >03:49:13,297 DEBUG anaconda: old container raid level: None >03:49:13,298 DEBUG anaconda: new container raid level: None >03:49:13,298 DEBUG anaconda: old container size request: 0 >03:49:13,298 DEBUG anaconda: new container size request: 0 >03:49:13,299 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:13,299 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:13,299 DEBUG anaconda: nothing changed for new device >03:49:13,300 DEBUG anaconda: show first mountpoint: New Fedora 19-Beta-TC4 Installation >03:49:13,301 DEBUG anaconda: populate_right_side: non-existent 6000MB partition sda3 (36) with non-existent ext4 filesystem mounted at / >03:49:13,301 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:13,302 DEBUG anaconda: updated device_container_name to None >03:49:13,302 DEBUG anaconda: updated device_container_raid_level to None >03:49:13,302 DEBUG anaconda: updated device_container_encrypted to False >03:49:13,303 DEBUG anaconda: updated device_container_size to 0 >03:49:13,310 INFO anaconda: getting device type for Standard Partition >03:49:13,311 DEBUG anaconda: device_type_changed: 2 Standard Partition >03:49:13,311 INFO anaconda: getting device type for Standard Partition >03:49:13,312 INFO anaconda: getting device type for Standard Partition >03:49:13,314 INFO anaconda: getting device type for Standard Partition >03:49:13,316 INFO anaconda: getting device type for Standard Partition >03:49:13,317 INFO anaconda: getting device type for Standard Partition >03:49:13,319 INFO anaconda: getting device type for Standard Partition >03:49:13,321 INFO anaconda: getting device type for Standard Partition >03:49:13,323 INFO anaconda: getting device type for Standard Partition >03:49:13,324 DEBUG anaconda: populate_raid: 2, None >03:49:13,325 INFO anaconda: getting device type for Standard Partition >03:49:13,327 INFO anaconda: getting device type for Standard Partition >03:49:13,328 DEBUG anaconda: populate_raid: 2, None >03:49:13,329 INFO anaconda: getting device type for Standard Partition >03:49:16,499 INFO anaconda: getting device type for RAID >03:49:16,502 DEBUG anaconda: device_type_changed: 1 RAID >03:49:16,503 INFO anaconda: getting device type for RAID >03:49:16,504 INFO anaconda: getting device type for RAID >03:49:16,505 INFO anaconda: getting device type for RAID >03:49:16,506 INFO anaconda: getting device type for RAID >03:49:16,507 INFO anaconda: getting device type for RAID >03:49:16,508 INFO anaconda: getting device type for RAID >03:49:16,509 INFO anaconda: getting device type for RAID >03:49:16,511 INFO anaconda: getting device type for RAID >03:49:16,512 INFO anaconda: getting device type for RAID >03:49:16,513 INFO anaconda: getting device type for RAID >03:49:16,514 INFO anaconda: getting device type for RAID >03:49:16,515 INFO anaconda: getting device type for RAID >03:49:16,517 INFO anaconda: getting device type for RAID >03:49:16,518 DEBUG anaconda: populate_raid: 1, raid1 >03:49:16,519 INFO anaconda: getting device type for RAID >03:49:19,744 INFO anaconda: ui: saving changes to device sda3 >03:49:19,746 DEBUG anaconda: old name: sda3 >03:49:19,747 DEBUG anaconda: new name: root >03:49:19,755 DEBUG anaconda: old size: 6000.0 >03:49:19,755 DEBUG anaconda: new size: 6000 >03:49:19,756 INFO anaconda: getting device type for RAID >03:49:19,757 DEBUG anaconda: old device type: 2 >03:49:19,757 DEBUG anaconda: new device type: 1 >03:49:19,757 DEBUG anaconda: reformat: True >03:49:19,760 DEBUG anaconda: old fs type: ext4 >03:49:19,760 DEBUG anaconda: new fs type: ext4 >03:49:19,761 DEBUG anaconda: old encryption setting: False >03:49:19,761 DEBUG anaconda: new encryption setting: False >03:49:19,762 DEBUG anaconda: old label: >03:49:19,762 DEBUG anaconda: new_label: >03:49:19,762 DEBUG anaconda: old mountpoint: / >03:49:19,763 DEBUG anaconda: new mountpoint: / >03:49:19,764 DEBUG anaconda: old raid level: None >03:49:19,764 DEBUG anaconda: new raid level: raid10 >03:49:19,765 DEBUG anaconda: old container: None >03:49:19,766 DEBUG anaconda: new container: None >03:49:19,766 DEBUG anaconda: old container encrypted: False >03:49:19,766 DEBUG anaconda: new container encrypted: False >03:49:19,767 DEBUG anaconda: old container raid level: None >03:49:19,767 DEBUG anaconda: new container raid level: None >03:49:19,767 DEBUG anaconda: old container size request: 0 >03:49:19,768 DEBUG anaconda: new container size request: 0 >03:49:19,768 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:19,768 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:20,750 DEBUG anaconda: populate_right_side: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:49:20,750 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:20,751 DEBUG anaconda: updated device_container_name to None >03:49:20,751 DEBUG anaconda: updated device_container_raid_level to None >03:49:20,751 DEBUG anaconda: updated device_container_encrypted to False >03:49:20,752 DEBUG anaconda: updated device_container_size to 0 >03:49:20,760 INFO anaconda: getting device type for RAID >03:49:20,761 DEBUG anaconda: populate_raid: 1, raid10 >03:49:20,761 INFO anaconda: getting device type for RAID >03:49:20,763 DEBUG anaconda: leaving save_right_side >03:49:37,902 INFO anaconda: ui: saving changes to device root >03:49:37,903 DEBUG anaconda: old name: root >03:49:37,904 DEBUG anaconda: new name: root >03:49:37,910 DEBUG anaconda: old size: 6000.0 >03:49:37,911 DEBUG anaconda: new size: 6000 >03:49:37,911 INFO anaconda: getting device type for RAID >03:49:37,912 DEBUG anaconda: old device type: 1 >03:49:37,913 DEBUG anaconda: new device type: 1 >03:49:37,914 DEBUG anaconda: reformat: True >03:49:37,917 DEBUG anaconda: old fs type: ext4 >03:49:37,918 DEBUG anaconda: new fs type: ext4 >03:49:37,918 DEBUG anaconda: old encryption setting: False >03:49:37,919 DEBUG anaconda: new encryption setting: False >03:49:37,919 DEBUG anaconda: old label: >03:49:37,920 DEBUG anaconda: new_label: >03:49:37,920 DEBUG anaconda: old mountpoint: / >03:49:37,921 DEBUG anaconda: new mountpoint: / >03:49:37,922 DEBUG anaconda: old raid level: raid10 >03:49:37,923 DEBUG anaconda: new raid level: raid10 >03:49:37,924 DEBUG anaconda: old container: None >03:49:37,925 DEBUG anaconda: new container: None >03:49:37,925 DEBUG anaconda: old container encrypted: False >03:49:37,926 DEBUG anaconda: new container encrypted: False >03:49:37,927 DEBUG anaconda: old container raid level: None >03:49:37,927 DEBUG anaconda: new container raid level: None >03:49:37,928 DEBUG anaconda: old container size request: 0 >03:49:37,928 DEBUG anaconda: new container size request: 0 >03:49:37,929 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:37,929 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:37,930 DEBUG anaconda: nothing changed for new device >03:49:42,583 DEBUG anaconda: current selector: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:49:42,585 DEBUG anaconda: notebook page = 1 >03:49:42,587 INFO anaconda: ui: saving changes to device root >03:49:42,588 DEBUG anaconda: old name: root >03:49:42,589 DEBUG anaconda: new name: root >03:49:42,596 DEBUG anaconda: old size: 6000.0 >03:49:42,596 DEBUG anaconda: new size: 6000 >03:49:42,597 INFO anaconda: getting device type for RAID >03:49:42,598 DEBUG anaconda: old device type: 1 >03:49:42,599 DEBUG anaconda: new device type: 1 >03:49:42,599 DEBUG anaconda: reformat: True >03:49:42,603 DEBUG anaconda: old fs type: ext4 >03:49:42,603 DEBUG anaconda: new fs type: ext4 >03:49:42,604 DEBUG anaconda: old encryption setting: False >03:49:42,605 DEBUG anaconda: new encryption setting: False >03:49:42,605 DEBUG anaconda: old label: >03:49:42,606 DEBUG anaconda: new_label: >03:49:42,607 DEBUG anaconda: old mountpoint: / >03:49:42,607 DEBUG anaconda: new mountpoint: / >03:49:42,609 DEBUG anaconda: old raid level: raid10 >03:49:42,609 DEBUG anaconda: new raid level: raid10 >03:49:42,611 DEBUG anaconda: old container: None >03:49:42,611 DEBUG anaconda: new container: None >03:49:42,612 DEBUG anaconda: old container encrypted: False >03:49:42,612 DEBUG anaconda: new container encrypted: False >03:49:42,613 DEBUG anaconda: old container raid level: None >03:49:42,614 DEBUG anaconda: new container raid level: None >03:49:42,614 DEBUG anaconda: old container size request: 0 >03:49:42,615 DEBUG anaconda: new container size request: 0 >03:49:42,615 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:42,616 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:42,616 DEBUG anaconda: nothing changed for new device >03:49:42,620 DEBUG anaconda: new selector: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:49:42,623 DEBUG anaconda: populate_right_side: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:49:42,624 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:42,624 DEBUG anaconda: updated device_container_name to None >03:49:42,625 DEBUG anaconda: updated device_container_raid_level to None >03:49:42,626 DEBUG anaconda: updated device_container_encrypted to False >03:49:42,626 DEBUG anaconda: updated device_container_size to 0 >03:49:42,635 INFO anaconda: getting device type for RAID >03:49:42,636 DEBUG anaconda: populate_raid: 1, raid10 >03:49:42,637 INFO anaconda: getting device type for RAID >03:49:43,848 DEBUG anaconda: current selector: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:49:43,850 DEBUG anaconda: notebook page = 1 >03:49:43,852 INFO anaconda: ui: saving changes to device root >03:49:43,854 DEBUG anaconda: old name: root >03:49:43,855 DEBUG anaconda: new name: root >03:49:43,862 DEBUG anaconda: old size: 6000.0 >03:49:43,863 DEBUG anaconda: new size: 6000 >03:49:43,863 INFO anaconda: getting device type for RAID >03:49:43,864 DEBUG anaconda: old device type: 1 >03:49:43,865 DEBUG anaconda: new device type: 1 >03:49:43,865 DEBUG anaconda: reformat: True >03:49:43,869 DEBUG anaconda: old fs type: ext4 >03:49:43,869 DEBUG anaconda: new fs type: ext4 >03:49:43,870 DEBUG anaconda: old encryption setting: False >03:49:43,871 DEBUG anaconda: new encryption setting: False >03:49:43,871 DEBUG anaconda: old label: >03:49:43,872 DEBUG anaconda: new_label: >03:49:43,872 DEBUG anaconda: old mountpoint: / >03:49:43,873 DEBUG anaconda: new mountpoint: / >03:49:43,874 DEBUG anaconda: old raid level: raid10 >03:49:43,874 DEBUG anaconda: new raid level: raid10 >03:49:43,876 DEBUG anaconda: old container: None >03:49:43,876 DEBUG anaconda: new container: None >03:49:43,877 DEBUG anaconda: old container encrypted: False >03:49:43,877 DEBUG anaconda: new container encrypted: False >03:49:43,878 DEBUG anaconda: old container raid level: None >03:49:43,879 DEBUG anaconda: new container raid level: None >03:49:43,879 DEBUG anaconda: old container size request: 0 >03:49:43,880 DEBUG anaconda: new container size request: 0 >03:49:43,880 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:43,881 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:43,881 DEBUG anaconda: nothing changed for new device >03:49:43,884 DEBUG anaconda: new selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:49:43,888 DEBUG anaconda: populate_right_side: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:49:43,888 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:43,889 DEBUG anaconda: updated device_container_name to None >03:49:43,889 DEBUG anaconda: updated device_container_raid_level to None >03:49:43,890 DEBUG anaconda: updated device_container_encrypted to False >03:49:43,891 DEBUG anaconda: updated device_container_size to 0 >03:49:43,900 INFO anaconda: getting device type for RAID >03:49:43,901 DEBUG anaconda: populate_raid: 1, raid1 >03:49:43,901 INFO anaconda: getting device type for RAID >03:49:46,638 DEBUG anaconda: current selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:49:46,640 DEBUG anaconda: notebook page = 1 >03:49:46,641 INFO anaconda: ui: saving changes to device boot >03:49:46,641 DEBUG anaconda: old name: boot >03:49:46,642 DEBUG anaconda: new name: boot >03:49:46,647 DEBUG anaconda: old size: 512.0 >03:49:46,648 DEBUG anaconda: new size: 512 >03:49:46,648 INFO anaconda: getting device type for RAID >03:49:46,649 DEBUG anaconda: old device type: 1 >03:49:46,650 DEBUG anaconda: new device type: 1 >03:49:46,650 DEBUG anaconda: reformat: True >03:49:46,654 DEBUG anaconda: old fs type: ext4 >03:49:46,654 DEBUG anaconda: new fs type: ext4 >03:49:46,655 DEBUG anaconda: old encryption setting: False >03:49:46,655 DEBUG anaconda: new encryption setting: False >03:49:46,656 DEBUG anaconda: old label: >03:49:46,656 DEBUG anaconda: new_label: >03:49:46,657 DEBUG anaconda: old mountpoint: /boot >03:49:46,658 DEBUG anaconda: new mountpoint: /boot >03:49:46,659 DEBUG anaconda: old raid level: raid1 >03:49:46,659 DEBUG anaconda: new raid level: raid1 >03:49:46,661 DEBUG anaconda: old container: None >03:49:46,661 DEBUG anaconda: new container: None >03:49:46,662 DEBUG anaconda: old container encrypted: False >03:49:46,663 DEBUG anaconda: new container encrypted: False >03:49:46,663 DEBUG anaconda: old container raid level: None >03:49:46,664 DEBUG anaconda: new container raid level: None >03:49:46,664 DEBUG anaconda: old container size request: 0 >03:49:46,665 DEBUG anaconda: new container size request: 0 >03:49:46,665 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:46,666 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:46,666 DEBUG anaconda: nothing changed for new device >03:49:46,669 DEBUG anaconda: new selector: non-existent 768MB mdarray swap (29) with non-existent swap >03:49:46,672 DEBUG anaconda: populate_right_side: non-existent 768MB mdarray swap (29) with non-existent swap >03:49:46,673 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:46,674 DEBUG anaconda: updated device_container_name to None >03:49:46,674 DEBUG anaconda: updated device_container_raid_level to None >03:49:46,675 DEBUG anaconda: updated device_container_encrypted to False >03:49:46,676 DEBUG anaconda: updated device_container_size to 0 >03:49:46,681 DEBUG anaconda: fs type changed: swap >03:49:46,689 INFO anaconda: getting device type for RAID >03:49:46,690 DEBUG anaconda: populate_raid: 1, raid10 >03:49:46,691 INFO anaconda: getting device type for RAID >03:49:50,503 DEBUG anaconda: current selector: non-existent 768MB mdarray swap (29) with non-existent swap >03:49:50,505 DEBUG anaconda: notebook page = 1 >03:49:50,507 INFO anaconda: ui: saving changes to device swap >03:49:50,508 DEBUG anaconda: old name: swap >03:49:50,509 DEBUG anaconda: new name: swap >03:49:50,514 DEBUG anaconda: old size: 768.0 >03:49:50,515 DEBUG anaconda: new size: 768 >03:49:50,515 INFO anaconda: getting device type for RAID >03:49:50,516 DEBUG anaconda: old device type: 1 >03:49:50,517 DEBUG anaconda: new device type: 1 >03:49:50,518 DEBUG anaconda: reformat: True >03:49:50,521 DEBUG anaconda: old fs type: swap >03:49:50,521 DEBUG anaconda: new fs type: swap >03:49:50,522 DEBUG anaconda: old encryption setting: False >03:49:50,522 DEBUG anaconda: new encryption setting: False >03:49:50,523 DEBUG anaconda: old label: >03:49:50,524 DEBUG anaconda: new_label: >03:49:50,524 DEBUG anaconda: old mountpoint: >03:49:50,525 DEBUG anaconda: new mountpoint: None >03:49:50,525 DEBUG anaconda: old raid level: raid10 >03:49:50,526 DEBUG anaconda: new raid level: raid10 >03:49:50,527 DEBUG anaconda: old container: None >03:49:50,528 DEBUG anaconda: new container: None >03:49:50,529 DEBUG anaconda: old container encrypted: False >03:49:50,529 DEBUG anaconda: new container encrypted: False >03:49:50,530 DEBUG anaconda: old container raid level: None >03:49:50,530 DEBUG anaconda: new container raid level: None >03:49:50,531 DEBUG anaconda: old container size request: 0 >03:49:50,531 DEBUG anaconda: new container size request: 0 >03:49:50,532 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:50,532 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:51,583 DEBUG anaconda: populate_right_side: non-existent 768MB mdarray swap (29) with non-existent swap >03:49:51,584 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:51,584 DEBUG anaconda: updated device_container_name to None >03:49:51,584 DEBUG anaconda: updated device_container_raid_level to None >03:49:51,585 DEBUG anaconda: updated device_container_encrypted to False >03:49:51,585 DEBUG anaconda: updated device_container_size to 0 >03:49:51,593 INFO anaconda: getting device type for RAID >03:49:51,594 DEBUG anaconda: populate_raid: 1, raid10 >03:49:51,594 INFO anaconda: getting device type for RAID >03:49:51,595 DEBUG anaconda: leaving save_right_side >03:49:51,598 DEBUG anaconda: new selector: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:49:51,601 DEBUG anaconda: populate_right_side: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:49:51,601 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:49:51,602 DEBUG anaconda: updated device_container_name to None >03:49:51,602 DEBUG anaconda: updated device_container_raid_level to None >03:49:51,602 DEBUG anaconda: updated device_container_encrypted to False >03:49:51,603 DEBUG anaconda: updated device_container_size to 0 >03:49:51,608 DEBUG anaconda: fs type changed: ext4 >03:49:51,615 INFO anaconda: getting device type for RAID >03:49:51,616 DEBUG anaconda: populate_raid: 1, raid10 >03:49:51,616 INFO anaconda: getting device type for RAID >03:50:38,541 DEBUG anaconda: current selector: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:50:38,542 DEBUG anaconda: notebook page = 1 >03:50:38,543 INFO anaconda: ui: saving changes to device root >03:50:38,543 DEBUG anaconda: old name: root >03:50:38,544 DEBUG anaconda: new name: root >03:50:38,550 DEBUG anaconda: old size: 6000.0 >03:50:38,551 DEBUG anaconda: new size: 6000 >03:50:38,552 INFO anaconda: getting device type for RAID >03:50:38,553 DEBUG anaconda: old device type: 1 >03:50:38,553 DEBUG anaconda: new device type: 1 >03:50:38,554 DEBUG anaconda: reformat: True >03:50:38,557 DEBUG anaconda: old fs type: ext4 >03:50:38,558 DEBUG anaconda: new fs type: ext4 >03:50:38,558 DEBUG anaconda: old encryption setting: False >03:50:38,559 DEBUG anaconda: new encryption setting: False >03:50:38,559 DEBUG anaconda: old label: >03:50:38,560 DEBUG anaconda: new_label: >03:50:38,561 DEBUG anaconda: old mountpoint: / >03:50:38,561 DEBUG anaconda: new mountpoint: / >03:50:38,563 DEBUG anaconda: old raid level: raid10 >03:50:38,563 DEBUG anaconda: new raid level: raid10 >03:50:38,564 DEBUG anaconda: old container: None >03:50:38,565 DEBUG anaconda: new container: None >03:50:38,565 DEBUG anaconda: old container encrypted: False >03:50:38,566 DEBUG anaconda: new container encrypted: False >03:50:38,567 DEBUG anaconda: old container raid level: None >03:50:38,567 DEBUG anaconda: new container raid level: None >03:50:38,568 DEBUG anaconda: old container size request: 0 >03:50:38,568 DEBUG anaconda: new container size request: 0 >03:50:38,569 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:38,570 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:38,570 DEBUG anaconda: nothing changed for new device >03:50:38,573 DEBUG anaconda: new selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:38,576 DEBUG anaconda: populate_right_side: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:38,577 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:50:38,578 DEBUG anaconda: updated device_container_name to None >03:50:38,578 DEBUG anaconda: updated device_container_raid_level to None >03:50:38,579 DEBUG anaconda: updated device_container_encrypted to False >03:50:38,579 DEBUG anaconda: updated device_container_size to 0 >03:50:38,588 INFO anaconda: getting device type for RAID >03:50:38,589 DEBUG anaconda: populate_raid: 1, raid1 >03:50:38,590 INFO anaconda: getting device type for RAID >03:50:44,042 INFO anaconda: ui: saving changes to device boot >03:50:44,043 DEBUG anaconda: old name: boot >03:50:44,043 DEBUG anaconda: new name: boot >03:50:44,049 DEBUG anaconda: old size: 512.0 >03:50:44,049 DEBUG anaconda: new size: 512 >03:50:44,050 INFO anaconda: getting device type for RAID >03:50:44,051 DEBUG anaconda: old device type: 1 >03:50:44,051 DEBUG anaconda: new device type: 1 >03:50:44,052 DEBUG anaconda: reformat: True >03:50:44,055 DEBUG anaconda: old fs type: ext4 >03:50:44,056 DEBUG anaconda: new fs type: ext4 >03:50:44,057 DEBUG anaconda: old encryption setting: False >03:50:44,057 DEBUG anaconda: new encryption setting: False >03:50:44,058 DEBUG anaconda: old label: >03:50:44,058 DEBUG anaconda: new_label: >03:50:44,059 DEBUG anaconda: old mountpoint: /boot >03:50:44,059 DEBUG anaconda: new mountpoint: /boot >03:50:44,061 DEBUG anaconda: old raid level: raid1 >03:50:44,061 DEBUG anaconda: new raid level: raid1 >03:50:44,063 DEBUG anaconda: old container: None >03:50:44,063 DEBUG anaconda: new container: None >03:50:44,064 DEBUG anaconda: old container encrypted: False >03:50:44,064 DEBUG anaconda: new container encrypted: False >03:50:44,065 DEBUG anaconda: old container raid level: None >03:50:44,065 DEBUG anaconda: new container raid level: None >03:50:44,066 DEBUG anaconda: old container size request: 0 >03:50:44,067 DEBUG anaconda: new container size request: 0 >03:50:44,067 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:44,068 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:44,068 DEBUG anaconda: nothing changed for new device >03:50:54,851 DEBUG anaconda: current selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:54,853 DEBUG anaconda: notebook page = 1 >03:50:54,855 INFO anaconda: ui: saving changes to device boot >03:50:54,856 DEBUG anaconda: old name: boot >03:50:54,857 DEBUG anaconda: new name: boot >03:50:54,862 DEBUG anaconda: old size: 512.0 >03:50:54,863 DEBUG anaconda: new size: 512 >03:50:54,864 INFO anaconda: getting device type for RAID >03:50:54,865 DEBUG anaconda: old device type: 1 >03:50:54,865 DEBUG anaconda: new device type: 1 >03:50:54,866 DEBUG anaconda: reformat: True >03:50:54,869 DEBUG anaconda: old fs type: ext4 >03:50:54,870 DEBUG anaconda: new fs type: ext4 >03:50:54,870 DEBUG anaconda: old encryption setting: False >03:50:54,871 DEBUG anaconda: new encryption setting: False >03:50:54,872 DEBUG anaconda: old label: >03:50:54,872 DEBUG anaconda: new_label: >03:50:54,873 DEBUG anaconda: old mountpoint: /boot >03:50:54,873 DEBUG anaconda: new mountpoint: /boot >03:50:54,875 DEBUG anaconda: old raid level: raid1 >03:50:54,875 DEBUG anaconda: new raid level: raid1 >03:50:54,877 DEBUG anaconda: old container: None >03:50:54,877 DEBUG anaconda: new container: None >03:50:54,878 DEBUG anaconda: old container encrypted: False >03:50:54,878 DEBUG anaconda: new container encrypted: False >03:50:54,879 DEBUG anaconda: old container raid level: None >03:50:54,880 DEBUG anaconda: new container raid level: None >03:50:54,880 DEBUG anaconda: old container size request: 0 >03:50:54,881 DEBUG anaconda: new container size request: 0 >03:50:54,881 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:54,882 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:54,882 DEBUG anaconda: nothing changed for new device >03:50:54,886 DEBUG anaconda: new selector: non-existent 768MB mdarray swap (29) with non-existent swap >03:50:54,889 DEBUG anaconda: populate_right_side: non-existent 768MB mdarray swap (29) with non-existent swap >03:50:54,890 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:50:54,890 DEBUG anaconda: updated device_container_name to None >03:50:54,891 DEBUG anaconda: updated device_container_raid_level to None >03:50:54,891 DEBUG anaconda: updated device_container_encrypted to False >03:50:54,892 DEBUG anaconda: updated device_container_size to 0 >03:50:54,898 DEBUG anaconda: fs type changed: swap >03:50:54,905 INFO anaconda: getting device type for RAID >03:50:54,906 DEBUG anaconda: populate_raid: 1, raid10 >03:50:54,907 INFO anaconda: getting device type for RAID >03:50:56,870 DEBUG anaconda: current selector: non-existent 768MB mdarray swap (29) with non-existent swap >03:50:56,871 DEBUG anaconda: notebook page = 1 >03:50:56,872 INFO anaconda: ui: saving changes to device swap >03:50:56,872 DEBUG anaconda: old name: swap >03:50:56,873 DEBUG anaconda: new name: swap >03:50:56,878 DEBUG anaconda: old size: 768.0 >03:50:56,879 DEBUG anaconda: new size: 768 >03:50:56,880 INFO anaconda: getting device type for RAID >03:50:56,880 DEBUG anaconda: old device type: 1 >03:50:56,881 DEBUG anaconda: new device type: 1 >03:50:56,881 DEBUG anaconda: reformat: True >03:50:56,884 DEBUG anaconda: old fs type: swap >03:50:56,885 DEBUG anaconda: new fs type: swap >03:50:56,886 DEBUG anaconda: old encryption setting: False >03:50:56,886 DEBUG anaconda: new encryption setting: False >03:50:56,887 DEBUG anaconda: old label: >03:50:56,887 DEBUG anaconda: new_label: >03:50:56,888 DEBUG anaconda: old mountpoint: >03:50:56,888 DEBUG anaconda: new mountpoint: None >03:50:56,889 DEBUG anaconda: old raid level: raid10 >03:50:56,889 DEBUG anaconda: new raid level: raid10 >03:50:56,891 DEBUG anaconda: old container: None >03:50:56,891 DEBUG anaconda: new container: None >03:50:56,892 DEBUG anaconda: old container encrypted: False >03:50:56,893 DEBUG anaconda: new container encrypted: False >03:50:56,893 DEBUG anaconda: old container raid level: None >03:50:56,894 DEBUG anaconda: new container raid level: None >03:50:56,894 DEBUG anaconda: old container size request: 0 >03:50:56,895 DEBUG anaconda: new container size request: 0 >03:50:56,895 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:56,896 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:57,936 DEBUG anaconda: populate_right_side: non-existent 768MB mdarray swap (29) with non-existent swap >03:50:57,936 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:50:57,937 DEBUG anaconda: updated device_container_name to None >03:50:57,937 DEBUG anaconda: updated device_container_raid_level to None >03:50:57,937 DEBUG anaconda: updated device_container_encrypted to False >03:50:57,938 DEBUG anaconda: updated device_container_size to 0 >03:50:57,945 INFO anaconda: getting device type for RAID >03:50:57,946 DEBUG anaconda: populate_raid: 1, raid10 >03:50:57,947 INFO anaconda: getting device type for RAID >03:50:57,948 DEBUG anaconda: leaving save_right_side >03:50:57,950 DEBUG anaconda: new selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:57,953 DEBUG anaconda: populate_right_side: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:57,953 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:50:57,954 DEBUG anaconda: updated device_container_name to None >03:50:57,954 DEBUG anaconda: updated device_container_raid_level to None >03:50:57,954 DEBUG anaconda: updated device_container_encrypted to False >03:50:57,955 DEBUG anaconda: updated device_container_size to 0 >03:50:57,959 DEBUG anaconda: fs type changed: ext4 >03:50:57,966 INFO anaconda: getting device type for RAID >03:50:57,967 DEBUG anaconda: populate_raid: 1, raid1 >03:50:57,967 INFO anaconda: getting device type for RAID >03:50:57,972 DEBUG anaconda: current selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:57,972 DEBUG anaconda: notebook page = 1 >03:50:57,973 INFO anaconda: ui: saving changes to device boot >03:50:57,973 DEBUG anaconda: old name: boot >03:50:57,974 DEBUG anaconda: new name: boot >03:50:57,979 DEBUG anaconda: old size: 512.0 >03:50:57,979 DEBUG anaconda: new size: 512 >03:50:57,979 INFO anaconda: getting device type for RAID >03:50:57,980 DEBUG anaconda: old device type: 1 >03:50:57,980 DEBUG anaconda: new device type: 1 >03:50:57,981 DEBUG anaconda: reformat: True >03:50:57,984 DEBUG anaconda: old fs type: ext4 >03:50:57,984 DEBUG anaconda: new fs type: ext4 >03:50:57,984 DEBUG anaconda: old encryption setting: False >03:50:57,985 DEBUG anaconda: new encryption setting: False >03:50:57,985 DEBUG anaconda: old label: >03:50:57,985 DEBUG anaconda: new_label: >03:50:57,986 DEBUG anaconda: old mountpoint: /boot >03:50:57,986 DEBUG anaconda: new mountpoint: /boot >03:50:57,987 DEBUG anaconda: old raid level: raid1 >03:50:57,987 DEBUG anaconda: new raid level: raid1 >03:50:57,988 DEBUG anaconda: old container: None >03:50:57,989 DEBUG anaconda: new container: None >03:50:57,989 DEBUG anaconda: old container encrypted: False >03:50:57,989 DEBUG anaconda: new container encrypted: False >03:50:57,990 DEBUG anaconda: old container raid level: None >03:50:57,990 DEBUG anaconda: new container raid level: None >03:50:57,990 DEBUG anaconda: old container size request: 0 >03:50:57,990 DEBUG anaconda: new container size request: 0 >03:50:57,991 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:57,991 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:57,991 DEBUG anaconda: nothing changed for new device >03:50:57,994 DEBUG anaconda: new selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:57,996 DEBUG anaconda: populate_right_side: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:57,997 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:50:57,997 DEBUG anaconda: updated device_container_name to None >03:50:57,998 DEBUG anaconda: updated device_container_raid_level to None >03:50:57,998 DEBUG anaconda: updated device_container_encrypted to False >03:50:57,998 DEBUG anaconda: updated device_container_size to 0 >03:50:58,006 INFO anaconda: getting device type for RAID >03:50:58,007 DEBUG anaconda: populate_raid: 1, raid1 >03:50:58,008 INFO anaconda: getting device type for RAID >03:50:58,923 DEBUG anaconda: current selector: non-existent 512MB mdarray boot (35) with non-existent ext4 filesystem mounted at /boot >03:50:58,924 DEBUG anaconda: notebook page = 1 >03:50:58,925 INFO anaconda: ui: saving changes to device boot >03:50:58,927 DEBUG anaconda: old name: boot >03:50:58,928 DEBUG anaconda: new name: boot >03:50:58,934 DEBUG anaconda: old size: 512.0 >03:50:58,934 DEBUG anaconda: new size: 512 >03:50:58,935 INFO anaconda: getting device type for RAID >03:50:58,936 DEBUG anaconda: old device type: 1 >03:50:58,936 DEBUG anaconda: new device type: 1 >03:50:58,936 DEBUG anaconda: reformat: True >03:50:58,939 DEBUG anaconda: old fs type: ext4 >03:50:58,940 DEBUG anaconda: new fs type: ext4 >03:50:58,940 DEBUG anaconda: old encryption setting: False >03:50:58,940 DEBUG anaconda: new encryption setting: False >03:50:58,941 DEBUG anaconda: old label: >03:50:58,941 DEBUG anaconda: new_label: >03:50:58,941 DEBUG anaconda: old mountpoint: /boot >03:50:58,942 DEBUG anaconda: new mountpoint: /boot >03:50:58,943 DEBUG anaconda: old raid level: raid1 >03:50:58,943 DEBUG anaconda: new raid level: raid1 >03:50:58,944 DEBUG anaconda: old container: None >03:50:58,945 DEBUG anaconda: new container: None >03:50:58,945 DEBUG anaconda: old container encrypted: False >03:50:58,945 DEBUG anaconda: new container encrypted: False >03:50:58,946 DEBUG anaconda: old container raid level: None >03:50:58,946 DEBUG anaconda: new container raid level: None >03:50:58,946 DEBUG anaconda: old container size request: 0 >03:50:58,947 DEBUG anaconda: new container size request: 0 >03:50:58,947 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:58,947 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:50:58,948 DEBUG anaconda: nothing changed for new device >03:50:58,950 DEBUG anaconda: new selector: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:50:58,953 DEBUG anaconda: populate_right_side: non-existent 6000MB mdarray root (41) with non-existent ext4 filesystem mounted at / >03:50:58,953 DEBUG anaconda: updated device_disks to ['sda', 'sdb', 'sdc', 'sdd'] >03:50:58,954 DEBUG anaconda: updated device_container_name to None >03:50:58,954 DEBUG anaconda: updated device_container_raid_level to None >03:50:58,954 DEBUG anaconda: updated device_container_encrypted to False >03:50:58,955 DEBUG anaconda: updated device_container_size to 0 >03:50:58,963 INFO anaconda: getting device type for RAID >03:50:58,964 DEBUG anaconda: populate_raid: 1, raid10 >03:50:58,965 INFO anaconda: getting device type for RAID >03:51:00,875 INFO anaconda: ui: saving changes to device root >03:51:00,878 DEBUG anaconda: old name: root >03:51:00,879 DEBUG anaconda: new name: root >03:51:00,884 DEBUG anaconda: old size: 6000.0 >03:51:00,885 DEBUG anaconda: new size: 6000 >03:51:00,885 INFO anaconda: getting device type for RAID >03:51:00,886 DEBUG anaconda: old device type: 1 >03:51:00,886 DEBUG anaconda: new device type: 1 >03:51:00,887 DEBUG anaconda: reformat: True >03:51:00,889 DEBUG anaconda: old fs type: ext4 >03:51:00,890 DEBUG anaconda: new fs type: ext4 >03:51:00,890 DEBUG anaconda: old encryption setting: False >03:51:00,890 DEBUG anaconda: new encryption setting: False >03:51:00,891 DEBUG anaconda: old label: >03:51:00,891 DEBUG anaconda: new_label: >03:51:00,891 DEBUG anaconda: old mountpoint: / >03:51:00,892 DEBUG anaconda: new mountpoint: / >03:51:00,893 DEBUG anaconda: old raid level: raid10 >03:51:00,893 DEBUG anaconda: new raid level: raid10 >03:51:00,894 DEBUG anaconda: old container: None >03:51:00,894 DEBUG anaconda: new container: None >03:51:00,895 DEBUG anaconda: old container encrypted: False >03:51:00,895 DEBUG anaconda: new container encrypted: False >03:51:00,895 DEBUG anaconda: old container raid level: None >03:51:00,896 DEBUG anaconda: new container raid level: None >03:51:00,896 DEBUG anaconda: old container size request: 0 >03:51:00,896 DEBUG anaconda: new container size request: 0 >03:51:00,897 DEBUG anaconda: old disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:51:00,897 DEBUG anaconda: new disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:51:00,897 DEBUG anaconda: nothing changed for new device >03:51:02,487 DEBUG anaconda: stage1 device cannot be of type mdarray >03:51:02,487 DEBUG anaconda: stage1 device cannot be of type mdarray >03:51:02,488 DEBUG anaconda: _is_valid_disklabel(sda) returning True >03:51:02,488 DEBUG anaconda: _is_valid_size(sda) returning True >03:51:02,488 DEBUG anaconda: _is_valid_location(sda) returning True >03:51:02,489 DEBUG anaconda: _is_valid_format(sda) returning True >03:51:02,489 DEBUG anaconda: is_valid_stage1_device(sda) returning True >03:51:02,496 INFO anaconda: Running Thread: AnaCheckStorageThread (140385544660736) >03:51:02,520 DEBUG anaconda: _is_valid_disklabel(sda) returning True >03:51:02,521 DEBUG anaconda: _is_valid_size(sda) returning True >03:51:02,523 DEBUG anaconda: _is_valid_location(sda) returning True >03:51:02,523 DEBUG anaconda: _is_valid_format(sda) returning True >03:51:02,523 DEBUG anaconda: is_valid_stage1_device(sda) returning True >03:51:02,524 DEBUG anaconda: _is_valid_disklabel(boot) returning True >03:51:02,529 DEBUG anaconda: _is_valid_size(boot) returning True >03:51:02,529 DEBUG anaconda: _is_valid_location(boot) returning True >03:51:02,530 DEBUG anaconda: _is_valid_partition(boot) returning True >03:51:02,530 DEBUG anaconda: _is_valid_md(boot) returning True >03:51:02,531 DEBUG anaconda: _is_valid_format(boot) returning True >03:51:02,532 DEBUG anaconda: is_valid_stage2_device(boot) returning True >03:51:02,537 INFO anaconda: Thread Done: AnaCheckStorageThread (140385544660736) >03:51:02,588 INFO anaconda: fs space: 6 GB needed: 622.82 MB >03:51:03,016 INFO anaconda: fs space: 6 GB needed: 622.82 MB >03:51:03,017 INFO anaconda: spoke is not ready: <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> >03:51:03,024 INFO anaconda: fs space: 6 GB needed: 622.82 MB >03:51:03,025 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> >03:51:03,026 INFO anaconda: setting <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> status to: Checking storage configuration... >03:51:03,033 INFO anaconda: fs space: 6 GB needed: 622.82 MB >03:51:03,034 INFO anaconda: spoke is ready: <pyanaconda.ui.gui.spokes.storage.StorageSpoke object at 0x7fae087a0110> >03:51:04,199 INFO anaconda: Running Thread: AnaInstallThread (140385544660736) >03:51:04,503 INFO anaconda: Setting up the installation environment >03:51:26,462 INFO anaconda: Creating disklabel on /dev/sda >03:51:27,325 INFO anaconda: Creating mdmember on /dev/sda3 >03:51:27,698 INFO anaconda: Creating mdmember on /dev/sda2 >03:51:27,832 INFO anaconda: Creating mdmember on /dev/sda1 >03:51:27,979 INFO anaconda: Creating disklabel on /dev/sdd >03:51:28,962 INFO anaconda: Creating mdmember on /dev/sdd3 >03:51:29,130 INFO anaconda: Creating mdmember on /dev/sdd2 >03:51:29,259 INFO anaconda: Creating mdmember on /dev/sdd1 >03:51:29,385 INFO anaconda: Creating disklabel on /dev/sdc >03:51:30,325 INFO anaconda: Creating mdmember on /dev/sdc3 >03:51:30,492 INFO anaconda: Creating mdmember on /dev/sdc2 >03:51:30,616 INFO anaconda: Creating mdmember on /dev/sdc1 >03:51:30,828 INFO anaconda: Creating disklabel on /dev/sdb >03:51:31,752 INFO anaconda: Creating mdmember on /dev/sdb3 >03:51:32,639 INFO anaconda: Creating swap on /dev/md/swap >03:51:32,947 INFO anaconda: Creating mdmember on /dev/sdb2 >03:51:33,774 INFO anaconda: Creating ext4 on /dev/md/boot >03:51:37,413 INFO anaconda: Creating mdmember on /dev/sdb1 >03:51:38,604 INFO anaconda: Creating ext4 on /dev/md/root >03:51:59,697 DEBUG anaconda: running handleException >03:51:59,698 DEBUG anaconda: Gtk running, queuing exception handler to the main loop >03:51:59,705 INFO anaconda: Thread Done: AnaInstallThread (140385544660736) > > >/tmp/packaging.log: >03:47:12,391 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:185 (reset) >03:47:12,392 INFO packaging: have _yum_lock for MainThread >03:47:12,392 DEBUG packaging: getting release version from tree at None (19) >03:47:12,393 DEBUG packaging: got a release version of 19 >03:47:12,393 INFO packaging: gave up _yum_lock for MainThread >03:47:21,470 INFO packaging: updating base repo >03:47:21,474 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:185 (reset) >03:47:21,479 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,574 INFO_2 yum.verbose.YumPlugins: Loaded plugins: blacklist, fastestmirror, langpacks, whiteout >03:47:21,575 INFO_2 yum.verbose.YumPlugins: No plugin match for: fastestmirror >03:47:21,575 INFO_2 yum.verbose.YumPlugins: No plugin match for: langpacks >03:47:21,576 DEBUG yum.verbose.plugin: Adding en_US to language list >03:47:21,584 DEBUG yum.verbose.YumBase: Config time: 0.105 >03:47:21,616 DEBUG packaging: getting release version from tree at None (19) >03:47:21,621 DEBUG packaging: got a release version of 19 >03:47:21,621 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,621 INFO packaging: configuring base repo >03:47:21,636 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:470 (updateBaseRepo) >03:47:21,637 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,637 DEBUG packaging: getting release version from tree at file:///run/install/repo (19) >03:47:21,637 DEBUG packaging: retrieving treeinfo from file:///run/install/repo (proxy: ; sslverify: True) >03:47:21,639 DEBUG packaging: got a release version of 19 >03:47:21,642 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,643 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:759 (_configureBaseRepo) >03:47:21,644 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,659 INFO_2 yum.verbose.YumPlugins: Loaded plugins: blacklist, fastestmirror, langpacks, whiteout >03:47:21,664 INFO_2 yum.verbose.YumPlugins: No plugin match for: fastestmirror >03:47:21,664 INFO_2 yum.verbose.YumPlugins: No plugin match for: langpacks >03:47:21,665 DEBUG yum.verbose.plugin: Adding en_US to language list >03:47:21,667 DEBUG yum.verbose.YumBase: Config time: 0.024 >03:47:21,668 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,671 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:762 (_configureBaseRepo) >03:47:21,672 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,707 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,708 DEBUG packaging: adding yum repo anaconda with baseurl file:///run/install/repo and mirrorlist None >03:47:21,709 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:762 (_configureBaseRepo) >03:47:21,710 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,742 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,744 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:484 (updateBaseRepo) >03:47:21,746 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,746 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,747 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:193 (setup) >03:47:21,748 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,748 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,749 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:193 (setup) >03:47:21,750 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,752 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:943 (_removeYumRepo) >03:47:21,758 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,758 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,759 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:527 (updateBaseRepo) >03:47:21,762 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,762 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,763 DEBUG packaging: disabling repo fedora >03:47:21,764 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:982 (disableRepo) >03:47:21,765 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,769 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,770 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:541 (updateBaseRepo) >03:47:21,770 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,771 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,772 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:943 (_removeYumRepo) >03:47:21,772 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,773 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,774 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:527 (updateBaseRepo) >03:47:21,774 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,779 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,780 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:943 (_removeYumRepo) >03:47:21,780 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,781 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,782 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:527 (updateBaseRepo) >03:47:21,782 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,783 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,783 DEBUG packaging: disabling repo updates-testing >03:47:21,784 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:982 (disableRepo) >03:47:21,789 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,789 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,790 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:541 (updateBaseRepo) >03:47:21,791 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,791 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,792 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:943 (_removeYumRepo) >03:47:21,793 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,793 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,794 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:527 (updateBaseRepo) >03:47:21,799 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,799 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,801 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:943 (_removeYumRepo) >03:47:21,801 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,802 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,803 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:527 (updateBaseRepo) >03:47:21,804 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,804 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,805 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:943 (_removeYumRepo) >03:47:21,808 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,808 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,809 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:527 (updateBaseRepo) >03:47:21,810 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,816 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,816 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,817 INFO packaging: gathering repo metadata >03:47:21,818 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:546 (gatherRepoMetadata) >03:47:21,822 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,822 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,823 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:198 (setup) >03:47:21,824 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,824 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,825 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:198 (setup) >03:47:21,825 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,825 DEBUG packaging: getting repo metadata for anaconda >03:47:21,826 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:551 (gatherRepoMetadata) >03:47:21,826 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,892 DEBUG packaging: getting group info for anaconda >03:47:21,938 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,938 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,939 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:198 (setup) >03:47:21,939 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,939 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:21,940 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:198 (setup) >03:47:21,940 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:21,942 DEBUG packaging: getting repo metadata for updates >03:47:21,943 INFO packaging: about to acquire _yum_lock for AnaPayloadThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:551 (gatherRepoMetadata) >03:47:21,943 INFO packaging: have _yum_lock for AnaPayloadThread >03:47:24,943 DEBUG packaging: getting group info for updates >03:47:24,944 ERR packaging: failed to get groups for repo updates >03:47:24,944 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:24,944 INFO packaging: gave up _yum_lock for AnaPayloadThread >03:47:24,945 INFO packaging: metadata retrieval complete >03:47:35,352 INFO packaging: about to acquire _yum_lock for AnaSourceWatcher at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:565 (_initialize) >03:47:35,353 INFO packaging: have _yum_lock for AnaSourceWatcher >03:47:35,354 INFO packaging: gave up _yum_lock for AnaSourceWatcher >03:47:35,540 INFO packaging: about to acquire _yum_lock for AnaSoftwareWatcher at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1000 (environments) >03:47:35,540 INFO packaging: have _yum_lock for AnaSoftwareWatcher >03:47:35,541 DEBUG yum.verbose.YumBase: Setting up Package Sacks >03:47:35,542 INFO_2 yum.verbose.plugin: Determining fastest mirrors >03:47:36,012 INFO_2 yum.verbose.plugin: * updates: mirror.globo.com >03:47:37,547 DEBUG yum.verbose.YumBase: rpmdb time: 0.000 >03:47:37,608 DEBUG yum.verbose.YumBase: pkgsack time: 2.067 >03:47:37,699 DEBUG yum.verbose.YumBase: group time: 2.158 >03:47:37,701 INFO packaging: gave up _yum_lock for AnaSoftwareWatcher >03:47:37,702 INFO packaging: about to acquire _yum_lock for AnaSoftwareWatcher at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:204 (_initialize) >03:47:37,702 INFO packaging: have _yum_lock for AnaSoftwareWatcher >03:47:37,703 INFO packaging: gave up _yum_lock for AnaSoftwareWatcher >03:47:37,704 INFO packaging: about to acquire _yum_lock for AnaSoftwareWatcher at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1114 (groups) >03:47:37,704 INFO packaging: have _yum_lock for AnaSoftwareWatcher >03:47:37,705 INFO packaging: gave up _yum_lock for AnaSoftwareWatcher >03:47:37,706 INFO packaging: about to acquire _yum_lock for AnaSoftwareWatcher at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:205 (_initialize) >03:47:37,706 INFO packaging: have _yum_lock for AnaSoftwareWatcher >03:47:37,707 INFO packaging: gave up _yum_lock for AnaSoftwareWatcher >03:47:37,709 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1000 (environments) >03:47:37,709 INFO packaging: have _yum_lock for MainThread >03:47:37,710 INFO packaging: gave up _yum_lock for MainThread >03:47:37,711 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:245 (refresh) >03:47:37,711 INFO packaging: have _yum_lock for MainThread >03:47:37,712 INFO packaging: gave up _yum_lock for MainThread >03:47:37,713 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,714 INFO packaging: have _yum_lock for MainThread >03:47:37,714 INFO packaging: gave up _yum_lock for MainThread >03:47:37,715 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,716 INFO packaging: have _yum_lock for MainThread >03:47:37,717 INFO packaging: gave up _yum_lock for MainThread >03:47:37,719 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,720 INFO packaging: have _yum_lock for MainThread >03:47:37,720 INFO packaging: gave up _yum_lock for MainThread >03:47:37,721 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,722 INFO packaging: have _yum_lock for MainThread >03:47:37,722 INFO packaging: gave up _yum_lock for MainThread >03:47:37,724 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,724 INFO packaging: have _yum_lock for MainThread >03:47:37,725 INFO packaging: gave up _yum_lock for MainThread >03:47:37,726 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,726 INFO packaging: have _yum_lock for MainThread >03:47:37,727 INFO packaging: gave up _yum_lock for MainThread >03:47:37,728 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,729 INFO packaging: have _yum_lock for MainThread >03:47:37,729 INFO packaging: gave up _yum_lock for MainThread >03:47:37,730 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,731 INFO packaging: have _yum_lock for MainThread >03:47:37,731 INFO packaging: gave up _yum_lock for MainThread >03:47:37,733 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,733 INFO packaging: have _yum_lock for MainThread >03:47:37,734 INFO packaging: gave up _yum_lock for MainThread >03:47:37,735 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,736 INFO packaging: have _yum_lock for MainThread >03:47:37,736 INFO packaging: gave up _yum_lock for MainThread >03:47:37,738 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,738 INFO packaging: have _yum_lock for MainThread >03:47:37,739 INFO packaging: gave up _yum_lock for MainThread >03:47:37,740 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,741 INFO packaging: have _yum_lock for MainThread >03:47:37,741 INFO packaging: gave up _yum_lock for MainThread >03:47:37,743 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,743 INFO packaging: have _yum_lock for MainThread >03:47:37,744 INFO packaging: gave up _yum_lock for MainThread >03:47:37,745 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,746 INFO packaging: have _yum_lock for MainThread >03:47:37,746 INFO packaging: gave up _yum_lock for MainThread >03:47:37,748 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,748 INFO packaging: have _yum_lock for MainThread >03:47:37,749 INFO packaging: gave up _yum_lock for MainThread >03:47:37,750 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,750 INFO packaging: have _yum_lock for MainThread >03:47:37,751 INFO packaging: gave up _yum_lock for MainThread >03:47:37,753 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,753 INFO packaging: have _yum_lock for MainThread >03:47:37,754 INFO packaging: gave up _yum_lock for MainThread >03:47:37,755 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,755 INFO packaging: have _yum_lock for MainThread >03:47:37,756 INFO packaging: gave up _yum_lock for MainThread >03:47:37,757 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:37,758 INFO packaging: have _yum_lock for MainThread >03:47:37,758 INFO packaging: gave up _yum_lock for MainThread >03:47:37,760 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:37,760 INFO packaging: have _yum_lock for MainThread >03:47:37,761 INFO packaging: gave up _yum_lock for MainThread >03:47:37,763 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1114 (groups) >03:47:37,763 INFO packaging: have _yum_lock for MainThread >03:47:37,764 INFO packaging: gave up _yum_lock for MainThread >03:47:37,765 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:286 (refreshAddons) >03:47:37,765 INFO packaging: have _yum_lock for MainThread >03:47:37,766 INFO packaging: gave up _yum_lock for MainThread >03:47:37,767 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,768 INFO packaging: have _yum_lock for MainThread >03:47:37,768 INFO packaging: gave up _yum_lock for MainThread >03:47:37,769 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,770 INFO packaging: have _yum_lock for MainThread >03:47:37,771 INFO packaging: gave up _yum_lock for MainThread >03:47:37,772 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,772 INFO packaging: have _yum_lock for MainThread >03:47:37,773 INFO packaging: gave up _yum_lock for MainThread >03:47:37,774 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,775 INFO packaging: have _yum_lock for MainThread >03:47:37,775 INFO packaging: gave up _yum_lock for MainThread >03:47:37,776 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,777 INFO packaging: have _yum_lock for MainThread >03:47:37,777 INFO packaging: gave up _yum_lock for MainThread >03:47:37,779 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,779 INFO packaging: have _yum_lock for MainThread >03:47:37,780 INFO packaging: gave up _yum_lock for MainThread >03:47:37,781 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,782 INFO packaging: have _yum_lock for MainThread >03:47:37,782 INFO packaging: gave up _yum_lock for MainThread >03:47:37,783 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,784 INFO packaging: have _yum_lock for MainThread >03:47:37,784 INFO packaging: gave up _yum_lock for MainThread >03:47:37,786 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,786 INFO packaging: have _yum_lock for MainThread >03:47:37,786 INFO packaging: gave up _yum_lock for MainThread >03:47:37,788 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,788 INFO packaging: have _yum_lock for MainThread >03:47:37,789 INFO packaging: gave up _yum_lock for MainThread >03:47:37,790 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,791 INFO packaging: have _yum_lock for MainThread >03:47:37,791 INFO packaging: gave up _yum_lock for MainThread >03:47:37,792 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,793 INFO packaging: have _yum_lock for MainThread >03:47:37,793 INFO packaging: gave up _yum_lock for MainThread >03:47:37,794 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,795 INFO packaging: have _yum_lock for MainThread >03:47:37,795 INFO packaging: gave up _yum_lock for MainThread >03:47:37,797 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,797 INFO packaging: have _yum_lock for MainThread >03:47:37,798 INFO packaging: gave up _yum_lock for MainThread >03:47:37,799 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,800 INFO packaging: have _yum_lock for MainThread >03:47:37,800 INFO packaging: gave up _yum_lock for MainThread >03:47:37,801 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,802 INFO packaging: have _yum_lock for MainThread >03:47:37,802 INFO packaging: gave up _yum_lock for MainThread >03:47:37,804 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,804 INFO packaging: have _yum_lock for MainThread >03:47:37,805 INFO packaging: gave up _yum_lock for MainThread >03:47:37,806 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,807 INFO packaging: have _yum_lock for MainThread >03:47:37,807 INFO packaging: gave up _yum_lock for MainThread >03:47:37,808 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,809 INFO packaging: have _yum_lock for MainThread >03:47:37,809 INFO packaging: gave up _yum_lock for MainThread >03:47:37,811 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,811 INFO packaging: have _yum_lock for MainThread >03:47:37,812 INFO packaging: gave up _yum_lock for MainThread >03:47:37,813 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,814 INFO packaging: have _yum_lock for MainThread >03:47:37,814 INFO packaging: gave up _yum_lock for MainThread >03:47:37,815 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,816 INFO packaging: have _yum_lock for MainThread >03:47:37,816 INFO packaging: gave up _yum_lock for MainThread >03:47:37,818 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,818 INFO packaging: have _yum_lock for MainThread >03:47:37,819 INFO packaging: gave up _yum_lock for MainThread >03:47:37,820 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,820 INFO packaging: have _yum_lock for MainThread >03:47:37,821 INFO packaging: gave up _yum_lock for MainThread >03:47:37,822 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,823 INFO packaging: have _yum_lock for MainThread >03:47:37,823 INFO packaging: gave up _yum_lock for MainThread >03:47:37,824 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,825 INFO packaging: have _yum_lock for MainThread >03:47:37,825 INFO packaging: gave up _yum_lock for MainThread >03:47:37,827 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,827 INFO packaging: have _yum_lock for MainThread >03:47:37,828 INFO packaging: gave up _yum_lock for MainThread >03:47:37,829 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,829 INFO packaging: have _yum_lock for MainThread >03:47:37,830 INFO packaging: gave up _yum_lock for MainThread >03:47:37,831 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,832 INFO packaging: have _yum_lock for MainThread >03:47:37,832 INFO packaging: gave up _yum_lock for MainThread >03:47:37,834 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,834 INFO packaging: have _yum_lock for MainThread >03:47:37,835 INFO packaging: gave up _yum_lock for MainThread >03:47:37,836 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,836 INFO packaging: have _yum_lock for MainThread >03:47:37,837 INFO packaging: gave up _yum_lock for MainThread >03:47:37,838 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,839 INFO packaging: have _yum_lock for MainThread >03:47:37,839 INFO packaging: gave up _yum_lock for MainThread >03:47:37,841 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,842 INFO packaging: have _yum_lock for MainThread >03:47:37,842 INFO packaging: gave up _yum_lock for MainThread >03:47:37,843 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,844 INFO packaging: have _yum_lock for MainThread >03:47:37,844 INFO packaging: gave up _yum_lock for MainThread >03:47:37,845 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,846 INFO packaging: have _yum_lock for MainThread >03:47:37,846 INFO packaging: gave up _yum_lock for MainThread >03:47:37,848 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,848 INFO packaging: have _yum_lock for MainThread >03:47:37,849 INFO packaging: gave up _yum_lock for MainThread >03:47:37,850 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,851 INFO packaging: have _yum_lock for MainThread >03:47:37,851 INFO packaging: gave up _yum_lock for MainThread >03:47:37,852 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,853 INFO packaging: have _yum_lock for MainThread >03:47:37,853 INFO packaging: gave up _yum_lock for MainThread >03:47:37,854 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,855 INFO packaging: have _yum_lock for MainThread >03:47:37,855 INFO packaging: gave up _yum_lock for MainThread >03:47:37,857 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,857 INFO packaging: have _yum_lock for MainThread >03:47:37,858 INFO packaging: gave up _yum_lock for MainThread >03:47:37,859 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,859 INFO packaging: have _yum_lock for MainThread >03:47:37,860 INFO packaging: gave up _yum_lock for MainThread >03:47:37,861 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,862 INFO packaging: have _yum_lock for MainThread >03:47:37,862 INFO packaging: gave up _yum_lock for MainThread >03:47:37,863 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,864 INFO packaging: have _yum_lock for MainThread >03:47:37,864 INFO packaging: gave up _yum_lock for MainThread >03:47:37,866 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,866 INFO packaging: have _yum_lock for MainThread >03:47:37,867 INFO packaging: gave up _yum_lock for MainThread >03:47:37,868 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:37,868 INFO packaging: have _yum_lock for MainThread >03:47:37,869 INFO packaging: gave up _yum_lock for MainThread >03:47:37,870 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,870 INFO packaging: have _yum_lock for MainThread >03:47:37,871 INFO packaging: gave up _yum_lock for MainThread >03:47:37,872 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,873 INFO packaging: have _yum_lock for MainThread >03:47:37,873 INFO packaging: gave up _yum_lock for MainThread >03:47:37,874 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,875 INFO packaging: have _yum_lock for MainThread >03:47:37,875 INFO packaging: gave up _yum_lock for MainThread >03:47:37,877 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,877 INFO packaging: have _yum_lock for MainThread >03:47:37,878 INFO packaging: gave up _yum_lock for MainThread >03:47:37,879 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,879 INFO packaging: have _yum_lock for MainThread >03:47:37,880 INFO packaging: gave up _yum_lock for MainThread >03:47:37,881 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,882 INFO packaging: have _yum_lock for MainThread >03:47:37,882 INFO packaging: gave up _yum_lock for MainThread >03:47:37,883 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,884 INFO packaging: have _yum_lock for MainThread >03:47:37,884 INFO packaging: gave up _yum_lock for MainThread >03:47:37,886 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,886 INFO packaging: have _yum_lock for MainThread >03:47:37,886 INFO packaging: gave up _yum_lock for MainThread >03:47:37,888 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,888 INFO packaging: have _yum_lock for MainThread >03:47:37,889 INFO packaging: gave up _yum_lock for MainThread >03:47:37,890 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,890 INFO packaging: have _yum_lock for MainThread >03:47:37,891 INFO packaging: gave up _yum_lock for MainThread >03:47:37,892 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,893 INFO packaging: have _yum_lock for MainThread >03:47:37,893 INFO packaging: gave up _yum_lock for MainThread >03:47:37,894 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,895 INFO packaging: have _yum_lock for MainThread >03:47:37,895 INFO packaging: gave up _yum_lock for MainThread >03:47:37,896 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,897 INFO packaging: have _yum_lock for MainThread >03:47:37,897 INFO packaging: gave up _yum_lock for MainThread >03:47:37,899 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,899 INFO packaging: have _yum_lock for MainThread >03:47:37,900 INFO packaging: gave up _yum_lock for MainThread >03:47:37,901 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,901 INFO packaging: have _yum_lock for MainThread >03:47:37,902 INFO packaging: gave up _yum_lock for MainThread >03:47:37,904 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,904 INFO packaging: have _yum_lock for MainThread >03:47:37,905 INFO packaging: gave up _yum_lock for MainThread >03:47:37,906 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,906 INFO packaging: have _yum_lock for MainThread >03:47:37,907 INFO packaging: gave up _yum_lock for MainThread >03:47:37,908 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:37,908 INFO packaging: have _yum_lock for MainThread >03:47:37,908 INFO packaging: gave up _yum_lock for MainThread >03:47:37,909 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,910 INFO packaging: have _yum_lock for MainThread >03:47:37,910 INFO packaging: gave up _yum_lock for MainThread >03:47:37,911 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,911 INFO packaging: have _yum_lock for MainThread >03:47:37,911 INFO packaging: gave up _yum_lock for MainThread >03:47:37,912 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,913 INFO packaging: have _yum_lock for MainThread >03:47:37,913 INFO packaging: gave up _yum_lock for MainThread >03:47:37,914 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,914 INFO packaging: have _yum_lock for MainThread >03:47:37,914 INFO packaging: gave up _yum_lock for MainThread >03:47:37,915 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,916 INFO packaging: have _yum_lock for MainThread >03:47:37,916 INFO packaging: gave up _yum_lock for MainThread >03:47:37,917 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,917 INFO packaging: have _yum_lock for MainThread >03:47:37,918 INFO packaging: gave up _yum_lock for MainThread >03:47:37,919 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,919 INFO packaging: have _yum_lock for MainThread >03:47:37,919 INFO packaging: gave up _yum_lock for MainThread >03:47:37,920 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,920 INFO packaging: have _yum_lock for MainThread >03:47:37,921 INFO packaging: gave up _yum_lock for MainThread >03:47:37,922 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,922 INFO packaging: have _yum_lock for MainThread >03:47:37,922 INFO packaging: gave up _yum_lock for MainThread >03:47:37,923 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:37,924 INFO packaging: have _yum_lock for MainThread >03:47:37,924 INFO packaging: gave up _yum_lock for MainThread >03:47:37,925 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,925 INFO packaging: have _yum_lock for MainThread >03:47:37,926 INFO packaging: gave up _yum_lock for MainThread >03:47:37,927 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,927 INFO packaging: have _yum_lock for MainThread >03:47:37,927 INFO packaging: gave up _yum_lock for MainThread >03:47:37,929 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,929 INFO packaging: have _yum_lock for MainThread >03:47:37,929 INFO packaging: gave up _yum_lock for MainThread >03:47:37,930 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,931 INFO packaging: have _yum_lock for MainThread >03:47:37,931 INFO packaging: gave up _yum_lock for MainThread >03:47:37,932 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,932 INFO packaging: have _yum_lock for MainThread >03:47:37,933 INFO packaging: gave up _yum_lock for MainThread >03:47:37,934 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,934 INFO packaging: have _yum_lock for MainThread >03:47:37,935 INFO packaging: gave up _yum_lock for MainThread >03:47:37,936 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,936 INFO packaging: have _yum_lock for MainThread >03:47:37,936 INFO packaging: gave up _yum_lock for MainThread >03:47:37,937 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,938 INFO packaging: have _yum_lock for MainThread >03:47:37,938 INFO packaging: gave up _yum_lock for MainThread >03:47:37,939 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,939 INFO packaging: have _yum_lock for MainThread >03:47:37,940 INFO packaging: gave up _yum_lock for MainThread >03:47:37,941 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,941 INFO packaging: have _yum_lock for MainThread >03:47:37,941 INFO packaging: gave up _yum_lock for MainThread >03:47:37,943 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,943 INFO packaging: have _yum_lock for MainThread >03:47:37,943 INFO packaging: gave up _yum_lock for MainThread >03:47:37,944 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,944 INFO packaging: have _yum_lock for MainThread >03:47:37,945 INFO packaging: gave up _yum_lock for MainThread >03:47:37,946 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,946 INFO packaging: have _yum_lock for MainThread >03:47:37,946 INFO packaging: gave up _yum_lock for MainThread >03:47:37,948 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,948 INFO packaging: have _yum_lock for MainThread >03:47:37,948 INFO packaging: gave up _yum_lock for MainThread >03:47:37,949 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,949 INFO packaging: have _yum_lock for MainThread >03:47:37,950 INFO packaging: gave up _yum_lock for MainThread >03:47:37,951 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,951 INFO packaging: have _yum_lock for MainThread >03:47:37,951 INFO packaging: gave up _yum_lock for MainThread >03:47:37,952 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,953 INFO packaging: have _yum_lock for MainThread >03:47:37,953 INFO packaging: gave up _yum_lock for MainThread >03:47:37,954 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,954 INFO packaging: have _yum_lock for MainThread >03:47:37,955 INFO packaging: gave up _yum_lock for MainThread >03:47:37,956 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,956 INFO packaging: have _yum_lock for MainThread >03:47:37,956 INFO packaging: gave up _yum_lock for MainThread >03:47:37,957 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,957 INFO packaging: have _yum_lock for MainThread >03:47:37,958 INFO packaging: gave up _yum_lock for MainThread >03:47:37,959 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,959 INFO packaging: have _yum_lock for MainThread >03:47:37,959 INFO packaging: gave up _yum_lock for MainThread >03:47:37,960 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,961 INFO packaging: have _yum_lock for MainThread >03:47:37,961 INFO packaging: gave up _yum_lock for MainThread >03:47:37,962 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,962 INFO packaging: have _yum_lock for MainThread >03:47:37,963 INFO packaging: gave up _yum_lock for MainThread >03:47:37,964 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,964 INFO packaging: have _yum_lock for MainThread >03:47:37,964 INFO packaging: gave up _yum_lock for MainThread >03:47:37,965 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,966 INFO packaging: have _yum_lock for MainThread >03:47:37,966 INFO packaging: gave up _yum_lock for MainThread >03:47:37,967 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,967 INFO packaging: have _yum_lock for MainThread >03:47:37,968 INFO packaging: gave up _yum_lock for MainThread >03:47:37,969 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,969 INFO packaging: have _yum_lock for MainThread >03:47:37,969 INFO packaging: gave up _yum_lock for MainThread >03:47:37,970 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,971 INFO packaging: have _yum_lock for MainThread >03:47:37,971 INFO packaging: gave up _yum_lock for MainThread >03:47:37,972 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,972 INFO packaging: have _yum_lock for MainThread >03:47:37,972 INFO packaging: gave up _yum_lock for MainThread >03:47:37,974 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:37,974 INFO packaging: have _yum_lock for MainThread >03:47:37,974 INFO packaging: gave up _yum_lock for MainThread >03:47:37,975 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,975 INFO packaging: have _yum_lock for MainThread >03:47:37,976 INFO packaging: gave up _yum_lock for MainThread >03:47:37,977 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,977 INFO packaging: have _yum_lock for MainThread >03:47:37,977 INFO packaging: gave up _yum_lock for MainThread >03:47:37,978 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,979 INFO packaging: have _yum_lock for MainThread >03:47:37,979 INFO packaging: gave up _yum_lock for MainThread >03:47:37,980 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,980 INFO packaging: have _yum_lock for MainThread >03:47:37,981 INFO packaging: gave up _yum_lock for MainThread >03:47:37,982 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,982 INFO packaging: have _yum_lock for MainThread >03:47:37,982 INFO packaging: gave up _yum_lock for MainThread >03:47:37,983 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,984 INFO packaging: have _yum_lock for MainThread >03:47:37,984 INFO packaging: gave up _yum_lock for MainThread >03:47:37,985 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,985 INFO packaging: have _yum_lock for MainThread >03:47:37,985 INFO packaging: gave up _yum_lock for MainThread >03:47:37,987 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,987 INFO packaging: have _yum_lock for MainThread >03:47:37,987 INFO packaging: gave up _yum_lock for MainThread >03:47:37,988 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,988 INFO packaging: have _yum_lock for MainThread >03:47:37,989 INFO packaging: gave up _yum_lock for MainThread >03:47:37,990 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,990 INFO packaging: have _yum_lock for MainThread >03:47:37,991 INFO packaging: gave up _yum_lock for MainThread >03:47:37,992 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:37,992 INFO packaging: have _yum_lock for MainThread >03:47:37,993 INFO packaging: gave up _yum_lock for MainThread >03:47:37,994 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:37,994 INFO packaging: have _yum_lock for MainThread >03:47:37,995 INFO packaging: gave up _yum_lock for MainThread >03:47:37,996 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:37,996 INFO packaging: have _yum_lock for MainThread >03:47:37,996 INFO packaging: gave up _yum_lock for MainThread >03:47:37,998 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:37,998 INFO packaging: have _yum_lock for MainThread >03:47:37,998 INFO packaging: gave up _yum_lock for MainThread >03:47:38,000 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,000 INFO packaging: have _yum_lock for MainThread >03:47:38,001 INFO packaging: gave up _yum_lock for MainThread >03:47:38,002 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,002 INFO packaging: have _yum_lock for MainThread >03:47:38,002 INFO packaging: gave up _yum_lock for MainThread >03:47:38,003 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,003 INFO packaging: have _yum_lock for MainThread >03:47:38,004 INFO packaging: gave up _yum_lock for MainThread >03:47:38,005 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,005 INFO packaging: have _yum_lock for MainThread >03:47:38,005 INFO packaging: gave up _yum_lock for MainThread >03:47:38,006 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,007 INFO packaging: have _yum_lock for MainThread >03:47:38,007 INFO packaging: gave up _yum_lock for MainThread >03:47:38,008 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,008 INFO packaging: have _yum_lock for MainThread >03:47:38,008 INFO packaging: gave up _yum_lock for MainThread >03:47:38,009 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,010 INFO packaging: have _yum_lock for MainThread >03:47:38,010 INFO packaging: gave up _yum_lock for MainThread >03:47:38,011 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,011 INFO packaging: have _yum_lock for MainThread >03:47:38,012 INFO packaging: gave up _yum_lock for MainThread >03:47:38,013 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,013 INFO packaging: have _yum_lock for MainThread >03:47:38,013 INFO packaging: gave up _yum_lock for MainThread >03:47:38,014 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,014 INFO packaging: have _yum_lock for MainThread >03:47:38,015 INFO packaging: gave up _yum_lock for MainThread >03:47:38,016 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,016 INFO packaging: have _yum_lock for MainThread >03:47:38,016 INFO packaging: gave up _yum_lock for MainThread >03:47:38,018 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,018 INFO packaging: have _yum_lock for MainThread >03:47:38,018 INFO packaging: gave up _yum_lock for MainThread >03:47:38,019 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,019 INFO packaging: have _yum_lock for MainThread >03:47:38,020 INFO packaging: gave up _yum_lock for MainThread >03:47:38,021 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,021 INFO packaging: have _yum_lock for MainThread >03:47:38,021 INFO packaging: gave up _yum_lock for MainThread >03:47:38,022 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,022 INFO packaging: have _yum_lock for MainThread >03:47:38,023 INFO packaging: gave up _yum_lock for MainThread >03:47:38,024 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,024 INFO packaging: have _yum_lock for MainThread >03:47:38,024 INFO packaging: gave up _yum_lock for MainThread >03:47:38,025 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,026 INFO packaging: have _yum_lock for MainThread >03:47:38,026 INFO packaging: gave up _yum_lock for MainThread >03:47:38,027 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,027 INFO packaging: have _yum_lock for MainThread >03:47:38,027 INFO packaging: gave up _yum_lock for MainThread >03:47:38,028 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,029 INFO packaging: have _yum_lock for MainThread >03:47:38,029 INFO packaging: gave up _yum_lock for MainThread >03:47:38,030 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,031 INFO packaging: have _yum_lock for MainThread >03:47:38,031 INFO packaging: gave up _yum_lock for MainThread >03:47:38,032 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,032 INFO packaging: have _yum_lock for MainThread >03:47:38,032 INFO packaging: gave up _yum_lock for MainThread >03:47:38,033 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,034 INFO packaging: have _yum_lock for MainThread >03:47:38,034 INFO packaging: gave up _yum_lock for MainThread >03:47:38,035 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,036 INFO packaging: have _yum_lock for MainThread >03:47:38,036 INFO packaging: gave up _yum_lock for MainThread >03:47:38,037 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,037 INFO packaging: have _yum_lock for MainThread >03:47:38,038 INFO packaging: gave up _yum_lock for MainThread >03:47:38,039 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,039 INFO packaging: have _yum_lock for MainThread >03:47:38,039 INFO packaging: gave up _yum_lock for MainThread >03:47:38,040 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,040 INFO packaging: have _yum_lock for MainThread >03:47:38,041 INFO packaging: gave up _yum_lock for MainThread >03:47:38,042 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,042 INFO packaging: have _yum_lock for MainThread >03:47:38,042 INFO packaging: gave up _yum_lock for MainThread >03:47:38,043 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,044 INFO packaging: have _yum_lock for MainThread >03:47:38,044 INFO packaging: gave up _yum_lock for MainThread >03:47:38,045 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,045 INFO packaging: have _yum_lock for MainThread >03:47:38,045 INFO packaging: gave up _yum_lock for MainThread >03:47:38,046 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,047 INFO packaging: have _yum_lock for MainThread >03:47:38,047 INFO packaging: gave up _yum_lock for MainThread >03:47:38,048 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,048 INFO packaging: have _yum_lock for MainThread >03:47:38,049 INFO packaging: gave up _yum_lock for MainThread >03:47:38,050 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,050 INFO packaging: have _yum_lock for MainThread >03:47:38,050 INFO packaging: gave up _yum_lock for MainThread >03:47:38,051 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,051 INFO packaging: have _yum_lock for MainThread >03:47:38,052 INFO packaging: gave up _yum_lock for MainThread >03:47:38,053 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,053 INFO packaging: have _yum_lock for MainThread >03:47:38,053 INFO packaging: gave up _yum_lock for MainThread >03:47:38,054 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,055 INFO packaging: have _yum_lock for MainThread >03:47:38,055 INFO packaging: gave up _yum_lock for MainThread >03:47:38,056 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,056 INFO packaging: have _yum_lock for MainThread >03:47:38,056 INFO packaging: gave up _yum_lock for MainThread >03:47:38,057 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,058 INFO packaging: have _yum_lock for MainThread >03:47:38,058 INFO packaging: gave up _yum_lock for MainThread >03:47:38,059 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,059 INFO packaging: have _yum_lock for MainThread >03:47:38,060 INFO packaging: gave up _yum_lock for MainThread >03:47:38,061 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,061 INFO packaging: have _yum_lock for MainThread >03:47:38,061 INFO packaging: gave up _yum_lock for MainThread >03:47:38,062 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,062 INFO packaging: have _yum_lock for MainThread >03:47:38,063 INFO packaging: gave up _yum_lock for MainThread >03:47:38,064 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,064 INFO packaging: have _yum_lock for MainThread >03:47:38,064 INFO packaging: gave up _yum_lock for MainThread >03:47:38,065 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,066 INFO packaging: have _yum_lock for MainThread >03:47:38,066 INFO packaging: gave up _yum_lock for MainThread >03:47:38,067 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,067 INFO packaging: have _yum_lock for MainThread >03:47:38,067 INFO packaging: gave up _yum_lock for MainThread >03:47:38,068 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,068 INFO packaging: have _yum_lock for MainThread >03:47:38,069 INFO packaging: gave up _yum_lock for MainThread >03:47:38,070 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,070 INFO packaging: have _yum_lock for MainThread >03:47:38,070 INFO packaging: gave up _yum_lock for MainThread >03:47:38,071 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,072 INFO packaging: have _yum_lock for MainThread >03:47:38,072 INFO packaging: gave up _yum_lock for MainThread >03:47:38,073 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,073 INFO packaging: have _yum_lock for MainThread >03:47:38,073 INFO packaging: gave up _yum_lock for MainThread >03:47:38,074 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,075 INFO packaging: have _yum_lock for MainThread >03:47:38,075 INFO packaging: gave up _yum_lock for MainThread >03:47:38,076 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,076 INFO packaging: have _yum_lock for MainThread >03:47:38,076 INFO packaging: gave up _yum_lock for MainThread >03:47:38,078 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,078 INFO packaging: have _yum_lock for MainThread >03:47:38,078 INFO packaging: gave up _yum_lock for MainThread >03:47:38,079 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,079 INFO packaging: have _yum_lock for MainThread >03:47:38,080 INFO packaging: gave up _yum_lock for MainThread >03:47:38,081 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,081 INFO packaging: have _yum_lock for MainThread >03:47:38,081 INFO packaging: gave up _yum_lock for MainThread >03:47:38,082 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,082 INFO packaging: have _yum_lock for MainThread >03:47:38,082 INFO packaging: gave up _yum_lock for MainThread >03:47:38,084 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,084 INFO packaging: have _yum_lock for MainThread >03:47:38,084 INFO packaging: gave up _yum_lock for MainThread >03:47:38,085 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,085 INFO packaging: have _yum_lock for MainThread >03:47:38,085 INFO packaging: gave up _yum_lock for MainThread >03:47:38,086 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,087 INFO packaging: have _yum_lock for MainThread >03:47:38,087 INFO packaging: gave up _yum_lock for MainThread >03:47:38,088 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,088 INFO packaging: have _yum_lock for MainThread >03:47:38,088 INFO packaging: gave up _yum_lock for MainThread >03:47:38,089 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,090 INFO packaging: have _yum_lock for MainThread >03:47:38,090 INFO packaging: gave up _yum_lock for MainThread >03:47:38,091 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,091 INFO packaging: have _yum_lock for MainThread >03:47:38,091 INFO packaging: gave up _yum_lock for MainThread >03:47:38,092 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,092 INFO packaging: have _yum_lock for MainThread >03:47:38,093 INFO packaging: gave up _yum_lock for MainThread >03:47:38,094 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,094 INFO packaging: have _yum_lock for MainThread >03:47:38,094 INFO packaging: gave up _yum_lock for MainThread >03:47:38,096 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,096 INFO packaging: have _yum_lock for MainThread >03:47:38,096 INFO packaging: gave up _yum_lock for MainThread >03:47:38,097 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,098 INFO packaging: have _yum_lock for MainThread >03:47:38,098 INFO packaging: gave up _yum_lock for MainThread >03:47:38,099 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,099 INFO packaging: have _yum_lock for MainThread >03:47:38,099 INFO packaging: gave up _yum_lock for MainThread >03:47:38,100 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,101 INFO packaging: have _yum_lock for MainThread >03:47:38,101 INFO packaging: gave up _yum_lock for MainThread >03:47:38,102 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,102 INFO packaging: have _yum_lock for MainThread >03:47:38,102 INFO packaging: gave up _yum_lock for MainThread >03:47:38,103 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,104 INFO packaging: have _yum_lock for MainThread >03:47:38,104 INFO packaging: gave up _yum_lock for MainThread >03:47:38,105 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,105 INFO packaging: have _yum_lock for MainThread >03:47:38,105 INFO packaging: gave up _yum_lock for MainThread >03:47:38,106 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,107 INFO packaging: have _yum_lock for MainThread >03:47:38,107 INFO packaging: gave up _yum_lock for MainThread >03:47:38,108 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,108 INFO packaging: have _yum_lock for MainThread >03:47:38,108 INFO packaging: gave up _yum_lock for MainThread >03:47:38,109 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,110 INFO packaging: have _yum_lock for MainThread >03:47:38,110 INFO packaging: gave up _yum_lock for MainThread >03:47:38,111 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,111 INFO packaging: have _yum_lock for MainThread >03:47:38,111 INFO packaging: gave up _yum_lock for MainThread >03:47:38,112 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,113 INFO packaging: have _yum_lock for MainThread >03:47:38,113 INFO packaging: gave up _yum_lock for MainThread >03:47:38,114 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,114 INFO packaging: have _yum_lock for MainThread >03:47:38,114 INFO packaging: gave up _yum_lock for MainThread >03:47:38,115 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,116 INFO packaging: have _yum_lock for MainThread >03:47:38,116 INFO packaging: gave up _yum_lock for MainThread >03:47:38,117 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,117 INFO packaging: have _yum_lock for MainThread >03:47:38,118 INFO packaging: gave up _yum_lock for MainThread >03:47:38,119 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,119 INFO packaging: have _yum_lock for MainThread >03:47:38,119 INFO packaging: gave up _yum_lock for MainThread >03:47:38,120 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,121 INFO packaging: have _yum_lock for MainThread >03:47:38,121 INFO packaging: gave up _yum_lock for MainThread >03:47:38,122 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,122 INFO packaging: have _yum_lock for MainThread >03:47:38,122 INFO packaging: gave up _yum_lock for MainThread >03:47:38,123 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,124 INFO packaging: have _yum_lock for MainThread >03:47:38,124 INFO packaging: gave up _yum_lock for MainThread >03:47:38,125 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,125 INFO packaging: have _yum_lock for MainThread >03:47:38,125 INFO packaging: gave up _yum_lock for MainThread >03:47:38,126 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,127 INFO packaging: have _yum_lock for MainThread >03:47:38,127 INFO packaging: gave up _yum_lock for MainThread >03:47:38,128 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,128 INFO packaging: have _yum_lock for MainThread >03:47:38,129 INFO packaging: gave up _yum_lock for MainThread >03:47:38,130 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,130 INFO packaging: have _yum_lock for MainThread >03:47:38,130 INFO packaging: gave up _yum_lock for MainThread >03:47:38,131 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,131 INFO packaging: have _yum_lock for MainThread >03:47:38,132 INFO packaging: gave up _yum_lock for MainThread >03:47:38,133 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,133 INFO packaging: have _yum_lock for MainThread >03:47:38,133 INFO packaging: gave up _yum_lock for MainThread >03:47:38,134 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,134 INFO packaging: have _yum_lock for MainThread >03:47:38,135 INFO packaging: gave up _yum_lock for MainThread >03:47:38,136 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,136 INFO packaging: have _yum_lock for MainThread >03:47:38,136 INFO packaging: gave up _yum_lock for MainThread >03:47:38,137 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,137 INFO packaging: have _yum_lock for MainThread >03:47:38,138 INFO packaging: gave up _yum_lock for MainThread >03:47:38,138 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,139 INFO packaging: have _yum_lock for MainThread >03:47:38,139 INFO packaging: gave up _yum_lock for MainThread >03:47:38,140 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,140 INFO packaging: have _yum_lock for MainThread >03:47:38,140 INFO packaging: gave up _yum_lock for MainThread >03:47:38,141 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,142 INFO packaging: have _yum_lock for MainThread >03:47:38,142 INFO packaging: gave up _yum_lock for MainThread >03:47:38,143 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,143 INFO packaging: have _yum_lock for MainThread >03:47:38,143 INFO packaging: gave up _yum_lock for MainThread >03:47:38,144 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,145 INFO packaging: have _yum_lock for MainThread >03:47:38,145 INFO packaging: gave up _yum_lock for MainThread >03:47:38,146 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,146 INFO packaging: have _yum_lock for MainThread >03:47:38,146 INFO packaging: gave up _yum_lock for MainThread >03:47:38,147 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,147 INFO packaging: have _yum_lock for MainThread >03:47:38,148 INFO packaging: gave up _yum_lock for MainThread >03:47:38,149 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,149 INFO packaging: have _yum_lock for MainThread >03:47:38,149 INFO packaging: gave up _yum_lock for MainThread >03:47:38,150 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,150 INFO packaging: have _yum_lock for MainThread >03:47:38,151 INFO packaging: gave up _yum_lock for MainThread >03:47:38,152 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,152 INFO packaging: have _yum_lock for MainThread >03:47:38,152 INFO packaging: gave up _yum_lock for MainThread >03:47:38,153 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,154 INFO packaging: have _yum_lock for MainThread >03:47:38,154 INFO packaging: gave up _yum_lock for MainThread >03:47:38,155 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,155 INFO packaging: have _yum_lock for MainThread >03:47:38,155 INFO packaging: gave up _yum_lock for MainThread >03:47:38,156 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,156 INFO packaging: have _yum_lock for MainThread >03:47:38,157 INFO packaging: gave up _yum_lock for MainThread >03:47:38,158 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,158 INFO packaging: have _yum_lock for MainThread >03:47:38,158 INFO packaging: gave up _yum_lock for MainThread >03:47:38,159 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,159 INFO packaging: have _yum_lock for MainThread >03:47:38,160 INFO packaging: gave up _yum_lock for MainThread >03:47:38,160 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,161 INFO packaging: have _yum_lock for MainThread >03:47:38,161 INFO packaging: gave up _yum_lock for MainThread >03:47:38,162 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,162 INFO packaging: have _yum_lock for MainThread >03:47:38,162 INFO packaging: gave up _yum_lock for MainThread >03:47:38,163 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,164 INFO packaging: have _yum_lock for MainThread >03:47:38,164 INFO packaging: gave up _yum_lock for MainThread >03:47:38,165 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,165 INFO packaging: have _yum_lock for MainThread >03:47:38,165 INFO packaging: gave up _yum_lock for MainThread >03:47:38,166 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,167 INFO packaging: have _yum_lock for MainThread >03:47:38,167 INFO packaging: gave up _yum_lock for MainThread >03:47:38,168 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,168 INFO packaging: have _yum_lock for MainThread >03:47:38,168 INFO packaging: gave up _yum_lock for MainThread >03:47:38,169 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,170 INFO packaging: have _yum_lock for MainThread >03:47:38,170 INFO packaging: gave up _yum_lock for MainThread >03:47:38,171 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,171 INFO packaging: have _yum_lock for MainThread >03:47:38,171 INFO packaging: gave up _yum_lock for MainThread >03:47:38,172 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,173 INFO packaging: have _yum_lock for MainThread >03:47:38,173 INFO packaging: gave up _yum_lock for MainThread >03:47:38,174 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,174 INFO packaging: have _yum_lock for MainThread >03:47:38,174 INFO packaging: gave up _yum_lock for MainThread >03:47:38,175 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,175 INFO packaging: have _yum_lock for MainThread >03:47:38,176 INFO packaging: gave up _yum_lock for MainThread >03:47:38,176 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,177 INFO packaging: have _yum_lock for MainThread >03:47:38,177 INFO packaging: gave up _yum_lock for MainThread >03:47:38,178 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,178 INFO packaging: have _yum_lock for MainThread >03:47:38,178 INFO packaging: gave up _yum_lock for MainThread >03:47:38,179 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,180 INFO packaging: have _yum_lock for MainThread >03:47:38,180 INFO packaging: gave up _yum_lock for MainThread >03:47:38,181 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,181 INFO packaging: have _yum_lock for MainThread >03:47:38,181 INFO packaging: gave up _yum_lock for MainThread >03:47:38,182 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,183 INFO packaging: have _yum_lock for MainThread >03:47:38,183 INFO packaging: gave up _yum_lock for MainThread >03:47:38,184 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,184 INFO packaging: have _yum_lock for MainThread >03:47:38,184 INFO packaging: gave up _yum_lock for MainThread >03:47:38,186 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,186 INFO packaging: have _yum_lock for MainThread >03:47:38,186 INFO packaging: gave up _yum_lock for MainThread >03:47:38,187 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,187 INFO packaging: have _yum_lock for MainThread >03:47:38,188 INFO packaging: gave up _yum_lock for MainThread >03:47:38,189 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,189 INFO packaging: have _yum_lock for MainThread >03:47:38,189 INFO packaging: gave up _yum_lock for MainThread >03:47:38,190 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,190 INFO packaging: have _yum_lock for MainThread >03:47:38,191 INFO packaging: gave up _yum_lock for MainThread >03:47:38,192 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,192 INFO packaging: have _yum_lock for MainThread >03:47:38,192 INFO packaging: gave up _yum_lock for MainThread >03:47:38,193 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,194 INFO packaging: have _yum_lock for MainThread >03:47:38,194 INFO packaging: gave up _yum_lock for MainThread >03:47:38,195 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,195 INFO packaging: have _yum_lock for MainThread >03:47:38,195 INFO packaging: gave up _yum_lock for MainThread >03:47:38,196 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,196 INFO packaging: have _yum_lock for MainThread >03:47:38,197 INFO packaging: gave up _yum_lock for MainThread >03:47:38,198 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,198 INFO packaging: have _yum_lock for MainThread >03:47:38,198 INFO packaging: gave up _yum_lock for MainThread >03:47:38,199 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,200 INFO packaging: have _yum_lock for MainThread >03:47:38,200 INFO packaging: gave up _yum_lock for MainThread >03:47:38,201 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,201 INFO packaging: have _yum_lock for MainThread >03:47:38,201 INFO packaging: gave up _yum_lock for MainThread >03:47:38,202 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,203 INFO packaging: have _yum_lock for MainThread >03:47:38,203 INFO packaging: gave up _yum_lock for MainThread >03:47:38,204 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,204 INFO packaging: have _yum_lock for MainThread >03:47:38,204 INFO packaging: gave up _yum_lock for MainThread >03:47:38,205 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,206 INFO packaging: have _yum_lock for MainThread >03:47:38,206 INFO packaging: gave up _yum_lock for MainThread >03:47:38,207 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,207 INFO packaging: have _yum_lock for MainThread >03:47:38,208 INFO packaging: gave up _yum_lock for MainThread >03:47:38,209 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,209 INFO packaging: have _yum_lock for MainThread >03:47:38,209 INFO packaging: gave up _yum_lock for MainThread >03:47:38,210 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,210 INFO packaging: have _yum_lock for MainThread >03:47:38,211 INFO packaging: gave up _yum_lock for MainThread >03:47:38,212 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,212 INFO packaging: have _yum_lock for MainThread >03:47:38,212 INFO packaging: gave up _yum_lock for MainThread >03:47:38,213 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,213 INFO packaging: have _yum_lock for MainThread >03:47:38,214 INFO packaging: gave up _yum_lock for MainThread >03:47:38,215 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,215 INFO packaging: have _yum_lock for MainThread >03:47:38,215 INFO packaging: gave up _yum_lock for MainThread >03:47:38,216 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,217 INFO packaging: have _yum_lock for MainThread >03:47:38,217 INFO packaging: gave up _yum_lock for MainThread >03:47:38,218 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,218 INFO packaging: have _yum_lock for MainThread >03:47:38,218 INFO packaging: gave up _yum_lock for MainThread >03:47:38,219 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,220 INFO packaging: have _yum_lock for MainThread >03:47:38,220 INFO packaging: gave up _yum_lock for MainThread >03:47:38,221 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,221 INFO packaging: have _yum_lock for MainThread >03:47:38,221 INFO packaging: gave up _yum_lock for MainThread >03:47:38,223 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,223 INFO packaging: have _yum_lock for MainThread >03:47:38,223 INFO packaging: gave up _yum_lock for MainThread >03:47:38,224 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,224 INFO packaging: have _yum_lock for MainThread >03:47:38,225 INFO packaging: gave up _yum_lock for MainThread >03:47:38,226 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,226 INFO packaging: have _yum_lock for MainThread >03:47:38,226 INFO packaging: gave up _yum_lock for MainThread >03:47:38,227 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,227 INFO packaging: have _yum_lock for MainThread >03:47:38,228 INFO packaging: gave up _yum_lock for MainThread >03:47:38,229 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,229 INFO packaging: have _yum_lock for MainThread >03:47:38,229 INFO packaging: gave up _yum_lock for MainThread >03:47:38,230 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,231 INFO packaging: have _yum_lock for MainThread >03:47:38,231 INFO packaging: gave up _yum_lock for MainThread >03:47:38,232 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,232 INFO packaging: have _yum_lock for MainThread >03:47:38,232 INFO packaging: gave up _yum_lock for MainThread >03:47:38,233 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,233 INFO packaging: have _yum_lock for MainThread >03:47:38,234 INFO packaging: gave up _yum_lock for MainThread >03:47:38,235 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,235 INFO packaging: have _yum_lock for MainThread >03:47:38,235 INFO packaging: gave up _yum_lock for MainThread >03:47:38,236 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,236 INFO packaging: have _yum_lock for MainThread >03:47:38,237 INFO packaging: gave up _yum_lock for MainThread >03:47:38,238 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,238 INFO packaging: have _yum_lock for MainThread >03:47:38,238 INFO packaging: gave up _yum_lock for MainThread >03:47:38,239 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,240 INFO packaging: have _yum_lock for MainThread >03:47:38,240 INFO packaging: gave up _yum_lock for MainThread >03:47:38,241 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,241 INFO packaging: have _yum_lock for MainThread >03:47:38,241 INFO packaging: gave up _yum_lock for MainThread >03:47:38,242 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,243 INFO packaging: have _yum_lock for MainThread >03:47:38,243 INFO packaging: gave up _yum_lock for MainThread >03:47:38,244 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,244 INFO packaging: have _yum_lock for MainThread >03:47:38,244 INFO packaging: gave up _yum_lock for MainThread >03:47:38,245 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,246 INFO packaging: have _yum_lock for MainThread >03:47:38,246 INFO packaging: gave up _yum_lock for MainThread >03:47:38,247 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,247 INFO packaging: have _yum_lock for MainThread >03:47:38,248 INFO packaging: gave up _yum_lock for MainThread >03:47:38,249 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,249 INFO packaging: have _yum_lock for MainThread >03:47:38,249 INFO packaging: gave up _yum_lock for MainThread >03:47:38,250 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,250 INFO packaging: have _yum_lock for MainThread >03:47:38,251 INFO packaging: gave up _yum_lock for MainThread >03:47:38,252 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,252 INFO packaging: have _yum_lock for MainThread >03:47:38,252 INFO packaging: gave up _yum_lock for MainThread >03:47:38,253 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,253 INFO packaging: have _yum_lock for MainThread >03:47:38,254 INFO packaging: gave up _yum_lock for MainThread >03:47:38,255 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,255 INFO packaging: have _yum_lock for MainThread >03:47:38,255 INFO packaging: gave up _yum_lock for MainThread >03:47:38,256 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,257 INFO packaging: have _yum_lock for MainThread >03:47:38,257 INFO packaging: gave up _yum_lock for MainThread >03:47:38,258 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,258 INFO packaging: have _yum_lock for MainThread >03:47:38,258 INFO packaging: gave up _yum_lock for MainThread >03:47:38,259 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:38,260 INFO packaging: have _yum_lock for MainThread >03:47:38,260 INFO packaging: gave up _yum_lock for MainThread >03:47:38,261 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,261 INFO packaging: have _yum_lock for MainThread >03:47:38,261 INFO packaging: gave up _yum_lock for MainThread >03:47:38,263 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,263 INFO packaging: have _yum_lock for MainThread >03:47:38,263 INFO packaging: gave up _yum_lock for MainThread >03:47:38,264 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,264 INFO packaging: have _yum_lock for MainThread >03:47:38,265 INFO packaging: gave up _yum_lock for MainThread >03:47:38,266 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,266 INFO packaging: have _yum_lock for MainThread >03:47:38,266 INFO packaging: gave up _yum_lock for MainThread >03:47:38,267 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,267 INFO packaging: have _yum_lock for MainThread >03:47:38,268 INFO packaging: gave up _yum_lock for MainThread >03:47:38,269 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,269 INFO packaging: have _yum_lock for MainThread >03:47:38,269 INFO packaging: gave up _yum_lock for MainThread >03:47:38,270 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,271 INFO packaging: have _yum_lock for MainThread >03:47:38,271 INFO packaging: gave up _yum_lock for MainThread >03:47:38,272 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,272 INFO packaging: have _yum_lock for MainThread >03:47:38,272 INFO packaging: gave up _yum_lock for MainThread >03:47:38,273 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,274 INFO packaging: have _yum_lock for MainThread >03:47:38,274 INFO packaging: gave up _yum_lock for MainThread >03:47:38,275 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,275 INFO packaging: have _yum_lock for MainThread >03:47:38,275 INFO packaging: gave up _yum_lock for MainThread >03:47:38,276 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,277 INFO packaging: have _yum_lock for MainThread >03:47:38,277 INFO packaging: gave up _yum_lock for MainThread >03:47:38,278 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,278 INFO packaging: have _yum_lock for MainThread >03:47:38,279 INFO packaging: gave up _yum_lock for MainThread >03:47:38,280 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,280 INFO packaging: have _yum_lock for MainThread >03:47:38,280 INFO packaging: gave up _yum_lock for MainThread >03:47:38,281 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,281 INFO packaging: have _yum_lock for MainThread >03:47:38,282 INFO packaging: gave up _yum_lock for MainThread >03:47:38,283 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,283 INFO packaging: have _yum_lock for MainThread >03:47:38,283 INFO packaging: gave up _yum_lock for MainThread >03:47:38,284 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,285 INFO packaging: have _yum_lock for MainThread >03:47:38,285 INFO packaging: gave up _yum_lock for MainThread >03:47:38,286 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,286 INFO packaging: have _yum_lock for MainThread >03:47:38,286 INFO packaging: gave up _yum_lock for MainThread >03:47:38,287 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,288 INFO packaging: have _yum_lock for MainThread >03:47:38,288 INFO packaging: gave up _yum_lock for MainThread >03:47:38,289 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,289 INFO packaging: have _yum_lock for MainThread >03:47:38,289 INFO packaging: gave up _yum_lock for MainThread >03:47:38,290 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,291 INFO packaging: have _yum_lock for MainThread >03:47:38,291 INFO packaging: gave up _yum_lock for MainThread >03:47:38,292 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,292 INFO packaging: have _yum_lock for MainThread >03:47:38,292 INFO packaging: gave up _yum_lock for MainThread >03:47:38,294 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,294 INFO packaging: have _yum_lock for MainThread >03:47:38,294 INFO packaging: gave up _yum_lock for MainThread >03:47:38,295 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,295 INFO packaging: have _yum_lock for MainThread >03:47:38,295 INFO packaging: gave up _yum_lock for MainThread >03:47:38,296 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,297 INFO packaging: have _yum_lock for MainThread >03:47:38,297 INFO packaging: gave up _yum_lock for MainThread >03:47:38,298 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,298 INFO packaging: have _yum_lock for MainThread >03:47:38,299 INFO packaging: gave up _yum_lock for MainThread >03:47:38,300 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,300 INFO packaging: have _yum_lock for MainThread >03:47:38,300 INFO packaging: gave up _yum_lock for MainThread >03:47:38,301 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,302 INFO packaging: have _yum_lock for MainThread >03:47:38,302 INFO packaging: gave up _yum_lock for MainThread >03:47:38,303 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,303 INFO packaging: have _yum_lock for MainThread >03:47:38,303 INFO packaging: gave up _yum_lock for MainThread >03:47:38,304 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,305 INFO packaging: have _yum_lock for MainThread >03:47:38,305 INFO packaging: gave up _yum_lock for MainThread >03:47:38,306 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,306 INFO packaging: have _yum_lock for MainThread >03:47:38,306 INFO packaging: gave up _yum_lock for MainThread >03:47:38,307 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,308 INFO packaging: have _yum_lock for MainThread >03:47:38,308 INFO packaging: gave up _yum_lock for MainThread >03:47:38,309 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,309 INFO packaging: have _yum_lock for MainThread >03:47:38,310 INFO packaging: gave up _yum_lock for MainThread >03:47:38,311 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,311 INFO packaging: have _yum_lock for MainThread >03:47:38,311 INFO packaging: gave up _yum_lock for MainThread >03:47:38,312 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,313 INFO packaging: have _yum_lock for MainThread >03:47:38,313 INFO packaging: gave up _yum_lock for MainThread >03:47:38,314 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,314 INFO packaging: have _yum_lock for MainThread >03:47:38,314 INFO packaging: gave up _yum_lock for MainThread >03:47:38,315 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,315 INFO packaging: have _yum_lock for MainThread >03:47:38,316 INFO packaging: gave up _yum_lock for MainThread >03:47:38,317 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,317 INFO packaging: have _yum_lock for MainThread >03:47:38,317 INFO packaging: gave up _yum_lock for MainThread >03:47:38,318 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:38,319 INFO packaging: have _yum_lock for MainThread >03:47:38,319 INFO packaging: gave up _yum_lock for MainThread >03:47:38,320 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,320 INFO packaging: have _yum_lock for MainThread >03:47:38,320 INFO packaging: gave up _yum_lock for MainThread >03:47:38,321 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,322 INFO packaging: have _yum_lock for MainThread >03:47:38,322 INFO packaging: gave up _yum_lock for MainThread >03:47:38,323 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,323 INFO packaging: have _yum_lock for MainThread >03:47:38,323 INFO packaging: gave up _yum_lock for MainThread >03:47:38,325 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,325 INFO packaging: have _yum_lock for MainThread >03:47:38,325 INFO packaging: gave up _yum_lock for MainThread >03:47:38,326 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,326 INFO packaging: have _yum_lock for MainThread >03:47:38,327 INFO packaging: gave up _yum_lock for MainThread >03:47:38,328 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,328 INFO packaging: have _yum_lock for MainThread >03:47:38,328 INFO packaging: gave up _yum_lock for MainThread >03:47:38,329 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,329 INFO packaging: have _yum_lock for MainThread >03:47:38,330 INFO packaging: gave up _yum_lock for MainThread >03:47:38,331 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,331 INFO packaging: have _yum_lock for MainThread >03:47:38,331 INFO packaging: gave up _yum_lock for MainThread >03:47:38,332 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,332 INFO packaging: have _yum_lock for MainThread >03:47:38,333 INFO packaging: gave up _yum_lock for MainThread >03:47:38,334 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,334 INFO packaging: have _yum_lock for MainThread >03:47:38,334 INFO packaging: gave up _yum_lock for MainThread >03:47:38,335 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,336 INFO packaging: have _yum_lock for MainThread >03:47:38,336 INFO packaging: gave up _yum_lock for MainThread >03:47:38,337 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,337 INFO packaging: have _yum_lock for MainThread >03:47:38,337 INFO packaging: gave up _yum_lock for MainThread >03:47:38,338 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,339 INFO packaging: have _yum_lock for MainThread >03:47:38,339 INFO packaging: gave up _yum_lock for MainThread >03:47:38,340 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,340 INFO packaging: have _yum_lock for MainThread >03:47:38,341 INFO packaging: gave up _yum_lock for MainThread >03:47:38,342 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,342 INFO packaging: have _yum_lock for MainThread >03:47:38,342 INFO packaging: gave up _yum_lock for MainThread >03:47:38,343 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,343 INFO packaging: have _yum_lock for MainThread >03:47:38,344 INFO packaging: gave up _yum_lock for MainThread >03:47:38,345 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,345 INFO packaging: have _yum_lock for MainThread >03:47:38,345 INFO packaging: gave up _yum_lock for MainThread >03:47:38,346 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,347 INFO packaging: have _yum_lock for MainThread >03:47:38,347 INFO packaging: gave up _yum_lock for MainThread >03:47:38,348 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,348 INFO packaging: have _yum_lock for MainThread >03:47:38,348 INFO packaging: gave up _yum_lock for MainThread >03:47:38,349 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,350 INFO packaging: have _yum_lock for MainThread >03:47:38,350 INFO packaging: gave up _yum_lock for MainThread >03:47:38,351 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,351 INFO packaging: have _yum_lock for MainThread >03:47:38,351 INFO packaging: gave up _yum_lock for MainThread >03:47:38,352 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,353 INFO packaging: have _yum_lock for MainThread >03:47:38,353 INFO packaging: gave up _yum_lock for MainThread >03:47:38,354 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,354 INFO packaging: have _yum_lock for MainThread >03:47:38,354 INFO packaging: gave up _yum_lock for MainThread >03:47:38,355 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,356 INFO packaging: have _yum_lock for MainThread >03:47:38,356 INFO packaging: gave up _yum_lock for MainThread >03:47:38,357 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,357 INFO packaging: have _yum_lock for MainThread >03:47:38,357 INFO packaging: gave up _yum_lock for MainThread >03:47:38,359 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,359 INFO packaging: have _yum_lock for MainThread >03:47:38,359 INFO packaging: gave up _yum_lock for MainThread >03:47:38,360 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,360 INFO packaging: have _yum_lock for MainThread >03:47:38,361 INFO packaging: gave up _yum_lock for MainThread >03:47:38,362 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,362 INFO packaging: have _yum_lock for MainThread >03:47:38,362 INFO packaging: gave up _yum_lock for MainThread >03:47:38,363 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,363 INFO packaging: have _yum_lock for MainThread >03:47:38,364 INFO packaging: gave up _yum_lock for MainThread >03:47:38,365 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,365 INFO packaging: have _yum_lock for MainThread >03:47:38,365 INFO packaging: gave up _yum_lock for MainThread >03:47:38,366 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,367 INFO packaging: have _yum_lock for MainThread >03:47:38,367 INFO packaging: gave up _yum_lock for MainThread >03:47:38,368 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,368 INFO packaging: have _yum_lock for MainThread >03:47:38,369 INFO packaging: gave up _yum_lock for MainThread >03:47:38,370 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,370 INFO packaging: have _yum_lock for MainThread >03:47:38,370 INFO packaging: gave up _yum_lock for MainThread >03:47:38,371 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,371 INFO packaging: have _yum_lock for MainThread >03:47:38,372 INFO packaging: gave up _yum_lock for MainThread >03:47:38,372 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,373 INFO packaging: have _yum_lock for MainThread >03:47:38,373 INFO packaging: gave up _yum_lock for MainThread >03:47:38,374 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,374 INFO packaging: have _yum_lock for MainThread >03:47:38,375 INFO packaging: gave up _yum_lock for MainThread >03:47:38,376 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,376 INFO packaging: have _yum_lock for MainThread >03:47:38,376 INFO packaging: gave up _yum_lock for MainThread >03:47:38,377 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,378 INFO packaging: have _yum_lock for MainThread >03:47:38,378 INFO packaging: gave up _yum_lock for MainThread >03:47:38,379 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,379 INFO packaging: have _yum_lock for MainThread >03:47:38,379 INFO packaging: gave up _yum_lock for MainThread >03:47:38,380 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,381 INFO packaging: have _yum_lock for MainThread >03:47:38,381 INFO packaging: gave up _yum_lock for MainThread >03:47:38,382 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,382 INFO packaging: have _yum_lock for MainThread >03:47:38,382 INFO packaging: gave up _yum_lock for MainThread >03:47:38,383 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,384 INFO packaging: have _yum_lock for MainThread >03:47:38,384 INFO packaging: gave up _yum_lock for MainThread >03:47:38,385 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,385 INFO packaging: have _yum_lock for MainThread >03:47:38,386 INFO packaging: gave up _yum_lock for MainThread >03:47:38,387 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,387 INFO packaging: have _yum_lock for MainThread >03:47:38,387 INFO packaging: gave up _yum_lock for MainThread >03:47:38,388 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,388 INFO packaging: have _yum_lock for MainThread >03:47:38,388 INFO packaging: gave up _yum_lock for MainThread >03:47:38,390 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,390 INFO packaging: have _yum_lock for MainThread >03:47:38,390 INFO packaging: gave up _yum_lock for MainThread >03:47:38,391 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,391 INFO packaging: have _yum_lock for MainThread >03:47:38,392 INFO packaging: gave up _yum_lock for MainThread >03:47:38,393 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,393 INFO packaging: have _yum_lock for MainThread >03:47:38,393 INFO packaging: gave up _yum_lock for MainThread >03:47:38,394 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,395 INFO packaging: have _yum_lock for MainThread >03:47:38,395 INFO packaging: gave up _yum_lock for MainThread >03:47:38,396 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,396 INFO packaging: have _yum_lock for MainThread >03:47:38,396 INFO packaging: gave up _yum_lock for MainThread >03:47:38,397 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,398 INFO packaging: have _yum_lock for MainThread >03:47:38,398 INFO packaging: gave up _yum_lock for MainThread >03:47:38,399 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,399 INFO packaging: have _yum_lock for MainThread >03:47:38,399 INFO packaging: gave up _yum_lock for MainThread >03:47:38,400 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,401 INFO packaging: have _yum_lock for MainThread >03:47:38,401 INFO packaging: gave up _yum_lock for MainThread >03:47:38,402 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,402 INFO packaging: have _yum_lock for MainThread >03:47:38,402 INFO packaging: gave up _yum_lock for MainThread >03:47:38,403 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,404 INFO packaging: have _yum_lock for MainThread >03:47:38,404 INFO packaging: gave up _yum_lock for MainThread >03:47:38,405 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,405 INFO packaging: have _yum_lock for MainThread >03:47:38,405 INFO packaging: gave up _yum_lock for MainThread >03:47:38,406 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,407 INFO packaging: have _yum_lock for MainThread >03:47:38,407 INFO packaging: gave up _yum_lock for MainThread >03:47:38,408 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,408 INFO packaging: have _yum_lock for MainThread >03:47:38,409 INFO packaging: gave up _yum_lock for MainThread >03:47:38,410 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,410 INFO packaging: have _yum_lock for MainThread >03:47:38,410 INFO packaging: gave up _yum_lock for MainThread >03:47:38,411 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,411 INFO packaging: have _yum_lock for MainThread >03:47:38,412 INFO packaging: gave up _yum_lock for MainThread >03:47:38,413 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,413 INFO packaging: have _yum_lock for MainThread >03:47:38,413 INFO packaging: gave up _yum_lock for MainThread >03:47:38,414 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,415 INFO packaging: have _yum_lock for MainThread >03:47:38,415 INFO packaging: gave up _yum_lock for MainThread >03:47:38,416 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,416 INFO packaging: have _yum_lock for MainThread >03:47:38,416 INFO packaging: gave up _yum_lock for MainThread >03:47:38,417 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,418 INFO packaging: have _yum_lock for MainThread >03:47:38,418 INFO packaging: gave up _yum_lock for MainThread >03:47:38,419 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,419 INFO packaging: have _yum_lock for MainThread >03:47:38,419 INFO packaging: gave up _yum_lock for MainThread >03:47:38,421 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,421 INFO packaging: have _yum_lock for MainThread >03:47:38,421 INFO packaging: gave up _yum_lock for MainThread >03:47:38,422 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,422 INFO packaging: have _yum_lock for MainThread >03:47:38,423 INFO packaging: gave up _yum_lock for MainThread >03:47:38,424 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,424 INFO packaging: have _yum_lock for MainThread >03:47:38,424 INFO packaging: gave up _yum_lock for MainThread >03:47:38,425 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,425 INFO packaging: have _yum_lock for MainThread >03:47:38,426 INFO packaging: gave up _yum_lock for MainThread >03:47:38,427 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,427 INFO packaging: have _yum_lock for MainThread >03:47:38,427 INFO packaging: gave up _yum_lock for MainThread >03:47:38,428 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,428 INFO packaging: have _yum_lock for MainThread >03:47:38,429 INFO packaging: gave up _yum_lock for MainThread >03:47:38,430 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,430 INFO packaging: have _yum_lock for MainThread >03:47:38,430 INFO packaging: gave up _yum_lock for MainThread >03:47:38,431 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,431 INFO packaging: have _yum_lock for MainThread >03:47:38,432 INFO packaging: gave up _yum_lock for MainThread >03:47:38,433 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,433 INFO packaging: have _yum_lock for MainThread >03:47:38,433 INFO packaging: gave up _yum_lock for MainThread >03:47:38,434 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,435 INFO packaging: have _yum_lock for MainThread >03:47:38,435 INFO packaging: gave up _yum_lock for MainThread >03:47:38,436 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,436 INFO packaging: have _yum_lock for MainThread >03:47:38,436 INFO packaging: gave up _yum_lock for MainThread >03:47:38,437 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,438 INFO packaging: have _yum_lock for MainThread >03:47:38,438 INFO packaging: gave up _yum_lock for MainThread >03:47:38,439 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,439 INFO packaging: have _yum_lock for MainThread >03:47:38,439 INFO packaging: gave up _yum_lock for MainThread >03:47:38,440 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,440 INFO packaging: have _yum_lock for MainThread >03:47:38,441 INFO packaging: gave up _yum_lock for MainThread >03:47:38,442 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,442 INFO packaging: have _yum_lock for MainThread >03:47:38,442 INFO packaging: gave up _yum_lock for MainThread >03:47:38,443 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,443 INFO packaging: have _yum_lock for MainThread >03:47:38,444 INFO packaging: gave up _yum_lock for MainThread >03:47:38,445 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,445 INFO packaging: have _yum_lock for MainThread >03:47:38,445 INFO packaging: gave up _yum_lock for MainThread >03:47:38,446 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,446 INFO packaging: have _yum_lock for MainThread >03:47:38,446 INFO packaging: gave up _yum_lock for MainThread >03:47:38,447 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,448 INFO packaging: have _yum_lock for MainThread >03:47:38,448 INFO packaging: gave up _yum_lock for MainThread >03:47:38,449 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,449 INFO packaging: have _yum_lock for MainThread >03:47:38,449 INFO packaging: gave up _yum_lock for MainThread >03:47:38,450 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,451 INFO packaging: have _yum_lock for MainThread >03:47:38,451 INFO packaging: gave up _yum_lock for MainThread >03:47:38,452 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,452 INFO packaging: have _yum_lock for MainThread >03:47:38,452 INFO packaging: gave up _yum_lock for MainThread >03:47:38,454 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,454 INFO packaging: have _yum_lock for MainThread >03:47:38,454 INFO packaging: gave up _yum_lock for MainThread >03:47:38,455 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,455 INFO packaging: have _yum_lock for MainThread >03:47:38,456 INFO packaging: gave up _yum_lock for MainThread >03:47:38,457 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:38,457 INFO packaging: have _yum_lock for MainThread >03:47:38,457 INFO packaging: gave up _yum_lock for MainThread >03:47:38,458 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:38,458 INFO packaging: have _yum_lock for MainThread >03:47:38,459 INFO packaging: gave up _yum_lock for MainThread >03:47:38,460 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:38,460 INFO packaging: have _yum_lock for MainThread >03:47:38,460 INFO packaging: gave up _yum_lock for MainThread >03:47:38,461 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:38,461 INFO packaging: have _yum_lock for MainThread >03:47:38,462 INFO packaging: gave up _yum_lock for MainThread >03:47:38,463 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:38,463 INFO packaging: have _yum_lock for MainThread >03:47:38,463 INFO packaging: gave up _yum_lock for MainThread >03:47:38,465 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:38,465 INFO packaging: have _yum_lock for MainThread >03:47:38,465 INFO packaging: gave up _yum_lock for MainThread >03:47:38,467 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:38,467 INFO packaging: have _yum_lock for MainThread >03:47:38,467 INFO packaging: gave up _yum_lock for MainThread >03:47:38,468 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:38,469 INFO packaging: have _yum_lock for MainThread >03:47:38,469 INFO packaging: gave up _yum_lock for MainThread >03:47:38,471 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:38,471 INFO packaging: have _yum_lock for MainThread >03:47:38,471 INFO packaging: gave up _yum_lock for MainThread >03:47:38,472 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:38,472 INFO packaging: have _yum_lock for MainThread >03:47:38,473 INFO packaging: gave up _yum_lock for MainThread >03:47:38,475 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:38,475 INFO packaging: have _yum_lock for MainThread >03:47:38,475 INFO packaging: gave up _yum_lock for MainThread >03:47:38,476 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:38,476 INFO packaging: have _yum_lock for MainThread >03:47:38,477 INFO packaging: gave up _yum_lock for MainThread >03:47:38,478 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:38,479 INFO packaging: have _yum_lock for MainThread >03:47:38,479 INFO packaging: gave up _yum_lock for MainThread >03:47:38,480 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:38,480 INFO packaging: have _yum_lock for MainThread >03:47:38,480 INFO packaging: gave up _yum_lock for MainThread >03:47:38,482 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:38,482 INFO packaging: have _yum_lock for MainThread >03:47:38,482 INFO packaging: gave up _yum_lock for MainThread >03:47:38,483 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:38,483 INFO packaging: have _yum_lock for MainThread >03:47:38,484 INFO packaging: gave up _yum_lock for MainThread >03:47:38,485 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:38,485 INFO packaging: have _yum_lock for MainThread >03:47:38,486 INFO packaging: gave up _yum_lock for MainThread >03:47:38,487 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:38,487 INFO packaging: have _yum_lock for MainThread >03:47:38,488 INFO packaging: gave up _yum_lock for MainThread >03:47:38,489 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:38,489 INFO packaging: have _yum_lock for MainThread >03:47:38,490 INFO packaging: gave up _yum_lock for MainThread >03:47:38,491 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:38,491 INFO packaging: have _yum_lock for MainThread >03:47:38,491 INFO packaging: gave up _yum_lock for MainThread >03:47:38,493 INFO packaging: about to acquire _yum_lock for AnaSoftwareWatcher at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:218 (_initialize) >03:47:38,493 INFO packaging: have _yum_lock for AnaSoftwareWatcher >03:47:38,493 DEBUG packaging: deleting package sacks >03:47:38,494 INFO packaging: gave up _yum_lock for AnaSoftwareWatcher >03:47:38,496 INFO packaging: about to acquire _yum_lock for AnaSoftwareWatcher at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1051 (selectEnvironment) >03:47:38,496 INFO packaging: have _yum_lock for AnaSoftwareWatcher >03:47:38,496 INFO packaging: gave up _yum_lock for AnaSoftwareWatcher >03:47:38,497 INFO packaging: about to acquire _yum_lock for AnaSoftwareWatcher at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:79 (_apply) >03:47:38,499 INFO packaging: have _yum_lock for AnaSoftwareWatcher >03:47:38,499 INFO packaging: gave up _yum_lock for AnaSoftwareWatcher >03:47:38,504 INFO packaging: checking software selection >03:47:38,505 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1370 (checkSoftwareSelection) >03:47:38,506 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:38,506 DEBUG packaging: deleting package sacks >03:47:38,508 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:38,509 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1371 (checkSoftwareSelection) >03:47:38,510 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:38,511 DEBUG packaging: deleting yum transaction info >03:47:38,511 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:38,512 DEBUG packaging: select group core >03:47:38,513 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1299 (_applyYumSelections) >03:47:38,517 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:38,517 DEBUG yum.verbose.YumBase: Setting up Package Sacks >03:47:38,839 DEBUG yum.verbose.YumBase: rpmdb time: 0.000 >03:47:38,911 DEBUG yum.verbose.YumBase: pkgsack time: 0.394 >03:47:39,072 DEBUG yum.verbose.YumBase: group time: 0.555 >03:47:39,146 DEBUG yum.verbose.YumBase: Obs Init time: 0.067 >03:47:39,184 DEBUG yum.verbose.YumBase: No package named ppc64-utils available to be installed >03:47:39,185 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,185 DEBUG packaging: select group core >03:47:39,187 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,187 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,187 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,187 DEBUG packaging: select group gnome-desktop >03:47:39,189 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,189 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,260 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,261 DEBUG packaging: select group multimedia >03:47:39,262 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,262 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,284 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,284 DEBUG packaging: select group input-methods >03:47:39,285 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,286 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,298 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,298 DEBUG packaging: select group base-x >03:47:39,300 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,300 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,312 DEBUG yum.verbose.YumBase: No package named xorg-x11-drv-geode available to be installed >03:47:39,315 DEBUG yum.verbose.YumBase: No package named xorg-x11-drv-omap available to be installed >03:47:39,317 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,317 DEBUG packaging: select group fonts >03:47:39,318 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,319 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,408 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,408 DEBUG packaging: select group hardware-support >03:47:39,410 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,410 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,429 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,429 DEBUG packaging: select group dial-up >03:47:39,430 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,431 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,440 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,440 DEBUG packaging: select group printing >03:47:39,441 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,442 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,514 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,514 DEBUG packaging: select group firefox >03:47:39,516 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,516 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,518 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,518 DEBUG packaging: select group standard >03:47:39,519 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:39,520 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,586 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,586 DEBUG packaging: select package kernel-PAE >03:47:39,587 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1412 (selectKernelPackage) >03:47:39,588 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,612 DEBUG yum.verbose.YumBase: Checking for virtual provide or file-provide for kernel-PAE >03:47:39,614 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,614 INFO packaging: no kernel-PAE package >03:47:39,615 DEBUG packaging: select package kernel >03:47:39,616 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1412 (selectKernelPackage) >03:47:39,616 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,617 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,618 INFO packaging: selected kernel >03:47:39,618 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1336 (_applyYumSelections) >03:47:39,619 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,619 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:39,620 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:101 (checkSoftwareSelection) >03:47:39,620 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:39,620 INFO packaging: checking dependencies >03:47:39,872 DEBUG yum.verbose.YumBase: Building updates object >03:47:39,988 DEBUG yum.verbose.YumBase: up:simple updates time: 0.045 >03:47:40,060 DEBUG yum.verbose.YumBase: up:obs time: 0.072 >03:47:40,061 DEBUG yum.verbose.YumBase: up:condense time: 0.000 >03:47:40,062 DEBUG yum.verbose.YumBase: updates time: 0.189 >03:47:40,064 DEBUG yum.verbose.YumBase: TSINFO: Marking glib2-2.36.1-2.fc19.x86_64 as install for ModemManager-0.6.0.0-3.fc19.x86_64 >03:47:40,067 DEBUG yum.verbose.YumBase: TSINFO: Marking dbus-glib-0.100-3.fc19.x86_64 as install for ModemManager-0.6.0.0-3.fc19.x86_64 >03:47:40,069 DEBUG yum.verbose.YumBase: TSINFO: Marking libgudev1-203-2.fc19.x86_64 as install for ModemManager-0.6.0.0-3.fc19.x86_64 >03:47:40,071 DEBUG yum.verbose.YumBase: Quick matched glib2-2.36.1-2.fc19.x86_64 to require for libgmodule-2.0.so.0()(64bit) >03:47:40,071 DEBUG yum.verbose.YumBase: Quick matched glib2-2.36.1-2.fc19.x86_64 to require for libglib-2.0.so.0()(64bit) >03:47:40,073 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:dbus-libs-1.6.8-5.fc19.x86_64 as install for ModemManager-0.6.0.0-3.fc19.x86_64 >03:47:40,085 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:NetworkManager-glib-0.9.8.1-1.git20130327.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,088 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:wpa_supplicant-1.1-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,091 DEBUG yum.verbose.YumBase: TSINFO: Marking libnl3-3.2.21-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,094 DEBUG yum.verbose.YumBase: TSINFO: Marking systemd-sysv-203-2.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,105 DEBUG yum.verbose.YumBase: TSINFO: Marking iptables-1.4.18-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,108 DEBUG yum.verbose.YumBase: TSINFO: Marking dnsmasq-2.66-3.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,110 DEBUG yum.verbose.YumBase: TSINFO: Marking chkconfig-1.3.60-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,111 DEBUG yum.verbose.YumBase: Quick matched chkconfig-1.3.60-1.fc19.x86_64 to require for chkconfig >03:47:40,113 DEBUG yum.verbose.YumBase: TSINFO: Marking avahi-autoipd-0.6.31-11.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,115 DEBUG yum.verbose.YumBase: TSINFO: Marking libuuid-2.23-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,118 DEBUG yum.verbose.YumBase: TSINFO: Marking systemd-libs-203-2.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,122 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-3.14.3-12.0.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,124 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libsmime3.so()(64bit) >03:47:40,125 DEBUG yum.verbose.YumBase: TSINFO: Marking polkit-0.110-3.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,128 DEBUG yum.verbose.YumBase: TSINFO: Marking nspr-4.9.5-2.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,129 DEBUG yum.verbose.YumBase: Quick matched nspr-4.9.5-2.fc19.x86_64 to require for libplc4.so()(64bit) >03:47:40,131 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-util-3.14.3-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:47:40,133 DEBUG yum.verbose.YumBase: Quick matched 1:NetworkManager-glib-0.9.8.1-1.git20130327.fc19.x86_64 to require for libnm-glib.so.4()(64bit) >03:47:40,134 DEBUG yum.verbose.YumBase: Quick matched libnl3-3.2.21-1.fc19.x86_64 to require for libnl-genl-3.so.200()(64bit) >03:47:40,135 DEBUG yum.verbose.YumBase: Quick matched libnl3-3.2.21-1.fc19.x86_64 to require for libnl-3.so.200()(64bit) >03:47:40,142 DEBUG yum.verbose.YumBase: TSINFO: Marking shared-mime-info-1.1-4.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,146 DEBUG yum.verbose.YumBase: TSINFO: Marking xl2tpd-1.3.1-13.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,148 DEBUG yum.verbose.YumBase: TSINFO: Marking pptp-1.7.2-20.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,150 DEBUG yum.verbose.YumBase: TSINFO: Marking openswan-2.6.38-11.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,153 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-keyring-3.8.2-1.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,156 DEBUG yum.verbose.YumBase: TSINFO: Marking pango-1.34.0-1.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,158 DEBUG yum.verbose.YumBase: Quick matched pango-1.34.0-1.fc19.x86_64 to require for libpango-1.0.so.0()(64bit) >03:47:40,159 DEBUG yum.verbose.YumBase: TSINFO: Marking gtk3-3.8.1-1.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,164 DEBUG yum.verbose.YumBase: TSINFO: Marking libgnome-keyring-3.8.0-1.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,167 DEBUG yum.verbose.YumBase: TSINFO: Marking gdk-pixbuf2-2.28.1-1.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,170 DEBUG yum.verbose.YumBase: TSINFO: Marking cairo-1.12.14-1.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,172 DEBUG yum.verbose.YumBase: TSINFO: Marking cairo-gobject-1.12.14-1.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,175 DEBUG yum.verbose.YumBase: TSINFO: Marking atk-2.8.0-1.fc19.x86_64 as install for NetworkManager-l2tp-0.9.8-1.fc19.x86_64 >03:47:40,183 DEBUG yum.verbose.YumBase: TSINFO: Marking openconnect-4.99-1.fc19.x86_64 as install for NetworkManager-openconnect-0.9.7.0-2.git20120918.fc19.x86_64 >03:47:40,186 DEBUG yum.verbose.YumBase: TSINFO: Marking libxml2-2.9.1-1.fc19.x86_64 as install for NetworkManager-openconnect-0.9.7.0-2.git20120918.fc19.x86_64 >03:47:40,187 DEBUG yum.verbose.YumBase: Quick matched libxml2-2.9.1-1.fc19.x86_64 to require for libxml2.so.2(LIBXML2_2.4.30)(64bit) >03:47:40,190 DEBUG yum.verbose.YumBase: TSINFO: Marking GConf2-3.2.6-6.fc19.x86_64 as install for NetworkManager-openconnect-0.9.7.0-2.git20120918.fc19.x86_64 >03:47:40,200 DEBUG yum.verbose.YumBase: TSINFO: Marking openvpn-2.3.1-2.fc19.x86_64 as install for 1:NetworkManager-openvpn-0.9.6.0-2.fc19.x86_64 >03:47:40,204 DEBUG yum.verbose.YumBase: TSINFO: Marking desktop-file-utils-0.21-2.fc19.x86_64 as install for 1:NetworkManager-openvpn-0.9.6.0-2.fc19.x86_64 >03:47:40,217 DEBUG yum.verbose.YumBase: TSINFO: Marking vpnc-0.5.3-17.svn457.fc19.x86_64 as install for 1:NetworkManager-vpnc-0.9.3.997-4.fc19.x86_64 >03:47:40,221 DEBUG yum.verbose.YumBase: TSINFO: Marking PackageKit-glib-0.8.7-4.fc19.x86_64 as install for PackageKit-command-not-found-0.8.7-4.fc19.x86_64 >03:47:40,223 DEBUG yum.verbose.YumBase: TSINFO: Marking sqlite-3.7.16.2-1.fc19.x86_64 as install for PackageKit-command-not-found-0.8.7-4.fc19.x86_64 >03:47:40,225 DEBUG yum.verbose.YumBase: TSINFO: Marking libarchive-3.1.2-2.fc19.x86_64 as install for PackageKit-command-not-found-0.8.7-4.fc19.x86_64 >03:47:40,231 DEBUG yum.verbose.YumBase: TSINFO: Marking gstreamer-0.10.36-3.fc19.x86_64 as install for PackageKit-gstreamer-plugin-0.8.7-4.fc19.x86_64 >03:47:40,232 DEBUG yum.verbose.YumBase: Quick matched gstreamer-0.10.36-3.fc19.x86_64 to require for libgstreamer-0.10.so.0()(64bit) >03:47:40,237 DEBUG yum.verbose.YumBase: TSINFO: Marking gtk2-2.24.17-1.fc19.x86_64 as install for PackageKit-gtk3-module-0.8.7-4.fc19.x86_64 >03:47:40,241 DEBUG yum.verbose.YumBase: Quick matched gtk2-2.24.17-1.fc19.x86_64 to require for libgdk-x11-2.0.so.0()(64bit) >03:47:40,242 DEBUG yum.verbose.YumBase: TSINFO: Marking freetype-2.4.11-3.fc19.x86_64 as install for PackageKit-gtk3-module-0.8.7-4.fc19.x86_64 >03:47:40,244 DEBUG yum.verbose.YumBase: TSINFO: Marking fontconfig-2.10.92-3.fc19.x86_64 as install for PackageKit-gtk3-module-0.8.7-4.fc19.x86_64 >03:47:40,247 DEBUG yum.verbose.YumBase: TSINFO: Marking PackageKit-0.8.7-4.fc19.x86_64 as install for PackageKit-yum-plugin-0.8.7-4.fc19.x86_64 >03:47:40,250 DEBUG yum.verbose.YumBase: TSINFO: Marking dbus-python-1.1.1-5.fc19.x86_64 as install for PackageKit-yum-plugin-0.8.7-4.fc19.x86_64 >03:47:40,253 DEBUG yum.verbose.YumBase: TSINFO: Marking fontpackages-filesystem-1.44-7.fc19.noarch as install for abattis-cantarell-fonts-0.0.12-2.fc19.noarch >03:47:40,256 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,259 DEBUG yum.verbose.YumBase: TSINFO: Marking gdb-7.6-24.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,261 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-plugin-ureport-2.1.3-3.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,264 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-plugin-logger-2.1.3-3.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,266 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-plugin-bugzilla-2.1.3-3.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,268 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-fedora-2.1.3-3.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,270 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-abrt-0.2.12-3.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,272 DEBUG yum.verbose.YumBase: TSINFO: Marking elfutils-0.155-5.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,277 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-retrace-client-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,279 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-plugin-bodhi-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,281 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-gui-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,284 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-addon-xorg-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,286 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-addon-vmcore-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,288 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-addon-python-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,290 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-addon-kerneloops-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,292 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-addon-ccpp-2.1.3-2.fc19.x86_64 as install for abrt-desktop-2.1.3-2.fc19.x86_64 >03:47:40,297 DEBUG yum.verbose.YumBase: TSINFO: Marking libacl-2.2.51-9.fc19.x86_64 as install for acl-2.2.51-9.fc19.x86_64 >03:47:40,298 DEBUG yum.verbose.YumBase: Quick matched libacl-2.2.51-9.fc19.x86_64 to require for libacl.so.1(ACL_1.0)(64bit) >03:47:40,300 DEBUG yum.verbose.YumBase: TSINFO: Marking libattr-2.4.46-10.fc19.x86_64 as install for acl-2.2.51-9.fc19.x86_64 >03:47:40,307 DEBUG yum.verbose.YumBase: TSINFO: Marking 5:guile-2.0.9-1.fc19.x86_64 as install for 1:aisleriot-3.7.91-1.fc19.x86_64 >03:47:40,308 DEBUG yum.verbose.YumBase: Quick matched 5:guile-2.0.9-1.fc19.x86_64 to require for libguile-2.0.so.22()(64bit) >03:47:40,310 DEBUG yum.verbose.YumBase: TSINFO: Marking gc-7.2d-2.fc19.x86_64 as install for 1:aisleriot-3.7.91-1.fc19.x86_64 >03:47:40,312 DEBUG yum.verbose.YumBase: TSINFO: Marking libcanberra-0.30-3.fc19.x86_64 as install for 1:aisleriot-3.7.91-1.fc19.x86_64 >03:47:40,315 DEBUG yum.verbose.YumBase: TSINFO: Marking libX11-1.5.99.901-2.fc19.x86_64 as install for 1:aisleriot-3.7.91-1.fc19.x86_64 >03:47:40,317 DEBUG yum.verbose.YumBase: TSINFO: Marking alsa-tools-firmware-1.0.27-1.fc19.x86_64 as install for alsa-firmware-1.0.27-1.fc19.noarch >03:47:40,321 DEBUG yum.verbose.YumBase: TSINFO: Marking pulseaudio-libs-3.0-7.fc19.x86_64 as install for alsa-plugins-pulseaudio-1.0.27-1.fc19.x86_64 >03:47:40,324 DEBUG yum.verbose.YumBase: TSINFO: Marking alsa-lib-1.0.27-3.fc19.x86_64 as install for alsa-plugins-pulseaudio-1.0.27-1.fc19.x86_64 >03:47:40,325 DEBUG yum.verbose.YumBase: Quick matched alsa-lib-1.0.27-3.fc19.x86_64 to require for libasound.so.2(ALSA_0.9)(64bit) >03:47:40,334 DEBUG yum.verbose.YumBase: TSINFO: Marking libsamplerate-0.1.8-4.fc19.x86_64 as install for alsa-utils-1.0.27-2.fc19.x86_64 >03:47:40,335 DEBUG yum.verbose.YumBase: Quick matched libsamplerate-0.1.8-4.fc19.x86_64 to require for libsamplerate.so.0(libsamplerate.so.0.0)(64bit) >03:47:40,337 DEBUG yum.verbose.YumBase: TSINFO: Marking dialog-1.2-1.20121230.fc19.x86_64 as install for alsa-utils-1.0.27-2.fc19.x86_64 >03:47:40,339 DEBUG yum.verbose.YumBase: TSINFO: Marking ncurses-libs-5.9-10.20130413.fc19.x86_64 as install for alsa-utils-1.0.27-2.fc19.x86_64 >03:47:40,341 DEBUG yum.verbose.YumBase: Quick matched ncurses-libs-5.9-10.20130413.fc19.x86_64 to require for libncursesw.so.5()(64bit) >03:47:40,341 DEBUG yum.verbose.YumBase: Quick matched ncurses-libs-5.9-10.20130413.fc19.x86_64 to require for libmenuw.so.5()(64bit) >03:47:40,342 DEBUG yum.verbose.YumBase: Quick matched ncurses-libs-5.9-10.20130413.fc19.x86_64 to require for libformw.so.5()(64bit) >03:47:40,347 DEBUG yum.verbose.YumBase: TSINFO: Marking pam-1.1.6-10.fc19.x86_64 as install for at-3.1.13-12.fc19.x86_64 >03:47:40,350 DEBUG yum.verbose.YumBase: TSINFO: Marking libselinux-2.1.13-12.fc19.x86_64 as install for at-3.1.13-12.fc19.x86_64 >03:47:40,351 DEBUG yum.verbose.YumBase: Quick matched pam-1.1.6-10.fc19.x86_64 to require for libpam.so.0()(64bit) >03:47:40,359 DEBUG yum.verbose.YumBase: TSINFO: Marking libXtst-1.2.1-5.fc19.x86_64 as install for at-spi2-core-2.8.0-1.fc19.x86_64 >03:47:40,361 DEBUG yum.verbose.YumBase: TSINFO: Marking libXext-1.3.1-4.fc19.x86_64 as install for at-spi2-core-2.8.0-1.fc19.x86_64 >03:47:40,362 DEBUG yum.verbose.YumBase: TSINFO: Marking libXevie-1.0.3-6.fc19.x86_64 as install for at-spi2-core-2.8.0-1.fc19.x86_64 >03:47:40,374 DEBUG yum.verbose.YumBase: TSINFO: Marking audit-libs-2.3-2.fc19.x86_64 as install for audit-2.3-2.fc19.x86_64 >03:47:40,377 DEBUG yum.verbose.YumBase: TSINFO: Marking krb5-libs-1.11.2-2.fc19.x86_64 as install for audit-2.3-2.fc19.x86_64 >03:47:40,378 DEBUG yum.verbose.YumBase: Quick matched krb5-libs-1.11.2-2.fc19.x86_64 to require for libgssapi_krb5.so.2(gssapi_krb5_2_MIT)(64bit) >03:47:40,382 DEBUG yum.verbose.YumBase: TSINFO: Marking tcp_wrappers-libs-7.6-73.fc19.x86_64 as install for audit-2.3-2.fc19.x86_64 >03:47:40,386 DEBUG yum.verbose.YumBase: Quick matched krb5-libs-1.11.2-2.fc19.x86_64 to require for libgssapi_krb5.so.2()(64bit) >03:47:40,387 DEBUG yum.verbose.YumBase: Quick matched audit-libs-2.3-2.fc19.x86_64 to require for libaudit.so.1()(64bit) >03:47:40,397 DEBUG yum.verbose.YumBase: TSINFO: Marking python-2.7.4-4.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:47:40,402 DEBUG yum.verbose.YumBase: TSINFO: Marking libpwquality-1.2.1-2.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:47:40,409 DEBUG yum.verbose.YumBase: TSINFO: Marking newt-python-0.52.15-1.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:47:40,411 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:openssl-1.0.1e-4.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:47:40,420 DEBUG yum.verbose.YumBase: TSINFO: Marking avahi-libs-0.6.31-11.fc19.x86_64 as install for avahi-0.6.31-11.fc19.x86_64 >03:47:40,422 DEBUG yum.verbose.YumBase: TSINFO: Marking libdaemon-0.14-5.fc19.x86_64 as install for avahi-0.6.31-11.fc19.x86_64 >03:47:40,424 DEBUG yum.verbose.YumBase: TSINFO: Marking expat-2.1.0-5.fc19.x86_64 as install for avahi-0.6.31-11.fc19.x86_64 >03:47:40,426 DEBUG yum.verbose.YumBase: TSINFO: Marking glibc-common-2.17-4.fc19.x86_64 as install for avahi-0.6.31-11.fc19.x86_64 >03:47:40,428 DEBUG yum.verbose.YumBase: TSINFO: Marking libcap-2.22-5.fc19.x86_64 as install for avahi-0.6.31-11.fc19.x86_64 >03:47:40,432 DEBUG yum.verbose.YumBase: TSINFO: Marking kmod-13-2.fc19.x86_64 as install for b43-openfwwf-5.2-9.fc19.noarch >03:47:40,448 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:pkgconfig-0.27.1-1.fc19.x86_64 as install for 1:bash-completion-2.1-2.fc19.noarch >03:47:40,452 DEBUG yum.verbose.YumBase: TSINFO: Marking info-5.1-1.fc19.x86_64 as install for bc-1.06.95-9.fc19.x86_64 >03:47:40,454 DEBUG yum.verbose.YumBase: TSINFO: Marking readline-6.2-6.fc19.x86_64 as install for bc-1.06.95-9.fc19.x86_64 >03:47:40,461 DEBUG yum.verbose.YumBase: TSINFO: Marking libidn-1.26-2.fc19.x86_64 as install for 32:bind-utils-9.9.3-0.2.rc1.fc19.x86_64 >03:47:40,464 DEBUG yum.verbose.YumBase: TSINFO: Marking zlib-1.2.7-10.fc19.x86_64 as install for 32:bind-utils-9.9.3-0.2.rc1.fc19.x86_64 >03:47:40,466 DEBUG yum.verbose.YumBase: TSINFO: Marking 32:bind-libs-9.9.3-0.2.rc1.fc19.x86_64 as install for 32:bind-utils-9.9.3-0.2.rc1.fc19.x86_64 >03:47:40,468 DEBUG yum.verbose.YumBase: Quick matched 32:bind-libs-9.9.3-0.2.rc1.fc19.x86_64 to require for libisccfg.so.90()(64bit) >03:47:40,468 DEBUG yum.verbose.YumBase: Quick matched 32:bind-libs-9.9.3-0.2.rc1.fc19.x86_64 to require for libisccc.so.90()(64bit) >03:47:40,469 DEBUG yum.verbose.YumBase: Quick matched 32:bind-libs-9.9.3-0.2.rc1.fc19.x86_64 to require for libisc.so.95()(64bit) >03:47:40,471 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:openssl-libs-1.0.1e-4.fc19.x86_64 as install for 32:bind-utils-9.9.3-0.2.rc1.fc19.x86_64 >03:47:40,473 DEBUG yum.verbose.YumBase: TSINFO: Marking libcom_err-1.42.7-2.fc19.x86_64 as install for 32:bind-utils-9.9.3-0.2.rc1.fc19.x86_64 >03:47:40,477 DEBUG yum.verbose.YumBase: TSINFO: Marking pciutils-libs-3.1.10-3.fc19.x86_64 as install for biosdevname-0.4.1-4.fc19.x86_64 >03:47:40,478 DEBUG yum.verbose.YumBase: Quick matched pciutils-libs-3.1.10-3.fc19.x86_64 to require for libpci.so.3()(64bit) >03:47:40,481 DEBUG yum.verbose.YumBase: TSINFO: Marking bluez-libs-4.101-6.fc19.x86_64 as install for bluez-cups-4.101-6.fc19.x86_64 >03:47:40,491 DEBUG yum.verbose.YumBase: TSINFO: Marking genisoimage-1.1.11-17.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,493 DEBUG yum.verbose.YumBase: TSINFO: Marking libisofs-1.2.8-1.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,495 DEBUG yum.verbose.YumBase: TSINFO: Marking libburn-1.2.8-1.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,497 DEBUG yum.verbose.YumBase: TSINFO: Marking dvd+rw-tools-7.1-12.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,499 DEBUG yum.verbose.YumBase: TSINFO: Marking wodim-1.1.11-17.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,502 DEBUG yum.verbose.YumBase: TSINFO: Marking cdrdao-1.2.3-19.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,504 DEBUG yum.verbose.YumBase: TSINFO: Marking icedax-1.1.11-17.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,506 DEBUG yum.verbose.YumBase: TSINFO: Marking tracker-0.16.1-1.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,509 DEBUG yum.verbose.YumBase: TSINFO: Marking totem-pl-parser-3.4.4-1.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,511 DEBUG yum.verbose.YumBase: TSINFO: Marking libnotify-0.7.5-5.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,516 DEBUG yum.verbose.YumBase: TSINFO: Marking gstreamer1-plugins-base-1.0.7-1.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,518 DEBUG yum.verbose.YumBase: Quick matched gstreamer1-plugins-base-1.0.7-1.fc19.x86_64 to require for libgsttag-1.0.so.0()(64bit) >03:47:40,520 DEBUG yum.verbose.YumBase: TSINFO: Marking gstreamer1-1.0.7-1.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,538 DEBUG yum.verbose.YumBase: TSINFO: Marking brasero-libs-3.8.0-1.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,539 DEBUG yum.verbose.YumBase: Quick matched brasero-libs-3.8.0-1.fc19.x86_64 to require for libbrasero-media3.so.1()(64bit) >03:47:40,540 DEBUG yum.verbose.YumBase: Quick matched brasero-libs-3.8.0-1.fc19.x86_64 to require for libbrasero-burn3.so.1()(64bit) >03:47:40,541 DEBUG yum.verbose.YumBase: TSINFO: Marking libSM-1.2.1-5.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,543 DEBUG yum.verbose.YumBase: TSINFO: Marking libICE-1.0.8-5.fc19.x86_64 as install for brasero-3.8.0-1.fc19.x86_64 >03:47:40,551 DEBUG yum.verbose.YumBase: TSINFO: Marking nautilus-extensions-3.8.1-1.fc19.x86_64 as install for brasero-nautilus-3.8.0-1.fc19.x86_64 >03:47:40,570 DEBUG yum.verbose.YumBase: TSINFO: Marking libblkid-2.23-1.fc19.x86_64 as install for btrfs-progs-0.20.rc1.20130308git704a08c-1.fc19.x86_64 >03:47:40,576 DEBUG yum.verbose.YumBase: Quick matched libblkid-2.23-1.fc19.x86_64 to require for libblkid.so.1(BLKID_2.17)(64bit) >03:47:40,577 DEBUG yum.verbose.YumBase: Quick matched libblkid-2.23-1.fc19.x86_64 to require for libblkid.so.1(BLKID_2.15)(64bit) >03:47:40,577 DEBUG yum.verbose.YumBase: Quick matched libblkid-2.23-1.fc19.x86_64 to require for libblkid.so.1(BLKID_1.0)(64bit) >03:47:40,583 DEBUG yum.verbose.YumBase: TSINFO: Marking e2fsprogs-libs-1.42.7-2.fc19.x86_64 as install for btrfs-progs-0.20.rc1.20130308git704a08c-1.fc19.x86_64 >03:47:40,586 DEBUG yum.verbose.YumBase: TSINFO: Marking bzip2-libs-1.0.6-8.fc19.x86_64 as install for bzip2-1.0.6-8.fc19.x86_64 >03:47:40,599 DEBUG yum.verbose.YumBase: TSINFO: Marking python-caribou-0.4.10-1.fc19.noarch as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,606 DEBUG yum.verbose.YumBase: TSINFO: Marking gobject-introspection-1.36.0-1.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,608 DEBUG yum.verbose.YumBase: TSINFO: Marking libxklavier-5.3-2.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,610 DEBUG yum.verbose.YumBase: TSINFO: Marking json-glib-0.16.0-1.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,614 DEBUG yum.verbose.YumBase: TSINFO: Marking libgee-0.10.1-1.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,618 DEBUG yum.verbose.YumBase: TSINFO: Marking cogl-1.14.0-1.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,624 DEBUG yum.verbose.YumBase: Quick matched cogl-1.14.0-1.fc19.x86_64 to require for libcogl-pango.so.12()(64bit) >03:47:40,625 DEBUG yum.verbose.YumBase: TSINFO: Marking clutter-1.14.2-1.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,627 DEBUG yum.verbose.YumBase: TSINFO: Marking libXrandr-1.4.0-3.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,630 DEBUG yum.verbose.YumBase: TSINFO: Marking libXi-1.7.1-1.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,636 DEBUG yum.verbose.YumBase: TSINFO: Marking libXfixes-5.0-5.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,638 DEBUG yum.verbose.YumBase: TSINFO: Marking libXdamage-1.1.4-3.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,640 DEBUG yum.verbose.YumBase: TSINFO: Marking libXcomposite-0.4.4-3.fc19.x86_64 as install for caribou-0.4.10-1.fc19.x86_64 >03:47:40,673 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:cheese-libs-3.8.1-1.fc19.x86_64 as install for 2:cheese-3.8.1-1.fc19.x86_64 >03:47:40,676 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-video-effects-0.4.0-5.fc19.noarch as install for 2:cheese-3.8.1-1.fc19.x86_64 >03:47:40,679 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-desktop3-3.8.1-1.fc19.x86_64 as install for 2:cheese-3.8.1-1.fc19.x86_64 >03:47:40,681 DEBUG yum.verbose.YumBase: TSINFO: Marking clutter-gtk-1.4.4-1.fc19.x86_64 as install for 2:cheese-3.8.1-1.fc19.x86_64 >03:47:40,690 DEBUG yum.verbose.YumBase: TSINFO: Marking clutter-gst2-2.0.2-1.fc19.x86_64 as install for 2:cheese-3.8.1-1.fc19.x86_64 >03:47:40,692 DEBUG yum.verbose.YumBase: Quick matched 2:cheese-libs-3.8.1-1.fc19.x86_64 to require for libcheese-gtk.so.23()(64bit) >03:47:40,704 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:libwbclient-4.0.5-1.fc19.x86_64 as install for cifs-utils-6.0-1.fc19.x86_64 >03:47:40,706 DEBUG yum.verbose.YumBase: TSINFO: Marking libtalloc-2.0.8-2.fc19.x86_64 as install for cifs-utils-6.0-1.fc19.x86_64 >03:47:40,713 DEBUG yum.verbose.YumBase: TSINFO: Marking keyutils-libs-1.5.5-4.fc19.x86_64 as install for cifs-utils-6.0-1.fc19.x86_64 >03:47:40,714 DEBUG yum.verbose.YumBase: Quick matched keyutils-libs-1.5.5-4.fc19.x86_64 to require for libkeyutils.so.1(KEYUTILS_0.3)(64bit) >03:47:40,716 DEBUG yum.verbose.YumBase: TSINFO: Marking keyutils-1.5.5-4.fc19.x86_64 as install for cifs-utils-6.0-1.fc19.x86_64 >03:47:40,719 DEBUG yum.verbose.YumBase: TSINFO: Marking libcap-ng-0.7.3-3.fc19.x86_64 as install for cifs-utils-6.0-1.fc19.x86_64 >03:47:40,739 DEBUG yum.verbose.YumBase: TSINFO: Marking color-filesystem-1-12.fc19.noarch as install for colord-0.1.34-1.fc19.x86_64 >03:47:40,740 DEBUG yum.verbose.YumBase: TSINFO: Marking libusbx-1.0.15-2.fc19.x86_64 as install for colord-0.1.34-1.fc19.x86_64 >03:47:40,743 DEBUG yum.verbose.YumBase: TSINFO: Marking lcms2-2.4-6.fc19.x86_64 as install for colord-0.1.34-1.fc19.x86_64 >03:47:40,750 DEBUG yum.verbose.YumBase: TSINFO: Marking libgusb-0.1.6-1.fc19.x86_64 as install for colord-0.1.34-1.fc19.x86_64 >03:47:40,752 DEBUG yum.verbose.YumBase: TSINFO: Marking colord-libs-0.1.34-1.fc19.x86_64 as install for colord-0.1.34-1.fc19.x86_64 >03:47:40,753 DEBUG yum.verbose.YumBase: Quick matched colord-libs-0.1.34-1.fc19.x86_64 to require for libcolordprivate.so.1()(64bit) >03:47:40,753 DEBUG yum.verbose.YumBase: Quick matched colord-libs-0.1.34-1.fc19.x86_64 to require for libcolord.so.1()(64bit) >03:47:40,776 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:control-center-filesystem-3.8.1.5-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,779 DEBUG yum.verbose.YumBase: TSINFO: Marking redhat-menus-12.0.2-6.fc19.noarch as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,785 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-menus-3.8.0-2.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,787 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:libsmbclient-4.0.5-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,789 DEBUG yum.verbose.YumBase: TSINFO: Marking pulseaudio-libs-glib2-3.0-7.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,796 DEBUG yum.verbose.YumBase: TSINFO: Marking iso-codes-3.41-2.fc19.noarch as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,798 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:dbus-x11-1.6.8-5.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,800 DEBUG yum.verbose.YumBase: TSINFO: Marking accountsservice-0.6.31-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,803 DEBUG yum.verbose.YumBase: TSINFO: Marking libgnomekbd-3.6.0-2.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,808 DEBUG yum.verbose.YumBase: TSINFO: Marking libwacom-0.7.1-2.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,815 DEBUG yum.verbose.YumBase: TSINFO: Marking upower-0.9.20-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,817 DEBUG yum.verbose.YumBase: TSINFO: Marking libnm-gtk-0.9.8.1-2.git20130327.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,821 DEBUG yum.verbose.YumBase: TSINFO: Marking ibus-libs-1.5.1-3.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,825 DEBUG yum.verbose.YumBase: TSINFO: Marking libgtop2-2.28.4-4.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,831 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-online-accounts-3.8.1-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,832 DEBUG yum.verbose.YumBase: Quick matched gnome-online-accounts-3.8.1-1.fc19.x86_64 to require for libgoa-1.0.so.0()(64bit) >03:47:40,834 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:gnome-bluetooth-libs-3.8.0-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,836 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:cups-libs-1.6.2-4.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,840 DEBUG yum.verbose.YumBase: TSINFO: Marking colord-gtk-0.1.25-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,848 DEBUG yum.verbose.YumBase: TSINFO: Marking accountsservice-libs-0.6.31-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,850 DEBUG yum.verbose.YumBase: TSINFO: Marking mesa-libGL-9.1.1-1.fc19.x86_64 as install for 1:control-center-3.8.1.5-1.fc19.x86_64 >03:47:40,865 DEBUG yum.verbose.YumBase: TSINFO: Marking pcsc-lite-libs-1.8.8-1.fc19.x86_64 as install for coolkey-1.1.0-22.fc19.x86_64 >03:47:40,868 DEBUG yum.verbose.YumBase: TSINFO: Marking pcsc-lite-1.8.8-1.fc19.x86_64 as install for coolkey-1.1.0-22.fc19.x86_64 >03:47:40,874 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-tools-3.14.3-12.0.fc19.x86_64 as install for coolkey-1.1.0-22.fc19.x86_64 >03:47:40,877 DEBUG yum.verbose.YumBase: TSINFO: Marking libstdc++-4.8.0-2.fc19.x86_64 as install for coolkey-1.1.0-22.fc19.x86_64 >03:47:40,880 DEBUG yum.verbose.YumBase: Quick matched libstdc++-4.8.0-2.fc19.x86_64 to require for libstdc++.so.6(GLIBCXX_3.4)(64bit) >03:47:40,882 DEBUG yum.verbose.YumBase: Quick matched libstdc++-4.8.0-2.fc19.x86_64 to require for libstdc++.so.6(CXXABI_1.3)(64bit) >03:47:40,883 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-softokn-3.14.3-1.fc19.x86_64 as install for coolkey-1.1.0-22.fc19.x86_64 >03:47:40,890 DEBUG yum.verbose.YumBase: TSINFO: Marking libgcc-4.8.0-2.fc19.x86_64 as install for coolkey-1.1.0-22.fc19.x86_64 >03:47:40,891 DEBUG yum.verbose.YumBase: TSINFO: Marking pcsc-lite-ccid-1.4.10-1.fc19.x86_64 as install for coolkey-1.1.0-22.fc19.x86_64 >03:47:40,912 DEBUG yum.verbose.YumBase: TSINFO: Marking grep-2.14-3.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:40,914 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:gmp-5.1.1-2.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:40,915 DEBUG yum.verbose.YumBase: Quick matched 1:gmp-5.1.1-2.fc19.x86_64 to require for libgmp.so.10()(64bit) >03:47:40,924 DEBUG yum.verbose.YumBase: TSINFO: Marking sed-4.2.2-2.fc19.x86_64 as install for cronie-1.4.10-4.fc19.x86_64 >03:47:40,937 DEBUG yum.verbose.YumBase: TSINFO: Marking cronie-anacron-1.4.10-4.fc19.x86_64 as install for cronie-1.4.10-4.fc19.x86_64 >03:47:40,946 DEBUG yum.verbose.YumBase: TSINFO: Marking cryptsetup-libs-1.6.1-1.fc19.x86_64 as install for cryptsetup-1.6.1-1.fc19.x86_64 >03:47:40,948 DEBUG yum.verbose.YumBase: TSINFO: Marking fipscheck-lib-1.3.1-3.fc19.x86_64 as install for cryptsetup-1.6.1-1.fc19.x86_64 >03:47:40,950 DEBUG yum.verbose.YumBase: TSINFO: Marking popt-1.13-14.fc19.x86_64 as install for cryptsetup-1.6.1-1.fc19.x86_64 >03:47:40,981 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:cups-filesystem-1.6.2-4.fc19.noarch as install for 1:cups-1.6.2-4.fc19.x86_64 >03:47:40,982 DEBUG yum.verbose.YumBase: TSINFO: Marking gnutls-3.1.10-1.fc19.x86_64 as install for 1:cups-1.6.2-4.fc19.x86_64 >03:47:40,984 DEBUG yum.verbose.YumBase: TSINFO: Marking cups-filters-1.0.31-2.fc19.x86_64 as install for 1:cups-1.6.2-4.fc19.x86_64 >03:47:40,990 DEBUG yum.verbose.YumBase: TSINFO: Marking libgpg-error-1.11-1.fc19.x86_64 as install for 1:cups-1.6.2-4.fc19.x86_64 >03:47:40,992 DEBUG yum.verbose.YumBase: TSINFO: Marking libgcrypt-1.5.2-1.fc19.x86_64 as install for 1:cups-1.6.2-4.fc19.x86_64 >03:47:41,007 DEBUG yum.verbose.YumBase: TSINFO: Marking libcurl-7.29.0-6.fc19.x86_64 as install for curl-7.29.0-6.fc19.x86_64 >03:47:41,012 DEBUG yum.verbose.YumBase: Quick matched libcurl-7.29.0-6.fc19.x86_64 to require for libcurl.so.4()(64bit) >03:47:41,016 DEBUG yum.verbose.YumBase: TSINFO: Marking cyrus-sasl-lib-2.1.26-6.fc19.x86_64 as install for cyrus-sasl-plain-2.1.26-6.fc19.x86_64 >03:47:41,048 DEBUG yum.verbose.YumBase: TSINFO: Marking duplicity-0.6.21-1.fc19.x86_64 as install for deja-dup-26.0-1.fc19.x86_64 >03:47:41,054 DEBUG yum.verbose.YumBase: TSINFO: Marking python-cloudfiles-1.7.10-3.fc19.noarch as install for deja-dup-26.0-1.fc19.x86_64 >03:47:41,055 DEBUG yum.verbose.YumBase: TSINFO: Marking libsecret-0.15-1.fc19.x86_64 as install for deja-dup-26.0-1.fc19.x86_64 >03:47:41,057 DEBUG yum.verbose.YumBase: TSINFO: Marking libpeas-1.8.0-1.fc19.x86_64 as install for deja-dup-26.0-1.fc19.x86_64 >03:47:41,062 DEBUG yum.verbose.YumBase: TSINFO: Marking dejavu-fonts-common-2.33-5.fc19.noarch as install for dejavu-sans-fonts-2.33-5.fc19.noarch >03:47:41,071 DEBUG yum.verbose.YumBase: TSINFO: Marking xz-libs-5.1.2-4alpha.fc19.x86_64 as install for deltarpm-3.6-0.12.20110223git.fc19.x86_64 >03:47:41,073 DEBUG yum.verbose.YumBase: TSINFO: Marking rpm-libs-4.11.0.1-1.fc19.x86_64 as install for deltarpm-3.6-0.12.20110223git.fc19.x86_64 >03:47:41,080 DEBUG yum.verbose.YumBase: Quick matched rpm-libs-4.11.0.1-1.fc19.x86_64 to require for librpm.so.3()(64bit) >03:47:41,084 DEBUG yum.verbose.YumBase: TSINFO: Marking 12:dhcp-libs-4.2.5-10.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:47:41,087 DEBUG yum.verbose.YumBase: TSINFO: Marking 12:dhcp-common-4.2.5-10.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:47:41,090 DEBUG yum.verbose.YumBase: TSINFO: Marking openldap-2.4.35-1.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:47:41,097 DEBUG yum.verbose.YumBase: Quick matched openldap-2.4.35-1.fc19.x86_64 to require for liblber-2.4.so.2()(64bit) >03:47:41,098 DEBUG yum.verbose.YumBase: TSINFO: Marking 32:bind-libs-lite-9.9.3-0.2.rc1.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:47:41,100 DEBUG yum.verbose.YumBase: Quick matched 32:bind-libs-lite-9.9.3-0.2.rc1.fc19.x86_64 to require for libdns-export.so.98()(64bit) >03:47:41,106 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-1.02.77-8.fc19.x86_64 as install for dmraid-1.0.0.rc16-20.fc19.x86_64 >03:47:41,112 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-libs-1.02.77-8.fc19.x86_64 as install for dmraid-1.0.0.rc16-20.fc19.x86_64 >03:47:41,114 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-event-libs-1.02.77-8.fc19.x86_64 as install for dmraid-1.0.0.rc16-20.fc19.x86_64 >03:47:41,120 DEBUG yum.verbose.YumBase: TSINFO: Marking kpartx-0.4.9-47.fc19.x86_64 as install for dmraid-1.0.0.rc16-20.fc19.x86_64 >03:47:41,122 DEBUG yum.verbose.YumBase: TSINFO: Marking dmraid-events-1.0.0.rc16-20.fc19.x86_64 as install for dmraid-1.0.0.rc16-20.fc19.x86_64 >03:47:41,123 DEBUG yum.verbose.YumBase: TSINFO: Marking libsepol-2.1.9-1.fc19.x86_64 as install for dmraid-1.0.0.rc16-20.fc19.x86_64 >03:47:41,138 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:rmt-0.4-0.19.b44.fc19.x86_64 as install for 1:dump-0.4-0.19.b44.fc19.x86_64 >03:47:41,147 DEBUG yum.verbose.YumBase: TSINFO: Marking libss-1.42.7-2.fc19.x86_64 as install for e2fsprogs-1.42.7-2.fc19.x86_64 >03:47:41,148 DEBUG yum.verbose.YumBase: Quick matched libss-1.42.7-2.fc19.x86_64 to require for libss.so.2()(64bit) >03:47:41,190 DEBUG yum.verbose.YumBase: TSINFO: Marking telepathy-salut-0.8.1-2.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,198 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:telepathy-mission-control-5.14.1-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,201 DEBUG yum.verbose.YumBase: TSINFO: Marking telepathy-haze-0.7.0-3.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,209 DEBUG yum.verbose.YumBase: TSINFO: Marking telepathy-gabble-0.17.3-2.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,212 DEBUG yum.verbose.YumBase: TSINFO: Marking telepathy-idle-0.1.16-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,220 DEBUG yum.verbose.YumBase: TSINFO: Marking telepathy-filesystem-0.0.2-5.fc19.noarch as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,223 DEBUG yum.verbose.YumBase: TSINFO: Marking telepathy-glib-0.20.2-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,229 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.9.0)(64bit) >03:47:41,229 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.9)(64bit) >03:47:41,229 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.6)(64bit) >03:47:41,230 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.36)(64bit) >03:47:41,230 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.35)(64bit) >03:47:41,230 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.34)(64bit) >03:47:41,231 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.32)(64bit) >03:47:41,231 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.3)(64bit) >03:47:41,231 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.29)(64bit) >03:47:41,233 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.27)(64bit) >03:47:41,233 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.26)(64bit) >03:47:41,233 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.24)(64bit) >03:47:41,234 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.23)(64bit) >03:47:41,236 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.21)(64bit) >03:47:41,236 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.2)(64bit) >03:47:41,236 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.18)(64bit) >03:47:41,237 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.17)(64bit) >03:47:41,237 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.16)(64bit) >03:47:41,237 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.15)(64bit) >03:47:41,238 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.14)(64bit) >03:47:41,238 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.12)(64bit) >03:47:41,238 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.1)(64bit) >03:47:41,239 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.7.0)(64bit) >03:47:41,241 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.19.9)(64bit) >03:47:41,241 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.19.4)(64bit) >03:47:41,242 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.19.3)(64bit) >03:47:41,242 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.19.1)(64bit) >03:47:41,242 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.19.0)(64bit) >03:47:41,243 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.17.6)(64bit) >03:47:41,243 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.17.5)(64bit) >03:47:41,243 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.17.1)(64bit) >03:47:41,244 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.17.0)(64bit) >03:47:41,244 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.15.8)(64bit) >03:47:41,244 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.15.6)(64bit) >03:47:41,245 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.15.5)(64bit) >03:47:41,245 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.15.3)(64bit) >03:47:41,246 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.15.2)(64bit) >03:47:41,246 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.15.1)(64bit) >03:47:41,246 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.15.0)(64bit) >03:47:41,251 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.14.3)(64bit) >03:47:41,251 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.14.2)(64bit) >03:47:41,251 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.9)(64bit) >03:47:41,252 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.8)(64bit) >03:47:41,252 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.7)(64bit) >03:47:41,252 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.3)(64bit) >03:47:41,253 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.2)(64bit) >03:47:41,253 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.16)(64bit) >03:47:41,253 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.14)(64bit) >03:47:41,254 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.12)(64bit) >03:47:41,254 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.11)(64bit) >03:47:41,254 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.10)(64bit) >03:47:41,254 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.1)(64bit) >03:47:41,255 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.13.0)(64bit) >03:47:41,255 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.9)(64bit) >03:47:41,255 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.7)(64bit) >03:47:41,256 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.6)(64bit) >03:47:41,256 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.5)(64bit) >03:47:41,256 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.4)(64bit) >03:47:41,257 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.3)(64bit) >03:47:41,257 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.15)(64bit) >03:47:41,258 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.14)(64bit) >03:47:41,258 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.13)(64bit) >03:47:41,258 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.12)(64bit) >03:47:41,259 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.11)(64bit) >03:47:41,260 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.1)(64bit) >03:47:41,260 DEBUG yum.verbose.YumBase: Quick matched telepathy-glib-0.20.2-1.fc19.x86_64 to require for libtelepathy-glib.so.0(TELEPATHY_GLIB_0.11.0)(64bit) >03:47:41,262 DEBUG yum.verbose.YumBase: TSINFO: Marking webkitgtk3-2.0.1-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,267 DEBUG yum.verbose.YumBase: TSINFO: Marking telepathy-logger-0.8.0-2.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,280 DEBUG yum.verbose.YumBase: TSINFO: Marking telepathy-farstream-0.6.0-2.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,284 DEBUG yum.verbose.YumBase: TSINFO: Marking libsoup-2.42.2-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,291 DEBUG yum.verbose.YumBase: TSINFO: Marking p11-kit-0.18.1-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,298 DEBUG yum.verbose.YumBase: TSINFO: Marking geoclue-0.12.99-4.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,300 DEBUG yum.verbose.YumBase: TSINFO: Marking gcr-3.8.1-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,307 DEBUG yum.verbose.YumBase: Quick matched gcr-3.8.1-1.fc19.x86_64 to require for libgcr-base-3.so.1()(64bit) >03:47:41,307 DEBUG yum.verbose.YumBase: Quick matched gcr-3.8.1-1.fc19.x86_64 to require for libgck-1.so.0()(64bit) >03:47:41,308 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:folks-0.9.1-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,310 DEBUG yum.verbose.YumBase: Quick matched 1:folks-0.9.1-1.fc19.x86_64 to require for libfolks-telepathy.so.25()(64bit) >03:47:41,312 DEBUG yum.verbose.YumBase: TSINFO: Marking farstream02-0.2.3-1.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,315 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:enchant-1.6.0-6.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,318 DEBUG yum.verbose.YumBase: TSINFO: Marking libchamplain-gtk-0.12.3-5.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,321 DEBUG yum.verbose.YumBase: TSINFO: Marking libchamplain-0.12.3-5.fc19.x86_64 as install for empathy-3.8.1-1.fc19.x86_64 >03:47:41,335 DEBUG yum.verbose.YumBase: TSINFO: Marking libjpeg-turbo-1.2.90-2.fc19.x86_64 as install for eog-3.8.0-1.fc19.x86_64 >03:47:41,341 DEBUG yum.verbose.YumBase: TSINFO: Marking gsettings-desktop-schemas-3.8.0-1.fc19.x86_64 as install for eog-3.8.0-1.fc19.x86_64 >03:47:41,344 DEBUG yum.verbose.YumBase: TSINFO: Marking libexif-0.6.21-4.fc19.x86_64 as install for eog-3.8.0-1.fc19.x86_64 >03:47:41,347 DEBUG yum.verbose.YumBase: TSINFO: Marking exempi-2.2.0-6.fc19.x86_64 as install for eog-3.8.0-1.fc19.x86_64 >03:47:41,370 DEBUG yum.verbose.YumBase: TSINFO: Marking evince-libs-3.8.0-2.fc19.x86_64 as install for evince-3.8.0-2.fc19.x86_64 >03:47:41,372 DEBUG yum.verbose.YumBase: Quick matched evince-libs-3.8.0-2.fc19.x86_64 to require for libevview3.so.3()(64bit) >03:47:41,372 DEBUG yum.verbose.YumBase: Quick matched evince-libs-3.8.0-2.fc19.x86_64 to require for libevdocument3.so.4()(64bit) >03:47:41,411 DEBUG yum.verbose.YumBase: TSINFO: Marking highlight-3.13-1.fc19.x86_64 as install for evolution-3.8.1-1.fc19.x86_64 >03:47:41,421 DEBUG yum.verbose.YumBase: TSINFO: Marking gvfs-1.16.1-1.fc19.x86_64 as install for evolution-3.8.1-1.fc19.x86_64 >03:47:41,431 DEBUG yum.verbose.YumBase: TSINFO: Marking libytnef-1.5-10.fc19.x86_64 as install for evolution-3.8.1-1.fc19.x86_64 >03:47:41,434 DEBUG yum.verbose.YumBase: TSINFO: Marking libical-0.48-4.fc19.x86_64 as install for evolution-3.8.1-1.fc19.x86_64 >03:47:41,436 DEBUG yum.verbose.YumBase: Quick matched libical-0.48-4.fc19.x86_64 to require for libicalss.so.0()(64bit) >03:47:41,437 DEBUG yum.verbose.YumBase: Quick matched libical-0.48-4.fc19.x86_64 to require for libical.so.0()(64bit) >03:47:41,438 DEBUG yum.verbose.YumBase: TSINFO: Marking libgweather-3.8.1-1.fc19.x86_64 as install for evolution-3.8.1-1.fc19.x86_64 >03:47:41,445 DEBUG yum.verbose.YumBase: TSINFO: Marking gtkhtml3-4.6.4-1.fc19.x86_64 as install for evolution-3.8.1-1.fc19.x86_64 >03:47:41,446 DEBUG yum.verbose.YumBase: Quick matched gtkhtml3-4.6.4-1.fc19.x86_64 to require for libgtkhtml-4.0.so.0()(64bit) >03:47:41,447 DEBUG yum.verbose.YumBase: TSINFO: Marking libgdata-0.13.3-1.fc19.x86_64 as install for evolution-3.8.1-1.fc19.x86_64 >03:47:41,450 DEBUG yum.verbose.YumBase: TSINFO: Marking evolution-data-server-3.8.1-2.fc19.x86_64 as install for evolution-3.8.1-1.fc19.x86_64 >03:47:41,455 DEBUG yum.verbose.YumBase: Quick matched evolution-data-server-3.8.1-2.fc19.x86_64 to require for libedata-book-1.2.so.17()(64bit) >03:47:41,457 DEBUG yum.verbose.YumBase: Quick matched evolution-data-server-3.8.1-2.fc19.x86_64 to require for libecal-1.2.so.15()(64bit) >03:47:41,457 DEBUG yum.verbose.YumBase: Quick matched evolution-data-server-3.8.1-2.fc19.x86_64 to require for libebook-contacts-1.2.so.0()(64bit) >03:47:41,458 DEBUG yum.verbose.YumBase: Quick matched evolution-data-server-3.8.1-2.fc19.x86_64 to require for libebook-1.2.so.14()(64bit) >03:47:41,458 DEBUG yum.verbose.YumBase: Quick matched evolution-data-server-3.8.1-2.fc19.x86_64 to require for libebackend-1.2.so.6()(64bit) >03:47:41,458 DEBUG yum.verbose.YumBase: Quick matched evolution-data-server-3.8.1-2.fc19.x86_64 to require for libcamel-1.2.so.43()(64bit) >03:47:41,481 DEBUG yum.verbose.YumBase: TSINFO: Marking file-libs-5.11-9.fc19.x86_64 as install for file-5.11-9.fc19.x86_64 >03:47:41,487 DEBUG yum.verbose.YumBase: Quick matched file-libs-5.11-9.fc19.x86_64 to require for libmagic.so.1()(64bit) >03:47:41,498 DEBUG yum.verbose.YumBase: TSINFO: Marking unar-1.6-4.fc19.x86_64 as install for file-roller-3.8.1-2.fc19.x86_64 >03:47:41,526 DEBUG yum.verbose.YumBase: TSINFO: Marking xulrunner-20.0-4.fc19.x86_64 as install for firefox-20.0-1.fc19.x86_64 >03:47:41,535 DEBUG yum.verbose.YumBase: TSINFO: Marking fedora-bookmarks-15-0.5.noarch as install for firefox-20.0-1.fc19.x86_64 >03:47:41,537 DEBUG yum.verbose.YumBase: Quick matched xulrunner-20.0-4.fc19.x86_64 to require for libxpcom.so()(64bit) >03:47:41,540 DEBUG yum.verbose.YumBase: TSINFO: Marking startup-notification-0.12-6.fc19.x86_64 as install for firefox-20.0-1.fc19.x86_64 >03:47:41,544 DEBUG yum.verbose.YumBase: TSINFO: Marking pygobject3-base-3.8.1-2.fc19.x86_64 as install for firewall-config-0.3.2-1.fc19.noarch >03:47:41,551 DEBUG yum.verbose.YumBase: TSINFO: Marking hicolor-icon-theme-0.12-6.fc19.noarch as install for firewall-config-0.3.2-1.fc19.noarch >03:47:41,554 DEBUG yum.verbose.YumBase: TSINFO: Marking python-slip-dbus-0.4.0-1.fc19.noarch as install for firewalld-0.3.2-1.fc19.noarch >03:47:41,558 DEBUG yum.verbose.YumBase: TSINFO: Marking python-decorator-3.4.0-2.fc19.noarch as install for firewalld-0.3.2-1.fc19.noarch >03:47:41,560 DEBUG yum.verbose.YumBase: TSINFO: Marking ebtables-2.0.10-8.fc19.x86_64 as install for firewalld-0.3.2-1.fc19.noarch >03:47:41,614 DEBUG yum.verbose.YumBase: TSINFO: Marking 4:perl-5.16.3-262.fc19.x86_64 as install for foomatic-4.0.9-1.fc19.x86_64 >03:47:41,623 DEBUG yum.verbose.YumBase: Quick matched 4:perl-5.16.3-262.fc19.x86_64 to require for perl(vars) >03:47:41,623 DEBUG yum.verbose.YumBase: Quick matched 4:perl-5.16.3-262.fc19.x86_64 to require for perl(strict) >03:47:41,624 DEBUG yum.verbose.YumBase: Quick matched 4:perl-5.16.3-262.fc19.x86_64 to require for perl(POSIX) >03:47:41,625 DEBUG yum.verbose.YumBase: Quick matched 4:perl-5.16.3-262.fc19.x86_64 to require for perl(Getopt::Std) >03:47:41,626 DEBUG yum.verbose.YumBase: Quick matched 4:perl-5.16.3-262.fc19.x86_64 to require for perl(Getopt::Long) >03:47:41,627 DEBUG yum.verbose.YumBase: Quick matched 4:perl-5.16.3-262.fc19.x86_64 to require for perl(FileHandle) >03:47:41,629 DEBUG yum.verbose.YumBase: Quick matched 4:perl-5.16.3-262.fc19.x86_64 to require for perl(Exporter) >03:47:41,635 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-Encode-2.51-1.fc19.x86_64 as install for foomatic-4.0.9-1.fc19.x86_64 >03:47:41,637 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-Data-Dumper-2.145-1.fc19.x86_64 as install for foomatic-4.0.9-1.fc19.x86_64 >03:47:41,641 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-PathTools-3.40-1.fc19.x86_64 as install for foomatic-4.0.9-1.fc19.x86_64 >03:47:41,645 DEBUG yum.verbose.YumBase: TSINFO: Marking foomatic-db-4.0-36.20130312.fc19.noarch as install for foomatic-4.0.9-1.fc19.x86_64 >03:47:41,656 DEBUG yum.verbose.YumBase: TSINFO: Marking foomatic-db-filesystem-4.0-36.20130312.fc19.noarch as install for foomatic-db-ppds-4.0-36.20130312.fc19.noarch >03:47:41,670 DEBUG yum.verbose.YumBase: TSINFO: Marking fprintd-0.5.0-1.fc19.x86_64 as install for fprintd-pam-0.5.0-1.fc19.x86_64 >03:47:41,708 DEBUG yum.verbose.YumBase: TSINFO: Marking libXau-1.0.6-7.fc19.x86_64 as install for 1:gdm-3.8.1.1-1.fc19.x86_64 >03:47:41,716 DEBUG yum.verbose.YumBase: TSINFO: Marking xorg-x11-server-utils-7.7-1.fc19.x86_64 as install for 1:gdm-3.8.1.1-1.fc19.x86_64 >03:47:41,718 DEBUG yum.verbose.YumBase: TSINFO: Marking fedora-logos-19.0.1-1.fc19.noarch as install for 1:gdm-3.8.1.1-1.fc19.x86_64 >03:47:41,725 DEBUG yum.verbose.YumBase: TSINFO: Marking xorg-x11-xkb-utils-7.7-5.fc19.x86_64 as install for 1:gdm-3.8.1.1-1.fc19.x86_64 >03:47:41,727 DEBUG yum.verbose.YumBase: TSINFO: Marking pulseaudio-gdm-hooks-3.0-7.fc19.x86_64 as install for 1:gdm-3.8.1.1-1.fc19.x86_64 >03:47:41,729 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-session-3.8.1-2.fc19.x86_64 as install for 1:gdm-3.8.1.1-1.fc19.x86_64 >03:47:41,733 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-keyring-pam-3.8.2-1.fc19.x86_64 as install for 1:gdm-3.8.1.1-1.fc19.x86_64 >03:47:41,736 DEBUG yum.verbose.YumBase: TSINFO: Marking libXdmcp-1.1.1-4.fc19.x86_64 as install for 1:gdm-3.8.1.1-1.fc19.x86_64 >03:47:41,749 DEBUG yum.verbose.YumBase: TSINFO: Marking python3-3.3.1-2.fc19.x86_64 as install for 2:gedit-3.8.1-1.fc19.x86_64 >03:47:41,756 DEBUG yum.verbose.YumBase: TSINFO: Marking python3-gobject-3.8.1-2.fc19.x86_64 as install for 2:gedit-3.8.1-1.fc19.x86_64 >03:47:41,758 DEBUG yum.verbose.YumBase: TSINFO: Marking zenity-3.8.0-2.fc19.x86_64 as install for 2:gedit-3.8.1-1.fc19.x86_64 >03:47:41,764 DEBUG yum.verbose.YumBase: TSINFO: Marking gtksourceview3-3.8.1-1.fc19.x86_64 as install for 2:gedit-3.8.1-1.fc19.x86_64 >03:47:41,788 DEBUG yum.verbose.YumBase: TSINFO: Marking urw-fonts-2.4-14.fc19.noarch as install for ghostscript-9.07-2.fc19.x86_64 >03:47:41,792 DEBUG yum.verbose.YumBase: TSINFO: Marking poppler-data-0.4.6-2.fc19.noarch as install for ghostscript-9.07-2.fc19.x86_64 >03:47:41,795 DEBUG yum.verbose.YumBase: TSINFO: Marking libtiff-4.0.3-6.fc19.x86_64 as install for ghostscript-9.07-2.fc19.x86_64 >03:47:41,798 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:libpng-1.5.13-2.fc19.x86_64 as install for ghostscript-9.07-2.fc19.x86_64 >03:47:41,801 DEBUG yum.verbose.YumBase: TSINFO: Marking ghostscript-fonts-5.50-30.fc19.noarch as install for ghostscript-9.07-2.fc19.x86_64 >03:47:41,805 DEBUG yum.verbose.YumBase: TSINFO: Marking libXt-1.1.3-3.fc19.x86_64 as install for ghostscript-9.07-2.fc19.x86_64 >03:47:41,827 DEBUG yum.verbose.YumBase: TSINFO: Marking ca-certificates-2012.87-10.1.fc19.noarch as install for glib-networking-2.36.1-1.fc19.x86_64 >03:47:41,832 DEBUG yum.verbose.YumBase: TSINFO: Marking libproxy-0.4.11-3.fc19.x86_64 as install for glib-networking-2.36.1-1.fc19.x86_64 >03:47:41,841 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-softokn-freebl-3.14.3-1.fc19.x86_64 as install for glibc-2.17-4.fc19.x86_64 >03:47:41,842 DEBUG yum.verbose.YumBase: Quick matched nss-softokn-freebl-3.14.3-1.fc19.x86_64 to require for libfreebl3.so()(64bit) >03:47:41,856 DEBUG yum.verbose.YumBase: TSINFO: Marking bluez-4.101-6.fc19.x86_64 as install for 1:gnome-bluetooth-3.8.0-1.fc19.x86_64 >03:47:41,862 DEBUG yum.verbose.YumBase: TSINFO: Marking pulseaudio-module-bluetooth-3.0-7.fc19.x86_64 as install for 1:gnome-bluetooth-3.8.0-1.fc19.x86_64 >03:47:41,870 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:obexd-0.46-4.fc19.x86_64 as install for 1:gnome-bluetooth-3.8.0-1.fc19.x86_64 >03:47:41,871 DEBUG yum.verbose.YumBase: TSINFO: Marking gvfs-obexftp-1.16.1-1.fc19.x86_64 as install for 1:gnome-bluetooth-3.8.0-1.fc19.x86_64 >03:47:41,910 DEBUG yum.verbose.YumBase: TSINFO: Marking mtools-4.0.18-3.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,921 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-gobject-0.1.6-1.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,922 DEBUG yum.verbose.YumBase: Quick matched libvirt-gobject-0.1.6-1.fc19.x86_64 to require for libvirt-gobject-1.0.so.0(LIBVIRT_GOBJECT_0.1.3)(64bit) >03:47:41,925 DEBUG yum.verbose.YumBase: Quick matched libvirt-gobject-0.1.6-1.fc19.x86_64 to require for libvirt-gobject-1.0.so.0(LIBVIRT_GOBJECT_0.1.2)(64bit) >03:47:41,925 DEBUG yum.verbose.YumBase: Quick matched libvirt-gobject-0.1.6-1.fc19.x86_64 to require for libvirt-gobject-1.0.so.0(LIBVIRT_GOBJECT_0.0.9)(64bit) >03:47:41,926 DEBUG yum.verbose.YumBase: Quick matched libvirt-gobject-0.1.6-1.fc19.x86_64 to require for libvirt-gobject-1.0.so.0(LIBVIRT_GOBJECT_0.0.8)(64bit) >03:47:41,930 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-gconfig-0.1.6-1.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,931 DEBUG yum.verbose.YumBase: Quick matched libvirt-gconfig-0.1.6-1.fc19.x86_64 to require for libvirt-gconfig-1.0.so.0(LIBVIRT_GCONFIG_0.1.5)(64bit) >03:47:41,933 DEBUG yum.verbose.YumBase: Quick matched libvirt-gconfig-0.1.6-1.fc19.x86_64 to require for libvirt-gconfig-1.0.so.0(LIBVIRT_GCONFIG_0.1.4)(64bit) >03:47:41,933 DEBUG yum.verbose.YumBase: Quick matched libvirt-gconfig-0.1.6-1.fc19.x86_64 to require for libvirt-gconfig-1.0.so.0(LIBVIRT_GCONFIG_0.1.0)(64bit) >03:47:41,934 DEBUG yum.verbose.YumBase: Quick matched libvirt-gconfig-0.1.6-1.fc19.x86_64 to require for libvirt-gconfig-1.0.so.0(LIBVIRT_GCONFIG_0.0.9)(64bit) >03:47:41,934 DEBUG yum.verbose.YumBase: Quick matched libvirt-gconfig-0.1.6-1.fc19.x86_64 to require for libvirt-gconfig-1.0.so.0(LIBVIRT_GCONFIG_0.0.8)(64bit) >03:47:41,936 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,939 DEBUG yum.verbose.YumBase: TSINFO: Marking spice-gtk3-0.19-1.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,948 DEBUG yum.verbose.YumBase: TSINFO: Marking spice-glib-0.19-1.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,959 DEBUG yum.verbose.YumBase: TSINFO: Marking libosinfo-0.2.6-1.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,966 DEBUG yum.verbose.YumBase: Quick matched libosinfo-0.2.6-1.fc19.x86_64 to require for libosinfo-1.0.so.0(LIBOSINFO_0.2.3)(64bit) >03:47:41,968 DEBUG yum.verbose.YumBase: Quick matched libosinfo-0.2.6-1.fc19.x86_64 to require for libosinfo-1.0.so.0(LIBOSINFO_0.2.2)(64bit) >03:47:41,970 DEBUG yum.verbose.YumBase: Quick matched libosinfo-0.2.6-1.fc19.x86_64 to require for libosinfo-1.0.so.0(LIBOSINFO_0.2.1)(64bit) >03:47:41,975 DEBUG yum.verbose.YumBase: Quick matched libosinfo-0.2.6-1.fc19.x86_64 to require for libosinfo-1.0.so.0(LIBOSINFO_0.2.0)(64bit) >03:47:41,975 DEBUG yum.verbose.YumBase: Quick matched libosinfo-0.2.6-1.fc19.x86_64 to require for libosinfo-1.0.so.0(LIBOSINFO_0.0.5)(64bit) >03:47:41,976 DEBUG yum.verbose.YumBase: Quick matched libosinfo-0.2.6-1.fc19.x86_64 to require for libosinfo-1.0.so.0(LIBOSINFO_0.0.1)(64bit) >03:47:41,978 DEBUG yum.verbose.YumBase: TSINFO: Marking fuseiso-20070708-13.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,985 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-client-1.0.5-2.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:41,997 DEBUG yum.verbose.YumBase: TSINFO: Marking gvnc-0.5.2-1.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:42,006 DEBUG yum.verbose.YumBase: TSINFO: Marking gtk-vnc2-0.5.2-1.fc19.x86_64 as install for gnome-boxes-3.8.1.2-1.fc19.x86_64 >03:47:42,078 DEBUG yum.verbose.YumBase: TSINFO: Marking argyllcms-1.4.0-9.fc19.x86_64 as install for gnome-color-manager-3.8.1-1.fc19.x86_64 >03:47:42,085 DEBUG yum.verbose.YumBase: TSINFO: Marking vte3-0.34.4-1.fc19.x86_64 as install for gnome-color-manager-3.8.1-1.fc19.x86_64 >03:47:42,092 DEBUG yum.verbose.YumBase: TSINFO: Marking libmash-0.2.0-8.fc19.x86_64 as install for gnome-color-manager-3.8.1-1.fc19.x86_64 >03:47:42,101 DEBUG yum.verbose.YumBase: TSINFO: Marking exiv2-libs-0.23-4.fc19.x86_64 as install for gnome-color-manager-3.8.1-1.fc19.x86_64 >03:47:42,179 DEBUG yum.verbose.YumBase: TSINFO: Marking udisks2-2.1.0-2.fc19.x86_64 as install for gnome-disk-utility-3.8.0-1.fc19.x86_64 >03:47:42,187 DEBUG yum.verbose.YumBase: TSINFO: Marking libudisks2-2.1.0-2.fc19.x86_64 as install for gnome-disk-utility-3.8.0-1.fc19.x86_64 >03:47:42,193 DEBUG yum.verbose.YumBase: TSINFO: Marking libdvdread-4.2.0-4.fc19.x86_64 as install for gnome-disk-utility-3.8.0-1.fc19.x86_64 >03:47:42,209 DEBUG yum.verbose.YumBase: TSINFO: Marking libzapojit-0.0.3-1.fc19.x86_64 as install for gnome-documents-3.8.1-1.fc19.x86_64 >03:47:42,214 DEBUG yum.verbose.YumBase: TSINFO: Marking rest-0.7.90-3.fc19.x86_64 as install for gnome-documents-3.8.1-1.fc19.x86_64 >03:47:42,222 DEBUG yum.verbose.YumBase: TSINFO: Marking gjs-1.36.1-1.fc19.x86_64 as install for gnome-documents-3.8.1-1.fc19.x86_64 >03:47:42,269 DEBUG yum.verbose.YumBase: TSINFO: Marking PackageKit-device-rebind-0.8.7-4.fc19.x86_64 as install for gnome-packagekit-3.8.1-1.fc19.x86_64 >03:47:42,277 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-settings-daemon-updates-3.8.1-2.fc19.x86_64 as install for gnome-packagekit-3.8.1-1.fc19.x86_64 >03:47:42,305 DEBUG yum.verbose.YumBase: TSINFO: Marking libxkbfile-1.0.8-3.fc19.x86_64 as install for gnome-settings-daemon-3.8.1-2.fc19.x86_64 >03:47:42,321 DEBUG yum.verbose.YumBase: TSINFO: Marking mutter-3.8.1-1.fc19.x86_64 as install for gnome-shell-3.8.1-4.fc19.x86_64 >03:47:42,324 DEBUG yum.verbose.YumBase: TSINFO: Marking libcroco-0.6.8-2.fc19.x86_64 as install for gnome-shell-3.8.1-4.fc19.x86_64 >03:47:42,325 DEBUG yum.verbose.YumBase: TSINFO: Marking mozilla-filesystem-1.9-9.fc19.x86_64 as install for gnome-shell-3.8.1-4.fc19.x86_64 >03:47:42,327 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:gdm-libs-3.8.1.1-1.fc19.x86_64 as install for gnome-shell-3.8.1-4.fc19.x86_64 >03:47:42,329 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:js-1.8.5-13.fc19.x86_64 as install for gnome-shell-3.8.1-4.fc19.x86_64 >03:47:42,343 DEBUG yum.verbose.YumBase: TSINFO: Marking libwnck3-3.4.5-1.fc19.x86_64 as install for gnome-system-monitor-3.8.0-1.fc19.x86_64 >03:47:42,345 DEBUG yum.verbose.YumBase: TSINFO: Marking libsigc++20-2.3.1-2.fc19.x86_64 as install for gnome-system-monitor-3.8.0-1.fc19.x86_64 >03:47:42,347 DEBUG yum.verbose.YumBase: TSINFO: Marking pangomm-2.34.0-1.fc19.x86_64 as install for gnome-system-monitor-3.8.0-1.fc19.x86_64 >03:47:42,349 DEBUG yum.verbose.YumBase: TSINFO: Marking gtkmm30-3.8.1-1.fc19.x86_64 as install for gnome-system-monitor-3.8.0-1.fc19.x86_64 >03:47:42,351 DEBUG yum.verbose.YumBase: TSINFO: Marking glibmm24-2.36.2-1.fc19.x86_64 as install for gnome-system-monitor-3.8.0-1.fc19.x86_64 >03:47:42,352 DEBUG yum.verbose.YumBase: Quick matched glibmm24-2.36.2-1.fc19.x86_64 to require for libgiomm-2.4.so.1()(64bit) >03:47:42,354 DEBUG yum.verbose.YumBase: TSINFO: Marking cairomm-1.10.0-6.fc19.x86_64 as install for gnome-system-monitor-3.8.0-1.fc19.x86_64 >03:47:42,355 DEBUG yum.verbose.YumBase: TSINFO: Marking atkmm-2.22.7-1.fc19.x86_64 as install for gnome-system-monitor-3.8.0-1.fc19.x86_64 >03:47:42,363 DEBUG yum.verbose.YumBase: TSINFO: Marking adwaita-gtk3-theme-3.8.1-1.fc19.x86_64 as install for gnome-themes-standard-3.8.1-1.fc19.x86_64 >03:47:42,365 DEBUG yum.verbose.YumBase: TSINFO: Marking adwaita-gtk2-theme-3.8.1-1.fc19.x86_64 as install for gnome-themes-standard-3.8.1-1.fc19.x86_64 >03:47:42,367 DEBUG yum.verbose.YumBase: TSINFO: Marking adwaita-cursor-theme-3.8.1-1.fc19.noarch as install for gnome-themes-standard-3.8.1-1.fc19.x86_64 >03:47:42,369 DEBUG yum.verbose.YumBase: TSINFO: Marking gnu-free-fonts-common-20120503-5.fc19.noarch as install for gnu-free-mono-fonts-20120503-5.fc19.noarch >03:47:42,379 DEBUG yum.verbose.YumBase: TSINFO: Marking pinentry-0.8.1-10.fc19.x86_64 as install for gnupg2-2.0.19-8.fc19.x86_64 >03:47:42,381 DEBUG yum.verbose.YumBase: TSINFO: Marking libassuan-2.0.3-5.fc19.x86_64 as install for gnupg2-2.0.19-8.fc19.x86_64 >03:47:42,382 DEBUG yum.verbose.YumBase: TSINFO: Marking pth-2.0.7-19.fc19.x86_64 as install for gnupg2-2.0.19-8.fc19.x86_64 >03:47:42,405 DEBUG yum.verbose.YumBase: TSINFO: Marking libsndfile-1.0.25-6.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,407 DEBUG yum.verbose.YumBase: TSINFO: Marking libvpx-1.2.0-1.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,409 DEBUG yum.verbose.YumBase: TSINFO: Marking libvdpau-0.6-1.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,411 DEBUG yum.verbose.YumBase: TSINFO: Marking orc-0.4.17-2.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,413 DEBUG yum.verbose.YumBase: TSINFO: Marking opus-1.0.2-3.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,414 DEBUG yum.verbose.YumBase: TSINFO: Marking libofa-0.9.3-22.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,416 DEBUG yum.verbose.YumBase: TSINFO: Marking libmpcdec-1.2.6-10.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,418 DEBUG yum.verbose.YumBase: TSINFO: Marking jasper-libs-1.900.1-24.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,422 DEBUG yum.verbose.YumBase: TSINFO: Marking gstreamer-plugins-base-0.10.36-4.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,424 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgsttag-0.10.so.0()(64bit) >03:47:42,424 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgstsdp-0.10.so.0()(64bit) >03:47:42,424 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgstrtp-0.10.so.0()(64bit) >03:47:42,425 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgstriff-0.10.so.0()(64bit) >03:47:42,425 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgstpbutils-0.10.so.0()(64bit) >03:47:42,425 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgstinterfaces-0.10.so.0()(64bit) >03:47:42,426 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgstfft-0.10.so.0()(64bit) >03:47:42,426 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgstaudio-0.10.so.0()(64bit) >03:47:42,426 DEBUG yum.verbose.YumBase: Quick matched gstreamer-plugins-base-0.10.36-4.fc19.x86_64 to require for libgstapp-0.10.so.0()(64bit) >03:47:42,427 DEBUG yum.verbose.YumBase: TSINFO: Marking gsm-1.0.13-9.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,429 DEBUG yum.verbose.YumBase: TSINFO: Marking libdvdnav-4.2.0-4.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,431 DEBUG yum.verbose.YumBase: TSINFO: Marking soundtouch-1.4.0-7.fc19.x86_64 as install for gstreamer-plugins-bad-free-0.10.23-17.fc19.x86_64 >03:47:42,435 DEBUG yum.verbose.YumBase: TSINFO: Marking espeak-1.47.08-1.fc19.x86_64 as install for gstreamer-plugins-espeak-0.4.0-2.fc19.x86_64 >03:47:42,457 DEBUG yum.verbose.YumBase: TSINFO: Marking wavpack-4.60.1-6.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,459 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:libvorbis-1.3.3-4.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,461 DEBUG yum.verbose.YumBase: TSINFO: Marking libv4l-0.8.8-6.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,463 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:libtheora-1.1.1-6.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,465 DEBUG yum.verbose.YumBase: TSINFO: Marking taglib-1.8-5.20130218git.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,466 DEBUG yum.verbose.YumBase: TSINFO: Marking speex-1.2-0.16.rc1.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,468 DEBUG yum.verbose.YumBase: TSINFO: Marking libshout-2.2.2-9.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,470 DEBUG yum.verbose.YumBase: TSINFO: Marking libavc1394-0.5.3-13.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,471 DEBUG yum.verbose.YumBase: TSINFO: Marking libraw1394-2.1.0-1.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,473 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:libogg-1.3.0-5.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,474 DEBUG yum.verbose.YumBase: TSINFO: Marking libiec61883-1.2.0-9.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,476 DEBUG yum.verbose.YumBase: TSINFO: Marking libdv-1.0.0-15.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,478 DEBUG yum.verbose.YumBase: TSINFO: Marking flac-libs-1.3.0-0.1.pre1.fc19.x86_64 as install for gstreamer-plugins-good-0.10.31-9.fc19.x86_64 >03:47:42,498 DEBUG yum.verbose.YumBase: TSINFO: Marking usbmuxd-1.0.8-7.fc19.x86_64 as install for gvfs-afc-1.16.1-1.fc19.x86_64 >03:47:42,499 DEBUG yum.verbose.YumBase: Quick matched usbmuxd-1.0.8-7.fc19.x86_64 to require for libusbmuxd.so.2()(64bit) >03:47:42,500 DEBUG yum.verbose.YumBase: TSINFO: Marking libtasn1-3.3-1.fc19.x86_64 as install for gvfs-afc-1.16.1-1.fc19.x86_64 >03:47:42,502 DEBUG yum.verbose.YumBase: TSINFO: Marking libplist-1.10-1.fc19.x86_64 as install for gvfs-afc-1.16.1-1.fc19.x86_64 >03:47:42,503 DEBUG yum.verbose.YumBase: TSINFO: Marking libimobiledevice-1.1.5-1.fc19.x86_64 as install for gvfs-afc-1.16.1-1.fc19.x86_64 >03:47:42,505 DEBUG yum.verbose.YumBase: TSINFO: Marking libbluray-0.2.3-2.fc19.x86_64 as install for gvfs-afc-1.16.1-1.fc19.x86_64 >03:47:42,514 DEBUG yum.verbose.YumBase: TSINFO: Marking fuse-libs-2.9.2-2.fc19.x86_64 as install for gvfs-fuse-1.16.1-1.fc19.x86_64 >03:47:42,515 DEBUG yum.verbose.YumBase: Quick matched fuse-libs-2.9.2-2.fc19.x86_64 to require for libfuse.so.2(FUSE_2.6)(64bit) >03:47:42,516 DEBUG yum.verbose.YumBase: TSINFO: Marking fuse-2.9.2-2.fc19.x86_64 as install for gvfs-fuse-1.16.1-1.fc19.x86_64 >03:47:42,528 DEBUG yum.verbose.YumBase: TSINFO: Marking libgphoto2-2.5.1.1-1.fc19.x86_64 as install for gvfs-gphoto2-1.16.1-1.fc19.x86_64 >03:47:42,530 DEBUG yum.verbose.YumBase: Quick matched libgphoto2-2.5.1.1-1.fc19.x86_64 to require for libgphoto2_port.so.10()(64bit) >03:47:42,530 DEBUG yum.verbose.YumBase: Quick matched libgphoto2-2.5.1.1-1.fc19.x86_64 to require for libgphoto2.so.6()(64bit) >03:47:42,535 DEBUG yum.verbose.YumBase: TSINFO: Marking libmtp-1.1.6-0.fc19.x86_64 as install for gvfs-mtp-1.16.1-1.fc19.x86_64 >03:47:42,553 DEBUG yum.verbose.YumBase: TSINFO: Marking hplip-libs-3.13.4-1.fc19.x86_64 as install for 1:hpijs-3.13.4-1.fc19.x86_64 >03:47:42,554 DEBUG yum.verbose.YumBase: Quick matched hplip-libs-3.13.4-1.fc19.x86_64 to require for libhpmud.so.0()(64bit) >03:47:42,555 DEBUG yum.verbose.YumBase: Quick matched hplip-libs-3.13.4-1.fc19.x86_64 to require for libhpip.so.0()(64bit) >03:47:42,562 DEBUG yum.verbose.YumBase: TSINFO: Marking python-pillow-2.0.0-7.gitd1c6db8.fc19.x86_64 as install for hplip-3.13.4-1.fc19.x86_64 >03:47:42,566 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:net-snmp-libs-5.7.2-8.fc19.x86_64 as install for hplip-3.13.4-1.fc19.x86_64 >03:47:42,570 DEBUG yum.verbose.YumBase: TSINFO: Marking hunspell-en-US-0.20121024-3.fc19.noarch as install for hunspell-1.3.2-10.fc19.x86_64 >03:47:42,575 DEBUG yum.verbose.YumBase: TSINFO: Marking libchewing-0.3.4-3.fc19.x86_64 as install for ibus-chewing-1.4.3-3.fc19.x86_64 >03:47:42,576 DEBUG yum.verbose.YumBase: TSINFO: Marking ibus-1.5.1-3.fc19.x86_64 as install for ibus-chewing-1.4.3-3.fc19.x86_64 >03:47:42,582 DEBUG yum.verbose.YumBase: TSINFO: Marking libhangul-0.1.0-6.fc19.x86_64 as install for ibus-hangul-1.4.2-4.fc19.x86_64 >03:47:42,584 DEBUG yum.verbose.YumBase: TSINFO: Marking pygobject3-3.8.1-2.fc19.x86_64 as install for ibus-hangul-1.4.2-4.fc19.x86_64 >03:47:42,589 DEBUG yum.verbose.YumBase: TSINFO: Marking libkkc-0.2.1-1.fc19.x86_64 as install for ibus-kkc-1.5.11-1.fc19.x86_64 >03:47:42,595 DEBUG yum.verbose.YumBase: TSINFO: Marking libgee06-0.6.8-1.fc19.x86_64 as install for ibus-kkc-1.5.11-1.fc19.x86_64 >03:47:42,608 DEBUG yum.verbose.YumBase: TSINFO: Marking libpinyin-data-0.9.91-1.fc19.x86_64 as install for ibus-libpinyin-1.6.91-1.fc19.x86_64 >03:47:42,614 DEBUG yum.verbose.YumBase: TSINFO: Marking libpinyin-0.9.91-1.fc19.x86_64 as install for ibus-libpinyin-1.6.91-1.fc19.x86_64 >03:47:42,616 DEBUG yum.verbose.YumBase: Quick matched libpinyin-0.9.91-1.fc19.x86_64 to require for libpinyin.so.4(LIBPINYIN)(64bit) >03:47:42,617 DEBUG yum.verbose.YumBase: Quick matched libpinyin-0.9.91-1.fc19.x86_64 to require for libpinyin.so.4()(64bit) >03:47:42,618 DEBUG yum.verbose.YumBase: TSINFO: Marking opencc-0.4.0-1.fc19.x86_64 as install for ibus-libpinyin-1.6.91-1.fc19.x86_64 >03:47:42,619 DEBUG yum.verbose.YumBase: TSINFO: Marking lua-5.1.4-12.fc19.x86_64 as install for ibus-libpinyin-1.6.91-1.fc19.x86_64 >03:47:42,625 DEBUG yum.verbose.YumBase: TSINFO: Marking m17n-lib-1.6.4-8.fc19.x86_64 as install for ibus-m17n-1.3.4-9.fc19.x86_64 >03:47:42,626 DEBUG yum.verbose.YumBase: Quick matched m17n-lib-1.6.4-8.fc19.x86_64 to require for libm17n.so.0()(64bit) >03:47:42,626 DEBUG yum.verbose.YumBase: Quick matched m17n-lib-1.6.4-8.fc19.x86_64 to require for libm17n-core.so.0()(64bit) >03:47:42,630 DEBUG yum.verbose.YumBase: TSINFO: Marking libtranslit-m17n-0.0.2-4.fc19.x86_64 as install for ibus-typing-booster-0.0.26-1.fc19.noarch >03:47:42,637 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:java-1.7.0-openjdk-1.7.0.19-2.3.9.6.fc19.x86_64 as install for icedtea-web-1.4-0.fc19.x86_64 >03:47:42,642 DEBUG yum.verbose.YumBase: TSINFO: Marking im-chooser-common-1.6.2-3.fc19.x86_64 as install for im-chooser-1.6.2-3.fc19.x86_64 >03:47:42,644 DEBUG yum.verbose.YumBase: TSINFO: Marking imsettings-libs-1.6.1-2.fc19.x86_64 as install for im-chooser-1.6.2-3.fc19.x86_64 >03:47:42,658 DEBUG yum.verbose.YumBase: TSINFO: Marking sysvinit-tools-2.88-10.dsf.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:42,659 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:findutils-4.5.11-1.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:42,663 DEBUG yum.verbose.YumBase: TSINFO: Marking fedora-release-19-0.5.noarch as install for initscripts-9.46-1.fc19.x86_64 >03:47:42,665 DEBUG yum.verbose.YumBase: TSINFO: Marking gawk-4.0.2-2.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:42,672 DEBUG yum.verbose.YumBase: TSINFO: Marking libdb-5.3.21-9.fc19.x86_64 as install for iproute-3.9.0-1.fc19.x86_64 >03:47:42,674 DEBUG yum.verbose.YumBase: TSINFO: Marking linux-atm-libs-2.5.1-7.fc19.x86_64 as install for iproute-3.9.0-1.fc19.x86_64 >03:47:42,677 DEBUG yum.verbose.YumBase: TSINFO: Marking libsysfs-2.1.0-13.fc19.x86_64 as install for iprutils-2.3.13-2.fc19.x86_64 >03:47:42,683 DEBUG yum.verbose.YumBase: TSINFO: Marking libnetfilter_conntrack-1.0.3-1.fc19.x86_64 as install for iptstate-2.2.5-2.fc19.x86_64 >03:47:42,693 DEBUG yum.verbose.YumBase: TSINFO: Marking numactl-libs-2.0.8-4.fc19.x86_64 as install for 2:irqbalance-1.0.5-2.fc19.x86_64 >03:47:42,694 DEBUG yum.verbose.YumBase: Quick matched numactl-libs-2.0.8-4.fc19.x86_64 to require for libnuma.so.1(libnuma_1.1)(64bit) >03:47:42,694 DEBUG yum.verbose.YumBase: Quick matched numactl-libs-2.0.8-4.fc19.x86_64 to require for libnuma.so.1()(64bit) >03:47:42,702 DEBUG yum.verbose.YumBase: TSINFO: Marking hwdata-0.247-1.fc19.noarch as install for isdn4k-utils-3.2-90.fc19.x86_64 >03:47:42,704 DEBUG yum.verbose.YumBase: TSINFO: Marking 14:libpcap-1.3.0-4.fc19.x86_64 as install for isdn4k-utils-3.2-90.fc19.x86_64 >03:47:42,710 DEBUG yum.verbose.YumBase: TSINFO: Marking kbd-misc-1.15.5-5.fc19.noarch as install for kbd-1.15.5-5.fc19.x86_64 >03:47:42,712 DEBUG yum.verbose.YumBase: TSINFO: Marking linux-firmware-20130201-0.5.git65a5163.fc19.noarch as install for kernel-3.9.0-301.fc19.x86_64 >03:47:42,715 DEBUG yum.verbose.YumBase: TSINFO: Marking grubby-8.24-1.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:47:42,717 DEBUG yum.verbose.YumBase: TSINFO: Marking dracut-027-45.git20130430.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:47:42,720 DEBUG yum.verbose.YumBase: TSINFO: Marking khmeros-fonts-common-5.0-16.fc19.noarch as install for khmeros-base-fonts-5.0-16.fc19.noarch >03:47:42,722 DEBUG yum.verbose.YumBase: TSINFO: Marking groff-base-1.22.2-2.fc19.x86_64 as install for less-458-2.fc19.x86_64 >03:47:42,735 DEBUG yum.verbose.YumBase: TSINFO: Marking libtdb-1.2.11-2.fc19.x86_64 as install for libcanberra-gtk2-0.30-3.fc19.x86_64 >03:47:42,737 DEBUG yum.verbose.YumBase: TSINFO: Marking libtool-ltdl-2.4.2-12.fc19.x86_64 as install for libcanberra-gtk2-0.30-3.fc19.x86_64 >03:47:42,743 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:liberation-fonts-common-1.07.2-13.fc19.noarch as install for 1:liberation-mono-fonts-1.07.2-13.fc19.noarch >03:47:42,759 DEBUG yum.verbose.YumBase: TSINFO: Marking sane-backends-1.0.23-7.fc19.x86_64 as install for libsane-hpaio-3.13.4-1.fc19.x86_64 >03:47:42,771 DEBUG yum.verbose.YumBase: TSINFO: Marking m17n-db-1.6.4-2.fc19.noarch as install for m17n-contrib-1.1.14-2.fc19.noarch >03:47:42,775 DEBUG yum.verbose.YumBase: TSINFO: Marking gzip-1.5-4.fc19.x86_64 as install for man-db-2.6.3-6.fc19.x86_64 >03:47:42,777 DEBUG yum.verbose.YumBase: TSINFO: Marking libpipeline-1.2.3-1.fc19.x86_64 as install for man-db-2.6.3-6.fc19.x86_64 >03:47:42,778 DEBUG yum.verbose.YumBase: TSINFO: Marking gdbm-1.10-6.fc19.x86_64 as install for man-db-2.6.3-6.fc19.x86_64 >03:47:42,785 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-filesystem-2.1.3-3.fc19.x86_64 as install for mdadm-3.2.6-15.fc19.x86_64 >03:47:42,793 DEBUG yum.verbose.YumBase: TSINFO: Marking mesa-dri-filesystem-9.1.1-1.fc19.x86_64 as install for mesa-dri-drivers-9.1.1-1.fc19.x86_64 >03:47:42,794 DEBUG yum.verbose.YumBase: TSINFO: Marking libffi-3.0.13-1.fc19.x86_64 as install for mesa-dri-drivers-9.1.1-1.fc19.x86_64 >03:47:42,796 DEBUG yum.verbose.YumBase: TSINFO: Marking libdrm-2.4.44-2.fc19.x86_64 as install for mesa-dri-drivers-9.1.1-1.fc19.x86_64 >03:47:42,797 DEBUG yum.verbose.YumBase: Quick matched libdrm-2.4.44-2.fc19.x86_64 to require for libdrm_nouveau.so.2()(64bit) >03:47:42,797 DEBUG yum.verbose.YumBase: Quick matched libdrm-2.4.44-2.fc19.x86_64 to require for libdrm_intel.so.1()(64bit) >03:47:42,798 DEBUG yum.verbose.YumBase: Quick matched libdrm-2.4.44-2.fc19.x86_64 to require for libdrm.so.2()(64bit) >03:47:42,799 DEBUG yum.verbose.YumBase: TSINFO: Marking llvm-libs-3.2-5.fc19.x86_64 as install for mesa-dri-drivers-9.1.1-1.fc19.x86_64 >03:47:42,804 DEBUG yum.verbose.YumBase: TSINFO: Marking lockdev-1.0.4-0.6.20111007git.fc19.x86_64 as install for minicom-2.6.2-1.fc19.x86_64 >03:47:42,805 DEBUG yum.verbose.YumBase: Quick matched lockdev-1.0.4-0.6.20111007git.fc19.x86_64 to require for liblockdev.so.1()(64bit) >03:47:42,812 DEBUG yum.verbose.YumBase: TSINFO: Marking libXcursor-1.1.13-4.fc19.x86_64 as install for mousetweaks-3.8.0-1.fc19.x86_64 >03:47:42,838 DEBUG yum.verbose.YumBase: TSINFO: Marking libtirpc-0.2.3-2.fc19.x86_64 as install for 1:nfs-utils-1.2.8-0.fc19.x86_64 >03:47:42,840 DEBUG yum.verbose.YumBase: TSINFO: Marking rpcbind-0.2.0-21.fc19.x86_64 as install for 1:nfs-utils-1.2.8-0.fc19.x86_64 >03:47:42,842 DEBUG yum.verbose.YumBase: TSINFO: Marking libnfsidmap-0.25-5.fc19.x86_64 as install for 1:nfs-utils-1.2.8-0.fc19.x86_64 >03:47:42,844 DEBUG yum.verbose.YumBase: TSINFO: Marking libmount-2.23-1.fc19.x86_64 as install for 1:nfs-utils-1.2.8-0.fc19.x86_64 >03:47:42,845 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount >03:47:42,846 DEBUG yum.verbose.YumBase: TSINFO: Marking libevent-2.0.18-3.fc19.x86_64 as install for 1:nfs-utils-1.2.8-0.fc19.x86_64 >03:47:42,850 DEBUG yum.verbose.YumBase: TSINFO: Marking nhn-nanum-fonts-common-3.020-8.fc19.noarch as install for nhn-nanum-gothic-fonts-3.020-8.fc19.noarch >03:47:42,860 DEBUG yum.verbose.YumBase: TSINFO: Marking socat-1.7.2.1-3.fc19.x86_64 as install for 2:nmap-ncat-6.25-2.fc19.x86_64 >03:47:42,869 DEBUG yum.verbose.YumBase: TSINFO: Marking newt-0.52.15-1.fc19.x86_64 as install for ntsysv-1.3.60-1.fc19.x86_64 >03:47:42,870 DEBUG yum.verbose.YumBase: Quick matched newt-0.52.15-1.fc19.x86_64 to require for libnewt.so.0.52()(64bit) >03:47:42,880 DEBUG yum.verbose.YumBase: TSINFO: Marking openssh-6.2p1-4.fc19.x86_64 as install for openssh-clients-6.2p1-4.fc19.x86_64 >03:47:42,882 DEBUG yum.verbose.YumBase: TSINFO: Marking libedit-3.0-10.20121213cvs.fc19.x86_64 as install for openssh-clients-6.2p1-4.fc19.x86_64 >03:47:42,893 DEBUG yum.verbose.YumBase: TSINFO: Marking python3-speechd-0.7.1-11.fc19.x86_64 as install for orca-3.8.1-1.fc19.x86_64 >03:47:42,894 DEBUG yum.verbose.YumBase: TSINFO: Marking python3-pyatspi-2.8.0-1.fc19.noarch as install for orca-3.8.1-1.fc19.x86_64 >03:47:42,896 DEBUG yum.verbose.YumBase: TSINFO: Marking python3-cairo-1.10.0-5.fc19.x86_64 as install for orca-3.8.1-1.fc19.x86_64 >03:47:42,897 DEBUG yum.verbose.YumBase: TSINFO: Marking python3-brlapi-0.6.0-2.fc19.x86_64 as install for orca-3.8.1-1.fc19.x86_64 >03:47:42,899 DEBUG yum.verbose.YumBase: TSINFO: Marking liblouis-python3-2.5.2-3.fc19.noarch as install for orca-3.8.1-1.fc19.x86_64 >03:47:42,917 DEBUG yum.verbose.YumBase: TSINFO: Marking paps-libs-0.6.8-25.fc19.x86_64 as install for paps-0.6.8-25.fc19.x86_64 >03:47:42,918 DEBUG yum.verbose.YumBase: Quick matched paps-libs-0.6.8-25.fc19.x86_64 to require for libpaps.so.0()(64bit) >03:47:42,927 DEBUG yum.verbose.YumBase: TSINFO: Marking libuser-0.59-1.fc19.x86_64 as install for passwd-0.78.99-4.fc19.x86_64 >03:47:42,932 DEBUG yum.verbose.YumBase: TSINFO: Marking passwdqc-lib-1.2.2-5.fc19.x86_64 as install for passwdqc-1.2.2-5.fc19.x86_64 >03:47:42,938 DEBUG yum.verbose.YumBase: TSINFO: Marking xdg-utils-1.1.0-0.15.20120809git.fc19.noarch as install for pinfo-0.6.10-7.fc19.x86_64 >03:47:42,944 DEBUG yum.verbose.YumBase: TSINFO: Marking plymouth-scripts-0.8.9-0.2013.03.26.0.fc19.x86_64 as install for plymouth-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:47:42,946 DEBUG yum.verbose.YumBase: TSINFO: Marking plymouth-core-libs-0.8.9-0.2013.03.26.0.fc19.x86_64 as install for plymouth-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:47:42,947 DEBUG yum.verbose.YumBase: Quick matched plymouth-core-libs-0.8.9-0.2013.03.26.0.fc19.x86_64 to require for libply-splash-core.so.2()(64bit) >03:47:42,948 DEBUG yum.verbose.YumBase: TSINFO: Marking plymouth-theme-charge-0.8.9-0.2013.03.26.0.fc19.x86_64 as install for plymouth-system-theme-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:47:42,955 DEBUG yum.verbose.YumBase: TSINFO: Marking libselinux-utils-2.1.13-12.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:47:42,957 DEBUG yum.verbose.YumBase: TSINFO: Marking libsemanage-2.1.10-4.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:47:42,959 DEBUG yum.verbose.YumBase: TSINFO: Marking diffutils-3.3-1.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:47:42,973 DEBUG yum.verbose.YumBase: TSINFO: Marking elfutils-libelf-0.155-5.fc19.x86_64 as install for prelink-0.5.0-1.fc19.x86_64 >03:47:42,974 DEBUG yum.verbose.YumBase: Quick matched elfutils-libelf-0.155-5.fc19.x86_64 to require for libelf.so.1()(64bit) >03:47:43,007 DEBUG yum.verbose.YumBase: TSINFO: Marking rtkit-0.11-3.fc19.x86_64 as install for pulseaudio-3.0-7.fc19.x86_64 >03:47:43,011 DEBUG yum.verbose.YumBase: TSINFO: Marking libxcb-1.9-2.fc19.x86_64 as install for pulseaudio-3.0-7.fc19.x86_64 >03:47:43,014 DEBUG yum.verbose.YumBase: TSINFO: Marking webrtc-audio-processing-0.1-3.fc19.x86_64 as install for pulseaudio-3.0-7.fc19.x86_64 >03:47:43,017 DEBUG yum.verbose.YumBase: TSINFO: Marking json-c-0.10-3.fc19.x86_64 as install for pulseaudio-3.0-7.fc19.x86_64 >03:47:43,023 DEBUG yum.verbose.YumBase: TSINFO: Marking libasyncns-0.8-5.fc19.x86_64 as install for pulseaudio-3.0-7.fc19.x86_64 >03:47:43,049 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:quota-nls-4.01-5.fc19.noarch as install for 1:quota-4.01-5.fc19.x86_64 >03:47:43,093 DEBUG yum.verbose.YumBase: TSINFO: Marking libdmapsharing-2.9.16-1.fc19.x86_64 as install for rhythmbox-2.99-1.fc19.x86_64 >03:47:43,095 DEBUG yum.verbose.YumBase: TSINFO: Marking python-mako-0.7.3-1.fc19.noarch as install for rhythmbox-2.99-1.fc19.x86_64 >03:47:43,098 DEBUG yum.verbose.YumBase: TSINFO: Marking media-player-info-17-3.fc19.noarch as install for rhythmbox-2.99-1.fc19.x86_64 >03:47:43,103 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-icon-theme-legacy-3.8.0-1.fc19.noarch as install for rhythmbox-2.99-1.fc19.x86_64 >03:47:43,105 DEBUG yum.verbose.YumBase: TSINFO: Marking libmx-1.4.7-6.fc19.x86_64 as install for rhythmbox-2.99-1.fc19.x86_64 >03:47:43,109 DEBUG yum.verbose.YumBase: TSINFO: Marking grilo-0.2.5-1.fc19.x86_64 as install for rhythmbox-2.99-1.fc19.x86_64 >03:47:43,116 DEBUG yum.verbose.YumBase: TSINFO: Marking libgpod-0.8.2-9.fc19.x86_64 as install for rhythmbox-2.99-1.fc19.x86_64 >03:47:43,140 DEBUG yum.verbose.YumBase: TSINFO: Marking libdb-utils-5.3.21-9.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:47:43,164 DEBUG yum.verbose.YumBase: TSINFO: Marking liblognorm-0.3.5-1.fc19.x86_64 as install for rsyslog-7.2.6-1.fc19.x86_64 >03:47:43,170 DEBUG yum.verbose.YumBase: TSINFO: Marking libestr-0.1.5-1.fc19.x86_64 as install for rsyslog-7.2.6-1.fc19.x86_64 >03:47:43,171 DEBUG yum.verbose.YumBase: TSINFO: Marking libee-0.4.1-4.fc19.x86_64 as install for rsyslog-7.2.6-1.fc19.x86_64 >03:47:43,206 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:samba-libs-4.0.5-1.fc19.x86_64 as install for 2:samba-client-4.0.5-1.fc19.x86_64 >03:47:43,215 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:samba-common-4.0.5-1.fc19.x86_64 as install for 2:samba-client-4.0.5-1.fc19.x86_64 >03:47:43,220 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libutil_reg.so(SAMBA_4.0.5)(64bit) >03:47:43,228 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libutil_cmdline.so(SAMBA_4.0.5)(64bit) >03:47:43,228 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libtrusts_util.so(SAMBA_4.0.5)(64bit) >03:47:43,230 DEBUG yum.verbose.YumBase: TSINFO: Marking libtevent-0.9.18-1.fc19.x86_64 as install for 2:samba-client-4.0.5-1.fc19.x86_64 >03:47:43,231 DEBUG yum.verbose.YumBase: Quick matched libtevent-0.9.18-1.fc19.x86_64 to require for libtevent.so.0(TEVENT_0.9.12)(64bit) >03:47:43,232 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsmbregistry.so(SAMBA_4.0.5)(64bit) >03:47:43,232 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsmbconf.so.0(SMBCONF_0)(64bit) >03:47:43,233 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsmbclient-raw.so.0(SMBCLIENT_RAW_0.0.1)(64bit) >03:47:43,235 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsmb_transport.so(SAMBA_4.0.5)(64bit) >03:47:43,235 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libserver-role.so(SAMBA_4.0.5)(64bit) >03:47:43,235 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsecrets3.so(SAMBA_4.0.5)(64bit) >03:47:43,236 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamdb-common.so(SAMBA_4.0.5)(64bit) >03:47:43,236 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba3-util.so(SAMBA_4.0.5)(64bit) >03:47:43,236 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-util.so.0(SAMBA_UTIL_0.0.1)(64bit) >03:47:43,237 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-sockets.so(SAMBA_4.0.5)(64bit) >03:47:43,237 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-security.so(SAMBA_4.0.5)(64bit) >03:47:43,237 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-hostconfig.so.0(SAMBA_HOSTCONFIG_0.0.1)(64bit) >03:47:43,237 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-credentials.so.0(SAMBA_CREDENTIALS_0.0.1)(64bit) >03:47:43,242 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libreplace.so(SAMBA_4.0.5)(64bit) >03:47:43,242 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libregistry.so.0(REGISTRY_0.0.1)(64bit) >03:47:43,243 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libnetif.so(SAMBA_4.0.5)(64bit) >03:47:43,244 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libndr.so.0(NDR_0.0.1)(64bit) >03:47:43,244 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libndr-standard.so.0(NDR_STANDARD_0.0.1)(64bit) >03:47:43,244 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libndr-samba.so(SAMBA_4.0.5)(64bit) >03:47:43,245 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libndr-nbt.so.0(NDR_NBT_0.0.1)(64bit) >03:47:43,245 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libmsrpc3.so(SAMBA_4.0.5)(64bit) >03:47:43,245 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for liblibsmb.so(SAMBA_4.0.5)(64bit) >03:47:43,246 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for liblibcli_netlogon3.so(SAMBA_4.0.5)(64bit) >03:47:43,246 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for liblibcli_lsa3.so(SAMBA_4.0.5)(64bit) >03:47:43,246 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libldbsamba.so(SAMBA_4.0.5)(64bit) >03:47:43,249 DEBUG yum.verbose.YumBase: TSINFO: Marking libldb-1.1.15-2.fc19.x86_64 as install for 2:samba-client-4.0.5-1.fc19.x86_64 >03:47:43,255 DEBUG yum.verbose.YumBase: Quick matched libldb-1.1.15-2.fc19.x86_64 to require for libldb.so.1(LDB_0.9.18)(64bit) >03:47:43,255 DEBUG yum.verbose.YumBase: Quick matched libldb-1.1.15-2.fc19.x86_64 to require for libldb.so.1(LDB_0.9.15)(64bit) >03:47:43,256 DEBUG yum.verbose.YumBase: Quick matched libldb-1.1.15-2.fc19.x86_64 to require for libldb.so.1(LDB_0.9.10)(64bit) >03:47:43,256 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libgensec.so.0(GENSEC_0.0.1)(64bit) >03:47:43,256 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libevents.so(SAMBA_4.0.5)(64bit) >03:47:43,257 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for liberrors.so(SAMBA_4.0.5)(64bit) >03:47:43,257 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libdcerpc.so.0(DCERPC_0.0.1)(64bit) >03:47:43,257 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libdcerpc-samba.so(SAMBA_4.0.5)(64bit) >03:47:43,258 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libdcerpc-binding.so.0(DCERPC_BINDING_0.0.1)(64bit) >03:47:43,258 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libdbwrap.so(SAMBA_4.0.5)(64bit) >03:47:43,258 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcmdline-credentials.so(SAMBA_4.0.5)(64bit) >03:47:43,260 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcliauth.so(SAMBA_4.0.5)(64bit) >03:47:43,260 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli_spoolss.so(SAMBA_4.0.5)(64bit) >03:47:43,260 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli_smb_common.so(SAMBA_4.0.5)(64bit) >03:47:43,261 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli_cldap.so(SAMBA_4.0.5)(64bit) >03:47:43,261 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli-nbt.so(SAMBA_4.0.5)(64bit) >03:47:43,261 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli-ldap.so(SAMBA_4.0.5)(64bit) >03:47:43,267 DEBUG yum.verbose.YumBase: TSINFO: Marking libbsd-0.4.2-3.fc19.x86_64 as install for 2:samba-client-4.0.5-1.fc19.x86_64 >03:47:43,268 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libCHARSET3.so(SAMBA_4.0.5)(64bit) >03:47:43,268 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libutil_tdb.so()(64bit) >03:47:43,269 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libutil_reg.so()(64bit) >03:47:43,269 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libutil_cmdline.so()(64bit) >03:47:43,269 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libtrusts_util.so()(64bit) >03:47:43,271 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsmbregistry.so()(64bit) >03:47:43,272 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsmbconf.so.0()(64bit) >03:47:43,272 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsmbclient-raw.so.0()(64bit) >03:47:43,272 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsmb_transport.so()(64bit) >03:47:43,273 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libserver-role.so()(64bit) >03:47:43,273 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsecrets3.so()(64bit) >03:47:43,278 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamdb-common.so()(64bit) >03:47:43,278 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba3-util.so()(64bit) >03:47:43,278 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-util.so.0()(64bit) >03:47:43,279 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-sockets.so()(64bit) >03:47:43,279 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-security.so()(64bit) >03:47:43,279 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-hostconfig.so.0()(64bit) >03:47:43,279 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libsamba-credentials.so.0()(64bit) >03:47:43,280 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libreplace.so()(64bit) >03:47:43,280 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libregistry.so.0()(64bit) >03:47:43,281 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libnetif.so()(64bit) >03:47:43,282 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libndr.so.0()(64bit) >03:47:43,282 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libndr-standard.so.0()(64bit) >03:47:43,282 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libndr-samba.so()(64bit) >03:47:43,283 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libndr-nbt.so.0()(64bit) >03:47:43,283 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libmsrpc3.so()(64bit) >03:47:43,283 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for liblibsmb.so()(64bit) >03:47:43,284 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for liblibcli_netlogon3.so()(64bit) >03:47:43,284 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for liblibcli_lsa3.so()(64bit) >03:47:43,288 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libldbsamba.so()(64bit) >03:47:43,289 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libgensec.so.0()(64bit) >03:47:43,289 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libevents.so()(64bit) >03:47:43,290 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for liberrors.so()(64bit) >03:47:43,290 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libdcerpc.so.0()(64bit) >03:47:43,290 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libdcerpc-samba.so()(64bit) >03:47:43,291 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libdcerpc-binding.so.0()(64bit) >03:47:43,291 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libdbwrap.so()(64bit) >03:47:43,291 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcmdline-credentials.so()(64bit) >03:47:43,291 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcliauth.so()(64bit) >03:47:43,292 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli_spoolss.so()(64bit) >03:47:43,292 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli_smb_common.so()(64bit) >03:47:43,292 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli_cldap.so()(64bit) >03:47:43,293 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli-nbt.so()(64bit) >03:47:43,297 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libcli-ldap.so()(64bit) >03:47:43,298 DEBUG yum.verbose.YumBase: Quick matched 2:samba-libs-4.0.5-1.fc19.x86_64 to require for libCHARSET3.so()(64bit) >03:47:43,313 DEBUG yum.verbose.YumBase: TSINFO: Marking sane-backends-libs-1.0.23-7.fc19.x86_64 as install for sane-backends-drivers-scanners-1.0.23-7.fc19.x86_64 >03:47:43,315 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:libusb-0.1.4-1.fc19.x86_64 as install for sane-backends-drivers-scanners-1.0.23-7.fc19.x86_64 >03:47:43,319 DEBUG yum.verbose.YumBase: TSINFO: Marking libieee1284-0.2.11-13.fc19.x86_64 as install for sane-backends-drivers-scanners-1.0.23-7.fc19.x86_64 >03:47:43,353 DEBUG yum.verbose.YumBase: TSINFO: Marking pinentry-gtk-0.8.1-10.fc19.x86_64 as install for seahorse-3.8.1-1.fc19.x86_64 >03:47:43,355 DEBUG yum.verbose.YumBase: TSINFO: Marking gpgme-1.3.2-3.fc19.x86_64 as install for seahorse-3.8.1-1.fc19.x86_64 >03:47:43,356 DEBUG yum.verbose.YumBase: Quick matched gpgme-1.3.2-3.fc19.x86_64 to require for libgpgme.so.11(GPGME_1.0)(64bit) >03:47:43,356 DEBUG yum.verbose.YumBase: Quick matched gpgme-1.3.2-3.fc19.x86_64 to require for libgpgme.so.11()(64bit) >03:47:43,361 DEBUG yum.verbose.YumBase: TSINFO: Marking avahi-glib-0.6.31-11.fc19.x86_64 as install for seahorse-3.8.1-1.fc19.x86_64 >03:47:43,364 DEBUG yum.verbose.YumBase: TSINFO: Marking selinux-policy-3.12.1-42.fc19.noarch as install for selinux-policy-targeted-3.12.1-42.fc19.noarch >03:47:43,365 DEBUG yum.verbose.YumBase: Quick matched selinux-policy-3.12.1-42.fc19.noarch to require for selinux-policy >03:47:43,381 DEBUG yum.verbose.YumBase: TSINFO: Marking procmail-3.22-32.fc19.x86_64 as install for sendmail-8.14.7-1.fc19.x86_64 >03:47:43,387 DEBUG yum.verbose.YumBase: TSINFO: Marking cyrus-sasl-2.1.26-6.fc19.x86_64 as install for sendmail-8.14.7-1.fc19.x86_64 >03:47:43,389 DEBUG yum.verbose.YumBase: TSINFO: Marking hesiod-3.2.1-1.fc19.x86_64 as install for sendmail-8.14.7-1.fc19.x86_64 >03:47:43,399 DEBUG yum.verbose.YumBase: TSINFO: Marking setroubleshoot-server-3.2.8-1.fc19.x86_64 as install for setroubleshoot-3.2.8-1.fc19.x86_64 >03:47:43,407 DEBUG yum.verbose.YumBase: TSINFO: Marking pygtk2-libglade-2.24.0-7.fc19.x86_64 as install for setroubleshoot-3.2.8-1.fc19.x86_64 >03:47:43,409 DEBUG yum.verbose.YumBase: TSINFO: Marking pygobject2-2.28.6-9.fc19.x86_64 as install for setroubleshoot-3.2.8-1.fc19.x86_64 >03:47:43,412 DEBUG yum.verbose.YumBase: TSINFO: Marking notify-python-0.1.1-23.fc19.x86_64 as install for setroubleshoot-3.2.8-1.fc19.x86_64 >03:47:43,419 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-gtk-2.1.3-3.fc19.x86_64 as install for setroubleshoot-3.2.8-1.fc19.x86_64 >03:47:43,422 DEBUG yum.verbose.YumBase: TSINFO: Marking usermode-1.111-2.fc19.x86_64 as install for setuptool-1.19.11-6.fc19.x86_64 >03:47:43,452 DEBUG yum.verbose.YumBase: TSINFO: Marking LibRaw-0.14.7-4.fc19.x86_64 as install for shotwell-0.14.1-1.fc19.x86_64 >03:47:43,454 DEBUG yum.verbose.YumBase: TSINFO: Marking libgomp-4.8.0-2.fc19.x86_64 as install for shotwell-0.14.1-1.fc19.x86_64 >03:47:43,456 DEBUG yum.verbose.YumBase: TSINFO: Marking libgexiv2-0.5.0-6.fc19.x86_64 as install for shotwell-0.14.1-1.fc19.x86_64 >03:47:43,481 DEBUG yum.verbose.YumBase: TSINFO: Marking mailx-12.5-8.fc19.x86_64 as install for 1:smartmontools-6.0-2.fc19.x86_64 >03:47:43,489 DEBUG yum.verbose.YumBase: TSINFO: Marking smc-fonts-common-5.0.1-5.fc19.noarch as install for smc-meera-fonts-5.0.1-5.fc19.noarch >03:47:43,491 DEBUG yum.verbose.YumBase: TSINFO: Marking xz-5.1.2-4alpha.fc19.x86_64 as install for sos-2.2-31.fc19.noarch >03:47:43,498 DEBUG yum.verbose.YumBase: TSINFO: Marking rpm-python-4.11.0.1-1.fc19.x86_64 as install for sos-2.2-31.fc19.noarch >03:47:43,500 DEBUG yum.verbose.YumBase: TSINFO: Marking libxml2-python-2.9.1-1.fc19.x86_64 as install for sos-2.2-31.fc19.noarch >03:47:43,509 DEBUG yum.verbose.YumBase: TSINFO: Marking libpciaccess-0.13.1-3.fc19.x86_64 as install for spice-vdagent-0.14.0-1.fc19.x86_64 >03:47:43,515 DEBUG yum.verbose.YumBase: TSINFO: Marking libXinerama-1.1.2-4.fc19.x86_64 as install for spice-vdagent-0.14.0-1.fc19.x86_64 >03:47:43,542 DEBUG yum.verbose.YumBase: TSINFO: Marking sssd-client-1.10.0-1.fc19.alpha1.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,549 DEBUG yum.verbose.YumBase: TSINFO: Marking libsss_idmap-1.10.0-1.fc19.alpha1.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,551 DEBUG yum.verbose.YumBase: TSINFO: Marking libipa_hbac-1.10.0-1.fc19.alpha1.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,557 DEBUG yum.verbose.YumBase: TSINFO: Marking cyrus-sasl-gssapi-2.1.26-6.fc19.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,560 DEBUG yum.verbose.YumBase: TSINFO: Marking python-libs-2.7.4-4.fc19.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,567 DEBUG yum.verbose.YumBase: TSINFO: Marking pcre-8.32-4.fc19.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,573 DEBUG yum.verbose.YumBase: TSINFO: Marking libnl-1.1-17.fc19.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,576 DEBUG yum.verbose.YumBase: TSINFO: Marking libini_config-1.0.0.1-16.fc19.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,578 DEBUG yum.verbose.YumBase: TSINFO: Marking libdhash-0.4.3-16.fc19.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,584 DEBUG yum.verbose.YumBase: TSINFO: Marking libcollection-0.6.2-16.fc19.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,586 DEBUG yum.verbose.YumBase: TSINFO: Marking c-ares-1.9.1-4.fc19.x86_64 as install for sssd-1.10.0-1.fc19.alpha1.x86_64 >03:47:43,619 DEBUG yum.verbose.YumBase: TSINFO: Marking libmusicbrainz5-5.0.1-5.fc19.x86_64 as install for sushi-3.8.1-1.fc19.x86_64 >03:47:43,633 DEBUG yum.verbose.YumBase: TSINFO: Marking system-config-printer-libs-1.4.0-1.fc19.noarch as install for system-config-printer-1.4.0-1.fc19.x86_64 >03:47:43,635 DEBUG yum.verbose.YumBase: TSINFO: Marking pygtk2-2.24.0-7.fc19.x86_64 as install for system-config-printer-1.4.0-1.fc19.x86_64 >03:47:43,637 DEBUG yum.verbose.YumBase: TSINFO: Marking python-slip-gtk-0.4.0-1.fc19.noarch as install for system-config-printer-1.4.0-1.fc19.x86_64 >03:47:43,643 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-python2-gnomekeyring-2.32.0-14.fc19.x86_64 as install for system-config-printer-1.4.0-1.fc19.x86_64 >03:47:43,670 DEBUG yum.verbose.YumBase: TSINFO: Marking kmod-libs-13-2.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:43,677 DEBUG yum.verbose.YumBase: TSINFO: Marking qrencode-libs-3.4.1-1.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:43,679 DEBUG yum.verbose.YumBase: TSINFO: Marking libmicrohttpd-0.9.24-2.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:43,700 DEBUG yum.verbose.YumBase: TSINFO: Marking thai-scalable-fonts-common-0.5.0-6.fc19.noarch as install for thai-scalable-waree-fonts-0.5.0-6.fc19.noarch >03:47:43,709 DEBUG yum.verbose.YumBase: TSINFO: Marking psmisc-22.20-2.fc19.x86_64 as install for tmpwatch-2.11-3.fc19.x86_64 >03:47:43,737 DEBUG yum.verbose.YumBase: TSINFO: Marking pyxdg-0.25-1.fc19.noarch as install for 1:totem-3.8.0-1.fc19.x86_64 >03:47:43,744 DEBUG yum.verbose.YumBase: TSINFO: Marking grilo-plugins-0.2.6-1.fc19.x86_64 as install for 1:totem-3.8.0-1.fc19.x86_64 >03:47:43,747 DEBUG yum.verbose.YumBase: TSINFO: Marking libzeitgeist-0.3.18-4.fc19.x86_64 as install for 1:totem-3.8.0-1.fc19.x86_64 >03:47:43,794 DEBUG yum.verbose.YumBase: TSINFO: Marking transmission-common-2.77-3.fc19.x86_64 as install for transmission-gtk-2.77-3.fc19.x86_64 >03:47:43,804 DEBUG yum.verbose.YumBase: TSINFO: Marking usb_modeswitch-data-20121109-2.fc19.noarch as install for usb_modeswitch-1.2.5-2.fc19.x86_64 >03:47:43,834 DEBUG yum.verbose.YumBase: TSINFO: Marking libutempter-1.1.6-2.fc19.x86_64 as install for util-linux-2.23-1.fc19.x86_64 >03:47:43,840 DEBUG yum.verbose.YumBase: Quick matched libutempter-1.1.6-2.fc19.x86_64 to require for libutempter.so.0()(64bit) >03:47:43,864 DEBUG yum.verbose.YumBase: TSINFO: Marking rdesktop-1.7.1-2.fc19.x86_64 as install for vinagre-3.8.1-2.fc19.x86_64 >03:47:43,870 DEBUG yum.verbose.YumBase: TSINFO: Marking avahi-ui-gtk3-0.6.31-11.fc19.x86_64 as install for vinagre-3.8.1-2.fc19.x86_64 >03:47:43,872 DEBUG yum.verbose.YumBase: TSINFO: Marking avahi-gobject-0.6.31-11.fc19.x86_64 as install for vinagre-3.8.1-2.fc19.x86_64 >03:47:43,888 DEBUG yum.verbose.YumBase: TSINFO: Marking crda-1.1.3_2013.02.13-2.fc19.x86_64 as install for 1:wireless-tools-29-9.1.fc19.x86_64 >03:47:43,903 DEBUG yum.verbose.YumBase: TSINFO: Marking libwvstreams-4.6.1-8.fc19.x86_64 as install for wvdial-1.61-7.fc19.x86_64 >03:47:43,904 DEBUG yum.verbose.YumBase: Quick matched libwvstreams-4.6.1-8.fc19.x86_64 to require for libwvstreams.so.4.6()(64bit) >03:47:43,904 DEBUG yum.verbose.YumBase: Quick matched libwvstreams-4.6.1-8.fc19.x86_64 to require for libwvbase.so.4.6()(64bit) >03:47:43,905 DEBUG yum.verbose.YumBase: Quick matched libwvstreams-4.6.1-8.fc19.x86_64 to require for libuniconf.so.4.6()(64bit) >03:47:43,917 DEBUG yum.verbose.YumBase: TSINFO: Marking xdg-user-dirs-0.14-4.fc19.x86_64 as install for xdg-user-dirs-gtk-0.10-1.fc19.x86_64 >03:47:43,922 DEBUG yum.verbose.YumBase: TSINFO: Marking xorg-x11-glamor-0.5.0-5.20130401git81aadb8.fc19.x86_64 as install for xorg-x11-drv-ati-7.1.0-5.20130408git6e74aacc5.fc19.x86_64 >03:47:43,936 DEBUG yum.verbose.YumBase: TSINFO: Marking xkeyboard-config-2.8-2.fc19.noarch as install for xorg-x11-drv-evdev-2.8.0-1.fc19.x86_64 >03:47:43,937 DEBUG yum.verbose.YumBase: TSINFO: Marking mtdev-1.1.3-3.fc19.x86_64 as install for xorg-x11-drv-evdev-2.8.0-1.fc19.x86_64 >03:47:43,938 DEBUG yum.verbose.YumBase: Quick matched mtdev-1.1.3-3.fc19.x86_64 to require for libmtdev.so.1()(64bit) >03:47:43,959 DEBUG yum.verbose.YumBase: TSINFO: Marking xcb-util-0.3.9-2.fc19.x86_64 as install for xorg-x11-drv-intel-2.21.6-1.fc19.x86_64 >03:47:43,962 DEBUG yum.verbose.YumBase: TSINFO: Marking libXvMC-1.0.7-4.fc19.x86_64 as install for xorg-x11-drv-intel-2.21.6-1.fc19.x86_64 >03:47:43,965 DEBUG yum.verbose.YumBase: TSINFO: Marking libXv-1.0.7-4.fc19.x86_64 as install for xorg-x11-drv-intel-2.21.6-1.fc19.x86_64 >03:47:44,012 DEBUG yum.verbose.YumBase: TSINFO: Marking mesa-libxatracker-9.1.1-1.fc19.x86_64 as install for xorg-x11-drv-vmware-13.0.1-1.fc19.x86_64 >03:47:44,016 DEBUG yum.verbose.YumBase: Quick matched mesa-libxatracker-9.1.1-1.fc19.x86_64 to require for libxatracker.so.1()(64bit) >03:47:44,047 DEBUG yum.verbose.YumBase: TSINFO: Marking xorg-x11-server-common-1.14.1-1.fc19.x86_64 as install for xorg-x11-server-Xorg-1.14.1-1.fc19.x86_64 >03:47:44,049 DEBUG yum.verbose.YumBase: TSINFO: Marking libunwind-1.1-1.fc19.x86_64 as install for xorg-x11-server-Xorg-1.14.1-1.fc19.x86_64 >03:47:44,050 DEBUG yum.verbose.YumBase: TSINFO: Marking pixman-0.28.0-3.fc19.x86_64 as install for xorg-x11-server-Xorg-1.14.1-1.fc19.x86_64 >03:47:44,052 DEBUG yum.verbose.YumBase: TSINFO: Marking libXfont-1.4.5-4.fc19.x86_64 as install for xorg-x11-server-Xorg-1.14.1-1.fc19.x86_64 >03:47:44,057 DEBUG yum.verbose.YumBase: TSINFO: Marking libdmx-1.1.2-3.fc19.x86_64 as install for xorg-x11-utils-7.5-9.fc19.x86_64 >03:47:44,058 DEBUG yum.verbose.YumBase: TSINFO: Marking libXxf86vm-1.1.2-4.fc19.x86_64 as install for xorg-x11-utils-7.5-9.fc19.x86_64 >03:47:44,060 DEBUG yum.verbose.YumBase: TSINFO: Marking libXxf86misc-1.0.3-6.fc19.x86_64 as install for xorg-x11-utils-7.5-9.fc19.x86_64 >03:47:44,062 DEBUG yum.verbose.YumBase: TSINFO: Marking libXxf86dga-1.1.3-4.fc19.x86_64 as install for xorg-x11-utils-7.5-9.fc19.x86_64 >03:47:44,063 DEBUG yum.verbose.YumBase: TSINFO: Marking libXrender-0.9.7-4.fc19.x86_64 as install for xorg-x11-utils-7.5-9.fc19.x86_64 >03:47:44,067 DEBUG yum.verbose.YumBase: TSINFO: Marking libXmu-1.1.1-4.fc19.x86_64 as install for 1:xorg-x11-xauth-1.0.7-3.fc19.x86_64 >03:47:44,073 DEBUG yum.verbose.YumBase: TSINFO: Marking yelp-xsl-3.8.0-1.fc19.noarch as install for 1:yelp-3.8.0-1.fc19.x86_64 >03:47:44,075 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:yelp-libs-3.8.0-1.fc19.x86_64 as install for 1:yelp-3.8.0-1.fc19.x86_64 >03:47:44,078 DEBUG yum.verbose.YumBase: TSINFO: Marking yum-metadata-parser-1.1.4-8.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:47:44,080 DEBUG yum.verbose.YumBase: TSINFO: Marking python-urlgrabber-3.9.1-26.fc19.noarch as install for yum-3.4.3-83.fc19.noarch >03:47:44,082 DEBUG yum.verbose.YumBase: TSINFO: Marking pyxattr-0.5.1-3.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:47:44,083 DEBUG yum.verbose.YumBase: TSINFO: Marking python-iniparse-0.4-7.fc19.noarch as install for yum-3.4.3-83.fc19.noarch >03:47:44,085 DEBUG yum.verbose.YumBase: TSINFO: Marking pyliblzma-0.5.3-8.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:47:44,087 DEBUG yum.verbose.YumBase: TSINFO: Marking pygpgme-0.3-6.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:47:44,089 DEBUG yum.verbose.YumBase: TSINFO: Marking python-kitchen-1.1.1-3.fc19.noarch as install for yum-utils-1.1.31-14.fc19.noarch >03:47:44,129 DEBUG yum.verbose.YumBase: TSINFO: Marking comps-extras-21-3.fc19.noarch as install for PackageKit-0.8.7-4.fc19.x86_64 >03:47:44,130 DEBUG yum.verbose.YumBase: TSINFO: Marking PackageKit-yum-0.8.7-4.fc19.x86_64 as install for PackageKit-0.8.7-4.fc19.x86_64 >03:47:44,138 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-libs-2.1.3-2.fc19.x86_64 as install for abrt-2.1.3-2.fc19.x86_64 >03:47:44,140 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-2.1.3-3.fc19.x86_64 as install for abrt-2.1.3-2.fc19.x86_64 >03:47:44,141 DEBUG yum.verbose.YumBase: Quick matched libreport-2.1.3-3.fc19.x86_64 to require for libreport.so.0()(64bit) >03:47:44,142 DEBUG yum.verbose.YumBase: TSINFO: Marking btparser-0.25-2.fc19.x86_64 as install for abrt-2.1.3-2.fc19.x86_64 >03:47:44,152 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-plugin-kerneloops-2.1.3-3.fc19.x86_64 as install for abrt-addon-kerneloops-2.1.3-2.fc19.x86_64 >03:47:44,156 DEBUG yum.verbose.YumBase: TSINFO: Marking crash-6.1.4-1.fc19.x86_64 as install for abrt-addon-vmcore-2.1.3-2.fc19.x86_64 >03:47:44,164 DEBUG yum.verbose.YumBase: TSINFO: Marking abrt-dbus-2.1.3-2.fc19.x86_64 as install for abrt-gui-2.1.3-2.fc19.x86_64 >03:47:44,168 DEBUG yum.verbose.YumBase: TSINFO: Marking xmlrpc-c-client-1.32.5-1901.svn2451.fc19.x86_64 as install for abrt-plugin-bodhi-2.1.3-2.fc19.x86_64 >03:47:44,170 DEBUG yum.verbose.YumBase: TSINFO: Marking xmlrpc-c-1.32.5-1901.svn2451.fc19.x86_64 as install for abrt-plugin-bodhi-2.1.3-2.fc19.x86_64 >03:47:44,171 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-web-2.1.3-3.fc19.x86_64 as install for abrt-plugin-bodhi-2.1.3-2.fc19.x86_64 >03:47:44,202 DEBUG yum.verbose.YumBase: TSINFO: Marking fxload-2002_04_11-13.fc19.x86_64 as install for alsa-tools-firmware-1.0.27-1.fc19.x86_64 >03:47:44,210 DEBUG yum.verbose.YumBase: TSINFO: Marking libXScrnSaver-1.2.2-5.fc19.x86_64 as install for argyllcms-1.4.0-9.fc19.x86_64 >03:47:44,242 DEBUG yum.verbose.YumBase: TSINFO: Marking 32:bind-license-9.9.3-0.2.rc1.fc19.noarch as install for 32:bind-libs-9.9.3-0.2.rc1.fc19.x86_64 >03:47:44,295 DEBUG yum.verbose.YumBase: TSINFO: Marking p11-kit-trust-0.18.1-1.fc19.x86_64 as install for ca-certificates-2012.87-10.1.fc19.noarch >03:47:44,305 DEBUG yum.verbose.YumBase: TSINFO: Marking mesa-libEGL-9.1.1-1.fc19.x86_64 as install for cairo-1.12.14-1.fc19.x86_64 >03:47:44,320 DEBUG yum.verbose.YumBase: TSINFO: Marking libao-1.1.0-6.fc19.x86_64 as install for cdrdao-1.2.3-19.fc19.x86_64 >03:47:44,321 DEBUG yum.verbose.YumBase: Quick matched libao-1.1.0-6.fc19.x86_64 to require for libao.so.4()(64bit) >03:47:44,359 DEBUG yum.verbose.YumBase: TSINFO: Marking iw-3.8-2.fc19.x86_64 as install for crda-1.1.3_2013.02.13-2.fc19.x86_64 >03:47:44,376 DEBUG yum.verbose.YumBase: TSINFO: Marking cups-filters-libs-1.0.31-2.fc19.x86_64 as install for cups-filters-1.0.31-2.fc19.x86_64 >03:47:44,378 DEBUG yum.verbose.YumBase: TSINFO: Marking poppler-utils-0.22.1-1.fc19.x86_64 as install for cups-filters-1.0.31-2.fc19.x86_64 >03:47:44,381 DEBUG yum.verbose.YumBase: TSINFO: Marking qpdf-libs-4.0.1-3.fc19.x86_64 as install for cups-filters-1.0.31-2.fc19.x86_64 >03:47:44,383 DEBUG yum.verbose.YumBase: Quick matched qpdf-libs-4.0.1-3.fc19.x86_64 to require for libqpdf.so.10()(64bit) >03:47:44,384 DEBUG yum.verbose.YumBase: TSINFO: Marking poppler-0.22.1-1.fc19.x86_64 as install for cups-filters-1.0.31-2.fc19.x86_64 >03:47:44,385 DEBUG yum.verbose.YumBase: Quick matched cups-filters-libs-1.0.31-2.fc19.x86_64 to require for libcupsfilters.so.1()(64bit) >03:47:44,419 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:emacs-filesystem-24.2-18.fc19.noarch as install for desktop-file-utils-0.21-2.fc19.x86_64 >03:47:44,436 DEBUG yum.verbose.YumBase: TSINFO: Marking sgpio-1.2.0.10-11.fc19.x86_64 as install for dmraid-events-1.0.0.rc16-20.fc19.x86_64 >03:47:44,437 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-event-1.02.77-8.fc19.x86_64 as install for dmraid-events-1.0.0.rc16-20.fc19.x86_64 >03:47:44,445 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:hardlink-1.0-17.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:44,452 DEBUG yum.verbose.YumBase: TSINFO: Marking python-boto-2.6.0-3.fc19.noarch as install for duplicity-0.6.21-1.fc19.x86_64 >03:47:44,454 DEBUG yum.verbose.YumBase: TSINFO: Marking python-GnuPGInterface-0.3.2-11.fc19.noarch as install for duplicity-0.6.21-1.fc19.x86_64 >03:47:44,455 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:ncftp-3.2.5-5.fc19.x86_64 as install for duplicity-0.6.21-1.fc19.x86_64 >03:47:44,458 DEBUG yum.verbose.YumBase: TSINFO: Marking gnupg-1.4.13-3.fc19.x86_64 as install for duplicity-0.6.21-1.fc19.x86_64 >03:47:44,460 DEBUG yum.verbose.YumBase: TSINFO: Marking librsync-0.9.7-20.fc19.x86_64 as install for duplicity-0.6.21-1.fc19.x86_64 >03:47:44,478 DEBUG yum.verbose.YumBase: TSINFO: Marking elfutils-libs-0.155-5.fc19.x86_64 as install for elfutils-0.155-5.fc19.x86_64 >03:47:44,480 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libdw.so.1(ELFUTILS_0.149)(64bit) >03:47:44,481 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libdw.so.1(ELFUTILS_0.148)(64bit) >03:47:44,481 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libdw.so.1(ELFUTILS_0.138)(64bit) >03:47:44,481 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libdw.so.1(ELFUTILS_0.127)(64bit) >03:47:44,482 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libdw.so.1(ELFUTILS_0.126)(64bit) >03:47:44,483 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libdw.so.1(ELFUTILS_0.122)(64bit) >03:47:44,483 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libasm.so.1(ELFUTILS_1.0)(64bit) >03:47:44,484 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libdw.so.1()(64bit) >03:47:44,484 DEBUG yum.verbose.YumBase: Quick matched elfutils-libs-0.155-5.fc19.x86_64 to require for libasm.so.1()(64bit) >03:47:44,503 DEBUG yum.verbose.YumBase: TSINFO: Marking libspectre-0.2.7-2.fc19.x86_64 as install for evince-libs-3.8.0-2.fc19.x86_64 >03:47:44,504 DEBUG yum.verbose.YumBase: TSINFO: Marking poppler-glib-0.22.1-1.fc19.x86_64 as install for evince-libs-3.8.0-2.fc19.x86_64 >03:47:44,506 DEBUG yum.verbose.YumBase: TSINFO: Marking libgxps-0.2.2-7.fc19.x86_64 as install for evince-libs-3.8.0-2.fc19.x86_64 >03:47:44,537 DEBUG yum.verbose.YumBase: TSINFO: Marking libnice-0.1.3-2.fc19.x86_64 as install for farstream02-0.2.3-1.fc19.x86_64 >03:47:44,539 DEBUG yum.verbose.YumBase: TSINFO: Marking gupnp-igd-0.2.2-1.fc19.x86_64 as install for farstream02-0.2.3-1.fc19.x86_64 >03:47:44,541 DEBUG yum.verbose.YumBase: TSINFO: Marking gupnp-0.20.2-1.fc19.x86_64 as install for farstream02-0.2.3-1.fc19.x86_64 >03:47:44,550 DEBUG yum.verbose.YumBase: TSINFO: Marking fipscheck-1.3.1-3.fc19.x86_64 as install for fipscheck-lib-1.3.1-3.fc19.x86_64 >03:47:44,571 DEBUG yum.verbose.YumBase: TSINFO: Marking libfprint-0.5.0-2.fc19.x86_64 as install for fprintd-0.5.0-1.fc19.x86_64 >03:47:44,613 DEBUG yum.verbose.YumBase: TSINFO: Marking libusal-1.1.11-17.fc19.x86_64 as install for genisoimage-1.1.11-17.fc19.x86_64 >03:47:44,614 DEBUG yum.verbose.YumBase: Quick matched libusal-1.1.11-17.fc19.x86_64 to require for libusal.so.0()(64bit) >03:47:44,614 DEBUG yum.verbose.YumBase: Quick matched libusal-1.1.11-17.fc19.x86_64 to require for librols.so.0()(64bit) >03:47:44,618 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:xorg-x11-font-utils-7.5-13.fc19.x86_64 as install for ghostscript-fonts-5.50-30.fc19.noarch >03:47:44,634 DEBUG yum.verbose.YumBase: TSINFO: Marking tzdata-2013b-2.fc19.noarch as install for glibc-common-2.17-4.fc19.x86_64 >03:47:44,642 DEBUG yum.verbose.YumBase: TSINFO: Marking python-inotify-0.9.4-2.fc19.noarch as install for gnome-abrt-0.2.12-3.fc19.x86_64 >03:47:44,653 DEBUG yum.verbose.YumBase: TSINFO: Marking desktop-backgrounds-gnome-19.0.0-1.fc19.noarch as install for gnome-desktop3-3.8.1-1.fc19.x86_64 >03:47:44,669 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-python2-desktop-2.32.0-14.fc19.x86_64 as install for gnome-python2-gnomekeyring-2.32.0-14.fc19.x86_64 >03:47:44,680 DEBUG yum.verbose.YumBase: TSINFO: Marking frei0r-plugins-1.3-9.fc19.x86_64 as install for gnome-video-effects-0.4.0-5.fc19.noarch >03:47:44,687 DEBUG yum.verbose.YumBase: TSINFO: Marking nettle-2.6-2.fc19.x86_64 as install for gnutls-3.1.10-1.fc19.x86_64 >03:47:44,688 DEBUG yum.verbose.YumBase: Quick matched nettle-2.6-2.fc19.x86_64 to require for libhogweed.so.2()(64bit) >03:47:44,708 DEBUG yum.verbose.YumBase: TSINFO: Marking gupnp-av-0.12.1-1.fc19.x86_64 as install for grilo-plugins-0.2.6-1.fc19.x86_64 >03:47:44,710 DEBUG yum.verbose.YumBase: TSINFO: Marking libquvi-0.4.1-3.fc19.x86_64 as install for grilo-plugins-0.2.6-1.fc19.x86_64 >03:47:44,712 DEBUG yum.verbose.YumBase: TSINFO: Marking gssdp-0.14.2-1.fc19.x86_64 as install for grilo-plugins-0.2.6-1.fc19.x86_64 >03:47:44,714 DEBUG yum.verbose.YumBase: TSINFO: Marking gmime-2.6.15-1.fc19.x86_64 as install for grilo-plugins-0.2.6-1.fc19.x86_64 >03:47:44,726 DEBUG yum.verbose.YumBase: TSINFO: Marking gstreamer-tools-0.10.36-3.fc19.x86_64 as install for gstreamer-0.10.36-3.fc19.x86_64 >03:47:44,736 DEBUG yum.verbose.YumBase: TSINFO: Marking libvisual-0.4.0-13.fc19.x86_64 as install for gstreamer-plugins-base-0.10.36-4.fc19.x86_64 >03:47:44,738 DEBUG yum.verbose.YumBase: TSINFO: Marking cdparanoia-libs-10.2-13.fc19.x86_64 as install for gstreamer-plugins-base-0.10.36-4.fc19.x86_64 >03:47:44,739 DEBUG yum.verbose.YumBase: Quick matched cdparanoia-libs-10.2-13.fc19.x86_64 to require for libcdda_interface.so.0()(64bit) >03:47:44,793 DEBUG yum.verbose.YumBase: TSINFO: Marking libxkbcommon-0.3.0-1.fc19.x86_64 as install for gtk3-3.8.1-1.fc19.x86_64 >03:47:44,795 DEBUG yum.verbose.YumBase: TSINFO: Marking libwayland-cursor-1.1.0-1.fc19.x86_64 as install for gtk3-3.8.1-1.fc19.x86_64 >03:47:44,796 DEBUG yum.verbose.YumBase: TSINFO: Marking libwayland-client-1.1.0-1.fc19.x86_64 as install for gtk3-3.8.1-1.fc19.x86_64 >03:47:44,819 DEBUG yum.verbose.YumBase: TSINFO: Marking libunistring-0.9.3-7.fc19.x86_64 as install for 5:guile-2.0.9-1.fc19.x86_64 >03:47:44,833 DEBUG yum.verbose.YumBase: TSINFO: Marking libcdio-paranoia-10.2+0.90-7.fc19.x86_64 as install for gvfs-1.16.1-1.fc19.x86_64 >03:47:44,835 DEBUG yum.verbose.YumBase: Quick matched libcdio-paranoia-10.2+0.90-7.fc19.x86_64 to require for libcdio_cdda.so.1(CDIO_CDDA_1)(64bit) >03:47:44,836 DEBUG yum.verbose.YumBase: TSINFO: Marking libcdio-0.90-2.fc19.x86_64 as install for gvfs-1.16.1-1.fc19.x86_64 >03:47:44,838 DEBUG yum.verbose.YumBase: Quick matched libcdio-paranoia-10.2+0.90-7.fc19.x86_64 to require for libcdio_cdda.so.1()(64bit) >03:47:44,843 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:obex-data-server-0.4.6-5.fc19.x86_64 as install for gvfs-obexftp-1.16.1-1.fc19.x86_64 >03:47:44,864 DEBUG yum.verbose.YumBase: TSINFO: Marking hplip-common-3.13.4-1.fc19.x86_64 as install for hplip-libs-3.13.4-1.fc19.x86_64 >03:47:44,882 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:vorbis-tools-1.4.0-8.fc19.x86_64 as install for icedax-1.1.11-17.fc19.x86_64 >03:47:44,885 DEBUG yum.verbose.YumBase: TSINFO: Marking cdparanoia-10.2-13.fc19.x86_64 as install for icedax-1.1.11-17.fc19.x86_64 >03:47:44,896 DEBUG yum.verbose.YumBase: TSINFO: Marking xml-common-0.6.3-39.fc19.noarch as install for iso-codes-3.41-2.fc19.noarch >03:47:44,911 DEBUG yum.verbose.YumBase: TSINFO: Marking jpackage-utils-1.7.5-26.fc19.noarch as install for 1:java-1.7.0-openjdk-1.7.0.19-2.3.9.6.fc19.x86_64 >03:47:44,913 DEBUG yum.verbose.YumBase: TSINFO: Marking xorg-x11-fonts-Type1-7.5-8.fc19.noarch as install for 1:java-1.7.0-openjdk-1.7.0.19-2.3.9.6.fc19.x86_64 >03:47:44,916 DEBUG yum.verbose.YumBase: TSINFO: Marking tzdata-java-2013b-2.fc19.noarch as install for 1:java-1.7.0-openjdk-1.7.0.19-2.3.9.6.fc19.x86_64 >03:47:44,917 DEBUG yum.verbose.YumBase: TSINFO: Marking rhino-1.7R4-2.fc19.noarch as install for 1:java-1.7.0-openjdk-1.7.0.19-2.3.9.6.fc19.x86_64 >03:47:44,919 DEBUG yum.verbose.YumBase: TSINFO: Marking giflib-4.1.6-7.fc19.x86_64 as install for 1:java-1.7.0-openjdk-1.7.0.19-2.3.9.6.fc19.x86_64 >03:47:44,945 DEBUG yum.verbose.YumBase: TSINFO: Marking libverto-0.2.5-2.fc19.x86_64 as install for krb5-libs-1.11.2-2.fc19.x86_64 >03:47:44,955 DEBUG yum.verbose.YumBase: TSINFO: Marking libX11-common-1.5.99.901-2.fc19.noarch as install for libX11-1.5.99.901-2.fc19.x86_64 >03:47:44,975 DEBUG yum.verbose.YumBase: TSINFO: Marking libfontenc-1.1.1-3.fc19.x86_64 as install for libXfont-1.4.5-4.fc19.x86_64 >03:47:45,038 DEBUG yum.verbose.YumBase: TSINFO: Marking lzo-2.06-4.fc19.x86_64 as install for libarchive-3.1.2-2.fc19.x86_64 >03:47:45,123 DEBUG yum.verbose.YumBase: TSINFO: Marking sound-theme-freedesktop-0.8-2.fc19.noarch as install for libcanberra-0.30-3.fc19.x86_64 >03:47:45,185 DEBUG yum.verbose.YumBase: TSINFO: Marking libssh2-1.4.3-4.fc19.x86_64 as install for libcurl-7.29.0-6.fc19.x86_64 >03:47:45,190 DEBUG yum.verbose.YumBase: Quick matched libssh2-1.4.3-4.fc19.x86_64 to require for libssh2.so.1()(64bit) >03:47:45,281 DEBUG yum.verbose.YumBase: TSINFO: Marking liboauth-0.9.7-2.fc19.x86_64 as install for libgdata-0.13.3-1.fc19.x86_64 >03:47:45,335 DEBUG yum.verbose.YumBase: TSINFO: Marking gd-2.0.35-24.fc19.x86_64 as install for libgphoto2-2.5.1.1-1.fc19.x86_64 >03:47:45,356 DEBUG yum.verbose.YumBase: TSINFO: Marking sg3_utils-libs-1.35-2.fc19.x86_64 as install for libgpod-0.8.2-9.fc19.x86_64 >03:47:45,411 DEBUG yum.verbose.YumBase: TSINFO: Marking libref_array-0.1.3-16.fc19.x86_64 as install for libini_config-1.0.0.1-16.fc19.x86_64 >03:47:45,414 DEBUG yum.verbose.YumBase: TSINFO: Marking libpath_utils-0.2.1-16.fc19.x86_64 as install for libini_config-1.0.0.1-16.fc19.x86_64 >03:47:45,415 DEBUG yum.verbose.YumBase: TSINFO: Marking libbasicobjects-0.1.0-16.fc19.x86_64 as install for libini_config-1.0.0.1-16.fc19.x86_64 >03:47:45,427 DEBUG yum.verbose.YumBase: TSINFO: Marking libkkc-data-0.2.1-1.fc19.x86_64 as install for libkkc-0.2.1-1.fc19.x86_64 >03:47:45,429 DEBUG yum.verbose.YumBase: TSINFO: Marking libkkc-common-0.2.1-1.fc19.noarch as install for libkkc-0.2.1-1.fc19.x86_64 >03:47:45,431 DEBUG yum.verbose.YumBase: TSINFO: Marking skkdic-20130104-5.T1435.fc19.noarch as install for libkkc-0.2.1-1.fc19.x86_64 >03:47:45,433 DEBUG yum.verbose.YumBase: TSINFO: Marking marisa-0.2.2-2.fc19.x86_64 as install for libkkc-0.2.1-1.fc19.x86_64 >03:47:45,442 DEBUG yum.verbose.YumBase: TSINFO: Marking liblouis-2.5.2-3.fc19.x86_64 as install for liblouis-python3-2.5.2-3.fc19.noarch >03:47:45,461 DEBUG yum.verbose.YumBase: TSINFO: Marking neon-0.29.6-6.fc19.x86_64 as install for libmusicbrainz5-5.0.1-5.fc19.x86_64 >03:47:45,472 DEBUG yum.verbose.YumBase: TSINFO: Marking libmnl-1.0.3-5.fc19.x86_64 as install for libnetfilter_conntrack-1.0.3-1.fc19.x86_64 >03:47:45,473 DEBUG yum.verbose.YumBase: Quick matched libmnl-1.0.3-5.fc19.x86_64 to require for libmnl.so.0(LIBMNL_1.0)(64bit) >03:47:45,474 DEBUG yum.verbose.YumBase: TSINFO: Marking libnfnetlink-1.0.1-2.fc19.x86_64 as install for libnetfilter_conntrack-1.0.3-1.fc19.x86_64 >03:47:45,490 DEBUG yum.verbose.YumBase: TSINFO: Marking mobile-broadband-provider-info-1.20120614-3.fc19.noarch as install for libnm-gtk-0.9.8.1-2.git20130327.fc19.x86_64 >03:47:45,498 DEBUG yum.verbose.YumBase: TSINFO: Marking fftw-libs-double-3.3.3-5.fc19.x86_64 as install for libofa-0.9.3-22.fc19.x86_64 >03:47:45,507 DEBUG yum.verbose.YumBase: TSINFO: Marking libxslt-1.1.28-2.fc19.x86_64 as install for libosinfo-0.2.6-1.fc19.x86_64 >03:47:45,509 DEBUG yum.verbose.YumBase: Quick matched libxslt-1.1.28-2.fc19.x86_64 to require for libxslt.so.1(LIBXML2_1.0.11)(64bit) >03:47:45,509 DEBUG yum.verbose.YumBase: Quick matched libxslt-1.1.28-2.fc19.x86_64 to require for libxslt.so.1()(64bit) >03:47:45,521 DEBUG yum.verbose.YumBase: TSINFO: Marking seed-3.8.1-1.fc19.x86_64 as install for libpeas-1.8.0-1.fc19.x86_64 >03:47:45,524 DEBUG yum.verbose.YumBase: TSINFO: Marking python3-libs-3.3.1-2.fc19.x86_64 as install for libpeas-1.8.0-1.fc19.x86_64 >03:47:45,542 DEBUG yum.verbose.YumBase: TSINFO: Marking libmodman-2.0.1-6.fc19.x86_64 as install for libproxy-0.4.11-3.fc19.x86_64 >03:47:45,546 DEBUG yum.verbose.YumBase: TSINFO: Marking cracklib-dicts-2.8.22-3.fc19.x86_64 as install for libpwquality-1.2.1-2.fc19.x86_64 >03:47:45,548 DEBUG yum.verbose.YumBase: TSINFO: Marking cracklib-2.8.22-3.fc19.x86_64 as install for libpwquality-1.2.1-2.fc19.x86_64 >03:47:45,556 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-plugin-reportuploader-2.1.3-3.fc19.x86_64 as install for libreport-gtk-2.1.3-3.fc19.x86_64 >03:47:45,558 DEBUG yum.verbose.YumBase: TSINFO: Marking recordmydesktop-0.3.8.1-10.fc19.x86_64 as install for libreport-gtk-2.1.3-3.fc19.x86_64 >03:47:45,582 DEBUG yum.verbose.YumBase: TSINFO: Marking ustr-1.0.4-13.fc18.x86_64 as install for libsemanage-2.1.10-4.fc19.x86_64 >03:47:45,583 DEBUG yum.verbose.YumBase: Quick matched ustr-1.0.4-13.fc18.x86_64 to require for libustr-1.0.so.1(USTR_1.0)(64bit) >03:47:45,584 DEBUG yum.verbose.YumBase: Quick matched ustr-1.0.4-13.fc18.x86_64 to require for libustr-1.0.so.1()(64bit) >03:47:45,636 DEBUG yum.verbose.YumBase: TSINFO: Marking jbigkit-libs-2.0-8.fc19.x86_64 as install for libtiff-4.0.3-6.fc19.x86_64 >03:47:45,647 DEBUG yum.verbose.YumBase: TSINFO: Marking libtranslit-0.0.2-4.fc19.x86_64 as install for libtranslit-m17n-0.0.2-4.fc19.x86_64 >03:47:45,684 DEBUG yum.verbose.YumBase: TSINFO: Marking gnutls-utils-3.1.10-1.fc19.x86_64 as install for libvirt-client-1.0.5-2.fc19.x86_64 >03:47:45,686 DEBUG yum.verbose.YumBase: TSINFO: Marking gettext-0.18.2.1-1.fc19.x86_64 as install for libvirt-client-1.0.5-2.fc19.x86_64 >03:47:45,689 DEBUG yum.verbose.YumBase: TSINFO: Marking cyrus-sasl-md5-2.1.26-6.fc19.x86_64 as install for libvirt-client-1.0.5-2.fc19.x86_64 >03:47:45,691 DEBUG yum.verbose.YumBase: TSINFO: Marking yajl-2.0.4-2.fc19.x86_64 as install for libvirt-client-1.0.5-2.fc19.x86_64 >03:47:45,693 DEBUG yum.verbose.YumBase: TSINFO: Marking libwsman1-2.3.6-6.fc19.x86_64 as install for libvirt-client-1.0.5-2.fc19.x86_64 >03:47:45,694 DEBUG yum.verbose.YumBase: Quick matched libwsman1-2.3.6-6.fc19.x86_64 to require for libwsman_client.so.1()(64bit) >03:47:45,695 DEBUG yum.verbose.YumBase: Quick matched libwsman1-2.3.6-6.fc19.x86_64 to require for libwsman.so.1()(64bit) >03:47:45,703 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-driver-storage-1.0.5-2.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,705 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-driver-secret-1.0.5-2.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,707 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-driver-qemu-1.0.5-2.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,709 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-driver-nwfilter-1.0.5-2.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,711 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-driver-nodedev-1.0.5-2.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,717 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-driver-network-1.0.5-2.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,721 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-driver-interface-1.0.5-2.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,723 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-daemon-1.0.5-2.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,728 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:qemu-kvm-1.4.1-1.fc19.x86_64 as install for libvirt-daemon-kvm-1.0.5-2.fc19.x86_64 >03:47:45,741 DEBUG yum.verbose.YumBase: TSINFO: Marking libvirt-glib-0.1.6-1.fc19.x86_64 as install for libvirt-gobject-0.1.6-1.fc19.x86_64 >03:47:45,742 DEBUG yum.verbose.YumBase: Quick matched libvirt-glib-0.1.6-1.fc19.x86_64 to require for libvirt-glib-1.0.so.0(LIBVIRT_GLIB_0.0.7)(64bit) >03:47:45,743 DEBUG yum.verbose.YumBase: Quick matched libvirt-glib-0.1.6-1.fc19.x86_64 to require for libvirt-glib-1.0.so.0()(64bit) >03:47:45,752 DEBUG yum.verbose.YumBase: TSINFO: Marking libwacom-data-0.7.1-2.fc19.noarch as install for libwacom-0.7.1-2.fc19.x86_64 >03:47:45,760 DEBUG yum.verbose.YumBase: TSINFO: Marking libXres-1.0.6-4.fc19.x86_64 as install for libwnck3-3.4.5-1.fc19.x86_64 >03:47:45,821 DEBUG yum.verbose.YumBase: TSINFO: Marking mesa-libglapi-9.1.1-1.fc19.x86_64 as install for mesa-libGL-9.1.1-1.fc19.x86_64 >03:47:45,875 DEBUG yum.verbose.YumBase: TSINFO: Marking ncurses-base-5.9-10.20130413.fc19.noarch as install for ncurses-libs-5.9-10.20130413.fc19.x86_64 >03:47:45,894 DEBUG yum.verbose.YumBase: TSINFO: Marking slang-2.2.4-8.fc19.x86_64 as install for newt-0.52.15-1.fc19.x86_64 >03:47:45,898 DEBUG yum.verbose.YumBase: Quick matched slang-2.2.4-8.fc19.x86_64 to require for libslang.so.2()(64bit) >03:47:45,955 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-sysinit-3.14.3-12.0.fc19.x86_64 as install for nss-3.14.3-12.0.fc19.x86_64 >03:47:46,042 DEBUG yum.verbose.YumBase: TSINFO: Marking vpnc-script-0.5.3-17.svn457.fc19.noarch as install for openconnect-4.99-1.fc19.x86_64 >03:47:46,046 DEBUG yum.verbose.YumBase: TSINFO: Marking trousers-0.3.10-2.fc19.x86_64 as install for openconnect-4.99-1.fc19.x86_64 >03:47:46,098 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:make-3.82-15.fc19.x86_64 as install for 1:openssl-1.0.1e-4.fc19.x86_64 >03:47:46,219 DEBUG yum.verbose.YumBase: TSINFO: Marking libthai-0.1.14-7.fc19.x86_64 as install for pango-1.34.0-1.fc19.x86_64 >03:47:46,229 DEBUG yum.verbose.YumBase: Quick matched libthai-0.1.14-7.fc19.x86_64 to require for libthai.so.0(LIBTHAI_0.1)(64bit) >03:47:46,232 DEBUG yum.verbose.YumBase: Quick matched libthai-0.1.14-7.fc19.x86_64 to require for libthai.so.0()(64bit) >03:47:46,236 DEBUG yum.verbose.YumBase: TSINFO: Marking harfbuzz-0.9.16-1.fc19.x86_64 as install for pango-1.34.0-1.fc19.x86_64 >03:47:46,244 DEBUG yum.verbose.YumBase: TSINFO: Marking libXft-2.3.1-4.fc19.x86_64 as install for pango-1.34.0-1.fc19.x86_64 >03:47:46,332 DEBUG yum.verbose.YumBase: TSINFO: Marking 4:perl-libs-5.16.3-262.fc19.x86_64 as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,340 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-Socket-2.009-2.fc19.x86_64 as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,346 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-Scalar-List-Utils-1.27-246.fc19.x86_64 as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,348 DEBUG yum.verbose.YumBase: TSINFO: Marking 4:perl-macros-5.16.3-262.fc19.x86_64 as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,350 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-threads-shared-1.43-2.fc19.x86_64 as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,353 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-threads-1.86-243.fc19.x86_64 as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,358 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:perl-Pod-Simple-3.20-262.fc19.noarch as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,362 DEBUG yum.verbose.YumBase: Quick matched 1:perl-Pod-Simple-3.20-262.fc19.noarch to require for perl(Pod::Simple::Search) >03:47:46,366 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-Filter-1.49-1.fc19.x86_64 as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,373 DEBUG yum.verbose.YumBase: TSINFO: Marking perl-Carp-1.26-243.fc19.noarch as install for 4:perl-5.16.3-262.fc19.x86_64 >03:47:46,418 DEBUG yum.verbose.YumBase: TSINFO: Marking plymouth-plugin-two-step-0.8.9-0.2013.03.26.0.fc19.x86_64 as install for plymouth-theme-charge-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:47:46,469 DEBUG yum.verbose.YumBase: TSINFO: Marking sbc-1.0-3.fc19.x86_64 as install for pulseaudio-module-bluetooth-3.0-7.fc19.x86_64 >03:47:46,470 DEBUG yum.verbose.YumBase: Quick matched sbc-1.0-3.fc19.x86_64 to require for libsbc.so.1()(64bit) >03:47:46,478 DEBUG yum.verbose.YumBase: TSINFO: Marking pycairo-1.8.10-6.fc19.x86_64 as install for pygobject3-3.8.1-2.fc19.x86_64 >03:47:46,492 DEBUG yum.verbose.YumBase: TSINFO: Marking libglade2-2.6.4-9.fc19.x86_64 as install for pygtk2-libglade-2.24.0-7.fc19.x86_64 >03:47:46,502 DEBUG yum.verbose.YumBase: TSINFO: Marking pyatspi-2.8.0-1.fc19.noarch as install for python-caribou-0.4.10-1.fc19.noarch >03:47:46,504 DEBUG yum.verbose.YumBase: TSINFO: Marking python-chardet-2.0.1-6.fc19.noarch as install for python-kitchen-1.1.1-3.fc19.noarch >03:47:46,517 DEBUG yum.verbose.YumBase: TSINFO: Marking python-markupsafe-0.11-8.fc19.x86_64 as install for python-mako-0.7.3-1.fc19.noarch >03:47:46,519 DEBUG yum.verbose.YumBase: TSINFO: Marking python-beaker-1.5.4-7.fc19.noarch as install for python-mako-0.7.3-1.fc19.noarch >03:47:46,525 DEBUG yum.verbose.YumBase: TSINFO: Marking libwebp-0.2.1-3.fc19.x86_64 as install for python-pillow-2.0.0-7.gitd1c6db8.fc19.x86_64 >03:47:46,527 DEBUG yum.verbose.YumBase: TSINFO: Marking lcms-libs-1.19-9.fc19.x86_64 as install for python-pillow-2.0.0-7.gitd1c6db8.fc19.x86_64 >03:47:46,530 DEBUG yum.verbose.YumBase: TSINFO: Marking python-slip-0.4.0-1.fc19.noarch as install for python-slip-dbus-0.4.0-1.fc19.noarch >03:47:46,532 DEBUG yum.verbose.YumBase: TSINFO: Marking python-pycurl-7.19.0-15.1.fc19.x86_64 as install for python-urlgrabber-3.9.1-26.fc19.noarch >03:47:46,541 DEBUG yum.verbose.YumBase: TSINFO: Marking brlapi-0.6.0-2.fc19.x86_64 as install for python3-brlapi-0.6.0-2.fc19.x86_64 >03:47:46,542 DEBUG yum.verbose.YumBase: Quick matched brlapi-0.6.0-2.fc19.x86_64 to require for libbrlapi.so.0.6()(64bit) >03:47:46,550 DEBUG yum.verbose.YumBase: TSINFO: Marking speech-dispatcher-0.7.1-11.fc19.x86_64 as install for python3-speechd-0.7.1-11.fc19.x86_64 >03:47:46,553 DEBUG yum.verbose.YumBase: TSINFO: Marking python3-pyxdg-0.25-1.fc19.noarch as install for python3-speechd-0.7.1-11.fc19.x86_64 >03:47:46,585 DEBUG yum.verbose.YumBase: TSINFO: Marking rpm-build-libs-4.11.0.1-1.fc19.x86_64 as install for rpm-python-4.11.0.1-1.fc19.x86_64 >03:47:46,587 DEBUG yum.verbose.YumBase: Quick matched rpm-build-libs-4.11.0.1-1.fc19.x86_64 to require for librpmbuild.so.3()(64bit) >03:47:46,645 DEBUG yum.verbose.YumBase: TSINFO: Marking pytalloc-2.0.8-2.fc19.x86_64 as install for 2:samba-libs-4.0.5-1.fc19.x86_64 >03:47:46,646 DEBUG yum.verbose.YumBase: Quick matched pytalloc-2.0.8-2.fc19.x86_64 to require for libpytalloc-util.so.2()(64bit) >03:47:46,664 DEBUG yum.verbose.YumBase: TSINFO: Marking setroubleshoot-plugins-3.0.50-1.fc19.noarch as install for setroubleshoot-server-3.2.8-1.fc19.x86_64 >03:47:46,666 DEBUG yum.verbose.YumBase: TSINFO: Marking libselinux-python-2.1.13-12.fc19.x86_64 as install for setroubleshoot-server-3.2.8-1.fc19.x86_64 >03:47:46,668 DEBUG yum.verbose.YumBase: TSINFO: Marking audit-libs-python-2.3-2.fc19.x86_64 as install for setroubleshoot-server-3.2.8-1.fc19.x86_64 >03:47:46,670 DEBUG yum.verbose.YumBase: TSINFO: Marking systemd-python-203-2.fc19.x86_64 as install for setroubleshoot-server-3.2.8-1.fc19.x86_64 >03:47:46,673 DEBUG yum.verbose.YumBase: TSINFO: Marking policycoreutils-devel-2.1.14-37.fc19.x86_64 as install for setroubleshoot-server-3.2.8-1.fc19.x86_64 >03:47:46,680 DEBUG yum.verbose.YumBase: TSINFO: Marking compat-readline5-5.2-21.fc19.x86_64 as install for socat-1.7.2.1-3.fc19.x86_64 >03:47:46,696 DEBUG yum.verbose.YumBase: TSINFO: Marking usbredir-0.6-2.fc19.x86_64 as install for spice-glib-0.19-1.fc19.x86_64 >03:47:46,697 DEBUG yum.verbose.YumBase: Quick matched usbredir-0.6-2.fc19.x86_64 to require for libusbredirhost.so.1()(64bit) >03:47:46,698 DEBUG yum.verbose.YumBase: TSINFO: Marking celt051-0.5.1.3-6.fc19.x86_64 as install for spice-glib-0.19-1.fc19.x86_64 >03:47:46,700 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:libcacard-1.4.1-1.fc19.x86_64 as install for spice-glib-0.19-1.fc19.x86_64 >03:47:46,721 DEBUG yum.verbose.YumBase: TSINFO: Marking python-cups-1.9.63-1.fc19.x86_64 as install for system-config-printer-libs-1.4.0-1.fc19.noarch >03:47:46,780 DEBUG yum.verbose.YumBase: TSINFO: Marking libpurple-2.10.7-2.fc19.x86_64 as install for telepathy-haze-0.7.0-3.fc19.x86_64 >03:47:46,837 DEBUG yum.verbose.YumBase: TSINFO: Marking libiptcdata-1.0.4-9.fc19.x86_64 as install for tracker-0.16.1-1.fc19.x86_64 >03:47:46,839 DEBUG yum.verbose.YumBase: TSINFO: Marking libgsf-1.14.26-4.fc19.x86_64 as install for tracker-0.16.1-1.fc19.x86_64 >03:47:46,841 DEBUG yum.verbose.YumBase: TSINFO: Marking enca-1.14-1.fc19.x86_64 as install for tracker-0.16.1-1.fc19.x86_64 >03:47:46,843 DEBUG yum.verbose.YumBase: TSINFO: Marking libcue-1.3.0-6.fc19.x86_64 as install for tracker-0.16.1-1.fc19.x86_64 >03:47:46,855 DEBUG yum.verbose.YumBase: TSINFO: Marking libatasmart-0.19-4.fc19.x86_64 as install for udisks2-2.1.0-2.fc19.x86_64 >03:47:46,857 DEBUG yum.verbose.YumBase: TSINFO: Marking xfsprogs-3.1.10-2.fc19.x86_64 as install for udisks2-2.1.0-2.fc19.x86_64 >03:47:46,859 DEBUG yum.verbose.YumBase: TSINFO: Marking gdisk-0.8.6-1.fc19.x86_64 as install for udisks2-2.1.0-2.fc19.x86_64 >03:47:46,865 DEBUG yum.verbose.YumBase: TSINFO: Marking libobjc-4.8.0-2.fc19.x86_64 as install for unar-1.6-4.fc19.x86_64 >03:47:46,867 DEBUG yum.verbose.YumBase: TSINFO: Marking libicu-50.1.2-5.fc19.x86_64 as install for unar-1.6-4.fc19.x86_64 >03:47:46,869 DEBUG yum.verbose.YumBase: TSINFO: Marking gnustep-base-1.24.4-1.fc19.x86_64 as install for unar-1.6-4.fc19.x86_64 >03:47:46,923 DEBUG yum.verbose.YumBase: TSINFO: Marking kernel-modules-extra-3.9.0-301.fc19.x86_64 as install for xl2tpd-1.3.1-13.fc19.x86_64 >03:47:46,932 DEBUG yum.verbose.YumBase: TSINFO: Marking mcpp-2.7.2-9.fc19.x86_64 as install for xorg-x11-server-utils-7.7-1.fc19.x86_64 >03:47:46,990 DEBUG yum.verbose.YumBase: TSINFO: Marking brltty-4.5-2.fc19.x86_64 as install for brlapi-0.6.0-2.fc19.x86_64 >03:47:46,999 DEBUG yum.verbose.YumBase: TSINFO: Marking binutils-2.23.52.0.1-8.fc19.x86_64 as install for btparser-0.25-2.fc19.x86_64 >03:47:47,026 DEBUG yum.verbose.YumBase: TSINFO: Marking schroedinger-cat-backgrounds-gnome-18.90.0-1.fc19.noarch as install for desktop-backgrounds-gnome-19.0.0-1.fc19.noarch >03:47:47,055 DEBUG yum.verbose.YumBase: TSINFO: Marking opencv-2.4.4-2.fc19.x86_64 as install for frei0r-plugins-1.3-9.fc19.x86_64 >03:47:47,057 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_video.so.2.4()(64bit) >03:47:47,058 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_ts.so.2.4()(64bit) >03:47:47,058 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_stitching.so.2.4()(64bit) >03:47:47,059 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_photo.so.2.4()(64bit) >03:47:47,059 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_objdetect.so.2.4()(64bit) >03:47:47,060 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_ml.so.2.4()(64bit) >03:47:47,060 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_legacy.so.2.4()(64bit) >03:47:47,061 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_imgproc.so.2.4()(64bit) >03:47:47,061 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_highgui.so.2.4()(64bit) >03:47:47,062 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_flann.so.2.4()(64bit) >03:47:47,062 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_features2d.so.2.4()(64bit) >03:47:47,063 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_core.so.2.4()(64bit) >03:47:47,063 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_contrib.so.2.4()(64bit) >03:47:47,064 DEBUG yum.verbose.YumBase: Quick matched opencv-2.4.4-2.fc19.x86_64 to require for libopencv_calib3d.so.2.4()(64bit) >03:47:47,065 DEBUG yum.verbose.YumBase: TSINFO: Marking gavl-1.4.0-2.fc19.x86_64 as install for frei0r-plugins-1.3-9.fc19.x86_64 >03:47:47,072 DEBUG yum.verbose.YumBase: TSINFO: Marking libXpm-3.5.10-4.fc19.x86_64 as install for gd-2.0.35-24.fc19.x86_64 >03:47:47,085 DEBUG yum.verbose.YumBase: TSINFO: Marking gettext-libs-0.18.2.1-1.fc19.x86_64 as install for gettext-0.18.2.1-1.fc19.x86_64 >03:47:47,086 DEBUG yum.verbose.YumBase: Quick matched gettext-libs-0.18.2.1-1.fc19.x86_64 to require for libgettextlib-0.18.2.so()(64bit) >03:47:47,092 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-python2-canvas-2.28.1-12.fc19.x86_64 as install for gnome-python2-desktop-2.32.0-14.fc19.x86_64 >03:47:47,109 DEBUG yum.verbose.YumBase: TSINFO: Marking gnustep-make-2.6.4-1.fc19.x86_64 as install for gnustep-base-1.24.4-1.fc19.x86_64 >03:47:47,117 DEBUG yum.verbose.YumBase: TSINFO: Marking gnutls-dane-3.1.10-1.fc19.x86_64 as install for gnutls-utils-3.1.10-1.fc19.x86_64 >03:47:47,118 DEBUG yum.verbose.YumBase: Quick matched gnutls-dane-3.1.10-1.fc19.x86_64 to require for libgnutls-dane.so.0(DANE_0_0)(64bit) >03:47:47,119 DEBUG yum.verbose.YumBase: Quick matched gnutls-dane-3.1.10-1.fc19.x86_64 to require for libgnutls-dane.so.0()(64bit) >03:47:47,137 DEBUG yum.verbose.YumBase: TSINFO: Marking graphite2-1.2.0-4.fc19.x86_64 as install for harfbuzz-0.9.16-1.fc19.x86_64 >03:47:47,142 DEBUG yum.verbose.YumBase: TSINFO: Marking javapackages-tools-0.14.0-1.fc19.noarch as install for jpackage-utils-1.7.5-26.fc19.noarch >03:47:47,188 DEBUG yum.verbose.YumBase: TSINFO: Marking ceph-libs-0.56.4-1.fc19.x86_64 as install for 2:libcacard-1.4.1-1.fc19.x86_64 >03:47:47,189 DEBUG yum.verbose.YumBase: Quick matched ceph-libs-0.56.4-1.fc19.x86_64 to require for librados.so.2()(64bit) >03:47:47,191 DEBUG yum.verbose.YumBase: TSINFO: Marking libiscsi-1.7.0-3.fc19.x86_64 as install for 2:libcacard-1.4.1-1.fc19.x86_64 >03:47:47,193 DEBUG yum.verbose.YumBase: TSINFO: Marking libaio-0.3.109-7.fc19.x86_64 as install for 2:libcacard-1.4.1-1.fc19.x86_64 >03:47:47,253 DEBUG yum.verbose.YumBase: TSINFO: Marking libsilc-1.1.10-8.fc19.x86_64 as install for libpurple-2.10.7-2.fc19.x86_64 >03:47:47,255 DEBUG yum.verbose.YumBase: Quick matched libsilc-1.1.10-8.fc19.x86_64 to require for libsilc-1.1.so.2()(64bit) >03:47:47,256 DEBUG yum.verbose.YumBase: TSINFO: Marking meanwhile-1.1.0-10.fc19.x86_64 as install for libpurple-2.10.7-2.fc19.x86_64 >03:47:47,258 DEBUG yum.verbose.YumBase: TSINFO: Marking libgadu-1.11.2-1.fc19.1.x86_64 as install for libpurple-2.10.7-2.fc19.x86_64 >03:47:47,260 DEBUG yum.verbose.YumBase: TSINFO: Marking farstream-0.1.2-5.fc19.x86_64 as install for libpurple-2.10.7-2.fc19.x86_64 >03:47:47,265 DEBUG yum.verbose.YumBase: TSINFO: Marking libquvi-scripts-0.4.10-2.fc19.noarch as install for libquvi-0.4.1-3.fc19.x86_64 >03:47:47,270 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-python-2.1.3-3.fc19.x86_64 as install for libreport-2.1.3-3.fc19.x86_64 >03:47:47,276 DEBUG yum.verbose.YumBase: TSINFO: Marking libtar-1.2.11-25.fc19.x86_64 as install for libreport-plugin-reportuploader-2.1.3-3.fc19.x86_64 >03:47:47,331 DEBUG yum.verbose.YumBase: TSINFO: Marking netcf-libs-0.2.3-4.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,339 DEBUG yum.verbose.YumBase: TSINFO: Marking glusterfs-fuse-3.4.0-0.3.alpha3.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,341 DEBUG yum.verbose.YumBase: TSINFO: Marking sheepdog-0.3.0-4.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,348 DEBUG yum.verbose.YumBase: TSINFO: Marking radvd-1.9.2-3.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,350 DEBUG yum.verbose.YumBase: TSINFO: Marking numad-0.5-10.20121130git.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,352 DEBUG yum.verbose.YumBase: TSINFO: Marking lzop-1.03-8.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,357 DEBUG yum.verbose.YumBase: TSINFO: Marking lvm2-2.02.98-8.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,360 DEBUG yum.verbose.YumBase: TSINFO: Marking libcgroup-0.38-4.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,363 DEBUG yum.verbose.YumBase: TSINFO: Marking iscsi-initiator-utils-6.2.0.873-5.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,367 DEBUG yum.verbose.YumBase: TSINFO: Marking iptables-services-1.4.18-1.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,374 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:dmidecode-2.12-2.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,377 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:qemu-img-1.4.1-1.fc19.x86_64 as install for libvirt-daemon-1.0.5-2.fc19.x86_64 >03:47:47,471 DEBUG yum.verbose.YumBase: TSINFO: Marking libmcpp-2.7.2-9.fc19.x86_64 as install for mcpp-2.7.2-9.fc19.x86_64 >03:47:47,487 DEBUG yum.verbose.YumBase: TSINFO: Marking libwayland-server-1.1.0-1.fc19.x86_64 as install for mesa-libEGL-9.1.1-1.fc19.x86_64 >03:47:47,489 DEBUG yum.verbose.YumBase: TSINFO: Marking mesa-libgbm-9.1.1-1.fc19.x86_64 as install for mesa-libEGL-9.1.1-1.fc19.x86_64 >03:47:47,517 DEBUG yum.verbose.YumBase: TSINFO: Marking pakchois-0.4-8.fc19.x86_64 as install for neon-0.29.6-6.fc19.x86_64 >03:47:47,542 DEBUG yum.verbose.YumBase: TSINFO: Marking openobex-1.5-8.fc19.x86_64 as install for 1:obex-data-server-0.4.6-5.fc19.x86_64 >03:47:47,580 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:perl-Pod-Escapes-1.04-262.fc19.noarch as install for 1:perl-Pod-Simple-3.20-262.fc19.noarch >03:47:47,616 DEBUG yum.verbose.YumBase: TSINFO: Marking plymouth-graphics-libs-0.8.9-0.2013.03.26.0.fc19.x86_64 as install for plymouth-plugin-two-step-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:47:47,618 DEBUG yum.verbose.YumBase: TSINFO: Marking plymouth-plugin-label-0.8.9-0.2013.03.26.0.fc19.x86_64 as install for plymouth-plugin-two-step-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:47:47,623 DEBUG yum.verbose.YumBase: TSINFO: Marking policycoreutils-python-2.1.14-37.fc19.x86_64 as install for policycoreutils-devel-2.1.14-37.fc19.x86_64 >03:47:47,625 DEBUG yum.verbose.YumBase: TSINFO: Marking selinux-policy-doc-3.12.1-42.fc19.noarch as install for policycoreutils-devel-2.1.14-37.fc19.x86_64 >03:47:47,627 DEBUG yum.verbose.YumBase: TSINFO: Marking selinux-policy-devel-3.12.1-42.fc19.noarch as install for policycoreutils-devel-2.1.14-37.fc19.x86_64 >03:47:47,629 DEBUG yum.verbose.YumBase: TSINFO: Marking checkpolicy-2.1.12-3.fc19.x86_64 as install for policycoreutils-devel-2.1.14-37.fc19.x86_64 >03:47:47,634 DEBUG yum.verbose.YumBase: TSINFO: Marking openjpeg-libs-1.5.1-5.fc19.x86_64 as install for poppler-0.22.1-1.fc19.x86_64 >03:47:47,652 DEBUG yum.verbose.YumBase: TSINFO: Marking python-paste-1.7.5.1-8.20111221hg1498.fc19.noarch as install for python-beaker-1.5.4-7.fc19.noarch >03:47:47,654 DEBUG yum.verbose.YumBase: TSINFO: Marking pycryptopp-0.6.0.1206569328141510525648634803928199668821045408958-3.fc19.x86_64 as install for python-beaker-1.5.4-7.fc19.noarch >03:47:47,674 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:qemu-system-x86-1.4.1-1.fc19.x86_64 as install for 2:qemu-kvm-1.4.1-1.fc19.x86_64 >03:47:47,687 DEBUG yum.verbose.YumBase: TSINFO: Marking jack-audio-connection-kit-example-clients-1.9.9.5-2.fc19.x86_64 as install for recordmydesktop-0.3.8.1-10.fc19.x86_64 >03:47:47,690 DEBUG yum.verbose.YumBase: TSINFO: Marking jack-audio-connection-kit-1.9.9.5-2.fc19.x86_64 as install for recordmydesktop-0.3.8.1-10.fc19.x86_64 >03:47:47,693 DEBUG yum.verbose.YumBase: TSINFO: Marking jline-1.0-4.fc19.noarch as install for rhino-1.7R4-2.fc19.noarch >03:47:47,709 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-js-common-0.1.2-7.fc19.noarch as install for seed-3.8.1-1.fc19.x86_64 >03:47:47,711 DEBUG yum.verbose.YumBase: TSINFO: Marking mpfr-3.1.1-2.fc19.x86_64 as install for seed-3.8.1-1.fc19.x86_64 >03:47:47,729 DEBUG yum.verbose.YumBase: TSINFO: Marking festival-freebsoft-utils-0.10-6.fc19.noarch as install for speech-dispatcher-0.7.1-11.fc19.x86_64 >03:47:47,731 DEBUG yum.verbose.YumBase: TSINFO: Marking flite-1.3-19.fc19.x86_64 as install for speech-dispatcher-0.7.1-11.fc19.x86_64 >03:47:47,733 DEBUG yum.verbose.YumBase: Quick matched flite-1.3-19.fc19.x86_64 to require for libflite_cmulex.so.1()(64bit) >03:47:47,733 DEBUG yum.verbose.YumBase: Quick matched flite-1.3-19.fc19.x86_64 to require for libflite_cmu_us_kal16.so.1()(64bit) >03:47:47,734 DEBUG yum.verbose.YumBase: Quick matched flite-1.3-19.fc19.x86_64 to require for libflite.so.1()(64bit) >03:47:47,736 DEBUG yum.verbose.YumBase: TSINFO: Marking dotconf-1.3-5.fc19.x86_64 as install for speech-dispatcher-0.7.1-11.fc19.x86_64 >03:47:47,774 DEBUG yum.verbose.YumBase: TSINFO: Marking ttmkfdir-3.0.9-39.fc19.x86_64 as install for xorg-x11-fonts-Type1-7.5-8.fc19.noarch >03:47:47,776 DEBUG yum.verbose.YumBase: Quick matched ttmkfdir-3.0.9-39.fc19.x86_64 to require for ttmkfdir >03:47:47,797 DEBUG yum.verbose.YumBase: TSINFO: Marking leveldb-1.9.0-1.fc19.x86_64 as install for ceph-libs-0.56.4-1.fc19.x86_64 >03:47:47,799 DEBUG yum.verbose.YumBase: TSINFO: Marking cryptopp-5.6.2-2.fc19.x86_64 as install for ceph-libs-0.56.4-1.fc19.x86_64 >03:47:47,801 DEBUG yum.verbose.YumBase: TSINFO: Marking boost-thread-1.53.0-6.fc19.x86_64 as install for ceph-libs-0.56.4-1.fc19.x86_64 >03:47:47,803 DEBUG yum.verbose.YumBase: TSINFO: Marking boost-system-1.53.0-6.fc19.x86_64 as install for ceph-libs-0.56.4-1.fc19.x86_64 >03:47:47,817 DEBUG yum.verbose.YumBase: TSINFO: Marking sox-14.4.1-2.fc19.x86_64 as install for festival-freebsoft-utils-0.10-6.fc19.noarch >03:47:47,821 DEBUG yum.verbose.YumBase: TSINFO: Marking festival-1.96-25.fc19.x86_64 as install for festival-freebsoft-utils-0.10-6.fc19.noarch >03:47:47,828 DEBUG yum.verbose.YumBase: TSINFO: Marking libgdither-0.6-6.fc19.x86_64 as install for gavl-1.4.0-2.fc19.x86_64 >03:47:47,839 DEBUG yum.verbose.YumBase: TSINFO: Marking glusterfs-3.4.0-0.3.alpha3.fc19.x86_64 as install for glusterfs-fuse-3.4.0-0.3.alpha3.fc19.x86_64 >03:47:47,848 DEBUG yum.verbose.YumBase: TSINFO: Marking gnome-python2-2.28.1-12.fc19.x86_64 as install for gnome-python2-canvas-2.28.1-12.fc19.x86_64 >03:47:47,850 DEBUG yum.verbose.YumBase: TSINFO: Marking libgnomecanvas-2.30.3-6.fc19.x86_64 as install for gnome-python2-canvas-2.28.1-12.fc19.x86_64 >03:47:47,851 DEBUG yum.verbose.YumBase: Quick matched libgnomecanvas-2.30.3-6.fc19.x86_64 to require for libgnomecanvas-2.so.0()(64bit) >03:47:47,853 DEBUG yum.verbose.YumBase: TSINFO: Marking libart_lgpl-2.3.21-5.fc19.x86_64 as install for gnome-python2-canvas-2.28.1-12.fc19.x86_64 >03:47:47,855 DEBUG yum.verbose.YumBase: TSINFO: Marking gnustep-filesystem-2.6.4-1.fc19.noarch as install for gnustep-make-2.6.4-1.fc19.x86_64 >03:47:47,861 DEBUG yum.verbose.YumBase: TSINFO: Marking unbound-libs-1.4.19-5.fc19.x86_64 as install for gnutls-dane-3.1.10-1.fc19.x86_64 >03:47:47,879 DEBUG yum.verbose.YumBase: TSINFO: Marking libffado-2.1.0-2.fc19.x86_64 as install for jack-audio-connection-kit-1.9.9.5-2.fc19.x86_64 >03:47:47,881 DEBUG yum.verbose.YumBase: TSINFO: Marking celt-0.11.1-5.fc19.x86_64 as install for jack-audio-connection-kit-1.9.9.5-2.fc19.x86_64 >03:47:47,918 DEBUG yum.verbose.YumBase: TSINFO: Marking lvm2-libs-2.02.98-8.fc19.x86_64 as install for lvm2-2.02.98-8.fc19.x86_64 >03:47:47,920 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-persistent-data-0.1.4-3.fc19.x86_64 as install for lvm2-2.02.98-8.fc19.x86_64 >03:47:47,921 DEBUG yum.verbose.YumBase: Quick matched lvm2-libs-2.02.98-8.fc19.x86_64 to require for liblvm2app.so.2.2()(64bit) >03:47:47,938 DEBUG yum.verbose.YumBase: TSINFO: Marking augeas-libs-1.0.0-2.fc19.x86_64 as install for netcf-libs-0.2.3-4.fc19.x86_64 >03:47:47,939 DEBUG yum.verbose.YumBase: Quick matched augeas-libs-1.0.0-2.fc19.x86_64 to require for libaugeas.so.0(AUGEAS_0.1.0)(64bit) >03:47:47,940 DEBUG yum.verbose.YumBase: Quick matched augeas-libs-1.0.0-2.fc19.x86_64 to require for libaugeas.so.0()(64bit) >03:47:47,955 DEBUG yum.verbose.YumBase: TSINFO: Marking libdc1394-2.2.0-2.fc19.x86_64 as install for opencv-2.4.4-2.fc19.x86_64 >03:47:47,957 DEBUG yum.verbose.YumBase: TSINFO: Marking ilmbase-1.0.3-5.fc19.x86_64 as install for opencv-2.4.4-2.fc19.x86_64 >03:47:47,958 DEBUG yum.verbose.YumBase: Quick matched ilmbase-1.0.3-5.fc19.x86_64 to require for libIlmThread.so.6()(64bit) >03:47:47,959 DEBUG yum.verbose.YumBase: TSINFO: Marking OpenEXR-libs-1.7.1-5.fc19.x86_64 as install for opencv-2.4.4-2.fc19.x86_64 >03:47:47,961 DEBUG yum.verbose.YumBase: Quick matched ilmbase-1.0.3-5.fc19.x86_64 to require for libHalf.so.6()(64bit) >03:47:47,985 DEBUG yum.verbose.YumBase: TSINFO: Marking libsemanage-python-2.1.10-4.fc19.x86_64 as install for policycoreutils-python-2.1.14-37.fc19.x86_64 >03:47:47,986 DEBUG yum.verbose.YumBase: TSINFO: Marking python-IPy-0.75-5.fc19.noarch as install for policycoreutils-python-2.1.14-37.fc19.x86_64 >03:47:47,988 DEBUG yum.verbose.YumBase: TSINFO: Marking setools-libs-3.3.7-38.fc19.x86_64 as install for policycoreutils-python-2.1.14-37.fc19.x86_64 >03:47:47,990 DEBUG yum.verbose.YumBase: Quick matched setools-libs-3.3.7-38.fc19.x86_64 to require for libqpol.so.1(VERS_1.2)(64bit) >03:47:47,991 DEBUG yum.verbose.YumBase: Quick matched setools-libs-3.3.7-38.fc19.x86_64 to require for libapol.so.4(VERS_4.0)(64bit) >03:47:47,991 DEBUG yum.verbose.YumBase: Quick matched setools-libs-3.3.7-38.fc19.x86_64 to require for libqpol.so.1()(64bit) >03:47:47,992 DEBUG yum.verbose.YumBase: Quick matched setools-libs-3.3.7-38.fc19.x86_64 to require for libapol.so.4()(64bit) >03:47:47,997 DEBUG yum.verbose.YumBase: TSINFO: Marking python-tempita-0.5.1-5.fc19.noarch as install for python-paste-1.7.5.1-8.20111221hg1498.fc19.noarch >03:47:47,999 DEBUG yum.verbose.YumBase: TSINFO: Marking python-setuptools-0.6.36-1.fc19.noarch as install for python-paste-1.7.5.1-8.20111221hg1498.fc19.noarch >03:47:48,002 DEBUG yum.verbose.YumBase: TSINFO: Marking pyOpenSSL-0.13-5.fc19.x86_64 as install for python-paste-1.7.5.1-8.20111221hg1498.fc19.noarch >03:47:48,023 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:qemu-common-1.4.1-1.fc19.x86_64 as install for 2:qemu-system-x86-1.4.1-1.fc19.x86_64 >03:47:48,025 DEBUG yum.verbose.YumBase: TSINFO: Marking seabios-bin-1.7.2-1.fc19.noarch as install for 2:qemu-system-x86-1.4.1-1.fc19.x86_64 >03:47:48,027 DEBUG yum.verbose.YumBase: TSINFO: Marking libseccomp-2.0.0-0.fc19.x86_64 as install for 2:qemu-system-x86-1.4.1-1.fc19.x86_64 >03:47:48,029 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:sgabios-bin-0.20110622svn-5.fc19.noarch as install for 2:qemu-system-x86-1.4.1-1.fc19.x86_64 >03:47:48,030 DEBUG yum.verbose.YumBase: TSINFO: Marking seavgabios-bin-1.7.2-1.fc19.noarch as install for 2:qemu-system-x86-1.4.1-1.fc19.x86_64 >03:47:48,032 DEBUG yum.verbose.YumBase: TSINFO: Marking spice-server-0.12.2-5.fc19.x86_64 as install for 2:qemu-system-x86-1.4.1-1.fc19.x86_64 >03:47:48,034 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.8.2)(64bit) >03:47:48,034 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.8.1)(64bit) >03:47:48,035 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.6.0)(64bit) >03:47:48,035 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.11.2)(64bit) >03:47:48,036 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.10.4)(64bit) >03:47:48,036 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.10.3)(64bit) >03:47:48,037 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.10.2)(64bit) >03:47:48,037 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.10.1)(64bit) >03:47:48,038 DEBUG yum.verbose.YumBase: Quick matched spice-server-0.12.2-5.fc19.x86_64 to require for libspice-server.so.1(SPICE_SERVER_0.10.0)(64bit) >03:47:48,039 DEBUG yum.verbose.YumBase: TSINFO: Marking ipxe-roms-qemu-20130103-1.git717279a.fc19.noarch as install for 2:qemu-system-x86-1.4.1-1.fc19.x86_64 >03:47:48,138 DEBUG yum.verbose.YumBase: TSINFO: Marking SDL-1.2.15-7.fc19.x86_64 as install for 2:qemu-system-x86-1.4.1-1.fc19.x86_64 >03:47:48,142 DEBUG yum.verbose.YumBase: TSINFO: Marking schroedinger-cat-backgrounds-animated-18.90.0-1.fc19.noarch as install for schroedinger-cat-backgrounds-gnome-18.90.0-1.fc19.noarch >03:47:48,144 DEBUG yum.verbose.YumBase: TSINFO: Marking m4-1.4.16-7.fc19.x86_64 as install for selinux-policy-devel-3.12.1-42.fc19.noarch >03:47:48,151 DEBUG yum.verbose.YumBase: TSINFO: Marking corosynclib-2.3.0-3.fc19.x86_64 as install for sheepdog-0.3.0-4.fc19.x86_64 >03:47:48,152 DEBUG yum.verbose.YumBase: Quick matched corosynclib-2.3.0-3.fc19.x86_64 to require for libcfg.so.6(COROSYNC_CFG_0.82)(64bit) >03:47:48,153 DEBUG yum.verbose.YumBase: TSINFO: Marking corosync-2.3.0-3.fc19.x86_64 as install for sheepdog-0.3.0-4.fc19.x86_64 >03:47:48,156 DEBUG yum.verbose.YumBase: Quick matched corosynclib-2.3.0-3.fc19.x86_64 to require for libcfg.so.6()(64bit) >03:47:48,187 DEBUG yum.verbose.YumBase: TSINFO: Marking libqb-0.14.4-2.fc19.x86_64 as install for corosync-2.3.0-3.fc19.x86_64 >03:47:48,196 DEBUG yum.verbose.YumBase: TSINFO: Marking librdmacm-1.0.17-1.fc19.x86_64 as install for corosynclib-2.3.0-3.fc19.x86_64 >03:47:48,198 DEBUG yum.verbose.YumBase: TSINFO: Marking libibverbs-1.1.6-6.fc19.x86_64 as install for corosynclib-2.3.0-3.fc19.x86_64 >03:47:48,199 DEBUG yum.verbose.YumBase: Quick matched libibverbs-1.1.6-6.fc19.x86_64 to require for libibverbs.so.1(IBVERBS_1.0)(64bit) >03:47:48,212 DEBUG yum.verbose.YumBase: TSINFO: Marking festival-speechtools-libs-1.2.96-25.fc19.x86_64 as install for festival-1.96-25.fc19.x86_64 >03:47:48,214 DEBUG yum.verbose.YumBase: TSINFO: Marking festival-lib-1.96-25.fc19.x86_64 as install for festival-1.96-25.fc19.x86_64 >03:47:48,216 DEBUG yum.verbose.YumBase: TSINFO: Marking festvox-slt-arctic-hts-0.20061229-25.fc19.noarch as install for festival-1.96-25.fc19.x86_64 >03:47:48,217 DEBUG yum.verbose.YumBase: Quick matched festival-speechtools-libs-1.2.96-25.fc19.x86_64 to require for libestools.so.1.2.96.1()(64bit) >03:47:48,218 DEBUG yum.verbose.YumBase: Quick matched festival-speechtools-libs-1.2.96-25.fc19.x86_64 to require for libestbase.so.1.2.96.1()(64bit) >03:47:48,232 DEBUG yum.verbose.YumBase: TSINFO: Marking snappy-1.1.0-1.fc19.x86_64 as install for leveldb-1.9.0-1.fc19.x86_64 >03:47:48,249 DEBUG yum.verbose.YumBase: TSINFO: Marking libxml++-2.36.0-2.fc19.x86_64 as install for libffado-2.1.0-2.fc19.x86_64 >03:47:48,251 DEBUG yum.verbose.YumBase: TSINFO: Marking libconfig-1.4.9-2.fc19.x86_64 as install for libffado-2.1.0-2.fc19.x86_64 >03:47:48,274 DEBUG yum.verbose.YumBase: TSINFO: Marking schroedinger-cat-backgrounds-base-18.90.0-1.fc19.noarch as install for schroedinger-cat-backgrounds-animated-18.90.0-1.fc19.noarch >03:47:48,293 DEBUG yum.verbose.YumBase: TSINFO: Marking ldns-1.6.16-2.fc19.x86_64 as install for unbound-libs-1.4.19-5.fc19.x86_64 >03:47:48,457 DEBUG yum.verbose.YumBase: TSINFO: Marking hunspell-en-GB-0.20121024-3.fc19.noarch as install for hunspell-en-0.20121024-3.fc19.noarch >03:47:48,696 DEBUG yum.verbose.YumBase: Depsolve time: 9.075 >03:47:48,698 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1383 (checkSoftwareSelection) >03:47:48,699 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:48,699 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:48,700 DEBUG packaging: success >03:47:48,700 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:48,701 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1398 (checkSoftwareSelection) >03:47:48,702 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:48,742 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:48,743 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:101 (checkSoftwareSelection) >03:47:48,744 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:48,745 INFO packaging: 1208 packages selected totalling 3.73 GB >03:47:48,746 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:49,007 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:47:49,009 INFO packaging: have _yum_lock for MainThread >03:47:49,012 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,013 INFO packaging: have _yum_lock for MainThread >03:47:49,015 INFO packaging: gave up _yum_lock for MainThread >03:47:49,015 INFO packaging: gave up _yum_lock for MainThread >03:47:49,016 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:47:49,017 INFO packaging: have _yum_lock for MainThread >03:47:49,018 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,019 INFO packaging: have _yum_lock for MainThread >03:47:49,019 INFO packaging: gave up _yum_lock for MainThread >03:47:49,020 INFO packaging: gave up _yum_lock for MainThread >03:47:49,021 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:49,022 INFO packaging: have _yum_lock for MainThread >03:47:49,022 INFO packaging: gave up _yum_lock for MainThread >03:47:49,023 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:179 (status) >03:47:49,024 INFO packaging: have _yum_lock for MainThread >03:47:49,024 INFO packaging: gave up _yum_lock for MainThread >03:47:49,026 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:47:49,026 INFO packaging: have _yum_lock for MainThread >03:47:49,027 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,028 INFO packaging: have _yum_lock for MainThread >03:47:49,028 INFO packaging: gave up _yum_lock for MainThread >03:47:49,028 INFO packaging: gave up _yum_lock for MainThread >03:47:49,030 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:49,030 INFO packaging: have _yum_lock for MainThread >03:47:49,031 INFO packaging: gave up _yum_lock for MainThread >03:47:49,032 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:179 (status) >03:47:49,032 INFO packaging: have _yum_lock for MainThread >03:47:49,033 INFO packaging: gave up _yum_lock for MainThread >03:47:49,037 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:47:49,038 INFO packaging: have _yum_lock for MainThread >03:47:49,039 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,040 INFO packaging: have _yum_lock for MainThread >03:47:49,040 INFO packaging: gave up _yum_lock for MainThread >03:47:49,040 INFO packaging: gave up _yum_lock for MainThread >03:47:49,042 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:47:49,042 INFO packaging: have _yum_lock for MainThread >03:47:49,044 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,045 INFO packaging: have _yum_lock for MainThread >03:47:49,045 INFO packaging: gave up _yum_lock for MainThread >03:47:49,045 INFO packaging: gave up _yum_lock for MainThread >03:47:49,047 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:47:49,047 INFO packaging: have _yum_lock for MainThread >03:47:49,049 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,049 INFO packaging: have _yum_lock for MainThread >03:47:49,050 INFO packaging: gave up _yum_lock for MainThread >03:47:49,050 INFO packaging: gave up _yum_lock for MainThread >03:47:49,052 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:47:49,052 INFO packaging: have _yum_lock for MainThread >03:47:49,054 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,054 INFO packaging: have _yum_lock for MainThread >03:47:49,054 INFO packaging: gave up _yum_lock for MainThread >03:47:49,055 INFO packaging: gave up _yum_lock for MainThread >03:47:49,057 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:47:49,057 INFO packaging: have _yum_lock for MainThread >03:47:49,058 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,059 INFO packaging: have _yum_lock for MainThread >03:47:49,059 INFO packaging: gave up _yum_lock for MainThread >03:47:49,060 INFO packaging: gave up _yum_lock for MainThread >03:47:49,061 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:47:49,062 INFO packaging: have _yum_lock for MainThread >03:47:49,063 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:47:49,064 INFO packaging: have _yum_lock for MainThread >03:47:49,064 INFO packaging: gave up _yum_lock for MainThread >03:47:49,064 INFO packaging: gave up _yum_lock for MainThread >03:47:51,567 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1000 (environments) >03:47:51,569 INFO packaging: have _yum_lock for MainThread >03:47:51,570 INFO packaging: gave up _yum_lock for MainThread >03:47:51,572 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:245 (refresh) >03:47:51,572 INFO packaging: have _yum_lock for MainThread >03:47:51,573 INFO packaging: gave up _yum_lock for MainThread >03:47:51,574 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,574 INFO packaging: have _yum_lock for MainThread >03:47:51,575 INFO packaging: gave up _yum_lock for MainThread >03:47:51,576 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,576 INFO packaging: have _yum_lock for MainThread >03:47:51,577 INFO packaging: gave up _yum_lock for MainThread >03:47:51,579 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,579 INFO packaging: have _yum_lock for MainThread >03:47:51,580 INFO packaging: gave up _yum_lock for MainThread >03:47:51,581 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,581 INFO packaging: have _yum_lock for MainThread >03:47:51,582 INFO packaging: gave up _yum_lock for MainThread >03:47:51,584 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,584 INFO packaging: have _yum_lock for MainThread >03:47:51,585 INFO packaging: gave up _yum_lock for MainThread >03:47:51,586 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,586 INFO packaging: have _yum_lock for MainThread >03:47:51,587 INFO packaging: gave up _yum_lock for MainThread >03:47:51,588 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,589 INFO packaging: have _yum_lock for MainThread >03:47:51,589 INFO packaging: gave up _yum_lock for MainThread >03:47:51,591 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,591 INFO packaging: have _yum_lock for MainThread >03:47:51,592 INFO packaging: gave up _yum_lock for MainThread >03:47:51,593 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,594 INFO packaging: have _yum_lock for MainThread >03:47:51,594 INFO packaging: gave up _yum_lock for MainThread >03:47:51,595 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,596 INFO packaging: have _yum_lock for MainThread >03:47:51,597 INFO packaging: gave up _yum_lock for MainThread >03:47:51,598 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,599 INFO packaging: have _yum_lock for MainThread >03:47:51,599 INFO packaging: gave up _yum_lock for MainThread >03:47:51,600 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,601 INFO packaging: have _yum_lock for MainThread >03:47:51,601 INFO packaging: gave up _yum_lock for MainThread >03:47:51,603 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,603 INFO packaging: have _yum_lock for MainThread >03:47:51,604 INFO packaging: gave up _yum_lock for MainThread >03:47:51,605 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,605 INFO packaging: have _yum_lock for MainThread >03:47:51,606 INFO packaging: gave up _yum_lock for MainThread >03:47:51,607 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,608 INFO packaging: have _yum_lock for MainThread >03:47:51,608 INFO packaging: gave up _yum_lock for MainThread >03:47:51,609 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,610 INFO packaging: have _yum_lock for MainThread >03:47:51,610 INFO packaging: gave up _yum_lock for MainThread >03:47:51,612 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,613 INFO packaging: have _yum_lock for MainThread >03:47:51,613 INFO packaging: gave up _yum_lock for MainThread >03:47:51,614 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,615 INFO packaging: have _yum_lock for MainThread >03:47:51,615 INFO packaging: gave up _yum_lock for MainThread >03:47:51,617 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:47:51,617 INFO packaging: have _yum_lock for MainThread >03:47:51,618 INFO packaging: gave up _yum_lock for MainThread >03:47:51,619 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:246 (refresh) >03:47:51,619 INFO packaging: have _yum_lock for MainThread >03:47:51,620 INFO packaging: gave up _yum_lock for MainThread >03:47:51,621 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1114 (groups) >03:47:51,622 INFO packaging: have _yum_lock for MainThread >03:47:51,622 INFO packaging: gave up _yum_lock for MainThread >03:47:51,623 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:286 (refreshAddons) >03:47:51,624 INFO packaging: have _yum_lock for MainThread >03:47:51,625 INFO packaging: gave up _yum_lock for MainThread >03:47:51,626 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,626 INFO packaging: have _yum_lock for MainThread >03:47:51,627 INFO packaging: gave up _yum_lock for MainThread >03:47:51,628 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,628 INFO packaging: have _yum_lock for MainThread >03:47:51,629 INFO packaging: gave up _yum_lock for MainThread >03:47:51,630 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,630 INFO packaging: have _yum_lock for MainThread >03:47:51,631 INFO packaging: gave up _yum_lock for MainThread >03:47:51,632 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,632 INFO packaging: have _yum_lock for MainThread >03:47:51,633 INFO packaging: gave up _yum_lock for MainThread >03:47:51,634 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,634 INFO packaging: have _yum_lock for MainThread >03:47:51,635 INFO packaging: gave up _yum_lock for MainThread >03:47:51,636 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,637 INFO packaging: have _yum_lock for MainThread >03:47:51,637 INFO packaging: gave up _yum_lock for MainThread >03:47:51,638 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,639 INFO packaging: have _yum_lock for MainThread >03:47:51,639 INFO packaging: gave up _yum_lock for MainThread >03:47:51,641 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,641 INFO packaging: have _yum_lock for MainThread >03:47:51,641 INFO packaging: gave up _yum_lock for MainThread >03:47:51,643 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,643 INFO packaging: have _yum_lock for MainThread >03:47:51,643 INFO packaging: gave up _yum_lock for MainThread >03:47:51,644 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,645 INFO packaging: have _yum_lock for MainThread >03:47:51,645 INFO packaging: gave up _yum_lock for MainThread >03:47:51,647 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,647 INFO packaging: have _yum_lock for MainThread >03:47:51,647 INFO packaging: gave up _yum_lock for MainThread >03:47:51,649 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,649 INFO packaging: have _yum_lock for MainThread >03:47:51,650 INFO packaging: gave up _yum_lock for MainThread >03:47:51,651 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,652 INFO packaging: have _yum_lock for MainThread >03:47:51,652 INFO packaging: gave up _yum_lock for MainThread >03:47:51,653 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,654 INFO packaging: have _yum_lock for MainThread >03:47:51,654 INFO packaging: gave up _yum_lock for MainThread >03:47:51,655 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,656 INFO packaging: have _yum_lock for MainThread >03:47:51,656 INFO packaging: gave up _yum_lock for MainThread >03:47:51,657 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,658 INFO packaging: have _yum_lock for MainThread >03:47:51,658 INFO packaging: gave up _yum_lock for MainThread >03:47:51,660 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,660 INFO packaging: have _yum_lock for MainThread >03:47:51,660 INFO packaging: gave up _yum_lock for MainThread >03:47:51,662 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,662 INFO packaging: have _yum_lock for MainThread >03:47:51,663 INFO packaging: gave up _yum_lock for MainThread >03:47:51,664 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,665 INFO packaging: have _yum_lock for MainThread >03:47:51,665 INFO packaging: gave up _yum_lock for MainThread >03:47:51,666 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,667 INFO packaging: have _yum_lock for MainThread >03:47:51,667 INFO packaging: gave up _yum_lock for MainThread >03:47:51,668 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,669 INFO packaging: have _yum_lock for MainThread >03:47:51,669 INFO packaging: gave up _yum_lock for MainThread >03:47:51,671 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,671 INFO packaging: have _yum_lock for MainThread >03:47:51,671 INFO packaging: gave up _yum_lock for MainThread >03:47:51,673 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,673 INFO packaging: have _yum_lock for MainThread >03:47:51,674 INFO packaging: gave up _yum_lock for MainThread >03:47:51,675 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,675 INFO packaging: have _yum_lock for MainThread >03:47:51,676 INFO packaging: gave up _yum_lock for MainThread >03:47:51,677 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,678 INFO packaging: have _yum_lock for MainThread >03:47:51,678 INFO packaging: gave up _yum_lock for MainThread >03:47:51,679 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,680 INFO packaging: have _yum_lock for MainThread >03:47:51,680 INFO packaging: gave up _yum_lock for MainThread >03:47:51,682 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,682 INFO packaging: have _yum_lock for MainThread >03:47:51,683 INFO packaging: gave up _yum_lock for MainThread >03:47:51,684 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,684 INFO packaging: have _yum_lock for MainThread >03:47:51,685 INFO packaging: gave up _yum_lock for MainThread >03:47:51,686 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,686 INFO packaging: have _yum_lock for MainThread >03:47:51,687 INFO packaging: gave up _yum_lock for MainThread >03:47:51,688 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,689 INFO packaging: have _yum_lock for MainThread >03:47:51,689 INFO packaging: gave up _yum_lock for MainThread >03:47:51,691 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,691 INFO packaging: have _yum_lock for MainThread >03:47:51,691 INFO packaging: gave up _yum_lock for MainThread >03:47:51,693 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,693 INFO packaging: have _yum_lock for MainThread >03:47:51,694 INFO packaging: gave up _yum_lock for MainThread >03:47:51,695 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,695 INFO packaging: have _yum_lock for MainThread >03:47:51,696 INFO packaging: gave up _yum_lock for MainThread >03:47:51,697 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,698 INFO packaging: have _yum_lock for MainThread >03:47:51,698 INFO packaging: gave up _yum_lock for MainThread >03:47:51,699 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,700 INFO packaging: have _yum_lock for MainThread >03:47:51,700 INFO packaging: gave up _yum_lock for MainThread >03:47:51,701 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,702 INFO packaging: have _yum_lock for MainThread >03:47:51,702 INFO packaging: gave up _yum_lock for MainThread >03:47:51,704 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,704 INFO packaging: have _yum_lock for MainThread >03:47:51,705 INFO packaging: gave up _yum_lock for MainThread >03:47:51,706 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,706 INFO packaging: have _yum_lock for MainThread >03:47:51,707 INFO packaging: gave up _yum_lock for MainThread >03:47:51,708 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,709 INFO packaging: have _yum_lock for MainThread >03:47:51,709 INFO packaging: gave up _yum_lock for MainThread >03:47:51,711 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,711 INFO packaging: have _yum_lock for MainThread >03:47:51,712 INFO packaging: gave up _yum_lock for MainThread >03:47:51,713 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,713 INFO packaging: have _yum_lock for MainThread >03:47:51,714 INFO packaging: gave up _yum_lock for MainThread >03:47:51,715 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,716 INFO packaging: have _yum_lock for MainThread >03:47:51,716 INFO packaging: gave up _yum_lock for MainThread >03:47:51,717 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,718 INFO packaging: have _yum_lock for MainThread >03:47:51,718 INFO packaging: gave up _yum_lock for MainThread >03:47:51,719 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,720 INFO packaging: have _yum_lock for MainThread >03:47:51,720 INFO packaging: gave up _yum_lock for MainThread >03:47:51,721 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:51,722 INFO packaging: have _yum_lock for MainThread >03:47:51,722 INFO packaging: gave up _yum_lock for MainThread >03:47:51,724 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,724 INFO packaging: have _yum_lock for MainThread >03:47:51,724 INFO packaging: gave up _yum_lock for MainThread >03:47:51,726 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,726 INFO packaging: have _yum_lock for MainThread >03:47:51,727 INFO packaging: gave up _yum_lock for MainThread >03:47:51,728 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,728 INFO packaging: have _yum_lock for MainThread >03:47:51,729 INFO packaging: gave up _yum_lock for MainThread >03:47:51,730 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,731 INFO packaging: have _yum_lock for MainThread >03:47:51,731 INFO packaging: gave up _yum_lock for MainThread >03:47:51,732 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,733 INFO packaging: have _yum_lock for MainThread >03:47:51,733 INFO packaging: gave up _yum_lock for MainThread >03:47:51,734 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,735 INFO packaging: have _yum_lock for MainThread >03:47:51,735 INFO packaging: gave up _yum_lock for MainThread >03:47:51,736 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,736 INFO packaging: have _yum_lock for MainThread >03:47:51,736 INFO packaging: gave up _yum_lock for MainThread >03:47:51,737 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,738 INFO packaging: have _yum_lock for MainThread >03:47:51,738 INFO packaging: gave up _yum_lock for MainThread >03:47:51,739 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,739 INFO packaging: have _yum_lock for MainThread >03:47:51,739 INFO packaging: gave up _yum_lock for MainThread >03:47:51,740 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,741 INFO packaging: have _yum_lock for MainThread >03:47:51,741 INFO packaging: gave up _yum_lock for MainThread >03:47:51,742 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,742 INFO packaging: have _yum_lock for MainThread >03:47:51,742 INFO packaging: gave up _yum_lock for MainThread >03:47:51,744 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,744 INFO packaging: have _yum_lock for MainThread >03:47:51,744 INFO packaging: gave up _yum_lock for MainThread >03:47:51,745 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,745 INFO packaging: have _yum_lock for MainThread >03:47:51,746 INFO packaging: gave up _yum_lock for MainThread >03:47:51,747 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,747 INFO packaging: have _yum_lock for MainThread >03:47:51,747 INFO packaging: gave up _yum_lock for MainThread >03:47:51,748 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,748 INFO packaging: have _yum_lock for MainThread >03:47:51,749 INFO packaging: gave up _yum_lock for MainThread >03:47:51,750 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,750 INFO packaging: have _yum_lock for MainThread >03:47:51,750 INFO packaging: gave up _yum_lock for MainThread >03:47:51,751 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,752 INFO packaging: have _yum_lock for MainThread >03:47:51,752 INFO packaging: gave up _yum_lock for MainThread >03:47:51,753 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:51,753 INFO packaging: have _yum_lock for MainThread >03:47:51,754 INFO packaging: gave up _yum_lock for MainThread >03:47:51,755 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,755 INFO packaging: have _yum_lock for MainThread >03:47:51,755 INFO packaging: gave up _yum_lock for MainThread >03:47:51,756 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,756 INFO packaging: have _yum_lock for MainThread >03:47:51,757 INFO packaging: gave up _yum_lock for MainThread >03:47:51,758 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,758 INFO packaging: have _yum_lock for MainThread >03:47:51,758 INFO packaging: gave up _yum_lock for MainThread >03:47:51,759 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,760 INFO packaging: have _yum_lock for MainThread >03:47:51,760 INFO packaging: gave up _yum_lock for MainThread >03:47:51,761 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,761 INFO packaging: have _yum_lock for MainThread >03:47:51,761 INFO packaging: gave up _yum_lock for MainThread >03:47:51,762 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,763 INFO packaging: have _yum_lock for MainThread >03:47:51,763 INFO packaging: gave up _yum_lock for MainThread >03:47:51,764 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,764 INFO packaging: have _yum_lock for MainThread >03:47:51,764 INFO packaging: gave up _yum_lock for MainThread >03:47:51,765 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,766 INFO packaging: have _yum_lock for MainThread >03:47:51,766 INFO packaging: gave up _yum_lock for MainThread >03:47:51,767 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,767 INFO packaging: have _yum_lock for MainThread >03:47:51,768 INFO packaging: gave up _yum_lock for MainThread >03:47:51,769 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:51,769 INFO packaging: have _yum_lock for MainThread >03:47:51,769 INFO packaging: gave up _yum_lock for MainThread >03:47:51,770 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,770 INFO packaging: have _yum_lock for MainThread >03:47:51,771 INFO packaging: gave up _yum_lock for MainThread >03:47:51,772 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,772 INFO packaging: have _yum_lock for MainThread >03:47:51,772 INFO packaging: gave up _yum_lock for MainThread >03:47:51,773 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,774 INFO packaging: have _yum_lock for MainThread >03:47:51,774 INFO packaging: gave up _yum_lock for MainThread >03:47:51,775 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,775 INFO packaging: have _yum_lock for MainThread >03:47:51,776 INFO packaging: gave up _yum_lock for MainThread >03:47:51,776 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,777 INFO packaging: have _yum_lock for MainThread >03:47:51,777 INFO packaging: gave up _yum_lock for MainThread >03:47:51,778 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,778 INFO packaging: have _yum_lock for MainThread >03:47:51,779 INFO packaging: gave up _yum_lock for MainThread >03:47:51,780 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,780 INFO packaging: have _yum_lock for MainThread >03:47:51,780 INFO packaging: gave up _yum_lock for MainThread >03:47:51,781 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,782 INFO packaging: have _yum_lock for MainThread >03:47:51,782 INFO packaging: gave up _yum_lock for MainThread >03:47:51,783 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,783 INFO packaging: have _yum_lock for MainThread >03:47:51,783 INFO packaging: gave up _yum_lock for MainThread >03:47:51,784 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,785 INFO packaging: have _yum_lock for MainThread >03:47:51,785 INFO packaging: gave up _yum_lock for MainThread >03:47:51,786 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,786 INFO packaging: have _yum_lock for MainThread >03:47:51,786 INFO packaging: gave up _yum_lock for MainThread >03:47:51,787 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,788 INFO packaging: have _yum_lock for MainThread >03:47:51,788 INFO packaging: gave up _yum_lock for MainThread >03:47:51,789 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,789 INFO packaging: have _yum_lock for MainThread >03:47:51,790 INFO packaging: gave up _yum_lock for MainThread >03:47:51,791 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,791 INFO packaging: have _yum_lock for MainThread >03:47:51,791 INFO packaging: gave up _yum_lock for MainThread >03:47:51,792 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,792 INFO packaging: have _yum_lock for MainThread >03:47:51,793 INFO packaging: gave up _yum_lock for MainThread >03:47:51,794 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,794 INFO packaging: have _yum_lock for MainThread >03:47:51,794 INFO packaging: gave up _yum_lock for MainThread >03:47:51,795 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,795 INFO packaging: have _yum_lock for MainThread >03:47:51,796 INFO packaging: gave up _yum_lock for MainThread >03:47:51,797 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,797 INFO packaging: have _yum_lock for MainThread >03:47:51,797 INFO packaging: gave up _yum_lock for MainThread >03:47:51,798 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,799 INFO packaging: have _yum_lock for MainThread >03:47:51,799 INFO packaging: gave up _yum_lock for MainThread >03:47:51,800 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,800 INFO packaging: have _yum_lock for MainThread >03:47:51,800 INFO packaging: gave up _yum_lock for MainThread >03:47:51,801 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,802 INFO packaging: have _yum_lock for MainThread >03:47:51,802 INFO packaging: gave up _yum_lock for MainThread >03:47:51,803 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,803 INFO packaging: have _yum_lock for MainThread >03:47:51,804 INFO packaging: gave up _yum_lock for MainThread >03:47:51,805 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,805 INFO packaging: have _yum_lock for MainThread >03:47:51,805 INFO packaging: gave up _yum_lock for MainThread >03:47:51,806 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,807 INFO packaging: have _yum_lock for MainThread >03:47:51,807 INFO packaging: gave up _yum_lock for MainThread >03:47:51,808 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,808 INFO packaging: have _yum_lock for MainThread >03:47:51,808 INFO packaging: gave up _yum_lock for MainThread >03:47:51,809 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,810 INFO packaging: have _yum_lock for MainThread >03:47:51,810 INFO packaging: gave up _yum_lock for MainThread >03:47:51,811 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,811 INFO packaging: have _yum_lock for MainThread >03:47:51,811 INFO packaging: gave up _yum_lock for MainThread >03:47:51,812 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,813 INFO packaging: have _yum_lock for MainThread >03:47:51,813 INFO packaging: gave up _yum_lock for MainThread >03:47:51,814 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,814 INFO packaging: have _yum_lock for MainThread >03:47:51,815 INFO packaging: gave up _yum_lock for MainThread >03:47:51,816 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:51,816 INFO packaging: have _yum_lock for MainThread >03:47:51,816 INFO packaging: gave up _yum_lock for MainThread >03:47:51,817 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,817 INFO packaging: have _yum_lock for MainThread >03:47:51,818 INFO packaging: gave up _yum_lock for MainThread >03:47:51,819 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,819 INFO packaging: have _yum_lock for MainThread >03:47:51,819 INFO packaging: gave up _yum_lock for MainThread >03:47:51,820 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,821 INFO packaging: have _yum_lock for MainThread >03:47:51,821 INFO packaging: gave up _yum_lock for MainThread >03:47:51,822 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,822 INFO packaging: have _yum_lock for MainThread >03:47:51,822 INFO packaging: gave up _yum_lock for MainThread >03:47:51,823 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,824 INFO packaging: have _yum_lock for MainThread >03:47:51,824 INFO packaging: gave up _yum_lock for MainThread >03:47:51,825 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,825 INFO packaging: have _yum_lock for MainThread >03:47:51,825 INFO packaging: gave up _yum_lock for MainThread >03:47:51,826 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,827 INFO packaging: have _yum_lock for MainThread >03:47:51,827 INFO packaging: gave up _yum_lock for MainThread >03:47:51,828 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,828 INFO packaging: have _yum_lock for MainThread >03:47:51,829 INFO packaging: gave up _yum_lock for MainThread >03:47:51,829 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,830 INFO packaging: have _yum_lock for MainThread >03:47:51,830 INFO packaging: gave up _yum_lock for MainThread >03:47:51,831 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,831 INFO packaging: have _yum_lock for MainThread >03:47:51,832 INFO packaging: gave up _yum_lock for MainThread >03:47:51,833 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,833 INFO packaging: have _yum_lock for MainThread >03:47:51,833 INFO packaging: gave up _yum_lock for MainThread >03:47:51,834 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,835 INFO packaging: have _yum_lock for MainThread >03:47:51,835 INFO packaging: gave up _yum_lock for MainThread >03:47:51,836 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,836 INFO packaging: have _yum_lock for MainThread >03:47:51,837 INFO packaging: gave up _yum_lock for MainThread >03:47:51,838 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,838 INFO packaging: have _yum_lock for MainThread >03:47:51,838 INFO packaging: gave up _yum_lock for MainThread >03:47:51,839 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,839 INFO packaging: have _yum_lock for MainThread >03:47:51,840 INFO packaging: gave up _yum_lock for MainThread >03:47:51,841 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,841 INFO packaging: have _yum_lock for MainThread >03:47:51,841 INFO packaging: gave up _yum_lock for MainThread >03:47:51,842 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,843 INFO packaging: have _yum_lock for MainThread >03:47:51,843 INFO packaging: gave up _yum_lock for MainThread >03:47:51,844 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,844 INFO packaging: have _yum_lock for MainThread >03:47:51,844 INFO packaging: gave up _yum_lock for MainThread >03:47:51,845 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,846 INFO packaging: have _yum_lock for MainThread >03:47:51,846 INFO packaging: gave up _yum_lock for MainThread >03:47:51,847 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,847 INFO packaging: have _yum_lock for MainThread >03:47:51,848 INFO packaging: gave up _yum_lock for MainThread >03:47:51,848 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,849 INFO packaging: have _yum_lock for MainThread >03:47:51,849 INFO packaging: gave up _yum_lock for MainThread >03:47:51,850 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,850 INFO packaging: have _yum_lock for MainThread >03:47:51,851 INFO packaging: gave up _yum_lock for MainThread >03:47:51,852 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,852 INFO packaging: have _yum_lock for MainThread >03:47:51,852 INFO packaging: gave up _yum_lock for MainThread >03:47:51,853 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,854 INFO packaging: have _yum_lock for MainThread >03:47:51,854 INFO packaging: gave up _yum_lock for MainThread >03:47:51,855 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,855 INFO packaging: have _yum_lock for MainThread >03:47:51,855 INFO packaging: gave up _yum_lock for MainThread >03:47:51,856 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,857 INFO packaging: have _yum_lock for MainThread >03:47:51,857 INFO packaging: gave up _yum_lock for MainThread >03:47:51,858 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,858 INFO packaging: have _yum_lock for MainThread >03:47:51,858 INFO packaging: gave up _yum_lock for MainThread >03:47:51,860 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,860 INFO packaging: have _yum_lock for MainThread >03:47:51,860 INFO packaging: gave up _yum_lock for MainThread >03:47:51,861 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,861 INFO packaging: have _yum_lock for MainThread >03:47:51,861 INFO packaging: gave up _yum_lock for MainThread >03:47:51,863 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,863 INFO packaging: have _yum_lock for MainThread >03:47:51,863 INFO packaging: gave up _yum_lock for MainThread >03:47:51,864 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,864 INFO packaging: have _yum_lock for MainThread >03:47:51,865 INFO packaging: gave up _yum_lock for MainThread >03:47:51,866 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,866 INFO packaging: have _yum_lock for MainThread >03:47:51,866 INFO packaging: gave up _yum_lock for MainThread >03:47:51,867 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,868 INFO packaging: have _yum_lock for MainThread >03:47:51,868 INFO packaging: gave up _yum_lock for MainThread >03:47:51,869 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,869 INFO packaging: have _yum_lock for MainThread >03:47:51,869 INFO packaging: gave up _yum_lock for MainThread >03:47:51,870 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,871 INFO packaging: have _yum_lock for MainThread >03:47:51,871 INFO packaging: gave up _yum_lock for MainThread >03:47:51,872 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,872 INFO packaging: have _yum_lock for MainThread >03:47:51,872 INFO packaging: gave up _yum_lock for MainThread >03:47:51,873 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,874 INFO packaging: have _yum_lock for MainThread >03:47:51,874 INFO packaging: gave up _yum_lock for MainThread >03:47:51,875 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,875 INFO packaging: have _yum_lock for MainThread >03:47:51,876 INFO packaging: gave up _yum_lock for MainThread >03:47:51,877 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,877 INFO packaging: have _yum_lock for MainThread >03:47:51,877 INFO packaging: gave up _yum_lock for MainThread >03:47:51,878 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,878 INFO packaging: have _yum_lock for MainThread >03:47:51,879 INFO packaging: gave up _yum_lock for MainThread >03:47:51,880 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,880 INFO packaging: have _yum_lock for MainThread >03:47:51,880 INFO packaging: gave up _yum_lock for MainThread >03:47:51,881 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,881 INFO packaging: have _yum_lock for MainThread >03:47:51,882 INFO packaging: gave up _yum_lock for MainThread >03:47:51,883 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,883 INFO packaging: have _yum_lock for MainThread >03:47:51,883 INFO packaging: gave up _yum_lock for MainThread >03:47:51,884 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,885 INFO packaging: have _yum_lock for MainThread >03:47:51,885 INFO packaging: gave up _yum_lock for MainThread >03:47:51,886 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,886 INFO packaging: have _yum_lock for MainThread >03:47:51,886 INFO packaging: gave up _yum_lock for MainThread >03:47:51,887 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,887 INFO packaging: have _yum_lock for MainThread >03:47:51,888 INFO packaging: gave up _yum_lock for MainThread >03:47:51,889 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,889 INFO packaging: have _yum_lock for MainThread >03:47:51,889 INFO packaging: gave up _yum_lock for MainThread >03:47:51,890 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,890 INFO packaging: have _yum_lock for MainThread >03:47:51,891 INFO packaging: gave up _yum_lock for MainThread >03:47:51,891 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,892 INFO packaging: have _yum_lock for MainThread >03:47:51,892 INFO packaging: gave up _yum_lock for MainThread >03:47:51,893 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,893 INFO packaging: have _yum_lock for MainThread >03:47:51,893 INFO packaging: gave up _yum_lock for MainThread >03:47:51,894 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,894 INFO packaging: have _yum_lock for MainThread >03:47:51,895 INFO packaging: gave up _yum_lock for MainThread >03:47:51,896 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,896 INFO packaging: have _yum_lock for MainThread >03:47:51,896 INFO packaging: gave up _yum_lock for MainThread >03:47:51,897 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,897 INFO packaging: have _yum_lock for MainThread >03:47:51,898 INFO packaging: gave up _yum_lock for MainThread >03:47:51,899 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,899 INFO packaging: have _yum_lock for MainThread >03:47:51,899 INFO packaging: gave up _yum_lock for MainThread >03:47:51,900 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,900 INFO packaging: have _yum_lock for MainThread >03:47:51,900 INFO packaging: gave up _yum_lock for MainThread >03:47:51,901 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,902 INFO packaging: have _yum_lock for MainThread >03:47:51,902 INFO packaging: gave up _yum_lock for MainThread >03:47:51,903 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,903 INFO packaging: have _yum_lock for MainThread >03:47:51,903 INFO packaging: gave up _yum_lock for MainThread >03:47:51,904 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,904 INFO packaging: have _yum_lock for MainThread >03:47:51,905 INFO packaging: gave up _yum_lock for MainThread >03:47:51,906 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,906 INFO packaging: have _yum_lock for MainThread >03:47:51,906 INFO packaging: gave up _yum_lock for MainThread >03:47:51,907 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,908 INFO packaging: have _yum_lock for MainThread >03:47:51,908 INFO packaging: gave up _yum_lock for MainThread >03:47:51,909 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,909 INFO packaging: have _yum_lock for MainThread >03:47:51,909 INFO packaging: gave up _yum_lock for MainThread >03:47:51,911 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,911 INFO packaging: have _yum_lock for MainThread >03:47:51,911 INFO packaging: gave up _yum_lock for MainThread >03:47:51,912 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,912 INFO packaging: have _yum_lock for MainThread >03:47:51,913 INFO packaging: gave up _yum_lock for MainThread >03:47:51,914 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,914 INFO packaging: have _yum_lock for MainThread >03:47:51,914 INFO packaging: gave up _yum_lock for MainThread >03:47:51,915 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,915 INFO packaging: have _yum_lock for MainThread >03:47:51,916 INFO packaging: gave up _yum_lock for MainThread >03:47:51,917 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,917 INFO packaging: have _yum_lock for MainThread >03:47:51,917 INFO packaging: gave up _yum_lock for MainThread >03:47:51,918 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,919 INFO packaging: have _yum_lock for MainThread >03:47:51,919 INFO packaging: gave up _yum_lock for MainThread >03:47:51,920 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,920 INFO packaging: have _yum_lock for MainThread >03:47:51,920 INFO packaging: gave up _yum_lock for MainThread >03:47:51,921 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,921 INFO packaging: have _yum_lock for MainThread >03:47:51,922 INFO packaging: gave up _yum_lock for MainThread >03:47:51,923 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,923 INFO packaging: have _yum_lock for MainThread >03:47:51,923 INFO packaging: gave up _yum_lock for MainThread >03:47:51,924 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,925 INFO packaging: have _yum_lock for MainThread >03:47:51,925 INFO packaging: gave up _yum_lock for MainThread >03:47:51,926 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,926 INFO packaging: have _yum_lock for MainThread >03:47:51,926 INFO packaging: gave up _yum_lock for MainThread >03:47:51,927 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,927 INFO packaging: have _yum_lock for MainThread >03:47:51,928 INFO packaging: gave up _yum_lock for MainThread >03:47:51,929 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,929 INFO packaging: have _yum_lock for MainThread >03:47:51,929 INFO packaging: gave up _yum_lock for MainThread >03:47:51,930 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,931 INFO packaging: have _yum_lock for MainThread >03:47:51,931 INFO packaging: gave up _yum_lock for MainThread >03:47:51,932 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,932 INFO packaging: have _yum_lock for MainThread >03:47:51,932 INFO packaging: gave up _yum_lock for MainThread >03:47:51,933 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,934 INFO packaging: have _yum_lock for MainThread >03:47:51,934 INFO packaging: gave up _yum_lock for MainThread >03:47:51,935 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,935 INFO packaging: have _yum_lock for MainThread >03:47:51,936 INFO packaging: gave up _yum_lock for MainThread >03:47:51,937 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,937 INFO packaging: have _yum_lock for MainThread >03:47:51,937 INFO packaging: gave up _yum_lock for MainThread >03:47:51,938 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,938 INFO packaging: have _yum_lock for MainThread >03:47:51,939 INFO packaging: gave up _yum_lock for MainThread >03:47:51,940 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,940 INFO packaging: have _yum_lock for MainThread >03:47:51,940 INFO packaging: gave up _yum_lock for MainThread >03:47:51,941 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,942 INFO packaging: have _yum_lock for MainThread >03:47:51,942 INFO packaging: gave up _yum_lock for MainThread >03:47:51,943 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,943 INFO packaging: have _yum_lock for MainThread >03:47:51,943 INFO packaging: gave up _yum_lock for MainThread >03:47:51,944 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,945 INFO packaging: have _yum_lock for MainThread >03:47:51,945 INFO packaging: gave up _yum_lock for MainThread >03:47:51,946 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,946 INFO packaging: have _yum_lock for MainThread >03:47:51,947 INFO packaging: gave up _yum_lock for MainThread >03:47:51,948 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,948 INFO packaging: have _yum_lock for MainThread >03:47:51,948 INFO packaging: gave up _yum_lock for MainThread >03:47:51,949 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,949 INFO packaging: have _yum_lock for MainThread >03:47:51,950 INFO packaging: gave up _yum_lock for MainThread >03:47:51,951 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,951 INFO packaging: have _yum_lock for MainThread >03:47:51,951 INFO packaging: gave up _yum_lock for MainThread >03:47:51,952 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,953 INFO packaging: have _yum_lock for MainThread >03:47:51,953 INFO packaging: gave up _yum_lock for MainThread >03:47:51,956 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,957 INFO packaging: have _yum_lock for MainThread >03:47:51,958 INFO packaging: gave up _yum_lock for MainThread >03:47:51,960 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,961 INFO packaging: have _yum_lock for MainThread >03:47:51,962 INFO packaging: gave up _yum_lock for MainThread >03:47:51,965 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,966 INFO packaging: have _yum_lock for MainThread >03:47:51,967 INFO packaging: gave up _yum_lock for MainThread >03:47:51,970 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,971 INFO packaging: have _yum_lock for MainThread >03:47:51,971 INFO packaging: gave up _yum_lock for MainThread >03:47:51,973 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,973 INFO packaging: have _yum_lock for MainThread >03:47:51,973 INFO packaging: gave up _yum_lock for MainThread >03:47:51,974 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,974 INFO packaging: have _yum_lock for MainThread >03:47:51,975 INFO packaging: gave up _yum_lock for MainThread >03:47:51,976 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,976 INFO packaging: have _yum_lock for MainThread >03:47:51,976 INFO packaging: gave up _yum_lock for MainThread >03:47:51,977 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,978 INFO packaging: have _yum_lock for MainThread >03:47:51,978 INFO packaging: gave up _yum_lock for MainThread >03:47:51,979 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,979 INFO packaging: have _yum_lock for MainThread >03:47:51,980 INFO packaging: gave up _yum_lock for MainThread >03:47:51,981 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,981 INFO packaging: have _yum_lock for MainThread >03:47:51,981 INFO packaging: gave up _yum_lock for MainThread >03:47:51,982 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,983 INFO packaging: have _yum_lock for MainThread >03:47:51,983 INFO packaging: gave up _yum_lock for MainThread >03:47:51,984 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,984 INFO packaging: have _yum_lock for MainThread >03:47:51,985 INFO packaging: gave up _yum_lock for MainThread >03:47:51,986 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,986 INFO packaging: have _yum_lock for MainThread >03:47:51,986 INFO packaging: gave up _yum_lock for MainThread >03:47:51,987 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,988 INFO packaging: have _yum_lock for MainThread >03:47:51,988 INFO packaging: gave up _yum_lock for MainThread >03:47:51,989 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,989 INFO packaging: have _yum_lock for MainThread >03:47:51,989 INFO packaging: gave up _yum_lock for MainThread >03:47:51,990 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,991 INFO packaging: have _yum_lock for MainThread >03:47:51,991 INFO packaging: gave up _yum_lock for MainThread >03:47:51,992 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,992 INFO packaging: have _yum_lock for MainThread >03:47:51,993 INFO packaging: gave up _yum_lock for MainThread >03:47:51,994 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:51,994 INFO packaging: have _yum_lock for MainThread >03:47:51,994 INFO packaging: gave up _yum_lock for MainThread >03:47:51,995 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:51,995 INFO packaging: have _yum_lock for MainThread >03:47:51,996 INFO packaging: gave up _yum_lock for MainThread >03:47:51,997 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:51,997 INFO packaging: have _yum_lock for MainThread >03:47:51,997 INFO packaging: gave up _yum_lock for MainThread >03:47:51,999 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:51,999 INFO packaging: have _yum_lock for MainThread >03:47:51,999 INFO packaging: gave up _yum_lock for MainThread >03:47:52,000 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,000 INFO packaging: have _yum_lock for MainThread >03:47:52,001 INFO packaging: gave up _yum_lock for MainThread >03:47:52,002 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,002 INFO packaging: have _yum_lock for MainThread >03:47:52,002 INFO packaging: gave up _yum_lock for MainThread >03:47:52,003 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,004 INFO packaging: have _yum_lock for MainThread >03:47:52,004 INFO packaging: gave up _yum_lock for MainThread >03:47:52,005 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,005 INFO packaging: have _yum_lock for MainThread >03:47:52,006 INFO packaging: gave up _yum_lock for MainThread >03:47:52,007 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,007 INFO packaging: have _yum_lock for MainThread >03:47:52,007 INFO packaging: gave up _yum_lock for MainThread >03:47:52,008 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,009 INFO packaging: have _yum_lock for MainThread >03:47:52,009 INFO packaging: gave up _yum_lock for MainThread >03:47:52,010 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,010 INFO packaging: have _yum_lock for MainThread >03:47:52,010 INFO packaging: gave up _yum_lock for MainThread >03:47:52,011 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,012 INFO packaging: have _yum_lock for MainThread >03:47:52,012 INFO packaging: gave up _yum_lock for MainThread >03:47:52,013 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,013 INFO packaging: have _yum_lock for MainThread >03:47:52,014 INFO packaging: gave up _yum_lock for MainThread >03:47:52,015 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,015 INFO packaging: have _yum_lock for MainThread >03:47:52,015 INFO packaging: gave up _yum_lock for MainThread >03:47:52,017 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,017 INFO packaging: have _yum_lock for MainThread >03:47:52,017 INFO packaging: gave up _yum_lock for MainThread >03:47:52,018 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,018 INFO packaging: have _yum_lock for MainThread >03:47:52,019 INFO packaging: gave up _yum_lock for MainThread >03:47:52,020 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,020 INFO packaging: have _yum_lock for MainThread >03:47:52,020 INFO packaging: gave up _yum_lock for MainThread >03:47:52,021 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,022 INFO packaging: have _yum_lock for MainThread >03:47:52,022 INFO packaging: gave up _yum_lock for MainThread >03:47:52,023 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,023 INFO packaging: have _yum_lock for MainThread >03:47:52,023 INFO packaging: gave up _yum_lock for MainThread >03:47:52,024 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,025 INFO packaging: have _yum_lock for MainThread >03:47:52,025 INFO packaging: gave up _yum_lock for MainThread >03:47:52,026 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,026 INFO packaging: have _yum_lock for MainThread >03:47:52,027 INFO packaging: gave up _yum_lock for MainThread >03:47:52,028 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,028 INFO packaging: have _yum_lock for MainThread >03:47:52,028 INFO packaging: gave up _yum_lock for MainThread >03:47:52,029 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,030 INFO packaging: have _yum_lock for MainThread >03:47:52,030 INFO packaging: gave up _yum_lock for MainThread >03:47:52,031 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,031 INFO packaging: have _yum_lock for MainThread >03:47:52,032 INFO packaging: gave up _yum_lock for MainThread >03:47:52,033 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,033 INFO packaging: have _yum_lock for MainThread >03:47:52,033 INFO packaging: gave up _yum_lock for MainThread >03:47:52,034 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,035 INFO packaging: have _yum_lock for MainThread >03:47:52,035 INFO packaging: gave up _yum_lock for MainThread >03:47:52,036 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,036 INFO packaging: have _yum_lock for MainThread >03:47:52,036 INFO packaging: gave up _yum_lock for MainThread >03:47:52,038 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,038 INFO packaging: have _yum_lock for MainThread >03:47:52,038 INFO packaging: gave up _yum_lock for MainThread >03:47:52,039 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,040 INFO packaging: have _yum_lock for MainThread >03:47:52,040 INFO packaging: gave up _yum_lock for MainThread >03:47:52,041 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,041 INFO packaging: have _yum_lock for MainThread >03:47:52,042 INFO packaging: gave up _yum_lock for MainThread >03:47:52,043 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,043 INFO packaging: have _yum_lock for MainThread >03:47:52,043 INFO packaging: gave up _yum_lock for MainThread >03:47:52,044 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,045 INFO packaging: have _yum_lock for MainThread >03:47:52,045 INFO packaging: gave up _yum_lock for MainThread >03:47:52,046 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,046 INFO packaging: have _yum_lock for MainThread >03:47:52,046 INFO packaging: gave up _yum_lock for MainThread >03:47:52,048 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,048 INFO packaging: have _yum_lock for MainThread >03:47:52,048 INFO packaging: gave up _yum_lock for MainThread >03:47:52,049 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,050 INFO packaging: have _yum_lock for MainThread >03:47:52,050 INFO packaging: gave up _yum_lock for MainThread >03:47:52,051 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,051 INFO packaging: have _yum_lock for MainThread >03:47:52,052 INFO packaging: gave up _yum_lock for MainThread >03:47:52,053 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,053 INFO packaging: have _yum_lock for MainThread >03:47:52,053 INFO packaging: gave up _yum_lock for MainThread >03:47:52,054 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,055 INFO packaging: have _yum_lock for MainThread >03:47:52,055 INFO packaging: gave up _yum_lock for MainThread >03:47:52,056 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,056 INFO packaging: have _yum_lock for MainThread >03:47:52,056 INFO packaging: gave up _yum_lock for MainThread >03:47:52,058 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,058 INFO packaging: have _yum_lock for MainThread >03:47:52,058 INFO packaging: gave up _yum_lock for MainThread >03:47:52,059 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,059 INFO packaging: have _yum_lock for MainThread >03:47:52,060 INFO packaging: gave up _yum_lock for MainThread >03:47:52,061 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,061 INFO packaging: have _yum_lock for MainThread >03:47:52,061 INFO packaging: gave up _yum_lock for MainThread >03:47:52,062 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,063 INFO packaging: have _yum_lock for MainThread >03:47:52,063 INFO packaging: gave up _yum_lock for MainThread >03:47:52,064 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,064 INFO packaging: have _yum_lock for MainThread >03:47:52,065 INFO packaging: gave up _yum_lock for MainThread >03:47:52,066 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,066 INFO packaging: have _yum_lock for MainThread >03:47:52,066 INFO packaging: gave up _yum_lock for MainThread >03:47:52,067 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,068 INFO packaging: have _yum_lock for MainThread >03:47:52,068 INFO packaging: gave up _yum_lock for MainThread >03:47:52,069 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,069 INFO packaging: have _yum_lock for MainThread >03:47:52,069 INFO packaging: gave up _yum_lock for MainThread >03:47:52,070 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,071 INFO packaging: have _yum_lock for MainThread >03:47:52,071 INFO packaging: gave up _yum_lock for MainThread >03:47:52,072 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,072 INFO packaging: have _yum_lock for MainThread >03:47:52,073 INFO packaging: gave up _yum_lock for MainThread >03:47:52,074 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,074 INFO packaging: have _yum_lock for MainThread >03:47:52,074 INFO packaging: gave up _yum_lock for MainThread >03:47:52,075 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,076 INFO packaging: have _yum_lock for MainThread >03:47:52,076 INFO packaging: gave up _yum_lock for MainThread >03:47:52,077 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,077 INFO packaging: have _yum_lock for MainThread >03:47:52,077 INFO packaging: gave up _yum_lock for MainThread >03:47:52,079 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,079 INFO packaging: have _yum_lock for MainThread >03:47:52,079 INFO packaging: gave up _yum_lock for MainThread >03:47:52,080 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,081 INFO packaging: have _yum_lock for MainThread >03:47:52,081 INFO packaging: gave up _yum_lock for MainThread >03:47:52,082 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,082 INFO packaging: have _yum_lock for MainThread >03:47:52,082 INFO packaging: gave up _yum_lock for MainThread >03:47:52,084 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,084 INFO packaging: have _yum_lock for MainThread >03:47:52,084 INFO packaging: gave up _yum_lock for MainThread >03:47:52,085 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,086 INFO packaging: have _yum_lock for MainThread >03:47:52,086 INFO packaging: gave up _yum_lock for MainThread >03:47:52,087 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,087 INFO packaging: have _yum_lock for MainThread >03:47:52,087 INFO packaging: gave up _yum_lock for MainThread >03:47:52,088 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,089 INFO packaging: have _yum_lock for MainThread >03:47:52,089 INFO packaging: gave up _yum_lock for MainThread >03:47:52,090 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,090 INFO packaging: have _yum_lock for MainThread >03:47:52,091 INFO packaging: gave up _yum_lock for MainThread >03:47:52,092 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,092 INFO packaging: have _yum_lock for MainThread >03:47:52,093 INFO packaging: gave up _yum_lock for MainThread >03:47:52,094 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,094 INFO packaging: have _yum_lock for MainThread >03:47:52,094 INFO packaging: gave up _yum_lock for MainThread >03:47:52,095 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,095 INFO packaging: have _yum_lock for MainThread >03:47:52,096 INFO packaging: gave up _yum_lock for MainThread >03:47:52,097 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,097 INFO packaging: have _yum_lock for MainThread >03:47:52,097 INFO packaging: gave up _yum_lock for MainThread >03:47:52,099 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,099 INFO packaging: have _yum_lock for MainThread >03:47:52,099 INFO packaging: gave up _yum_lock for MainThread >03:47:52,100 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,101 INFO packaging: have _yum_lock for MainThread >03:47:52,101 INFO packaging: gave up _yum_lock for MainThread >03:47:52,102 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,102 INFO packaging: have _yum_lock for MainThread >03:47:52,102 INFO packaging: gave up _yum_lock for MainThread >03:47:52,104 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,104 INFO packaging: have _yum_lock for MainThread >03:47:52,104 INFO packaging: gave up _yum_lock for MainThread >03:47:52,105 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,106 INFO packaging: have _yum_lock for MainThread >03:47:52,106 INFO packaging: gave up _yum_lock for MainThread >03:47:52,107 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,107 INFO packaging: have _yum_lock for MainThread >03:47:52,108 INFO packaging: gave up _yum_lock for MainThread >03:47:52,109 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,109 INFO packaging: have _yum_lock for MainThread >03:47:52,109 INFO packaging: gave up _yum_lock for MainThread >03:47:52,110 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,111 INFO packaging: have _yum_lock for MainThread >03:47:52,111 INFO packaging: gave up _yum_lock for MainThread >03:47:52,112 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,112 INFO packaging: have _yum_lock for MainThread >03:47:52,112 INFO packaging: gave up _yum_lock for MainThread >03:47:52,113 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,114 INFO packaging: have _yum_lock for MainThread >03:47:52,114 INFO packaging: gave up _yum_lock for MainThread >03:47:52,115 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,115 INFO packaging: have _yum_lock for MainThread >03:47:52,116 INFO packaging: gave up _yum_lock for MainThread >03:47:52,117 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,117 INFO packaging: have _yum_lock for MainThread >03:47:52,117 INFO packaging: gave up _yum_lock for MainThread >03:47:52,118 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,119 INFO packaging: have _yum_lock for MainThread >03:47:52,119 INFO packaging: gave up _yum_lock for MainThread >03:47:52,120 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:52,120 INFO packaging: have _yum_lock for MainThread >03:47:52,120 INFO packaging: gave up _yum_lock for MainThread >03:47:52,121 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,122 INFO packaging: have _yum_lock for MainThread >03:47:52,122 INFO packaging: gave up _yum_lock for MainThread >03:47:52,123 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,123 INFO packaging: have _yum_lock for MainThread >03:47:52,124 INFO packaging: gave up _yum_lock for MainThread >03:47:52,125 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,125 INFO packaging: have _yum_lock for MainThread >03:47:52,125 INFO packaging: gave up _yum_lock for MainThread >03:47:52,126 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,126 INFO packaging: have _yum_lock for MainThread >03:47:52,127 INFO packaging: gave up _yum_lock for MainThread >03:47:52,128 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,128 INFO packaging: have _yum_lock for MainThread >03:47:52,128 INFO packaging: gave up _yum_lock for MainThread >03:47:52,129 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,130 INFO packaging: have _yum_lock for MainThread >03:47:52,130 INFO packaging: gave up _yum_lock for MainThread >03:47:52,131 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,131 INFO packaging: have _yum_lock for MainThread >03:47:52,132 INFO packaging: gave up _yum_lock for MainThread >03:47:52,133 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,134 INFO packaging: have _yum_lock for MainThread >03:47:52,134 INFO packaging: gave up _yum_lock for MainThread >03:47:52,135 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,136 INFO packaging: have _yum_lock for MainThread >03:47:52,136 INFO packaging: gave up _yum_lock for MainThread >03:47:52,137 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,137 INFO packaging: have _yum_lock for MainThread >03:47:52,137 INFO packaging: gave up _yum_lock for MainThread >03:47:52,138 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,139 INFO packaging: have _yum_lock for MainThread >03:47:52,139 INFO packaging: gave up _yum_lock for MainThread >03:47:52,140 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,140 INFO packaging: have _yum_lock for MainThread >03:47:52,141 INFO packaging: gave up _yum_lock for MainThread >03:47:52,142 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,142 INFO packaging: have _yum_lock for MainThread >03:47:52,142 INFO packaging: gave up _yum_lock for MainThread >03:47:52,144 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,144 INFO packaging: have _yum_lock for MainThread >03:47:52,144 INFO packaging: gave up _yum_lock for MainThread >03:47:52,146 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,146 INFO packaging: have _yum_lock for MainThread >03:47:52,146 INFO packaging: gave up _yum_lock for MainThread >03:47:52,148 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,148 INFO packaging: have _yum_lock for MainThread >03:47:52,148 INFO packaging: gave up _yum_lock for MainThread >03:47:52,149 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,150 INFO packaging: have _yum_lock for MainThread >03:47:52,150 INFO packaging: gave up _yum_lock for MainThread >03:47:52,151 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,151 INFO packaging: have _yum_lock for MainThread >03:47:52,151 INFO packaging: gave up _yum_lock for MainThread >03:47:52,153 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,153 INFO packaging: have _yum_lock for MainThread >03:47:52,153 INFO packaging: gave up _yum_lock for MainThread >03:47:52,154 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,155 INFO packaging: have _yum_lock for MainThread >03:47:52,155 INFO packaging: gave up _yum_lock for MainThread >03:47:52,156 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,156 INFO packaging: have _yum_lock for MainThread >03:47:52,156 INFO packaging: gave up _yum_lock for MainThread >03:47:52,157 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,158 INFO packaging: have _yum_lock for MainThread >03:47:52,158 INFO packaging: gave up _yum_lock for MainThread >03:47:52,159 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,159 INFO packaging: have _yum_lock for MainThread >03:47:52,160 INFO packaging: gave up _yum_lock for MainThread >03:47:52,161 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,161 INFO packaging: have _yum_lock for MainThread >03:47:52,161 INFO packaging: gave up _yum_lock for MainThread >03:47:52,162 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,163 INFO packaging: have _yum_lock for MainThread >03:47:52,163 INFO packaging: gave up _yum_lock for MainThread >03:47:52,164 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,164 INFO packaging: have _yum_lock for MainThread >03:47:52,165 INFO packaging: gave up _yum_lock for MainThread >03:47:52,166 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,166 INFO packaging: have _yum_lock for MainThread >03:47:52,166 INFO packaging: gave up _yum_lock for MainThread >03:47:52,167 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,168 INFO packaging: have _yum_lock for MainThread >03:47:52,168 INFO packaging: gave up _yum_lock for MainThread >03:47:52,169 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,169 INFO packaging: have _yum_lock for MainThread >03:47:52,169 INFO packaging: gave up _yum_lock for MainThread >03:47:52,171 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,171 INFO packaging: have _yum_lock for MainThread >03:47:52,171 INFO packaging: gave up _yum_lock for MainThread >03:47:52,172 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,172 INFO packaging: have _yum_lock for MainThread >03:47:52,173 INFO packaging: gave up _yum_lock for MainThread >03:47:52,174 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,174 INFO packaging: have _yum_lock for MainThread >03:47:52,174 INFO packaging: gave up _yum_lock for MainThread >03:47:52,175 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,176 INFO packaging: have _yum_lock for MainThread >03:47:52,176 INFO packaging: gave up _yum_lock for MainThread >03:47:52,177 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,177 INFO packaging: have _yum_lock for MainThread >03:47:52,178 INFO packaging: gave up _yum_lock for MainThread >03:47:52,179 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,179 INFO packaging: have _yum_lock for MainThread >03:47:52,179 INFO packaging: gave up _yum_lock for MainThread >03:47:52,180 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,180 INFO packaging: have _yum_lock for MainThread >03:47:52,181 INFO packaging: gave up _yum_lock for MainThread >03:47:52,182 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,182 INFO packaging: have _yum_lock for MainThread >03:47:52,182 INFO packaging: gave up _yum_lock for MainThread >03:47:52,183 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:52,184 INFO packaging: have _yum_lock for MainThread >03:47:52,184 INFO packaging: gave up _yum_lock for MainThread >03:47:52,185 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,185 INFO packaging: have _yum_lock for MainThread >03:47:52,186 INFO packaging: gave up _yum_lock for MainThread >03:47:52,187 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,187 INFO packaging: have _yum_lock for MainThread >03:47:52,187 INFO packaging: gave up _yum_lock for MainThread >03:47:52,188 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,188 INFO packaging: have _yum_lock for MainThread >03:47:52,189 INFO packaging: gave up _yum_lock for MainThread >03:47:52,190 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,190 INFO packaging: have _yum_lock for MainThread >03:47:52,190 INFO packaging: gave up _yum_lock for MainThread >03:47:52,191 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,192 INFO packaging: have _yum_lock for MainThread >03:47:52,192 INFO packaging: gave up _yum_lock for MainThread >03:47:52,193 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,193 INFO packaging: have _yum_lock for MainThread >03:47:52,194 INFO packaging: gave up _yum_lock for MainThread >03:47:52,194 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,195 INFO packaging: have _yum_lock for MainThread >03:47:52,195 INFO packaging: gave up _yum_lock for MainThread >03:47:52,196 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,196 INFO packaging: have _yum_lock for MainThread >03:47:52,197 INFO packaging: gave up _yum_lock for MainThread >03:47:52,198 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,198 INFO packaging: have _yum_lock for MainThread >03:47:52,198 INFO packaging: gave up _yum_lock for MainThread >03:47:52,199 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,200 INFO packaging: have _yum_lock for MainThread >03:47:52,200 INFO packaging: gave up _yum_lock for MainThread >03:47:52,201 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,201 INFO packaging: have _yum_lock for MainThread >03:47:52,201 INFO packaging: gave up _yum_lock for MainThread >03:47:52,202 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,203 INFO packaging: have _yum_lock for MainThread >03:47:52,203 INFO packaging: gave up _yum_lock for MainThread >03:47:52,204 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,204 INFO packaging: have _yum_lock for MainThread >03:47:52,205 INFO packaging: gave up _yum_lock for MainThread >03:47:52,206 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,206 INFO packaging: have _yum_lock for MainThread >03:47:52,206 INFO packaging: gave up _yum_lock for MainThread >03:47:52,207 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,207 INFO packaging: have _yum_lock for MainThread >03:47:52,208 INFO packaging: gave up _yum_lock for MainThread >03:47:52,209 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,209 INFO packaging: have _yum_lock for MainThread >03:47:52,209 INFO packaging: gave up _yum_lock for MainThread >03:47:52,210 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,210 INFO packaging: have _yum_lock for MainThread >03:47:52,211 INFO packaging: gave up _yum_lock for MainThread >03:47:52,212 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,212 INFO packaging: have _yum_lock for MainThread >03:47:52,212 INFO packaging: gave up _yum_lock for MainThread >03:47:52,213 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,214 INFO packaging: have _yum_lock for MainThread >03:47:52,214 INFO packaging: gave up _yum_lock for MainThread >03:47:52,215 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,215 INFO packaging: have _yum_lock for MainThread >03:47:52,215 INFO packaging: gave up _yum_lock for MainThread >03:47:52,216 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,217 INFO packaging: have _yum_lock for MainThread >03:47:52,217 INFO packaging: gave up _yum_lock for MainThread >03:47:52,218 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,218 INFO packaging: have _yum_lock for MainThread >03:47:52,218 INFO packaging: gave up _yum_lock for MainThread >03:47:52,219 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,220 INFO packaging: have _yum_lock for MainThread >03:47:52,220 INFO packaging: gave up _yum_lock for MainThread >03:47:52,221 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,221 INFO packaging: have _yum_lock for MainThread >03:47:52,222 INFO packaging: gave up _yum_lock for MainThread >03:47:52,223 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,223 INFO packaging: have _yum_lock for MainThread >03:47:52,223 INFO packaging: gave up _yum_lock for MainThread >03:47:52,224 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,224 INFO packaging: have _yum_lock for MainThread >03:47:52,225 INFO packaging: gave up _yum_lock for MainThread >03:47:52,226 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,226 INFO packaging: have _yum_lock for MainThread >03:47:52,226 INFO packaging: gave up _yum_lock for MainThread >03:47:52,227 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,227 INFO packaging: have _yum_lock for MainThread >03:47:52,228 INFO packaging: gave up _yum_lock for MainThread >03:47:52,229 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,229 INFO packaging: have _yum_lock for MainThread >03:47:52,229 INFO packaging: gave up _yum_lock for MainThread >03:47:52,231 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,231 INFO packaging: have _yum_lock for MainThread >03:47:52,231 INFO packaging: gave up _yum_lock for MainThread >03:47:52,232 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,232 INFO packaging: have _yum_lock for MainThread >03:47:52,233 INFO packaging: gave up _yum_lock for MainThread >03:47:52,234 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,234 INFO packaging: have _yum_lock for MainThread >03:47:52,234 INFO packaging: gave up _yum_lock for MainThread >03:47:52,235 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,235 INFO packaging: have _yum_lock for MainThread >03:47:52,236 INFO packaging: gave up _yum_lock for MainThread >03:47:52,237 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,237 INFO packaging: have _yum_lock for MainThread >03:47:52,237 INFO packaging: gave up _yum_lock for MainThread >03:47:52,238 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,239 INFO packaging: have _yum_lock for MainThread >03:47:52,239 INFO packaging: gave up _yum_lock for MainThread >03:47:52,240 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,240 INFO packaging: have _yum_lock for MainThread >03:47:52,240 INFO packaging: gave up _yum_lock for MainThread >03:47:52,241 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,242 INFO packaging: have _yum_lock for MainThread >03:47:52,242 INFO packaging: gave up _yum_lock for MainThread >03:47:52,243 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,243 INFO packaging: have _yum_lock for MainThread >03:47:52,243 INFO packaging: gave up _yum_lock for MainThread >03:47:52,244 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,245 INFO packaging: have _yum_lock for MainThread >03:47:52,245 INFO packaging: gave up _yum_lock for MainThread >03:47:52,246 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,246 INFO packaging: have _yum_lock for MainThread >03:47:52,247 INFO packaging: gave up _yum_lock for MainThread >03:47:52,247 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,248 INFO packaging: have _yum_lock for MainThread >03:47:52,248 INFO packaging: gave up _yum_lock for MainThread >03:47:52,249 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,249 INFO packaging: have _yum_lock for MainThread >03:47:52,249 INFO packaging: gave up _yum_lock for MainThread >03:47:52,251 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,251 INFO packaging: have _yum_lock for MainThread >03:47:52,251 INFO packaging: gave up _yum_lock for MainThread >03:47:52,252 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,252 INFO packaging: have _yum_lock for MainThread >03:47:52,253 INFO packaging: gave up _yum_lock for MainThread >03:47:52,254 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,254 INFO packaging: have _yum_lock for MainThread >03:47:52,254 INFO packaging: gave up _yum_lock for MainThread >03:47:52,255 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,256 INFO packaging: have _yum_lock for MainThread >03:47:52,256 INFO packaging: gave up _yum_lock for MainThread >03:47:52,257 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,257 INFO packaging: have _yum_lock for MainThread >03:47:52,257 INFO packaging: gave up _yum_lock for MainThread >03:47:52,258 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,259 INFO packaging: have _yum_lock for MainThread >03:47:52,259 INFO packaging: gave up _yum_lock for MainThread >03:47:52,260 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,260 INFO packaging: have _yum_lock for MainThread >03:47:52,260 INFO packaging: gave up _yum_lock for MainThread >03:47:52,261 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,262 INFO packaging: have _yum_lock for MainThread >03:47:52,262 INFO packaging: gave up _yum_lock for MainThread >03:47:52,263 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,263 INFO packaging: have _yum_lock for MainThread >03:47:52,264 INFO packaging: gave up _yum_lock for MainThread >03:47:52,265 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,265 INFO packaging: have _yum_lock for MainThread >03:47:52,265 INFO packaging: gave up _yum_lock for MainThread >03:47:52,266 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,266 INFO packaging: have _yum_lock for MainThread >03:47:52,266 INFO packaging: gave up _yum_lock for MainThread >03:47:52,268 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,268 INFO packaging: have _yum_lock for MainThread >03:47:52,268 INFO packaging: gave up _yum_lock for MainThread >03:47:52,269 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,269 INFO packaging: have _yum_lock for MainThread >03:47:52,270 INFO packaging: gave up _yum_lock for MainThread >03:47:52,271 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,271 INFO packaging: have _yum_lock for MainThread >03:47:52,271 INFO packaging: gave up _yum_lock for MainThread >03:47:52,272 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,272 INFO packaging: have _yum_lock for MainThread >03:47:52,273 INFO packaging: gave up _yum_lock for MainThread >03:47:52,274 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,274 INFO packaging: have _yum_lock for MainThread >03:47:52,274 INFO packaging: gave up _yum_lock for MainThread >03:47:52,275 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,275 INFO packaging: have _yum_lock for MainThread >03:47:52,276 INFO packaging: gave up _yum_lock for MainThread >03:47:52,277 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,277 INFO packaging: have _yum_lock for MainThread >03:47:52,277 INFO packaging: gave up _yum_lock for MainThread >03:47:52,278 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,278 INFO packaging: have _yum_lock for MainThread >03:47:52,279 INFO packaging: gave up _yum_lock for MainThread >03:47:52,280 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,280 INFO packaging: have _yum_lock for MainThread >03:47:52,280 INFO packaging: gave up _yum_lock for MainThread >03:47:52,281 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,282 INFO packaging: have _yum_lock for MainThread >03:47:52,282 INFO packaging: gave up _yum_lock for MainThread >03:47:52,283 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,283 INFO packaging: have _yum_lock for MainThread >03:47:52,283 INFO packaging: gave up _yum_lock for MainThread >03:47:52,284 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,285 INFO packaging: have _yum_lock for MainThread >03:47:52,285 INFO packaging: gave up _yum_lock for MainThread >03:47:52,286 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,286 INFO packaging: have _yum_lock for MainThread >03:47:52,286 INFO packaging: gave up _yum_lock for MainThread >03:47:52,287 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,288 INFO packaging: have _yum_lock for MainThread >03:47:52,288 INFO packaging: gave up _yum_lock for MainThread >03:47:52,289 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,289 INFO packaging: have _yum_lock for MainThread >03:47:52,290 INFO packaging: gave up _yum_lock for MainThread >03:47:52,291 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,291 INFO packaging: have _yum_lock for MainThread >03:47:52,291 INFO packaging: gave up _yum_lock for MainThread >03:47:52,292 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,292 INFO packaging: have _yum_lock for MainThread >03:47:52,293 INFO packaging: gave up _yum_lock for MainThread >03:47:52,294 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,294 INFO packaging: have _yum_lock for MainThread >03:47:52,294 INFO packaging: gave up _yum_lock for MainThread >03:47:52,295 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,295 INFO packaging: have _yum_lock for MainThread >03:47:52,296 INFO packaging: gave up _yum_lock for MainThread >03:47:52,297 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,297 INFO packaging: have _yum_lock for MainThread >03:47:52,297 INFO packaging: gave up _yum_lock for MainThread >03:47:52,298 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,299 INFO packaging: have _yum_lock for MainThread >03:47:52,299 INFO packaging: gave up _yum_lock for MainThread >03:47:52,300 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,300 INFO packaging: have _yum_lock for MainThread >03:47:52,300 INFO packaging: gave up _yum_lock for MainThread >03:47:52,301 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,302 INFO packaging: have _yum_lock for MainThread >03:47:52,302 INFO packaging: gave up _yum_lock for MainThread >03:47:52,303 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,303 INFO packaging: have _yum_lock for MainThread >03:47:52,303 INFO packaging: gave up _yum_lock for MainThread >03:47:52,305 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,305 INFO packaging: have _yum_lock for MainThread >03:47:52,305 INFO packaging: gave up _yum_lock for MainThread >03:47:52,306 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,306 INFO packaging: have _yum_lock for MainThread >03:47:52,306 INFO packaging: gave up _yum_lock for MainThread >03:47:52,308 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,308 INFO packaging: have _yum_lock for MainThread >03:47:52,308 INFO packaging: gave up _yum_lock for MainThread >03:47:52,309 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,309 INFO packaging: have _yum_lock for MainThread >03:47:52,310 INFO packaging: gave up _yum_lock for MainThread >03:47:52,311 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,311 INFO packaging: have _yum_lock for MainThread >03:47:52,311 INFO packaging: gave up _yum_lock for MainThread >03:47:52,312 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,312 INFO packaging: have _yum_lock for MainThread >03:47:52,313 INFO packaging: gave up _yum_lock for MainThread >03:47:52,314 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,314 INFO packaging: have _yum_lock for MainThread >03:47:52,314 INFO packaging: gave up _yum_lock for MainThread >03:47:52,315 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,315 INFO packaging: have _yum_lock for MainThread >03:47:52,316 INFO packaging: gave up _yum_lock for MainThread >03:47:52,317 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,317 INFO packaging: have _yum_lock for MainThread >03:47:52,318 INFO packaging: gave up _yum_lock for MainThread >03:47:52,319 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,319 INFO packaging: have _yum_lock for MainThread >03:47:52,319 INFO packaging: gave up _yum_lock for MainThread >03:47:52,320 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,320 INFO packaging: have _yum_lock for MainThread >03:47:52,321 INFO packaging: gave up _yum_lock for MainThread >03:47:52,322 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,322 INFO packaging: have _yum_lock for MainThread >03:47:52,322 INFO packaging: gave up _yum_lock for MainThread >03:47:52,323 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:52,324 INFO packaging: have _yum_lock for MainThread >03:47:52,324 INFO packaging: gave up _yum_lock for MainThread >03:47:52,325 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:52,325 INFO packaging: have _yum_lock for MainThread >03:47:52,325 INFO packaging: gave up _yum_lock for MainThread >03:47:52,326 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:52,327 INFO packaging: have _yum_lock for MainThread >03:47:52,327 INFO packaging: gave up _yum_lock for MainThread >03:47:52,328 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:52,328 INFO packaging: have _yum_lock for MainThread >03:47:52,328 INFO packaging: gave up _yum_lock for MainThread >03:47:52,330 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:52,330 INFO packaging: have _yum_lock for MainThread >03:47:52,330 INFO packaging: gave up _yum_lock for MainThread >03:47:52,331 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:52,331 INFO packaging: have _yum_lock for MainThread >03:47:52,332 INFO packaging: gave up _yum_lock for MainThread >03:47:52,333 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:52,334 INFO packaging: have _yum_lock for MainThread >03:47:52,334 INFO packaging: gave up _yum_lock for MainThread >03:47:52,335 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:52,335 INFO packaging: have _yum_lock for MainThread >03:47:52,336 INFO packaging: gave up _yum_lock for MainThread >03:47:52,337 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:52,338 INFO packaging: have _yum_lock for MainThread >03:47:52,338 INFO packaging: gave up _yum_lock for MainThread >03:47:52,339 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:52,339 INFO packaging: have _yum_lock for MainThread >03:47:52,340 INFO packaging: gave up _yum_lock for MainThread >03:47:52,341 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:52,342 INFO packaging: have _yum_lock for MainThread >03:47:52,342 INFO packaging: gave up _yum_lock for MainThread >03:47:52,343 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:52,343 INFO packaging: have _yum_lock for MainThread >03:47:52,344 INFO packaging: gave up _yum_lock for MainThread >03:47:52,345 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:52,345 INFO packaging: have _yum_lock for MainThread >03:47:52,345 INFO packaging: gave up _yum_lock for MainThread >03:47:52,347 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:52,347 INFO packaging: have _yum_lock for MainThread >03:47:52,347 INFO packaging: gave up _yum_lock for MainThread >03:47:52,349 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:52,349 INFO packaging: have _yum_lock for MainThread >03:47:52,349 INFO packaging: gave up _yum_lock for MainThread >03:47:52,350 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:52,350 INFO packaging: have _yum_lock for MainThread >03:47:52,351 INFO packaging: gave up _yum_lock for MainThread >03:47:52,352 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:52,353 INFO packaging: have _yum_lock for MainThread >03:47:52,353 INFO packaging: gave up _yum_lock for MainThread >03:47:52,354 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:52,354 INFO packaging: have _yum_lock for MainThread >03:47:52,355 INFO packaging: gave up _yum_lock for MainThread >03:47:52,356 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:52,356 INFO packaging: have _yum_lock for MainThread >03:47:52,357 INFO packaging: gave up _yum_lock for MainThread >03:47:52,358 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:52,358 INFO packaging: have _yum_lock for MainThread >03:47:52,358 INFO packaging: gave up _yum_lock for MainThread >03:47:55,314 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1079 (environmentGroups) >03:47:55,315 INFO packaging: have _yum_lock for MainThread >03:47:55,316 INFO packaging: gave up _yum_lock for MainThread >03:47:55,319 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:337 (on_environment_toggled) >03:47:55,319 INFO packaging: have _yum_lock for MainThread >03:47:55,320 INFO packaging: gave up _yum_lock for MainThread >03:47:55,321 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1114 (groups) >03:47:55,322 INFO packaging: have _yum_lock for MainThread >03:47:55,322 INFO packaging: gave up _yum_lock for MainThread >03:47:55,323 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:286 (refreshAddons) >03:47:55,323 INFO packaging: have _yum_lock for MainThread >03:47:55,324 INFO packaging: gave up _yum_lock for MainThread >03:47:55,325 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,325 INFO packaging: have _yum_lock for MainThread >03:47:55,325 INFO packaging: gave up _yum_lock for MainThread >03:47:55,326 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,327 INFO packaging: have _yum_lock for MainThread >03:47:55,327 INFO packaging: gave up _yum_lock for MainThread >03:47:55,328 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,328 INFO packaging: have _yum_lock for MainThread >03:47:55,329 INFO packaging: gave up _yum_lock for MainThread >03:47:55,330 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,330 INFO packaging: have _yum_lock for MainThread >03:47:55,330 INFO packaging: gave up _yum_lock for MainThread >03:47:55,332 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,332 INFO packaging: have _yum_lock for MainThread >03:47:55,332 INFO packaging: gave up _yum_lock for MainThread >03:47:55,333 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,333 INFO packaging: have _yum_lock for MainThread >03:47:55,334 INFO packaging: gave up _yum_lock for MainThread >03:47:55,335 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,335 INFO packaging: have _yum_lock for MainThread >03:47:55,335 INFO packaging: gave up _yum_lock for MainThread >03:47:55,336 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,337 INFO packaging: have _yum_lock for MainThread >03:47:55,337 INFO packaging: gave up _yum_lock for MainThread >03:47:55,338 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,338 INFO packaging: have _yum_lock for MainThread >03:47:55,339 INFO packaging: gave up _yum_lock for MainThread >03:47:55,340 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,340 INFO packaging: have _yum_lock for MainThread >03:47:55,340 INFO packaging: gave up _yum_lock for MainThread >03:47:55,342 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,342 INFO packaging: have _yum_lock for MainThread >03:47:55,342 INFO packaging: gave up _yum_lock for MainThread >03:47:55,343 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,343 INFO packaging: have _yum_lock for MainThread >03:47:55,344 INFO packaging: gave up _yum_lock for MainThread >03:47:55,345 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,345 INFO packaging: have _yum_lock for MainThread >03:47:55,345 INFO packaging: gave up _yum_lock for MainThread >03:47:55,347 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,347 INFO packaging: have _yum_lock for MainThread >03:47:55,347 INFO packaging: gave up _yum_lock for MainThread >03:47:55,348 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,349 INFO packaging: have _yum_lock for MainThread >03:47:55,349 INFO packaging: gave up _yum_lock for MainThread >03:47:55,350 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,350 INFO packaging: have _yum_lock for MainThread >03:47:55,350 INFO packaging: gave up _yum_lock for MainThread >03:47:55,352 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,352 INFO packaging: have _yum_lock for MainThread >03:47:55,352 INFO packaging: gave up _yum_lock for MainThread >03:47:55,354 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,354 INFO packaging: have _yum_lock for MainThread >03:47:55,354 INFO packaging: gave up _yum_lock for MainThread >03:47:55,355 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,356 INFO packaging: have _yum_lock for MainThread >03:47:55,356 INFO packaging: gave up _yum_lock for MainThread >03:47:55,357 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,358 INFO packaging: have _yum_lock for MainThread >03:47:55,358 INFO packaging: gave up _yum_lock for MainThread >03:47:55,359 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,360 INFO packaging: have _yum_lock for MainThread >03:47:55,360 INFO packaging: gave up _yum_lock for MainThread >03:47:55,361 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,361 INFO packaging: have _yum_lock for MainThread >03:47:55,361 INFO packaging: gave up _yum_lock for MainThread >03:47:55,363 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,363 INFO packaging: have _yum_lock for MainThread >03:47:55,363 INFO packaging: gave up _yum_lock for MainThread >03:47:55,364 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,364 INFO packaging: have _yum_lock for MainThread >03:47:55,365 INFO packaging: gave up _yum_lock for MainThread >03:47:55,366 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,366 INFO packaging: have _yum_lock for MainThread >03:47:55,366 INFO packaging: gave up _yum_lock for MainThread >03:47:55,368 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,368 INFO packaging: have _yum_lock for MainThread >03:47:55,368 INFO packaging: gave up _yum_lock for MainThread >03:47:55,369 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,369 INFO packaging: have _yum_lock for MainThread >03:47:55,370 INFO packaging: gave up _yum_lock for MainThread >03:47:55,371 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,371 INFO packaging: have _yum_lock for MainThread >03:47:55,371 INFO packaging: gave up _yum_lock for MainThread >03:47:55,373 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,373 INFO packaging: have _yum_lock for MainThread >03:47:55,373 INFO packaging: gave up _yum_lock for MainThread >03:47:55,374 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,374 INFO packaging: have _yum_lock for MainThread >03:47:55,375 INFO packaging: gave up _yum_lock for MainThread >03:47:55,376 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,376 INFO packaging: have _yum_lock for MainThread >03:47:55,376 INFO packaging: gave up _yum_lock for MainThread >03:47:55,378 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,378 INFO packaging: have _yum_lock for MainThread >03:47:55,378 INFO packaging: gave up _yum_lock for MainThread >03:47:55,379 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,379 INFO packaging: have _yum_lock for MainThread >03:47:55,380 INFO packaging: gave up _yum_lock for MainThread >03:47:55,381 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,381 INFO packaging: have _yum_lock for MainThread >03:47:55,381 INFO packaging: gave up _yum_lock for MainThread >03:47:55,382 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,383 INFO packaging: have _yum_lock for MainThread >03:47:55,383 INFO packaging: gave up _yum_lock for MainThread >03:47:55,384 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,385 INFO packaging: have _yum_lock for MainThread >03:47:55,385 INFO packaging: gave up _yum_lock for MainThread >03:47:55,386 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,386 INFO packaging: have _yum_lock for MainThread >03:47:55,386 INFO packaging: gave up _yum_lock for MainThread >03:47:55,388 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,388 INFO packaging: have _yum_lock for MainThread >03:47:55,388 INFO packaging: gave up _yum_lock for MainThread >03:47:55,389 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,390 INFO packaging: have _yum_lock for MainThread >03:47:55,390 INFO packaging: gave up _yum_lock for MainThread >03:47:55,391 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,391 INFO packaging: have _yum_lock for MainThread >03:47:55,391 INFO packaging: gave up _yum_lock for MainThread >03:47:55,393 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,393 INFO packaging: have _yum_lock for MainThread >03:47:55,393 INFO packaging: gave up _yum_lock for MainThread >03:47:55,394 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,395 INFO packaging: have _yum_lock for MainThread >03:47:55,395 INFO packaging: gave up _yum_lock for MainThread >03:47:55,396 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,397 INFO packaging: have _yum_lock for MainThread >03:47:55,397 INFO packaging: gave up _yum_lock for MainThread >03:47:55,398 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,398 INFO packaging: have _yum_lock for MainThread >03:47:55,398 INFO packaging: gave up _yum_lock for MainThread >03:47:55,400 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:55,400 INFO packaging: have _yum_lock for MainThread >03:47:55,400 INFO packaging: gave up _yum_lock for MainThread >03:47:55,401 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,402 INFO packaging: have _yum_lock for MainThread >03:47:55,402 INFO packaging: gave up _yum_lock for MainThread >03:47:55,403 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,403 INFO packaging: have _yum_lock for MainThread >03:47:55,404 INFO packaging: gave up _yum_lock for MainThread >03:47:55,405 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,405 INFO packaging: have _yum_lock for MainThread >03:47:55,405 INFO packaging: gave up _yum_lock for MainThread >03:47:55,406 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,407 INFO packaging: have _yum_lock for MainThread >03:47:55,407 INFO packaging: gave up _yum_lock for MainThread >03:47:55,408 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,408 INFO packaging: have _yum_lock for MainThread >03:47:55,409 INFO packaging: gave up _yum_lock for MainThread >03:47:55,410 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,410 INFO packaging: have _yum_lock for MainThread >03:47:55,410 INFO packaging: gave up _yum_lock for MainThread >03:47:55,411 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,412 INFO packaging: have _yum_lock for MainThread >03:47:55,412 INFO packaging: gave up _yum_lock for MainThread >03:47:55,413 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,413 INFO packaging: have _yum_lock for MainThread >03:47:55,414 INFO packaging: gave up _yum_lock for MainThread >03:47:55,415 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,415 INFO packaging: have _yum_lock for MainThread >03:47:55,415 INFO packaging: gave up _yum_lock for MainThread >03:47:55,416 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,417 INFO packaging: have _yum_lock for MainThread >03:47:55,417 INFO packaging: gave up _yum_lock for MainThread >03:47:55,418 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,418 INFO packaging: have _yum_lock for MainThread >03:47:55,419 INFO packaging: gave up _yum_lock for MainThread >03:47:55,420 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,420 INFO packaging: have _yum_lock for MainThread >03:47:55,420 INFO packaging: gave up _yum_lock for MainThread >03:47:55,421 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,422 INFO packaging: have _yum_lock for MainThread >03:47:55,422 INFO packaging: gave up _yum_lock for MainThread >03:47:55,423 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,423 INFO packaging: have _yum_lock for MainThread >03:47:55,424 INFO packaging: gave up _yum_lock for MainThread >03:47:55,425 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,425 INFO packaging: have _yum_lock for MainThread >03:47:55,425 INFO packaging: gave up _yum_lock for MainThread >03:47:55,426 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,427 INFO packaging: have _yum_lock for MainThread >03:47:55,427 INFO packaging: gave up _yum_lock for MainThread >03:47:55,428 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,428 INFO packaging: have _yum_lock for MainThread >03:47:55,429 INFO packaging: gave up _yum_lock for MainThread >03:47:55,430 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:55,430 INFO packaging: have _yum_lock for MainThread >03:47:55,430 INFO packaging: gave up _yum_lock for MainThread >03:47:55,431 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,432 INFO packaging: have _yum_lock for MainThread >03:47:55,432 INFO packaging: gave up _yum_lock for MainThread >03:47:55,433 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,434 INFO packaging: have _yum_lock for MainThread >03:47:55,434 INFO packaging: gave up _yum_lock for MainThread >03:47:55,435 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,435 INFO packaging: have _yum_lock for MainThread >03:47:55,435 INFO packaging: gave up _yum_lock for MainThread >03:47:55,437 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,437 INFO packaging: have _yum_lock for MainThread >03:47:55,437 INFO packaging: gave up _yum_lock for MainThread >03:47:55,438 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,438 INFO packaging: have _yum_lock for MainThread >03:47:55,439 INFO packaging: gave up _yum_lock for MainThread >03:47:55,440 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,440 INFO packaging: have _yum_lock for MainThread >03:47:55,440 INFO packaging: gave up _yum_lock for MainThread >03:47:55,441 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,441 INFO packaging: have _yum_lock for MainThread >03:47:55,442 INFO packaging: gave up _yum_lock for MainThread >03:47:55,443 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,443 INFO packaging: have _yum_lock for MainThread >03:47:55,443 INFO packaging: gave up _yum_lock for MainThread >03:47:55,444 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,445 INFO packaging: have _yum_lock for MainThread >03:47:55,445 INFO packaging: gave up _yum_lock for MainThread >03:47:55,446 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:55,446 INFO packaging: have _yum_lock for MainThread >03:47:55,447 INFO packaging: gave up _yum_lock for MainThread >03:47:55,448 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,448 INFO packaging: have _yum_lock for MainThread >03:47:55,448 INFO packaging: gave up _yum_lock for MainThread >03:47:55,449 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,450 INFO packaging: have _yum_lock for MainThread >03:47:55,450 INFO packaging: gave up _yum_lock for MainThread >03:47:55,451 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,451 INFO packaging: have _yum_lock for MainThread >03:47:55,451 INFO packaging: gave up _yum_lock for MainThread >03:47:55,453 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,453 INFO packaging: have _yum_lock for MainThread >03:47:55,453 INFO packaging: gave up _yum_lock for MainThread >03:47:55,454 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,454 INFO packaging: have _yum_lock for MainThread >03:47:55,455 INFO packaging: gave up _yum_lock for MainThread >03:47:55,456 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,456 INFO packaging: have _yum_lock for MainThread >03:47:55,456 INFO packaging: gave up _yum_lock for MainThread >03:47:55,457 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,458 INFO packaging: have _yum_lock for MainThread >03:47:55,458 INFO packaging: gave up _yum_lock for MainThread >03:47:55,459 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,459 INFO packaging: have _yum_lock for MainThread >03:47:55,460 INFO packaging: gave up _yum_lock for MainThread >03:47:55,461 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,461 INFO packaging: have _yum_lock for MainThread >03:47:55,461 INFO packaging: gave up _yum_lock for MainThread >03:47:55,462 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,463 INFO packaging: have _yum_lock for MainThread >03:47:55,463 INFO packaging: gave up _yum_lock for MainThread >03:47:55,464 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,464 INFO packaging: have _yum_lock for MainThread >03:47:55,465 INFO packaging: gave up _yum_lock for MainThread >03:47:55,466 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,466 INFO packaging: have _yum_lock for MainThread >03:47:55,466 INFO packaging: gave up _yum_lock for MainThread >03:47:55,467 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,467 INFO packaging: have _yum_lock for MainThread >03:47:55,468 INFO packaging: gave up _yum_lock for MainThread >03:47:55,469 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,469 INFO packaging: have _yum_lock for MainThread >03:47:55,469 INFO packaging: gave up _yum_lock for MainThread >03:47:55,470 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,471 INFO packaging: have _yum_lock for MainThread >03:47:55,471 INFO packaging: gave up _yum_lock for MainThread >03:47:55,472 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,472 INFO packaging: have _yum_lock for MainThread >03:47:55,473 INFO packaging: gave up _yum_lock for MainThread >03:47:55,474 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,474 INFO packaging: have _yum_lock for MainThread >03:47:55,474 INFO packaging: gave up _yum_lock for MainThread >03:47:55,475 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,475 INFO packaging: have _yum_lock for MainThread >03:47:55,475 INFO packaging: gave up _yum_lock for MainThread >03:47:55,477 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,477 INFO packaging: have _yum_lock for MainThread >03:47:55,477 INFO packaging: gave up _yum_lock for MainThread >03:47:55,478 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,478 INFO packaging: have _yum_lock for MainThread >03:47:55,479 INFO packaging: gave up _yum_lock for MainThread >03:47:55,480 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,480 INFO packaging: have _yum_lock for MainThread >03:47:55,480 INFO packaging: gave up _yum_lock for MainThread >03:47:55,481 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,481 INFO packaging: have _yum_lock for MainThread >03:47:55,482 INFO packaging: gave up _yum_lock for MainThread >03:47:55,483 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,483 INFO packaging: have _yum_lock for MainThread >03:47:55,483 INFO packaging: gave up _yum_lock for MainThread >03:47:55,484 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,485 INFO packaging: have _yum_lock for MainThread >03:47:55,485 INFO packaging: gave up _yum_lock for MainThread >03:47:55,486 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,486 INFO packaging: have _yum_lock for MainThread >03:47:55,487 INFO packaging: gave up _yum_lock for MainThread >03:47:55,488 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,488 INFO packaging: have _yum_lock for MainThread >03:47:55,488 INFO packaging: gave up _yum_lock for MainThread >03:47:55,489 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,489 INFO packaging: have _yum_lock for MainThread >03:47:55,490 INFO packaging: gave up _yum_lock for MainThread >03:47:55,491 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,491 INFO packaging: have _yum_lock for MainThread >03:47:55,491 INFO packaging: gave up _yum_lock for MainThread >03:47:55,492 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,493 INFO packaging: have _yum_lock for MainThread >03:47:55,493 INFO packaging: gave up _yum_lock for MainThread >03:47:55,494 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,494 INFO packaging: have _yum_lock for MainThread >03:47:55,495 INFO packaging: gave up _yum_lock for MainThread >03:47:55,496 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,496 INFO packaging: have _yum_lock for MainThread >03:47:55,496 INFO packaging: gave up _yum_lock for MainThread >03:47:55,497 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,497 INFO packaging: have _yum_lock for MainThread >03:47:55,498 INFO packaging: gave up _yum_lock for MainThread >03:47:55,499 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,499 INFO packaging: have _yum_lock for MainThread >03:47:55,499 INFO packaging: gave up _yum_lock for MainThread >03:47:55,500 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:55,501 INFO packaging: have _yum_lock for MainThread >03:47:55,501 INFO packaging: gave up _yum_lock for MainThread >03:47:55,502 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,502 INFO packaging: have _yum_lock for MainThread >03:47:55,502 INFO packaging: gave up _yum_lock for MainThread >03:47:55,503 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,503 INFO packaging: have _yum_lock for MainThread >03:47:55,504 INFO packaging: gave up _yum_lock for MainThread >03:47:55,505 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,505 INFO packaging: have _yum_lock for MainThread >03:47:55,505 INFO packaging: gave up _yum_lock for MainThread >03:47:55,506 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,506 INFO packaging: have _yum_lock for MainThread >03:47:55,507 INFO packaging: gave up _yum_lock for MainThread >03:47:55,508 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,508 INFO packaging: have _yum_lock for MainThread >03:47:55,508 INFO packaging: gave up _yum_lock for MainThread >03:47:55,509 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,509 INFO packaging: have _yum_lock for MainThread >03:47:55,510 INFO packaging: gave up _yum_lock for MainThread >03:47:55,511 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,511 INFO packaging: have _yum_lock for MainThread >03:47:55,511 INFO packaging: gave up _yum_lock for MainThread >03:47:55,512 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,512 INFO packaging: have _yum_lock for MainThread >03:47:55,513 INFO packaging: gave up _yum_lock for MainThread >03:47:55,514 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,514 INFO packaging: have _yum_lock for MainThread >03:47:55,514 INFO packaging: gave up _yum_lock for MainThread >03:47:55,515 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,516 INFO packaging: have _yum_lock for MainThread >03:47:55,516 INFO packaging: gave up _yum_lock for MainThread >03:47:55,517 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,517 INFO packaging: have _yum_lock for MainThread >03:47:55,517 INFO packaging: gave up _yum_lock for MainThread >03:47:55,519 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,519 INFO packaging: have _yum_lock for MainThread >03:47:55,519 INFO packaging: gave up _yum_lock for MainThread >03:47:55,520 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,520 INFO packaging: have _yum_lock for MainThread >03:47:55,521 INFO packaging: gave up _yum_lock for MainThread >03:47:55,522 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,522 INFO packaging: have _yum_lock for MainThread >03:47:55,522 INFO packaging: gave up _yum_lock for MainThread >03:47:55,523 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,524 INFO packaging: have _yum_lock for MainThread >03:47:55,524 INFO packaging: gave up _yum_lock for MainThread >03:47:55,525 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,525 INFO packaging: have _yum_lock for MainThread >03:47:55,526 INFO packaging: gave up _yum_lock for MainThread >03:47:55,527 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,527 INFO packaging: have _yum_lock for MainThread >03:47:55,527 INFO packaging: gave up _yum_lock for MainThread >03:47:55,528 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,529 INFO packaging: have _yum_lock for MainThread >03:47:55,529 INFO packaging: gave up _yum_lock for MainThread >03:47:55,530 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,530 INFO packaging: have _yum_lock for MainThread >03:47:55,530 INFO packaging: gave up _yum_lock for MainThread >03:47:55,532 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,532 INFO packaging: have _yum_lock for MainThread >03:47:55,532 INFO packaging: gave up _yum_lock for MainThread >03:47:55,533 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,533 INFO packaging: have _yum_lock for MainThread >03:47:55,534 INFO packaging: gave up _yum_lock for MainThread >03:47:55,535 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,535 INFO packaging: have _yum_lock for MainThread >03:47:55,535 INFO packaging: gave up _yum_lock for MainThread >03:47:55,536 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,537 INFO packaging: have _yum_lock for MainThread >03:47:55,537 INFO packaging: gave up _yum_lock for MainThread >03:47:55,538 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,538 INFO packaging: have _yum_lock for MainThread >03:47:55,538 INFO packaging: gave up _yum_lock for MainThread >03:47:55,539 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,540 INFO packaging: have _yum_lock for MainThread >03:47:55,540 INFO packaging: gave up _yum_lock for MainThread >03:47:55,541 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,542 INFO packaging: have _yum_lock for MainThread >03:47:55,542 INFO packaging: gave up _yum_lock for MainThread >03:47:55,543 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,543 INFO packaging: have _yum_lock for MainThread >03:47:55,543 INFO packaging: gave up _yum_lock for MainThread >03:47:55,545 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,545 INFO packaging: have _yum_lock for MainThread >03:47:55,545 INFO packaging: gave up _yum_lock for MainThread >03:47:55,546 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,546 INFO packaging: have _yum_lock for MainThread >03:47:55,547 INFO packaging: gave up _yum_lock for MainThread >03:47:55,548 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,548 INFO packaging: have _yum_lock for MainThread >03:47:55,548 INFO packaging: gave up _yum_lock for MainThread >03:47:55,549 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,549 INFO packaging: have _yum_lock for MainThread >03:47:55,550 INFO packaging: gave up _yum_lock for MainThread >03:47:55,551 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,551 INFO packaging: have _yum_lock for MainThread >03:47:55,551 INFO packaging: gave up _yum_lock for MainThread >03:47:55,552 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,553 INFO packaging: have _yum_lock for MainThread >03:47:55,553 INFO packaging: gave up _yum_lock for MainThread >03:47:55,554 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,554 INFO packaging: have _yum_lock for MainThread >03:47:55,555 INFO packaging: gave up _yum_lock for MainThread >03:47:55,556 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,556 INFO packaging: have _yum_lock for MainThread >03:47:55,556 INFO packaging: gave up _yum_lock for MainThread >03:47:55,557 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,558 INFO packaging: have _yum_lock for MainThread >03:47:55,558 INFO packaging: gave up _yum_lock for MainThread >03:47:55,559 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,559 INFO packaging: have _yum_lock for MainThread >03:47:55,559 INFO packaging: gave up _yum_lock for MainThread >03:47:55,561 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,561 INFO packaging: have _yum_lock for MainThread >03:47:55,561 INFO packaging: gave up _yum_lock for MainThread >03:47:55,562 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,562 INFO packaging: have _yum_lock for MainThread >03:47:55,563 INFO packaging: gave up _yum_lock for MainThread >03:47:55,564 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,564 INFO packaging: have _yum_lock for MainThread >03:47:55,564 INFO packaging: gave up _yum_lock for MainThread >03:47:55,565 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,566 INFO packaging: have _yum_lock for MainThread >03:47:55,566 INFO packaging: gave up _yum_lock for MainThread >03:47:55,567 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,567 INFO packaging: have _yum_lock for MainThread >03:47:55,568 INFO packaging: gave up _yum_lock for MainThread >03:47:55,569 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,569 INFO packaging: have _yum_lock for MainThread >03:47:55,569 INFO packaging: gave up _yum_lock for MainThread >03:47:55,570 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,570 INFO packaging: have _yum_lock for MainThread >03:47:55,571 INFO packaging: gave up _yum_lock for MainThread >03:47:55,572 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,572 INFO packaging: have _yum_lock for MainThread >03:47:55,572 INFO packaging: gave up _yum_lock for MainThread >03:47:55,573 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,574 INFO packaging: have _yum_lock for MainThread >03:47:55,574 INFO packaging: gave up _yum_lock for MainThread >03:47:55,575 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,575 INFO packaging: have _yum_lock for MainThread >03:47:55,576 INFO packaging: gave up _yum_lock for MainThread >03:47:55,577 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,577 INFO packaging: have _yum_lock for MainThread >03:47:55,577 INFO packaging: gave up _yum_lock for MainThread >03:47:55,578 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,578 INFO packaging: have _yum_lock for MainThread >03:47:55,579 INFO packaging: gave up _yum_lock for MainThread >03:47:55,580 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,580 INFO packaging: have _yum_lock for MainThread >03:47:55,580 INFO packaging: gave up _yum_lock for MainThread >03:47:55,582 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,582 INFO packaging: have _yum_lock for MainThread >03:47:55,582 INFO packaging: gave up _yum_lock for MainThread >03:47:55,583 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,584 INFO packaging: have _yum_lock for MainThread >03:47:55,584 INFO packaging: gave up _yum_lock for MainThread >03:47:55,585 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,585 INFO packaging: have _yum_lock for MainThread >03:47:55,585 INFO packaging: gave up _yum_lock for MainThread >03:47:55,586 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,587 INFO packaging: have _yum_lock for MainThread >03:47:55,587 INFO packaging: gave up _yum_lock for MainThread >03:47:55,588 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,588 INFO packaging: have _yum_lock for MainThread >03:47:55,589 INFO packaging: gave up _yum_lock for MainThread >03:47:55,590 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,590 INFO packaging: have _yum_lock for MainThread >03:47:55,590 INFO packaging: gave up _yum_lock for MainThread >03:47:55,591 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,591 INFO packaging: have _yum_lock for MainThread >03:47:55,592 INFO packaging: gave up _yum_lock for MainThread >03:47:55,593 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,593 INFO packaging: have _yum_lock for MainThread >03:47:55,593 INFO packaging: gave up _yum_lock for MainThread >03:47:55,594 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,595 INFO packaging: have _yum_lock for MainThread >03:47:55,595 INFO packaging: gave up _yum_lock for MainThread >03:47:55,596 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,596 INFO packaging: have _yum_lock for MainThread >03:47:55,596 INFO packaging: gave up _yum_lock for MainThread >03:47:55,597 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,598 INFO packaging: have _yum_lock for MainThread >03:47:55,598 INFO packaging: gave up _yum_lock for MainThread >03:47:55,599 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,599 INFO packaging: have _yum_lock for MainThread >03:47:55,599 INFO packaging: gave up _yum_lock for MainThread >03:47:55,601 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,601 INFO packaging: have _yum_lock for MainThread >03:47:55,601 INFO packaging: gave up _yum_lock for MainThread >03:47:55,602 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,602 INFO packaging: have _yum_lock for MainThread >03:47:55,603 INFO packaging: gave up _yum_lock for MainThread >03:47:55,604 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,604 INFO packaging: have _yum_lock for MainThread >03:47:55,604 INFO packaging: gave up _yum_lock for MainThread >03:47:55,605 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,605 INFO packaging: have _yum_lock for MainThread >03:47:55,606 INFO packaging: gave up _yum_lock for MainThread >03:47:55,607 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,607 INFO packaging: have _yum_lock for MainThread >03:47:55,607 INFO packaging: gave up _yum_lock for MainThread >03:47:55,608 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,609 INFO packaging: have _yum_lock for MainThread >03:47:55,609 INFO packaging: gave up _yum_lock for MainThread >03:47:55,610 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,610 INFO packaging: have _yum_lock for MainThread >03:47:55,610 INFO packaging: gave up _yum_lock for MainThread >03:47:55,611 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,612 INFO packaging: have _yum_lock for MainThread >03:47:55,612 INFO packaging: gave up _yum_lock for MainThread >03:47:55,613 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,613 INFO packaging: have _yum_lock for MainThread >03:47:55,614 INFO packaging: gave up _yum_lock for MainThread >03:47:55,615 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,615 INFO packaging: have _yum_lock for MainThread >03:47:55,615 INFO packaging: gave up _yum_lock for MainThread >03:47:55,616 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,616 INFO packaging: have _yum_lock for MainThread >03:47:55,617 INFO packaging: gave up _yum_lock for MainThread >03:47:55,618 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,618 INFO packaging: have _yum_lock for MainThread >03:47:55,618 INFO packaging: gave up _yum_lock for MainThread >03:47:55,619 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,619 INFO packaging: have _yum_lock for MainThread >03:47:55,620 INFO packaging: gave up _yum_lock for MainThread >03:47:55,621 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,621 INFO packaging: have _yum_lock for MainThread >03:47:55,621 INFO packaging: gave up _yum_lock for MainThread >03:47:55,622 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,623 INFO packaging: have _yum_lock for MainThread >03:47:55,623 INFO packaging: gave up _yum_lock for MainThread >03:47:55,624 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,624 INFO packaging: have _yum_lock for MainThread >03:47:55,625 INFO packaging: gave up _yum_lock for MainThread >03:47:55,625 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,626 INFO packaging: have _yum_lock for MainThread >03:47:55,626 INFO packaging: gave up _yum_lock for MainThread >03:47:55,627 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,627 INFO packaging: have _yum_lock for MainThread >03:47:55,627 INFO packaging: gave up _yum_lock for MainThread >03:47:55,628 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,629 INFO packaging: have _yum_lock for MainThread >03:47:55,629 INFO packaging: gave up _yum_lock for MainThread >03:47:55,630 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,630 INFO packaging: have _yum_lock for MainThread >03:47:55,631 INFO packaging: gave up _yum_lock for MainThread >03:47:55,632 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,632 INFO packaging: have _yum_lock for MainThread >03:47:55,632 INFO packaging: gave up _yum_lock for MainThread >03:47:55,633 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,633 INFO packaging: have _yum_lock for MainThread >03:47:55,634 INFO packaging: gave up _yum_lock for MainThread >03:47:55,635 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,635 INFO packaging: have _yum_lock for MainThread >03:47:55,635 INFO packaging: gave up _yum_lock for MainThread >03:47:55,637 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,637 INFO packaging: have _yum_lock for MainThread >03:47:55,637 INFO packaging: gave up _yum_lock for MainThread >03:47:55,638 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,639 INFO packaging: have _yum_lock for MainThread >03:47:55,639 INFO packaging: gave up _yum_lock for MainThread >03:47:55,640 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,640 INFO packaging: have _yum_lock for MainThread >03:47:55,640 INFO packaging: gave up _yum_lock for MainThread >03:47:55,641 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,642 INFO packaging: have _yum_lock for MainThread >03:47:55,642 INFO packaging: gave up _yum_lock for MainThread >03:47:55,643 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,643 INFO packaging: have _yum_lock for MainThread >03:47:55,644 INFO packaging: gave up _yum_lock for MainThread >03:47:55,645 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,645 INFO packaging: have _yum_lock for MainThread >03:47:55,645 INFO packaging: gave up _yum_lock for MainThread >03:47:55,646 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,647 INFO packaging: have _yum_lock for MainThread >03:47:55,647 INFO packaging: gave up _yum_lock for MainThread >03:47:55,648 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,648 INFO packaging: have _yum_lock for MainThread >03:47:55,648 INFO packaging: gave up _yum_lock for MainThread >03:47:55,650 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,650 INFO packaging: have _yum_lock for MainThread >03:47:55,650 INFO packaging: gave up _yum_lock for MainThread >03:47:55,651 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,651 INFO packaging: have _yum_lock for MainThread >03:47:55,652 INFO packaging: gave up _yum_lock for MainThread >03:47:55,653 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,653 INFO packaging: have _yum_lock for MainThread >03:47:55,654 INFO packaging: gave up _yum_lock for MainThread >03:47:55,655 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,655 INFO packaging: have _yum_lock for MainThread >03:47:55,655 INFO packaging: gave up _yum_lock for MainThread >03:47:55,656 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,657 INFO packaging: have _yum_lock for MainThread >03:47:55,657 INFO packaging: gave up _yum_lock for MainThread >03:47:55,658 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,658 INFO packaging: have _yum_lock for MainThread >03:47:55,658 INFO packaging: gave up _yum_lock for MainThread >03:47:55,660 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,660 INFO packaging: have _yum_lock for MainThread >03:47:55,660 INFO packaging: gave up _yum_lock for MainThread >03:47:55,661 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,661 INFO packaging: have _yum_lock for MainThread >03:47:55,662 INFO packaging: gave up _yum_lock for MainThread >03:47:55,663 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,663 INFO packaging: have _yum_lock for MainThread >03:47:55,663 INFO packaging: gave up _yum_lock for MainThread >03:47:55,664 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,664 INFO packaging: have _yum_lock for MainThread >03:47:55,665 INFO packaging: gave up _yum_lock for MainThread >03:47:55,666 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,666 INFO packaging: have _yum_lock for MainThread >03:47:55,666 INFO packaging: gave up _yum_lock for MainThread >03:47:55,667 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,668 INFO packaging: have _yum_lock for MainThread >03:47:55,668 INFO packaging: gave up _yum_lock for MainThread >03:47:55,669 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,669 INFO packaging: have _yum_lock for MainThread >03:47:55,670 INFO packaging: gave up _yum_lock for MainThread >03:47:55,671 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,671 INFO packaging: have _yum_lock for MainThread >03:47:55,671 INFO packaging: gave up _yum_lock for MainThread >03:47:55,672 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,672 INFO packaging: have _yum_lock for MainThread >03:47:55,673 INFO packaging: gave up _yum_lock for MainThread >03:47:55,674 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,674 INFO packaging: have _yum_lock for MainThread >03:47:55,674 INFO packaging: gave up _yum_lock for MainThread >03:47:55,675 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,676 INFO packaging: have _yum_lock for MainThread >03:47:55,676 INFO packaging: gave up _yum_lock for MainThread >03:47:55,677 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,677 INFO packaging: have _yum_lock for MainThread >03:47:55,678 INFO packaging: gave up _yum_lock for MainThread >03:47:55,679 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,679 INFO packaging: have _yum_lock for MainThread >03:47:55,679 INFO packaging: gave up _yum_lock for MainThread >03:47:55,680 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,681 INFO packaging: have _yum_lock for MainThread >03:47:55,681 INFO packaging: gave up _yum_lock for MainThread >03:47:55,682 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,682 INFO packaging: have _yum_lock for MainThread >03:47:55,682 INFO packaging: gave up _yum_lock for MainThread >03:47:55,683 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,684 INFO packaging: have _yum_lock for MainThread >03:47:55,684 INFO packaging: gave up _yum_lock for MainThread >03:47:55,685 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,685 INFO packaging: have _yum_lock for MainThread >03:47:55,686 INFO packaging: gave up _yum_lock for MainThread >03:47:55,687 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,687 INFO packaging: have _yum_lock for MainThread >03:47:55,687 INFO packaging: gave up _yum_lock for MainThread >03:47:55,688 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,689 INFO packaging: have _yum_lock for MainThread >03:47:55,689 INFO packaging: gave up _yum_lock for MainThread >03:47:55,690 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,690 INFO packaging: have _yum_lock for MainThread >03:47:55,690 INFO packaging: gave up _yum_lock for MainThread >03:47:55,692 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,692 INFO packaging: have _yum_lock for MainThread >03:47:55,692 INFO packaging: gave up _yum_lock for MainThread >03:47:55,693 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,694 INFO packaging: have _yum_lock for MainThread >03:47:55,694 INFO packaging: gave up _yum_lock for MainThread >03:47:55,695 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,695 INFO packaging: have _yum_lock for MainThread >03:47:55,695 INFO packaging: gave up _yum_lock for MainThread >03:47:55,696 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,697 INFO packaging: have _yum_lock for MainThread >03:47:55,697 INFO packaging: gave up _yum_lock for MainThread >03:47:55,698 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,698 INFO packaging: have _yum_lock for MainThread >03:47:55,698 INFO packaging: gave up _yum_lock for MainThread >03:47:55,699 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,700 INFO packaging: have _yum_lock for MainThread >03:47:55,700 INFO packaging: gave up _yum_lock for MainThread >03:47:55,701 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,701 INFO packaging: have _yum_lock for MainThread >03:47:55,701 INFO packaging: gave up _yum_lock for MainThread >03:47:55,702 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,703 INFO packaging: have _yum_lock for MainThread >03:47:55,703 INFO packaging: gave up _yum_lock for MainThread >03:47:55,704 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,704 INFO packaging: have _yum_lock for MainThread >03:47:55,705 INFO packaging: gave up _yum_lock for MainThread >03:47:55,706 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,706 INFO packaging: have _yum_lock for MainThread >03:47:55,706 INFO packaging: gave up _yum_lock for MainThread >03:47:55,707 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:55,707 INFO packaging: have _yum_lock for MainThread >03:47:55,708 INFO packaging: gave up _yum_lock for MainThread >03:47:55,709 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,709 INFO packaging: have _yum_lock for MainThread >03:47:55,709 INFO packaging: gave up _yum_lock for MainThread >03:47:55,710 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,710 INFO packaging: have _yum_lock for MainThread >03:47:55,711 INFO packaging: gave up _yum_lock for MainThread >03:47:55,712 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,712 INFO packaging: have _yum_lock for MainThread >03:47:55,712 INFO packaging: gave up _yum_lock for MainThread >03:47:55,713 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,714 INFO packaging: have _yum_lock for MainThread >03:47:55,714 INFO packaging: gave up _yum_lock for MainThread >03:47:55,715 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,715 INFO packaging: have _yum_lock for MainThread >03:47:55,715 INFO packaging: gave up _yum_lock for MainThread >03:47:55,717 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,717 INFO packaging: have _yum_lock for MainThread >03:47:55,717 INFO packaging: gave up _yum_lock for MainThread >03:47:55,718 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,719 INFO packaging: have _yum_lock for MainThread >03:47:55,719 INFO packaging: gave up _yum_lock for MainThread >03:47:55,720 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,720 INFO packaging: have _yum_lock for MainThread >03:47:55,720 INFO packaging: gave up _yum_lock for MainThread >03:47:55,721 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,722 INFO packaging: have _yum_lock for MainThread >03:47:55,722 INFO packaging: gave up _yum_lock for MainThread >03:47:55,723 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,723 INFO packaging: have _yum_lock for MainThread >03:47:55,723 INFO packaging: gave up _yum_lock for MainThread >03:47:55,724 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,725 INFO packaging: have _yum_lock for MainThread >03:47:55,725 INFO packaging: gave up _yum_lock for MainThread >03:47:55,726 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,726 INFO packaging: have _yum_lock for MainThread >03:47:55,727 INFO packaging: gave up _yum_lock for MainThread >03:47:55,728 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,728 INFO packaging: have _yum_lock for MainThread >03:47:55,728 INFO packaging: gave up _yum_lock for MainThread >03:47:55,729 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,729 INFO packaging: have _yum_lock for MainThread >03:47:55,730 INFO packaging: gave up _yum_lock for MainThread >03:47:55,731 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,731 INFO packaging: have _yum_lock for MainThread >03:47:55,731 INFO packaging: gave up _yum_lock for MainThread >03:47:55,732 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,732 INFO packaging: have _yum_lock for MainThread >03:47:55,733 INFO packaging: gave up _yum_lock for MainThread >03:47:55,734 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,734 INFO packaging: have _yum_lock for MainThread >03:47:55,734 INFO packaging: gave up _yum_lock for MainThread >03:47:55,736 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,736 INFO packaging: have _yum_lock for MainThread >03:47:55,736 INFO packaging: gave up _yum_lock for MainThread >03:47:55,737 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,737 INFO packaging: have _yum_lock for MainThread >03:47:55,738 INFO packaging: gave up _yum_lock for MainThread >03:47:55,739 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,739 INFO packaging: have _yum_lock for MainThread >03:47:55,739 INFO packaging: gave up _yum_lock for MainThread >03:47:55,740 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,741 INFO packaging: have _yum_lock for MainThread >03:47:55,741 INFO packaging: gave up _yum_lock for MainThread >03:47:55,742 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,742 INFO packaging: have _yum_lock for MainThread >03:47:55,743 INFO packaging: gave up _yum_lock for MainThread >03:47:55,744 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,744 INFO packaging: have _yum_lock for MainThread >03:47:55,744 INFO packaging: gave up _yum_lock for MainThread >03:47:55,746 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,746 INFO packaging: have _yum_lock for MainThread >03:47:55,746 INFO packaging: gave up _yum_lock for MainThread >03:47:55,747 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,747 INFO packaging: have _yum_lock for MainThread >03:47:55,748 INFO packaging: gave up _yum_lock for MainThread >03:47:55,749 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,749 INFO packaging: have _yum_lock for MainThread >03:47:55,749 INFO packaging: gave up _yum_lock for MainThread >03:47:55,750 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,750 INFO packaging: have _yum_lock for MainThread >03:47:55,751 INFO packaging: gave up _yum_lock for MainThread >03:47:55,752 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,752 INFO packaging: have _yum_lock for MainThread >03:47:55,753 INFO packaging: gave up _yum_lock for MainThread >03:47:55,754 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,754 INFO packaging: have _yum_lock for MainThread >03:47:55,754 INFO packaging: gave up _yum_lock for MainThread >03:47:55,755 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,755 INFO packaging: have _yum_lock for MainThread >03:47:55,756 INFO packaging: gave up _yum_lock for MainThread >03:47:55,757 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,757 INFO packaging: have _yum_lock for MainThread >03:47:55,757 INFO packaging: gave up _yum_lock for MainThread >03:47:55,758 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,758 INFO packaging: have _yum_lock for MainThread >03:47:55,759 INFO packaging: gave up _yum_lock for MainThread >03:47:55,760 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,760 INFO packaging: have _yum_lock for MainThread >03:47:55,760 INFO packaging: gave up _yum_lock for MainThread >03:47:55,761 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,762 INFO packaging: have _yum_lock for MainThread >03:47:55,762 INFO packaging: gave up _yum_lock for MainThread >03:47:55,763 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,763 INFO packaging: have _yum_lock for MainThread >03:47:55,763 INFO packaging: gave up _yum_lock for MainThread >03:47:55,765 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,765 INFO packaging: have _yum_lock for MainThread >03:47:55,765 INFO packaging: gave up _yum_lock for MainThread >03:47:55,766 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,766 INFO packaging: have _yum_lock for MainThread >03:47:55,767 INFO packaging: gave up _yum_lock for MainThread >03:47:55,768 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,768 INFO packaging: have _yum_lock for MainThread >03:47:55,768 INFO packaging: gave up _yum_lock for MainThread >03:47:55,769 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,769 INFO packaging: have _yum_lock for MainThread >03:47:55,770 INFO packaging: gave up _yum_lock for MainThread >03:47:55,771 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,771 INFO packaging: have _yum_lock for MainThread >03:47:55,771 INFO packaging: gave up _yum_lock for MainThread >03:47:55,772 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,773 INFO packaging: have _yum_lock for MainThread >03:47:55,773 INFO packaging: gave up _yum_lock for MainThread >03:47:55,774 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,774 INFO packaging: have _yum_lock for MainThread >03:47:55,774 INFO packaging: gave up _yum_lock for MainThread >03:47:55,776 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,776 INFO packaging: have _yum_lock for MainThread >03:47:55,776 INFO packaging: gave up _yum_lock for MainThread >03:47:55,777 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,777 INFO packaging: have _yum_lock for MainThread >03:47:55,778 INFO packaging: gave up _yum_lock for MainThread >03:47:55,779 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,779 INFO packaging: have _yum_lock for MainThread >03:47:55,779 INFO packaging: gave up _yum_lock for MainThread >03:47:55,780 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,780 INFO packaging: have _yum_lock for MainThread >03:47:55,781 INFO packaging: gave up _yum_lock for MainThread >03:47:55,782 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,782 INFO packaging: have _yum_lock for MainThread >03:47:55,782 INFO packaging: gave up _yum_lock for MainThread >03:47:55,783 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,784 INFO packaging: have _yum_lock for MainThread >03:47:55,784 INFO packaging: gave up _yum_lock for MainThread >03:47:55,785 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,785 INFO packaging: have _yum_lock for MainThread >03:47:55,785 INFO packaging: gave up _yum_lock for MainThread >03:47:55,787 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,787 INFO packaging: have _yum_lock for MainThread >03:47:55,787 INFO packaging: gave up _yum_lock for MainThread >03:47:55,788 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,789 INFO packaging: have _yum_lock for MainThread >03:47:55,789 INFO packaging: gave up _yum_lock for MainThread >03:47:55,790 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,790 INFO packaging: have _yum_lock for MainThread >03:47:55,790 INFO packaging: gave up _yum_lock for MainThread >03:47:55,791 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,792 INFO packaging: have _yum_lock for MainThread >03:47:55,792 INFO packaging: gave up _yum_lock for MainThread >03:47:55,793 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,793 INFO packaging: have _yum_lock for MainThread >03:47:55,793 INFO packaging: gave up _yum_lock for MainThread >03:47:55,794 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,795 INFO packaging: have _yum_lock for MainThread >03:47:55,795 INFO packaging: gave up _yum_lock for MainThread >03:47:55,796 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,796 INFO packaging: have _yum_lock for MainThread >03:47:55,796 INFO packaging: gave up _yum_lock for MainThread >03:47:55,798 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,798 INFO packaging: have _yum_lock for MainThread >03:47:55,798 INFO packaging: gave up _yum_lock for MainThread >03:47:55,799 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:55,799 INFO packaging: have _yum_lock for MainThread >03:47:55,800 INFO packaging: gave up _yum_lock for MainThread >03:47:55,801 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,801 INFO packaging: have _yum_lock for MainThread >03:47:55,801 INFO packaging: gave up _yum_lock for MainThread >03:47:55,802 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,802 INFO packaging: have _yum_lock for MainThread >03:47:55,803 INFO packaging: gave up _yum_lock for MainThread >03:47:55,804 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,804 INFO packaging: have _yum_lock for MainThread >03:47:55,804 INFO packaging: gave up _yum_lock for MainThread >03:47:55,806 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,806 INFO packaging: have _yum_lock for MainThread >03:47:55,806 INFO packaging: gave up _yum_lock for MainThread >03:47:55,807 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,807 INFO packaging: have _yum_lock for MainThread >03:47:55,808 INFO packaging: gave up _yum_lock for MainThread >03:47:55,809 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,809 INFO packaging: have _yum_lock for MainThread >03:47:55,809 INFO packaging: gave up _yum_lock for MainThread >03:47:55,810 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,810 INFO packaging: have _yum_lock for MainThread >03:47:55,811 INFO packaging: gave up _yum_lock for MainThread >03:47:55,812 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,812 INFO packaging: have _yum_lock for MainThread >03:47:55,812 INFO packaging: gave up _yum_lock for MainThread >03:47:55,813 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,814 INFO packaging: have _yum_lock for MainThread >03:47:55,814 INFO packaging: gave up _yum_lock for MainThread >03:47:55,815 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,815 INFO packaging: have _yum_lock for MainThread >03:47:55,816 INFO packaging: gave up _yum_lock for MainThread >03:47:55,817 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,817 INFO packaging: have _yum_lock for MainThread >03:47:55,817 INFO packaging: gave up _yum_lock for MainThread >03:47:55,818 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,818 INFO packaging: have _yum_lock for MainThread >03:47:55,819 INFO packaging: gave up _yum_lock for MainThread >03:47:55,820 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,820 INFO packaging: have _yum_lock for MainThread >03:47:55,820 INFO packaging: gave up _yum_lock for MainThread >03:47:55,821 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,821 INFO packaging: have _yum_lock for MainThread >03:47:55,822 INFO packaging: gave up _yum_lock for MainThread >03:47:55,823 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,823 INFO packaging: have _yum_lock for MainThread >03:47:55,823 INFO packaging: gave up _yum_lock for MainThread >03:47:55,824 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,825 INFO packaging: have _yum_lock for MainThread >03:47:55,825 INFO packaging: gave up _yum_lock for MainThread >03:47:55,826 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,826 INFO packaging: have _yum_lock for MainThread >03:47:55,826 INFO packaging: gave up _yum_lock for MainThread >03:47:55,828 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,828 INFO packaging: have _yum_lock for MainThread >03:47:55,828 INFO packaging: gave up _yum_lock for MainThread >03:47:55,829 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,829 INFO packaging: have _yum_lock for MainThread >03:47:55,830 INFO packaging: gave up _yum_lock for MainThread >03:47:55,831 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,831 INFO packaging: have _yum_lock for MainThread >03:47:55,831 INFO packaging: gave up _yum_lock for MainThread >03:47:55,832 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,833 INFO packaging: have _yum_lock for MainThread >03:47:55,833 INFO packaging: gave up _yum_lock for MainThread >03:47:55,834 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,834 INFO packaging: have _yum_lock for MainThread >03:47:55,835 INFO packaging: gave up _yum_lock for MainThread >03:47:55,836 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,836 INFO packaging: have _yum_lock for MainThread >03:47:55,836 INFO packaging: gave up _yum_lock for MainThread >03:47:55,837 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,838 INFO packaging: have _yum_lock for MainThread >03:47:55,838 INFO packaging: gave up _yum_lock for MainThread >03:47:55,839 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,839 INFO packaging: have _yum_lock for MainThread >03:47:55,840 INFO packaging: gave up _yum_lock for MainThread >03:47:55,841 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,841 INFO packaging: have _yum_lock for MainThread >03:47:55,841 INFO packaging: gave up _yum_lock for MainThread >03:47:55,842 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,842 INFO packaging: have _yum_lock for MainThread >03:47:55,843 INFO packaging: gave up _yum_lock for MainThread >03:47:55,844 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,844 INFO packaging: have _yum_lock for MainThread >03:47:55,844 INFO packaging: gave up _yum_lock for MainThread >03:47:55,845 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,846 INFO packaging: have _yum_lock for MainThread >03:47:55,846 INFO packaging: gave up _yum_lock for MainThread >03:47:55,847 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,847 INFO packaging: have _yum_lock for MainThread >03:47:55,848 INFO packaging: gave up _yum_lock for MainThread >03:47:55,849 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,849 INFO packaging: have _yum_lock for MainThread >03:47:55,849 INFO packaging: gave up _yum_lock for MainThread >03:47:55,850 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,851 INFO packaging: have _yum_lock for MainThread >03:47:55,851 INFO packaging: gave up _yum_lock for MainThread >03:47:55,852 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,852 INFO packaging: have _yum_lock for MainThread >03:47:55,852 INFO packaging: gave up _yum_lock for MainThread >03:47:55,854 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,854 INFO packaging: have _yum_lock for MainThread >03:47:55,854 INFO packaging: gave up _yum_lock for MainThread >03:47:55,855 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,856 INFO packaging: have _yum_lock for MainThread >03:47:55,856 INFO packaging: gave up _yum_lock for MainThread >03:47:55,857 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,857 INFO packaging: have _yum_lock for MainThread >03:47:55,857 INFO packaging: gave up _yum_lock for MainThread >03:47:55,858 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,859 INFO packaging: have _yum_lock for MainThread >03:47:55,859 INFO packaging: gave up _yum_lock for MainThread >03:47:55,860 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:55,860 INFO packaging: have _yum_lock for MainThread >03:47:55,861 INFO packaging: gave up _yum_lock for MainThread >03:47:55,862 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,862 INFO packaging: have _yum_lock for MainThread >03:47:55,862 INFO packaging: gave up _yum_lock for MainThread >03:47:55,863 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,864 INFO packaging: have _yum_lock for MainThread >03:47:55,864 INFO packaging: gave up _yum_lock for MainThread >03:47:55,865 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,865 INFO packaging: have _yum_lock for MainThread >03:47:55,865 INFO packaging: gave up _yum_lock for MainThread >03:47:55,867 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,867 INFO packaging: have _yum_lock for MainThread >03:47:55,867 INFO packaging: gave up _yum_lock for MainThread >03:47:55,868 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,868 INFO packaging: have _yum_lock for MainThread >03:47:55,869 INFO packaging: gave up _yum_lock for MainThread >03:47:55,870 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,870 INFO packaging: have _yum_lock for MainThread >03:47:55,871 INFO packaging: gave up _yum_lock for MainThread >03:47:55,872 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,872 INFO packaging: have _yum_lock for MainThread >03:47:55,872 INFO packaging: gave up _yum_lock for MainThread >03:47:55,873 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,873 INFO packaging: have _yum_lock for MainThread >03:47:55,874 INFO packaging: gave up _yum_lock for MainThread >03:47:55,875 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,875 INFO packaging: have _yum_lock for MainThread >03:47:55,875 INFO packaging: gave up _yum_lock for MainThread >03:47:55,876 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,877 INFO packaging: have _yum_lock for MainThread >03:47:55,877 INFO packaging: gave up _yum_lock for MainThread >03:47:55,878 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,878 INFO packaging: have _yum_lock for MainThread >03:47:55,879 INFO packaging: gave up _yum_lock for MainThread >03:47:55,880 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,880 INFO packaging: have _yum_lock for MainThread >03:47:55,880 INFO packaging: gave up _yum_lock for MainThread >03:47:55,881 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,882 INFO packaging: have _yum_lock for MainThread >03:47:55,882 INFO packaging: gave up _yum_lock for MainThread >03:47:55,883 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,883 INFO packaging: have _yum_lock for MainThread >03:47:55,884 INFO packaging: gave up _yum_lock for MainThread >03:47:55,885 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,885 INFO packaging: have _yum_lock for MainThread >03:47:55,885 INFO packaging: gave up _yum_lock for MainThread >03:47:55,886 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,887 INFO packaging: have _yum_lock for MainThread >03:47:55,887 INFO packaging: gave up _yum_lock for MainThread >03:47:55,888 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,888 INFO packaging: have _yum_lock for MainThread >03:47:55,889 INFO packaging: gave up _yum_lock for MainThread >03:47:55,890 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,890 INFO packaging: have _yum_lock for MainThread >03:47:55,890 INFO packaging: gave up _yum_lock for MainThread >03:47:55,891 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,892 INFO packaging: have _yum_lock for MainThread >03:47:55,892 INFO packaging: gave up _yum_lock for MainThread >03:47:55,893 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,893 INFO packaging: have _yum_lock for MainThread >03:47:55,894 INFO packaging: gave up _yum_lock for MainThread >03:47:55,895 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,895 INFO packaging: have _yum_lock for MainThread >03:47:55,895 INFO packaging: gave up _yum_lock for MainThread >03:47:55,896 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,897 INFO packaging: have _yum_lock for MainThread >03:47:55,897 INFO packaging: gave up _yum_lock for MainThread >03:47:55,898 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,898 INFO packaging: have _yum_lock for MainThread >03:47:55,898 INFO packaging: gave up _yum_lock for MainThread >03:47:55,899 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,900 INFO packaging: have _yum_lock for MainThread >03:47:55,900 INFO packaging: gave up _yum_lock for MainThread >03:47:55,901 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,901 INFO packaging: have _yum_lock for MainThread >03:47:55,902 INFO packaging: gave up _yum_lock for MainThread >03:47:55,903 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,903 INFO packaging: have _yum_lock for MainThread >03:47:55,903 INFO packaging: gave up _yum_lock for MainThread >03:47:55,904 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,905 INFO packaging: have _yum_lock for MainThread >03:47:55,905 INFO packaging: gave up _yum_lock for MainThread >03:47:55,906 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,906 INFO packaging: have _yum_lock for MainThread >03:47:55,907 INFO packaging: gave up _yum_lock for MainThread >03:47:55,908 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,908 INFO packaging: have _yum_lock for MainThread >03:47:55,908 INFO packaging: gave up _yum_lock for MainThread >03:47:55,909 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,909 INFO packaging: have _yum_lock for MainThread >03:47:55,910 INFO packaging: gave up _yum_lock for MainThread >03:47:55,911 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,911 INFO packaging: have _yum_lock for MainThread >03:47:55,911 INFO packaging: gave up _yum_lock for MainThread >03:47:55,912 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,913 INFO packaging: have _yum_lock for MainThread >03:47:55,913 INFO packaging: gave up _yum_lock for MainThread >03:47:55,914 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,915 INFO packaging: have _yum_lock for MainThread >03:47:55,915 INFO packaging: gave up _yum_lock for MainThread >03:47:55,916 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,916 INFO packaging: have _yum_lock for MainThread >03:47:55,916 INFO packaging: gave up _yum_lock for MainThread >03:47:55,917 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,918 INFO packaging: have _yum_lock for MainThread >03:47:55,918 INFO packaging: gave up _yum_lock for MainThread >03:47:55,919 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,919 INFO packaging: have _yum_lock for MainThread >03:47:55,920 INFO packaging: gave up _yum_lock for MainThread >03:47:55,921 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,921 INFO packaging: have _yum_lock for MainThread >03:47:55,921 INFO packaging: gave up _yum_lock for MainThread >03:47:55,922 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,923 INFO packaging: have _yum_lock for MainThread >03:47:55,923 INFO packaging: gave up _yum_lock for MainThread >03:47:55,924 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,924 INFO packaging: have _yum_lock for MainThread >03:47:55,925 INFO packaging: gave up _yum_lock for MainThread >03:47:55,926 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,926 INFO packaging: have _yum_lock for MainThread >03:47:55,926 INFO packaging: gave up _yum_lock for MainThread >03:47:55,927 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,927 INFO packaging: have _yum_lock for MainThread >03:47:55,928 INFO packaging: gave up _yum_lock for MainThread >03:47:55,929 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,929 INFO packaging: have _yum_lock for MainThread >03:47:55,929 INFO packaging: gave up _yum_lock for MainThread >03:47:55,930 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,931 INFO packaging: have _yum_lock for MainThread >03:47:55,931 INFO packaging: gave up _yum_lock for MainThread >03:47:55,932 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,932 INFO packaging: have _yum_lock for MainThread >03:47:55,932 INFO packaging: gave up _yum_lock for MainThread >03:47:55,934 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,934 INFO packaging: have _yum_lock for MainThread >03:47:55,934 INFO packaging: gave up _yum_lock for MainThread >03:47:55,935 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,936 INFO packaging: have _yum_lock for MainThread >03:47:55,936 INFO packaging: gave up _yum_lock for MainThread >03:47:55,937 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,937 INFO packaging: have _yum_lock for MainThread >03:47:55,937 INFO packaging: gave up _yum_lock for MainThread >03:47:55,938 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,939 INFO packaging: have _yum_lock for MainThread >03:47:55,939 INFO packaging: gave up _yum_lock for MainThread >03:47:55,940 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,940 INFO packaging: have _yum_lock for MainThread >03:47:55,941 INFO packaging: gave up _yum_lock for MainThread >03:47:55,942 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,942 INFO packaging: have _yum_lock for MainThread >03:47:55,942 INFO packaging: gave up _yum_lock for MainThread >03:47:55,943 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,943 INFO packaging: have _yum_lock for MainThread >03:47:55,943 INFO packaging: gave up _yum_lock for MainThread >03:47:55,945 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,945 INFO packaging: have _yum_lock for MainThread >03:47:55,945 INFO packaging: gave up _yum_lock for MainThread >03:47:55,946 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,946 INFO packaging: have _yum_lock for MainThread >03:47:55,946 INFO packaging: gave up _yum_lock for MainThread >03:47:55,948 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,948 INFO packaging: have _yum_lock for MainThread >03:47:55,948 INFO packaging: gave up _yum_lock for MainThread >03:47:55,949 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,950 INFO packaging: have _yum_lock for MainThread >03:47:55,950 INFO packaging: gave up _yum_lock for MainThread >03:47:55,951 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,951 INFO packaging: have _yum_lock for MainThread >03:47:55,951 INFO packaging: gave up _yum_lock for MainThread >03:47:55,953 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,953 INFO packaging: have _yum_lock for MainThread >03:47:55,953 INFO packaging: gave up _yum_lock for MainThread >03:47:55,954 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,955 INFO packaging: have _yum_lock for MainThread >03:47:55,955 INFO packaging: gave up _yum_lock for MainThread >03:47:55,956 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,956 INFO packaging: have _yum_lock for MainThread >03:47:55,956 INFO packaging: gave up _yum_lock for MainThread >03:47:55,958 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,958 INFO packaging: have _yum_lock for MainThread >03:47:55,958 INFO packaging: gave up _yum_lock for MainThread >03:47:55,959 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,960 INFO packaging: have _yum_lock for MainThread >03:47:55,960 INFO packaging: gave up _yum_lock for MainThread >03:47:55,961 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,961 INFO packaging: have _yum_lock for MainThread >03:47:55,962 INFO packaging: gave up _yum_lock for MainThread >03:47:55,963 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,963 INFO packaging: have _yum_lock for MainThread >03:47:55,963 INFO packaging: gave up _yum_lock for MainThread >03:47:55,964 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,964 INFO packaging: have _yum_lock for MainThread >03:47:55,965 INFO packaging: gave up _yum_lock for MainThread >03:47:55,966 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,966 INFO packaging: have _yum_lock for MainThread >03:47:55,967 INFO packaging: gave up _yum_lock for MainThread >03:47:55,968 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,968 INFO packaging: have _yum_lock for MainThread >03:47:55,968 INFO packaging: gave up _yum_lock for MainThread >03:47:55,969 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,970 INFO packaging: have _yum_lock for MainThread >03:47:55,970 INFO packaging: gave up _yum_lock for MainThread >03:47:55,971 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,971 INFO packaging: have _yum_lock for MainThread >03:47:55,971 INFO packaging: gave up _yum_lock for MainThread >03:47:55,972 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,973 INFO packaging: have _yum_lock for MainThread >03:47:55,973 INFO packaging: gave up _yum_lock for MainThread >03:47:55,974 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,975 INFO packaging: have _yum_lock for MainThread >03:47:55,975 INFO packaging: gave up _yum_lock for MainThread >03:47:55,976 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,976 INFO packaging: have _yum_lock for MainThread >03:47:55,976 INFO packaging: gave up _yum_lock for MainThread >03:47:55,977 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,978 INFO packaging: have _yum_lock for MainThread >03:47:55,978 INFO packaging: gave up _yum_lock for MainThread >03:47:55,979 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,979 INFO packaging: have _yum_lock for MainThread >03:47:55,979 INFO packaging: gave up _yum_lock for MainThread >03:47:55,981 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,981 INFO packaging: have _yum_lock for MainThread >03:47:55,981 INFO packaging: gave up _yum_lock for MainThread >03:47:55,982 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,982 INFO packaging: have _yum_lock for MainThread >03:47:55,983 INFO packaging: gave up _yum_lock for MainThread >03:47:55,984 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,984 INFO packaging: have _yum_lock for MainThread >03:47:55,984 INFO packaging: gave up _yum_lock for MainThread >03:47:55,986 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,986 INFO packaging: have _yum_lock for MainThread >03:47:55,986 INFO packaging: gave up _yum_lock for MainThread >03:47:55,987 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,988 INFO packaging: have _yum_lock for MainThread >03:47:55,988 INFO packaging: gave up _yum_lock for MainThread >03:47:55,989 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,989 INFO packaging: have _yum_lock for MainThread >03:47:55,989 INFO packaging: gave up _yum_lock for MainThread >03:47:55,990 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,991 INFO packaging: have _yum_lock for MainThread >03:47:55,991 INFO packaging: gave up _yum_lock for MainThread >03:47:55,992 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,993 INFO packaging: have _yum_lock for MainThread >03:47:55,993 INFO packaging: gave up _yum_lock for MainThread >03:47:55,994 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:55,994 INFO packaging: have _yum_lock for MainThread >03:47:55,994 INFO packaging: gave up _yum_lock for MainThread >03:47:55,995 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:55,996 INFO packaging: have _yum_lock for MainThread >03:47:55,996 INFO packaging: gave up _yum_lock for MainThread >03:47:55,997 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:55,997 INFO packaging: have _yum_lock for MainThread >03:47:55,998 INFO packaging: gave up _yum_lock for MainThread >03:47:55,999 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:55,999 INFO packaging: have _yum_lock for MainThread >03:47:55,999 INFO packaging: gave up _yum_lock for MainThread >03:47:56,001 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,001 INFO packaging: have _yum_lock for MainThread >03:47:56,001 INFO packaging: gave up _yum_lock for MainThread >03:47:56,002 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,002 INFO packaging: have _yum_lock for MainThread >03:47:56,003 INFO packaging: gave up _yum_lock for MainThread >03:47:56,004 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,004 INFO packaging: have _yum_lock for MainThread >03:47:56,004 INFO packaging: gave up _yum_lock for MainThread >03:47:56,005 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,006 INFO packaging: have _yum_lock for MainThread >03:47:56,006 INFO packaging: gave up _yum_lock for MainThread >03:47:56,007 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,007 INFO packaging: have _yum_lock for MainThread >03:47:56,008 INFO packaging: gave up _yum_lock for MainThread >03:47:56,009 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,009 INFO packaging: have _yum_lock for MainThread >03:47:56,009 INFO packaging: gave up _yum_lock for MainThread >03:47:56,011 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,011 INFO packaging: have _yum_lock for MainThread >03:47:56,011 INFO packaging: gave up _yum_lock for MainThread >03:47:56,012 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,013 INFO packaging: have _yum_lock for MainThread >03:47:56,013 INFO packaging: gave up _yum_lock for MainThread >03:47:56,015 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,015 INFO packaging: have _yum_lock for MainThread >03:47:56,016 INFO packaging: gave up _yum_lock for MainThread >03:47:56,017 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,017 INFO packaging: have _yum_lock for MainThread >03:47:56,017 INFO packaging: gave up _yum_lock for MainThread >03:47:56,019 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,019 INFO packaging: have _yum_lock for MainThread >03:47:56,020 INFO packaging: gave up _yum_lock for MainThread >03:47:56,021 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,021 INFO packaging: have _yum_lock for MainThread >03:47:56,021 INFO packaging: gave up _yum_lock for MainThread >03:47:56,023 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,023 INFO packaging: have _yum_lock for MainThread >03:47:56,024 INFO packaging: gave up _yum_lock for MainThread >03:47:56,025 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,025 INFO packaging: have _yum_lock for MainThread >03:47:56,025 INFO packaging: gave up _yum_lock for MainThread >03:47:56,027 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,027 INFO packaging: have _yum_lock for MainThread >03:47:56,028 INFO packaging: gave up _yum_lock for MainThread >03:47:56,029 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,029 INFO packaging: have _yum_lock for MainThread >03:47:56,029 INFO packaging: gave up _yum_lock for MainThread >03:47:56,031 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,031 INFO packaging: have _yum_lock for MainThread >03:47:56,032 INFO packaging: gave up _yum_lock for MainThread >03:47:56,033 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,033 INFO packaging: have _yum_lock for MainThread >03:47:56,033 INFO packaging: gave up _yum_lock for MainThread >03:47:56,035 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,035 INFO packaging: have _yum_lock for MainThread >03:47:56,036 INFO packaging: gave up _yum_lock for MainThread >03:47:56,037 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,037 INFO packaging: have _yum_lock for MainThread >03:47:56,037 INFO packaging: gave up _yum_lock for MainThread >03:47:56,042 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1079 (environmentGroups) >03:47:56,042 INFO packaging: have _yum_lock for MainThread >03:47:56,042 INFO packaging: gave up _yum_lock for MainThread >03:47:56,043 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:337 (on_environment_toggled) >03:47:56,044 INFO packaging: have _yum_lock for MainThread >03:47:56,044 INFO packaging: gave up _yum_lock for MainThread >03:47:56,046 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1114 (groups) >03:47:56,046 INFO packaging: have _yum_lock for MainThread >03:47:56,046 INFO packaging: gave up _yum_lock for MainThread >03:47:56,047 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:286 (refreshAddons) >03:47:56,048 INFO packaging: have _yum_lock for MainThread >03:47:56,048 INFO packaging: gave up _yum_lock for MainThread >03:47:56,049 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,049 INFO packaging: have _yum_lock for MainThread >03:47:56,050 INFO packaging: gave up _yum_lock for MainThread >03:47:56,051 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,051 INFO packaging: have _yum_lock for MainThread >03:47:56,051 INFO packaging: gave up _yum_lock for MainThread >03:47:56,053 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,053 INFO packaging: have _yum_lock for MainThread >03:47:56,053 INFO packaging: gave up _yum_lock for MainThread >03:47:56,054 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,054 INFO packaging: have _yum_lock for MainThread >03:47:56,055 INFO packaging: gave up _yum_lock for MainThread >03:47:56,056 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,056 INFO packaging: have _yum_lock for MainThread >03:47:56,056 INFO packaging: gave up _yum_lock for MainThread >03:47:56,057 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,058 INFO packaging: have _yum_lock for MainThread >03:47:56,058 INFO packaging: gave up _yum_lock for MainThread >03:47:56,059 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,059 INFO packaging: have _yum_lock for MainThread >03:47:56,060 INFO packaging: gave up _yum_lock for MainThread >03:47:56,061 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,061 INFO packaging: have _yum_lock for MainThread >03:47:56,061 INFO packaging: gave up _yum_lock for MainThread >03:47:56,063 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,063 INFO packaging: have _yum_lock for MainThread >03:47:56,063 INFO packaging: gave up _yum_lock for MainThread >03:47:56,064 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,064 INFO packaging: have _yum_lock for MainThread >03:47:56,065 INFO packaging: gave up _yum_lock for MainThread >03:47:56,066 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,066 INFO packaging: have _yum_lock for MainThread >03:47:56,066 INFO packaging: gave up _yum_lock for MainThread >03:47:56,067 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,068 INFO packaging: have _yum_lock for MainThread >03:47:56,068 INFO packaging: gave up _yum_lock for MainThread >03:47:56,069 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,069 INFO packaging: have _yum_lock for MainThread >03:47:56,070 INFO packaging: gave up _yum_lock for MainThread >03:47:56,071 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,071 INFO packaging: have _yum_lock for MainThread >03:47:56,071 INFO packaging: gave up _yum_lock for MainThread >03:47:56,072 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,073 INFO packaging: have _yum_lock for MainThread >03:47:56,073 INFO packaging: gave up _yum_lock for MainThread >03:47:56,074 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,074 INFO packaging: have _yum_lock for MainThread >03:47:56,075 INFO packaging: gave up _yum_lock for MainThread >03:47:56,076 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,076 INFO packaging: have _yum_lock for MainThread >03:47:56,076 INFO packaging: gave up _yum_lock for MainThread >03:47:56,077 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,077 INFO packaging: have _yum_lock for MainThread >03:47:56,078 INFO packaging: gave up _yum_lock for MainThread >03:47:56,079 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,079 INFO packaging: have _yum_lock for MainThread >03:47:56,080 INFO packaging: gave up _yum_lock for MainThread >03:47:56,081 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,081 INFO packaging: have _yum_lock for MainThread >03:47:56,081 INFO packaging: gave up _yum_lock for MainThread >03:47:56,082 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,083 INFO packaging: have _yum_lock for MainThread >03:47:56,083 INFO packaging: gave up _yum_lock for MainThread >03:47:56,084 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,084 INFO packaging: have _yum_lock for MainThread >03:47:56,084 INFO packaging: gave up _yum_lock for MainThread >03:47:56,086 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,086 INFO packaging: have _yum_lock for MainThread >03:47:56,086 INFO packaging: gave up _yum_lock for MainThread >03:47:56,087 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,088 INFO packaging: have _yum_lock for MainThread >03:47:56,088 INFO packaging: gave up _yum_lock for MainThread >03:47:56,089 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,089 INFO packaging: have _yum_lock for MainThread >03:47:56,090 INFO packaging: gave up _yum_lock for MainThread >03:47:56,091 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,091 INFO packaging: have _yum_lock for MainThread >03:47:56,091 INFO packaging: gave up _yum_lock for MainThread >03:47:56,092 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,093 INFO packaging: have _yum_lock for MainThread >03:47:56,093 INFO packaging: gave up _yum_lock for MainThread >03:47:56,094 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,094 INFO packaging: have _yum_lock for MainThread >03:47:56,094 INFO packaging: gave up _yum_lock for MainThread >03:47:56,096 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,096 INFO packaging: have _yum_lock for MainThread >03:47:56,096 INFO packaging: gave up _yum_lock for MainThread >03:47:56,097 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,098 INFO packaging: have _yum_lock for MainThread >03:47:56,098 INFO packaging: gave up _yum_lock for MainThread >03:47:56,099 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,099 INFO packaging: have _yum_lock for MainThread >03:47:56,100 INFO packaging: gave up _yum_lock for MainThread >03:47:56,101 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,101 INFO packaging: have _yum_lock for MainThread >03:47:56,101 INFO packaging: gave up _yum_lock for MainThread >03:47:56,102 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,103 INFO packaging: have _yum_lock for MainThread >03:47:56,103 INFO packaging: gave up _yum_lock for MainThread >03:47:56,104 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,104 INFO packaging: have _yum_lock for MainThread >03:47:56,105 INFO packaging: gave up _yum_lock for MainThread >03:47:56,106 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,106 INFO packaging: have _yum_lock for MainThread >03:47:56,106 INFO packaging: gave up _yum_lock for MainThread >03:47:56,107 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,108 INFO packaging: have _yum_lock for MainThread >03:47:56,108 INFO packaging: gave up _yum_lock for MainThread >03:47:56,109 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,109 INFO packaging: have _yum_lock for MainThread >03:47:56,110 INFO packaging: gave up _yum_lock for MainThread >03:47:56,111 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,111 INFO packaging: have _yum_lock for MainThread >03:47:56,111 INFO packaging: gave up _yum_lock for MainThread >03:47:56,112 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,113 INFO packaging: have _yum_lock for MainThread >03:47:56,113 INFO packaging: gave up _yum_lock for MainThread >03:47:56,114 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,114 INFO packaging: have _yum_lock for MainThread >03:47:56,114 INFO packaging: gave up _yum_lock for MainThread >03:47:56,116 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,116 INFO packaging: have _yum_lock for MainThread >03:47:56,116 INFO packaging: gave up _yum_lock for MainThread >03:47:56,117 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,117 INFO packaging: have _yum_lock for MainThread >03:47:56,118 INFO packaging: gave up _yum_lock for MainThread >03:47:56,119 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,119 INFO packaging: have _yum_lock for MainThread >03:47:56,119 INFO packaging: gave up _yum_lock for MainThread >03:47:56,120 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,121 INFO packaging: have _yum_lock for MainThread >03:47:56,121 INFO packaging: gave up _yum_lock for MainThread >03:47:56,122 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:56,122 INFO packaging: have _yum_lock for MainThread >03:47:56,123 INFO packaging: gave up _yum_lock for MainThread >03:47:56,124 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,124 INFO packaging: have _yum_lock for MainThread >03:47:56,124 INFO packaging: gave up _yum_lock for MainThread >03:47:56,125 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,126 INFO packaging: have _yum_lock for MainThread >03:47:56,126 INFO packaging: gave up _yum_lock for MainThread >03:47:56,127 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,127 INFO packaging: have _yum_lock for MainThread >03:47:56,127 INFO packaging: gave up _yum_lock for MainThread >03:47:56,129 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,129 INFO packaging: have _yum_lock for MainThread >03:47:56,129 INFO packaging: gave up _yum_lock for MainThread >03:47:56,130 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,131 INFO packaging: have _yum_lock for MainThread >03:47:56,131 INFO packaging: gave up _yum_lock for MainThread >03:47:56,132 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,132 INFO packaging: have _yum_lock for MainThread >03:47:56,133 INFO packaging: gave up _yum_lock for MainThread >03:47:56,134 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,134 INFO packaging: have _yum_lock for MainThread >03:47:56,134 INFO packaging: gave up _yum_lock for MainThread >03:47:56,135 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,136 INFO packaging: have _yum_lock for MainThread >03:47:56,136 INFO packaging: gave up _yum_lock for MainThread >03:47:56,137 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,137 INFO packaging: have _yum_lock for MainThread >03:47:56,138 INFO packaging: gave up _yum_lock for MainThread >03:47:56,139 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,139 INFO packaging: have _yum_lock for MainThread >03:47:56,139 INFO packaging: gave up _yum_lock for MainThread >03:47:56,140 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,141 INFO packaging: have _yum_lock for MainThread >03:47:56,141 INFO packaging: gave up _yum_lock for MainThread >03:47:56,142 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,142 INFO packaging: have _yum_lock for MainThread >03:47:56,142 INFO packaging: gave up _yum_lock for MainThread >03:47:56,143 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,144 INFO packaging: have _yum_lock for MainThread >03:47:56,144 INFO packaging: gave up _yum_lock for MainThread >03:47:56,145 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,145 INFO packaging: have _yum_lock for MainThread >03:47:56,145 INFO packaging: gave up _yum_lock for MainThread >03:47:56,146 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,147 INFO packaging: have _yum_lock for MainThread >03:47:56,147 INFO packaging: gave up _yum_lock for MainThread >03:47:56,148 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,148 INFO packaging: have _yum_lock for MainThread >03:47:56,149 INFO packaging: gave up _yum_lock for MainThread >03:47:56,150 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,150 INFO packaging: have _yum_lock for MainThread >03:47:56,150 INFO packaging: gave up _yum_lock for MainThread >03:47:56,151 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:56,152 INFO packaging: have _yum_lock for MainThread >03:47:56,152 INFO packaging: gave up _yum_lock for MainThread >03:47:56,153 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,153 INFO packaging: have _yum_lock for MainThread >03:47:56,154 INFO packaging: gave up _yum_lock for MainThread >03:47:56,155 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,155 INFO packaging: have _yum_lock for MainThread >03:47:56,155 INFO packaging: gave up _yum_lock for MainThread >03:47:56,156 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,157 INFO packaging: have _yum_lock for MainThread >03:47:56,157 INFO packaging: gave up _yum_lock for MainThread >03:47:56,158 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,158 INFO packaging: have _yum_lock for MainThread >03:47:56,159 INFO packaging: gave up _yum_lock for MainThread >03:47:56,160 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,160 INFO packaging: have _yum_lock for MainThread >03:47:56,160 INFO packaging: gave up _yum_lock for MainThread >03:47:56,161 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,162 INFO packaging: have _yum_lock for MainThread >03:47:56,162 INFO packaging: gave up _yum_lock for MainThread >03:47:56,163 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,163 INFO packaging: have _yum_lock for MainThread >03:47:56,163 INFO packaging: gave up _yum_lock for MainThread >03:47:56,165 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,165 INFO packaging: have _yum_lock for MainThread >03:47:56,165 INFO packaging: gave up _yum_lock for MainThread >03:47:56,166 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,167 INFO packaging: have _yum_lock for MainThread >03:47:56,167 INFO packaging: gave up _yum_lock for MainThread >03:47:56,168 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:56,168 INFO packaging: have _yum_lock for MainThread >03:47:56,169 INFO packaging: gave up _yum_lock for MainThread >03:47:56,170 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,170 INFO packaging: have _yum_lock for MainThread >03:47:56,170 INFO packaging: gave up _yum_lock for MainThread >03:47:56,171 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,171 INFO packaging: have _yum_lock for MainThread >03:47:56,172 INFO packaging: gave up _yum_lock for MainThread >03:47:56,173 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,173 INFO packaging: have _yum_lock for MainThread >03:47:56,173 INFO packaging: gave up _yum_lock for MainThread >03:47:56,175 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,175 INFO packaging: have _yum_lock for MainThread >03:47:56,175 INFO packaging: gave up _yum_lock for MainThread >03:47:56,176 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,177 INFO packaging: have _yum_lock for MainThread >03:47:56,177 INFO packaging: gave up _yum_lock for MainThread >03:47:56,178 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,178 INFO packaging: have _yum_lock for MainThread >03:47:56,178 INFO packaging: gave up _yum_lock for MainThread >03:47:56,179 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,180 INFO packaging: have _yum_lock for MainThread >03:47:56,180 INFO packaging: gave up _yum_lock for MainThread >03:47:56,181 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,181 INFO packaging: have _yum_lock for MainThread >03:47:56,182 INFO packaging: gave up _yum_lock for MainThread >03:47:56,183 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,183 INFO packaging: have _yum_lock for MainThread >03:47:56,183 INFO packaging: gave up _yum_lock for MainThread >03:47:56,185 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,185 INFO packaging: have _yum_lock for MainThread >03:47:56,185 INFO packaging: gave up _yum_lock for MainThread >03:47:56,186 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,186 INFO packaging: have _yum_lock for MainThread >03:47:56,187 INFO packaging: gave up _yum_lock for MainThread >03:47:56,188 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,188 INFO packaging: have _yum_lock for MainThread >03:47:56,188 INFO packaging: gave up _yum_lock for MainThread >03:47:56,189 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,190 INFO packaging: have _yum_lock for MainThread >03:47:56,190 INFO packaging: gave up _yum_lock for MainThread >03:47:56,191 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,192 INFO packaging: have _yum_lock for MainThread >03:47:56,192 INFO packaging: gave up _yum_lock for MainThread >03:47:56,193 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,193 INFO packaging: have _yum_lock for MainThread >03:47:56,193 INFO packaging: gave up _yum_lock for MainThread >03:47:56,194 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,195 INFO packaging: have _yum_lock for MainThread >03:47:56,195 INFO packaging: gave up _yum_lock for MainThread >03:47:56,196 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,196 INFO packaging: have _yum_lock for MainThread >03:47:56,197 INFO packaging: gave up _yum_lock for MainThread >03:47:56,198 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,198 INFO packaging: have _yum_lock for MainThread >03:47:56,198 INFO packaging: gave up _yum_lock for MainThread >03:47:56,199 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,200 INFO packaging: have _yum_lock for MainThread >03:47:56,200 INFO packaging: gave up _yum_lock for MainThread >03:47:56,201 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,201 INFO packaging: have _yum_lock for MainThread >03:47:56,201 INFO packaging: gave up _yum_lock for MainThread >03:47:56,202 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,203 INFO packaging: have _yum_lock for MainThread >03:47:56,203 INFO packaging: gave up _yum_lock for MainThread >03:47:56,204 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,205 INFO packaging: have _yum_lock for MainThread >03:47:56,205 INFO packaging: gave up _yum_lock for MainThread >03:47:56,206 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,206 INFO packaging: have _yum_lock for MainThread >03:47:56,206 INFO packaging: gave up _yum_lock for MainThread >03:47:56,208 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,208 INFO packaging: have _yum_lock for MainThread >03:47:56,208 INFO packaging: gave up _yum_lock for MainThread >03:47:56,209 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,209 INFO packaging: have _yum_lock for MainThread >03:47:56,210 INFO packaging: gave up _yum_lock for MainThread >03:47:56,211 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,211 INFO packaging: have _yum_lock for MainThread >03:47:56,211 INFO packaging: gave up _yum_lock for MainThread >03:47:56,212 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,213 INFO packaging: have _yum_lock for MainThread >03:47:56,213 INFO packaging: gave up _yum_lock for MainThread >03:47:56,214 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,214 INFO packaging: have _yum_lock for MainThread >03:47:56,215 INFO packaging: gave up _yum_lock for MainThread >03:47:56,216 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,216 INFO packaging: have _yum_lock for MainThread >03:47:56,216 INFO packaging: gave up _yum_lock for MainThread >03:47:56,217 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,217 INFO packaging: have _yum_lock for MainThread >03:47:56,218 INFO packaging: gave up _yum_lock for MainThread >03:47:56,219 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,219 INFO packaging: have _yum_lock for MainThread >03:47:56,219 INFO packaging: gave up _yum_lock for MainThread >03:47:56,221 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,221 INFO packaging: have _yum_lock for MainThread >03:47:56,221 INFO packaging: gave up _yum_lock for MainThread >03:47:56,222 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,222 INFO packaging: have _yum_lock for MainThread >03:47:56,223 INFO packaging: gave up _yum_lock for MainThread >03:47:56,224 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:56,224 INFO packaging: have _yum_lock for MainThread >03:47:56,224 INFO packaging: gave up _yum_lock for MainThread >03:47:56,225 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,225 INFO packaging: have _yum_lock for MainThread >03:47:56,226 INFO packaging: gave up _yum_lock for MainThread >03:47:56,227 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,227 INFO packaging: have _yum_lock for MainThread >03:47:56,227 INFO packaging: gave up _yum_lock for MainThread >03:47:56,229 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,229 INFO packaging: have _yum_lock for MainThread >03:47:56,229 INFO packaging: gave up _yum_lock for MainThread >03:47:56,230 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,231 INFO packaging: have _yum_lock for MainThread >03:47:56,231 INFO packaging: gave up _yum_lock for MainThread >03:47:56,232 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,232 INFO packaging: have _yum_lock for MainThread >03:47:56,232 INFO packaging: gave up _yum_lock for MainThread >03:47:56,233 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,234 INFO packaging: have _yum_lock for MainThread >03:47:56,234 INFO packaging: gave up _yum_lock for MainThread >03:47:56,235 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,235 INFO packaging: have _yum_lock for MainThread >03:47:56,236 INFO packaging: gave up _yum_lock for MainThread >03:47:56,237 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,237 INFO packaging: have _yum_lock for MainThread >03:47:56,237 INFO packaging: gave up _yum_lock for MainThread >03:47:56,238 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,239 INFO packaging: have _yum_lock for MainThread >03:47:56,239 INFO packaging: gave up _yum_lock for MainThread >03:47:56,240 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,240 INFO packaging: have _yum_lock for MainThread >03:47:56,240 INFO packaging: gave up _yum_lock for MainThread >03:47:56,241 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,242 INFO packaging: have _yum_lock for MainThread >03:47:56,242 INFO packaging: gave up _yum_lock for MainThread >03:47:56,243 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,243 INFO packaging: have _yum_lock for MainThread >03:47:56,244 INFO packaging: gave up _yum_lock for MainThread >03:47:56,245 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,245 INFO packaging: have _yum_lock for MainThread >03:47:56,245 INFO packaging: gave up _yum_lock for MainThread >03:47:56,246 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,247 INFO packaging: have _yum_lock for MainThread >03:47:56,247 INFO packaging: gave up _yum_lock for MainThread >03:47:56,248 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,248 INFO packaging: have _yum_lock for MainThread >03:47:56,248 INFO packaging: gave up _yum_lock for MainThread >03:47:56,250 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,250 INFO packaging: have _yum_lock for MainThread >03:47:56,250 INFO packaging: gave up _yum_lock for MainThread >03:47:56,251 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,252 INFO packaging: have _yum_lock for MainThread >03:47:56,252 INFO packaging: gave up _yum_lock for MainThread >03:47:56,253 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,253 INFO packaging: have _yum_lock for MainThread >03:47:56,253 INFO packaging: gave up _yum_lock for MainThread >03:47:56,254 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,255 INFO packaging: have _yum_lock for MainThread >03:47:56,255 INFO packaging: gave up _yum_lock for MainThread >03:47:56,256 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,256 INFO packaging: have _yum_lock for MainThread >03:47:56,257 INFO packaging: gave up _yum_lock for MainThread >03:47:56,258 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,258 INFO packaging: have _yum_lock for MainThread >03:47:56,258 INFO packaging: gave up _yum_lock for MainThread >03:47:56,259 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,260 INFO packaging: have _yum_lock for MainThread >03:47:56,260 INFO packaging: gave up _yum_lock for MainThread >03:47:56,261 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,261 INFO packaging: have _yum_lock for MainThread >03:47:56,262 INFO packaging: gave up _yum_lock for MainThread >03:47:56,263 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,263 INFO packaging: have _yum_lock for MainThread >03:47:56,263 INFO packaging: gave up _yum_lock for MainThread >03:47:56,264 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,264 INFO packaging: have _yum_lock for MainThread >03:47:56,265 INFO packaging: gave up _yum_lock for MainThread >03:47:56,266 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,266 INFO packaging: have _yum_lock for MainThread >03:47:56,266 INFO packaging: gave up _yum_lock for MainThread >03:47:56,267 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,268 INFO packaging: have _yum_lock for MainThread >03:47:56,268 INFO packaging: gave up _yum_lock for MainThread >03:47:56,269 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,269 INFO packaging: have _yum_lock for MainThread >03:47:56,270 INFO packaging: gave up _yum_lock for MainThread >03:47:56,271 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,271 INFO packaging: have _yum_lock for MainThread >03:47:56,271 INFO packaging: gave up _yum_lock for MainThread >03:47:56,272 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,273 INFO packaging: have _yum_lock for MainThread >03:47:56,273 INFO packaging: gave up _yum_lock for MainThread >03:47:56,274 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,274 INFO packaging: have _yum_lock for MainThread >03:47:56,275 INFO packaging: gave up _yum_lock for MainThread >03:47:56,276 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,276 INFO packaging: have _yum_lock for MainThread >03:47:56,276 INFO packaging: gave up _yum_lock for MainThread >03:47:56,277 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,278 INFO packaging: have _yum_lock for MainThread >03:47:56,278 INFO packaging: gave up _yum_lock for MainThread >03:47:56,279 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,279 INFO packaging: have _yum_lock for MainThread >03:47:56,280 INFO packaging: gave up _yum_lock for MainThread >03:47:56,281 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,281 INFO packaging: have _yum_lock for MainThread >03:47:56,281 INFO packaging: gave up _yum_lock for MainThread >03:47:56,283 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,283 INFO packaging: have _yum_lock for MainThread >03:47:56,283 INFO packaging: gave up _yum_lock for MainThread >03:47:56,284 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,284 INFO packaging: have _yum_lock for MainThread >03:47:56,285 INFO packaging: gave up _yum_lock for MainThread >03:47:56,286 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,286 INFO packaging: have _yum_lock for MainThread >03:47:56,286 INFO packaging: gave up _yum_lock for MainThread >03:47:56,288 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,288 INFO packaging: have _yum_lock for MainThread >03:47:56,288 INFO packaging: gave up _yum_lock for MainThread >03:47:56,289 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,290 INFO packaging: have _yum_lock for MainThread >03:47:56,290 INFO packaging: gave up _yum_lock for MainThread >03:47:56,291 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,291 INFO packaging: have _yum_lock for MainThread >03:47:56,291 INFO packaging: gave up _yum_lock for MainThread >03:47:56,293 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,293 INFO packaging: have _yum_lock for MainThread >03:47:56,293 INFO packaging: gave up _yum_lock for MainThread >03:47:56,294 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,294 INFO packaging: have _yum_lock for MainThread >03:47:56,295 INFO packaging: gave up _yum_lock for MainThread >03:47:56,296 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,296 INFO packaging: have _yum_lock for MainThread >03:47:56,296 INFO packaging: gave up _yum_lock for MainThread >03:47:56,297 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,298 INFO packaging: have _yum_lock for MainThread >03:47:56,298 INFO packaging: gave up _yum_lock for MainThread >03:47:56,299 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,300 INFO packaging: have _yum_lock for MainThread >03:47:56,300 INFO packaging: gave up _yum_lock for MainThread >03:47:56,301 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,301 INFO packaging: have _yum_lock for MainThread >03:47:56,301 INFO packaging: gave up _yum_lock for MainThread >03:47:56,303 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,303 INFO packaging: have _yum_lock for MainThread >03:47:56,303 INFO packaging: gave up _yum_lock for MainThread >03:47:56,304 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,304 INFO packaging: have _yum_lock for MainThread >03:47:56,305 INFO packaging: gave up _yum_lock for MainThread >03:47:56,306 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,306 INFO packaging: have _yum_lock for MainThread >03:47:56,307 INFO packaging: gave up _yum_lock for MainThread >03:47:56,308 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,308 INFO packaging: have _yum_lock for MainThread >03:47:56,308 INFO packaging: gave up _yum_lock for MainThread >03:47:56,309 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,309 INFO packaging: have _yum_lock for MainThread >03:47:56,310 INFO packaging: gave up _yum_lock for MainThread >03:47:56,311 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,311 INFO packaging: have _yum_lock for MainThread >03:47:56,311 INFO packaging: gave up _yum_lock for MainThread >03:47:56,313 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,313 INFO packaging: have _yum_lock for MainThread >03:47:56,313 INFO packaging: gave up _yum_lock for MainThread >03:47:56,314 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,314 INFO packaging: have _yum_lock for MainThread >03:47:56,315 INFO packaging: gave up _yum_lock for MainThread >03:47:56,316 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,316 INFO packaging: have _yum_lock for MainThread >03:47:56,316 INFO packaging: gave up _yum_lock for MainThread >03:47:56,318 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,318 INFO packaging: have _yum_lock for MainThread >03:47:56,318 INFO packaging: gave up _yum_lock for MainThread >03:47:56,319 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,320 INFO packaging: have _yum_lock for MainThread >03:47:56,320 INFO packaging: gave up _yum_lock for MainThread >03:47:56,321 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,321 INFO packaging: have _yum_lock for MainThread >03:47:56,321 INFO packaging: gave up _yum_lock for MainThread >03:47:56,323 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,323 INFO packaging: have _yum_lock for MainThread >03:47:56,323 INFO packaging: gave up _yum_lock for MainThread >03:47:56,324 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,325 INFO packaging: have _yum_lock for MainThread >03:47:56,325 INFO packaging: gave up _yum_lock for MainThread >03:47:56,326 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,326 INFO packaging: have _yum_lock for MainThread >03:47:56,326 INFO packaging: gave up _yum_lock for MainThread >03:47:56,328 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,328 INFO packaging: have _yum_lock for MainThread >03:47:56,328 INFO packaging: gave up _yum_lock for MainThread >03:47:56,329 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,330 INFO packaging: have _yum_lock for MainThread >03:47:56,330 INFO packaging: gave up _yum_lock for MainThread >03:47:56,331 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,331 INFO packaging: have _yum_lock for MainThread >03:47:56,332 INFO packaging: gave up _yum_lock for MainThread >03:47:56,333 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,333 INFO packaging: have _yum_lock for MainThread >03:47:56,333 INFO packaging: gave up _yum_lock for MainThread >03:47:56,334 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,335 INFO packaging: have _yum_lock for MainThread >03:47:56,335 INFO packaging: gave up _yum_lock for MainThread >03:47:56,336 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,336 INFO packaging: have _yum_lock for MainThread >03:47:56,337 INFO packaging: gave up _yum_lock for MainThread >03:47:56,338 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,338 INFO packaging: have _yum_lock for MainThread >03:47:56,338 INFO packaging: gave up _yum_lock for MainThread >03:47:56,339 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,340 INFO packaging: have _yum_lock for MainThread >03:47:56,340 INFO packaging: gave up _yum_lock for MainThread >03:47:56,342 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,342 INFO packaging: have _yum_lock for MainThread >03:47:56,343 INFO packaging: gave up _yum_lock for MainThread >03:47:56,344 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,344 INFO packaging: have _yum_lock for MainThread >03:47:56,345 INFO packaging: gave up _yum_lock for MainThread >03:47:56,346 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,346 INFO packaging: have _yum_lock for MainThread >03:47:56,346 INFO packaging: gave up _yum_lock for MainThread >03:47:56,348 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,348 INFO packaging: have _yum_lock for MainThread >03:47:56,348 INFO packaging: gave up _yum_lock for MainThread >03:47:56,349 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,349 INFO packaging: have _yum_lock for MainThread >03:47:56,350 INFO packaging: gave up _yum_lock for MainThread >03:47:56,351 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,351 INFO packaging: have _yum_lock for MainThread >03:47:56,351 INFO packaging: gave up _yum_lock for MainThread >03:47:56,352 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,353 INFO packaging: have _yum_lock for MainThread >03:47:56,353 INFO packaging: gave up _yum_lock for MainThread >03:47:56,354 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,354 INFO packaging: have _yum_lock for MainThread >03:47:56,354 INFO packaging: gave up _yum_lock for MainThread >03:47:56,355 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,356 INFO packaging: have _yum_lock for MainThread >03:47:56,356 INFO packaging: gave up _yum_lock for MainThread >03:47:56,357 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,357 INFO packaging: have _yum_lock for MainThread >03:47:56,358 INFO packaging: gave up _yum_lock for MainThread >03:47:56,359 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,359 INFO packaging: have _yum_lock for MainThread >03:47:56,359 INFO packaging: gave up _yum_lock for MainThread >03:47:56,361 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,361 INFO packaging: have _yum_lock for MainThread >03:47:56,361 INFO packaging: gave up _yum_lock for MainThread >03:47:56,363 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,363 INFO packaging: have _yum_lock for MainThread >03:47:56,363 INFO packaging: gave up _yum_lock for MainThread >03:47:56,365 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,365 INFO packaging: have _yum_lock for MainThread >03:47:56,365 INFO packaging: gave up _yum_lock for MainThread >03:47:56,367 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,367 INFO packaging: have _yum_lock for MainThread >03:47:56,367 INFO packaging: gave up _yum_lock for MainThread >03:47:56,368 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,368 INFO packaging: have _yum_lock for MainThread >03:47:56,369 INFO packaging: gave up _yum_lock for MainThread >03:47:56,370 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,370 INFO packaging: have _yum_lock for MainThread >03:47:56,370 INFO packaging: gave up _yum_lock for MainThread >03:47:56,371 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,372 INFO packaging: have _yum_lock for MainThread >03:47:56,372 INFO packaging: gave up _yum_lock for MainThread >03:47:56,373 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,373 INFO packaging: have _yum_lock for MainThread >03:47:56,373 INFO packaging: gave up _yum_lock for MainThread >03:47:56,375 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,375 INFO packaging: have _yum_lock for MainThread >03:47:56,375 INFO packaging: gave up _yum_lock for MainThread >03:47:56,376 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,377 INFO packaging: have _yum_lock for MainThread >03:47:56,377 INFO packaging: gave up _yum_lock for MainThread >03:47:56,378 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,378 INFO packaging: have _yum_lock for MainThread >03:47:56,378 INFO packaging: gave up _yum_lock for MainThread >03:47:56,379 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,380 INFO packaging: have _yum_lock for MainThread >03:47:56,380 INFO packaging: gave up _yum_lock for MainThread >03:47:56,381 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,381 INFO packaging: have _yum_lock for MainThread >03:47:56,382 INFO packaging: gave up _yum_lock for MainThread >03:47:56,383 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,383 INFO packaging: have _yum_lock for MainThread >03:47:56,383 INFO packaging: gave up _yum_lock for MainThread >03:47:56,384 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,384 INFO packaging: have _yum_lock for MainThread >03:47:56,385 INFO packaging: gave up _yum_lock for MainThread >03:47:56,386 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,386 INFO packaging: have _yum_lock for MainThread >03:47:56,386 INFO packaging: gave up _yum_lock for MainThread >03:47:56,387 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,388 INFO packaging: have _yum_lock for MainThread >03:47:56,388 INFO packaging: gave up _yum_lock for MainThread >03:47:56,389 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,389 INFO packaging: have _yum_lock for MainThread >03:47:56,389 INFO packaging: gave up _yum_lock for MainThread >03:47:56,391 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,391 INFO packaging: have _yum_lock for MainThread >03:47:56,391 INFO packaging: gave up _yum_lock for MainThread >03:47:56,392 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,392 INFO packaging: have _yum_lock for MainThread >03:47:56,393 INFO packaging: gave up _yum_lock for MainThread >03:47:56,394 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,394 INFO packaging: have _yum_lock for MainThread >03:47:56,394 INFO packaging: gave up _yum_lock for MainThread >03:47:56,395 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,396 INFO packaging: have _yum_lock for MainThread >03:47:56,396 INFO packaging: gave up _yum_lock for MainThread >03:47:56,397 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,397 INFO packaging: have _yum_lock for MainThread >03:47:56,397 INFO packaging: gave up _yum_lock for MainThread >03:47:56,399 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,399 INFO packaging: have _yum_lock for MainThread >03:47:56,399 INFO packaging: gave up _yum_lock for MainThread >03:47:56,401 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,401 INFO packaging: have _yum_lock for MainThread >03:47:56,401 INFO packaging: gave up _yum_lock for MainThread >03:47:56,402 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,403 INFO packaging: have _yum_lock for MainThread >03:47:56,403 INFO packaging: gave up _yum_lock for MainThread >03:47:56,404 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,404 INFO packaging: have _yum_lock for MainThread >03:47:56,405 INFO packaging: gave up _yum_lock for MainThread >03:47:56,406 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,406 INFO packaging: have _yum_lock for MainThread >03:47:56,406 INFO packaging: gave up _yum_lock for MainThread >03:47:56,408 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,408 INFO packaging: have _yum_lock for MainThread >03:47:56,408 INFO packaging: gave up _yum_lock for MainThread >03:47:56,409 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,410 INFO packaging: have _yum_lock for MainThread >03:47:56,410 INFO packaging: gave up _yum_lock for MainThread >03:47:56,411 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,411 INFO packaging: have _yum_lock for MainThread >03:47:56,412 INFO packaging: gave up _yum_lock for MainThread >03:47:56,413 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,413 INFO packaging: have _yum_lock for MainThread >03:47:56,413 INFO packaging: gave up _yum_lock for MainThread >03:47:56,414 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,415 INFO packaging: have _yum_lock for MainThread >03:47:56,415 INFO packaging: gave up _yum_lock for MainThread >03:47:56,416 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,416 INFO packaging: have _yum_lock for MainThread >03:47:56,417 INFO packaging: gave up _yum_lock for MainThread >03:47:56,418 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,418 INFO packaging: have _yum_lock for MainThread >03:47:56,418 INFO packaging: gave up _yum_lock for MainThread >03:47:56,419 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,420 INFO packaging: have _yum_lock for MainThread >03:47:56,420 INFO packaging: gave up _yum_lock for MainThread >03:47:56,421 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,421 INFO packaging: have _yum_lock for MainThread >03:47:56,422 INFO packaging: gave up _yum_lock for MainThread >03:47:56,423 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,423 INFO packaging: have _yum_lock for MainThread >03:47:56,423 INFO packaging: gave up _yum_lock for MainThread >03:47:56,425 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,425 INFO packaging: have _yum_lock for MainThread >03:47:56,425 INFO packaging: gave up _yum_lock for MainThread >03:47:56,426 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,426 INFO packaging: have _yum_lock for MainThread >03:47:56,427 INFO packaging: gave up _yum_lock for MainThread >03:47:56,428 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,428 INFO packaging: have _yum_lock for MainThread >03:47:56,428 INFO packaging: gave up _yum_lock for MainThread >03:47:56,429 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,430 INFO packaging: have _yum_lock for MainThread >03:47:56,430 INFO packaging: gave up _yum_lock for MainThread >03:47:56,431 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,431 INFO packaging: have _yum_lock for MainThread >03:47:56,432 INFO packaging: gave up _yum_lock for MainThread >03:47:56,433 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,433 INFO packaging: have _yum_lock for MainThread >03:47:56,433 INFO packaging: gave up _yum_lock for MainThread >03:47:56,435 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,435 INFO packaging: have _yum_lock for MainThread >03:47:56,435 INFO packaging: gave up _yum_lock for MainThread >03:47:56,436 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,437 INFO packaging: have _yum_lock for MainThread >03:47:56,437 INFO packaging: gave up _yum_lock for MainThread >03:47:56,438 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,438 INFO packaging: have _yum_lock for MainThread >03:47:56,439 INFO packaging: gave up _yum_lock for MainThread >03:47:56,440 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,440 INFO packaging: have _yum_lock for MainThread >03:47:56,440 INFO packaging: gave up _yum_lock for MainThread >03:47:56,441 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:56,442 INFO packaging: have _yum_lock for MainThread >03:47:56,442 INFO packaging: gave up _yum_lock for MainThread >03:47:56,443 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,443 INFO packaging: have _yum_lock for MainThread >03:47:56,443 INFO packaging: gave up _yum_lock for MainThread >03:47:56,445 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,445 INFO packaging: have _yum_lock for MainThread >03:47:56,445 INFO packaging: gave up _yum_lock for MainThread >03:47:56,446 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,446 INFO packaging: have _yum_lock for MainThread >03:47:56,447 INFO packaging: gave up _yum_lock for MainThread >03:47:56,448 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,448 INFO packaging: have _yum_lock for MainThread >03:47:56,448 INFO packaging: gave up _yum_lock for MainThread >03:47:56,450 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,450 INFO packaging: have _yum_lock for MainThread >03:47:56,450 INFO packaging: gave up _yum_lock for MainThread >03:47:56,451 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,451 INFO packaging: have _yum_lock for MainThread >03:47:56,452 INFO packaging: gave up _yum_lock for MainThread >03:47:56,453 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,453 INFO packaging: have _yum_lock for MainThread >03:47:56,453 INFO packaging: gave up _yum_lock for MainThread >03:47:56,455 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,455 INFO packaging: have _yum_lock for MainThread >03:47:56,455 INFO packaging: gave up _yum_lock for MainThread >03:47:56,456 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,456 INFO packaging: have _yum_lock for MainThread >03:47:56,457 INFO packaging: gave up _yum_lock for MainThread >03:47:56,458 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,458 INFO packaging: have _yum_lock for MainThread >03:47:56,458 INFO packaging: gave up _yum_lock for MainThread >03:47:56,460 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,460 INFO packaging: have _yum_lock for MainThread >03:47:56,460 INFO packaging: gave up _yum_lock for MainThread >03:47:56,461 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,462 INFO packaging: have _yum_lock for MainThread >03:47:56,462 INFO packaging: gave up _yum_lock for MainThread >03:47:56,463 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,463 INFO packaging: have _yum_lock for MainThread >03:47:56,463 INFO packaging: gave up _yum_lock for MainThread >03:47:56,465 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,465 INFO packaging: have _yum_lock for MainThread >03:47:56,465 INFO packaging: gave up _yum_lock for MainThread >03:47:56,466 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,467 INFO packaging: have _yum_lock for MainThread >03:47:56,467 INFO packaging: gave up _yum_lock for MainThread >03:47:56,468 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,468 INFO packaging: have _yum_lock for MainThread >03:47:56,469 INFO packaging: gave up _yum_lock for MainThread >03:47:56,470 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,470 INFO packaging: have _yum_lock for MainThread >03:47:56,470 INFO packaging: gave up _yum_lock for MainThread >03:47:56,471 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,472 INFO packaging: have _yum_lock for MainThread >03:47:56,472 INFO packaging: gave up _yum_lock for MainThread >03:47:56,473 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,473 INFO packaging: have _yum_lock for MainThread >03:47:56,474 INFO packaging: gave up _yum_lock for MainThread >03:47:56,475 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,475 INFO packaging: have _yum_lock for MainThread >03:47:56,475 INFO packaging: gave up _yum_lock for MainThread >03:47:56,477 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,477 INFO packaging: have _yum_lock for MainThread >03:47:56,477 INFO packaging: gave up _yum_lock for MainThread >03:47:56,478 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,479 INFO packaging: have _yum_lock for MainThread >03:47:56,479 INFO packaging: gave up _yum_lock for MainThread >03:47:56,480 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,480 INFO packaging: have _yum_lock for MainThread >03:47:56,480 INFO packaging: gave up _yum_lock for MainThread >03:47:56,482 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,482 INFO packaging: have _yum_lock for MainThread >03:47:56,482 INFO packaging: gave up _yum_lock for MainThread >03:47:56,483 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,484 INFO packaging: have _yum_lock for MainThread >03:47:56,484 INFO packaging: gave up _yum_lock for MainThread >03:47:56,485 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,485 INFO packaging: have _yum_lock for MainThread >03:47:56,486 INFO packaging: gave up _yum_lock for MainThread >03:47:56,487 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,487 INFO packaging: have _yum_lock for MainThread >03:47:56,487 INFO packaging: gave up _yum_lock for MainThread >03:47:56,488 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,489 INFO packaging: have _yum_lock for MainThread >03:47:56,489 INFO packaging: gave up _yum_lock for MainThread >03:47:56,490 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,490 INFO packaging: have _yum_lock for MainThread >03:47:56,491 INFO packaging: gave up _yum_lock for MainThread >03:47:56,492 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,492 INFO packaging: have _yum_lock for MainThread >03:47:56,493 INFO packaging: gave up _yum_lock for MainThread >03:47:56,494 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,494 INFO packaging: have _yum_lock for MainThread >03:47:56,495 INFO packaging: gave up _yum_lock for MainThread >03:47:56,496 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,497 INFO packaging: have _yum_lock for MainThread >03:47:56,503 INFO packaging: gave up _yum_lock for MainThread >03:47:56,504 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,505 INFO packaging: have _yum_lock for MainThread >03:47:56,505 INFO packaging: gave up _yum_lock for MainThread >03:47:56,506 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,506 INFO packaging: have _yum_lock for MainThread >03:47:56,507 INFO packaging: gave up _yum_lock for MainThread >03:47:56,508 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,508 INFO packaging: have _yum_lock for MainThread >03:47:56,508 INFO packaging: gave up _yum_lock for MainThread >03:47:56,509 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,510 INFO packaging: have _yum_lock for MainThread >03:47:56,510 INFO packaging: gave up _yum_lock for MainThread >03:47:56,511 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,511 INFO packaging: have _yum_lock for MainThread >03:47:56,512 INFO packaging: gave up _yum_lock for MainThread >03:47:56,513 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,513 INFO packaging: have _yum_lock for MainThread >03:47:56,513 INFO packaging: gave up _yum_lock for MainThread >03:47:56,514 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,515 INFO packaging: have _yum_lock for MainThread >03:47:56,515 INFO packaging: gave up _yum_lock for MainThread >03:47:56,516 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,516 INFO packaging: have _yum_lock for MainThread >03:47:56,517 INFO packaging: gave up _yum_lock for MainThread >03:47:56,518 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,518 INFO packaging: have _yum_lock for MainThread >03:47:56,518 INFO packaging: gave up _yum_lock for MainThread >03:47:56,519 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,519 INFO packaging: have _yum_lock for MainThread >03:47:56,520 INFO packaging: gave up _yum_lock for MainThread >03:47:56,521 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,521 INFO packaging: have _yum_lock for MainThread >03:47:56,521 INFO packaging: gave up _yum_lock for MainThread >03:47:56,523 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,523 INFO packaging: have _yum_lock for MainThread >03:47:56,524 INFO packaging: gave up _yum_lock for MainThread >03:47:56,525 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,525 INFO packaging: have _yum_lock for MainThread >03:47:56,525 INFO packaging: gave up _yum_lock for MainThread >03:47:56,526 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,527 INFO packaging: have _yum_lock for MainThread >03:47:56,527 INFO packaging: gave up _yum_lock for MainThread >03:47:56,529 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,529 INFO packaging: have _yum_lock for MainThread >03:47:56,529 INFO packaging: gave up _yum_lock for MainThread >03:47:56,531 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,531 INFO packaging: have _yum_lock for MainThread >03:47:56,531 INFO packaging: gave up _yum_lock for MainThread >03:47:56,532 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,532 INFO packaging: have _yum_lock for MainThread >03:47:56,533 INFO packaging: gave up _yum_lock for MainThread >03:47:56,534 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,534 INFO packaging: have _yum_lock for MainThread >03:47:56,534 INFO packaging: gave up _yum_lock for MainThread >03:47:56,535 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,536 INFO packaging: have _yum_lock for MainThread >03:47:56,536 INFO packaging: gave up _yum_lock for MainThread >03:47:56,537 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,538 INFO packaging: have _yum_lock for MainThread >03:47:56,538 INFO packaging: gave up _yum_lock for MainThread >03:47:56,539 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,539 INFO packaging: have _yum_lock for MainThread >03:47:56,540 INFO packaging: gave up _yum_lock for MainThread >03:47:56,541 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,541 INFO packaging: have _yum_lock for MainThread >03:47:56,541 INFO packaging: gave up _yum_lock for MainThread >03:47:56,542 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,543 INFO packaging: have _yum_lock for MainThread >03:47:56,543 INFO packaging: gave up _yum_lock for MainThread >03:47:56,544 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,544 INFO packaging: have _yum_lock for MainThread >03:47:56,545 INFO packaging: gave up _yum_lock for MainThread >03:47:56,546 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,546 INFO packaging: have _yum_lock for MainThread >03:47:56,546 INFO packaging: gave up _yum_lock for MainThread >03:47:56,548 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:56,548 INFO packaging: have _yum_lock for MainThread >03:47:56,548 INFO packaging: gave up _yum_lock for MainThread >03:47:56,549 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,550 INFO packaging: have _yum_lock for MainThread >03:47:56,550 INFO packaging: gave up _yum_lock for MainThread >03:47:56,551 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,551 INFO packaging: have _yum_lock for MainThread >03:47:56,551 INFO packaging: gave up _yum_lock for MainThread >03:47:56,552 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,553 INFO packaging: have _yum_lock for MainThread >03:47:56,553 INFO packaging: gave up _yum_lock for MainThread >03:47:56,554 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,555 INFO packaging: have _yum_lock for MainThread >03:47:56,555 INFO packaging: gave up _yum_lock for MainThread >03:47:56,556 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,556 INFO packaging: have _yum_lock for MainThread >03:47:56,556 INFO packaging: gave up _yum_lock for MainThread >03:47:56,558 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,558 INFO packaging: have _yum_lock for MainThread >03:47:56,558 INFO packaging: gave up _yum_lock for MainThread >03:47:56,559 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,560 INFO packaging: have _yum_lock for MainThread >03:47:56,560 INFO packaging: gave up _yum_lock for MainThread >03:47:56,561 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,561 INFO packaging: have _yum_lock for MainThread >03:47:56,562 INFO packaging: gave up _yum_lock for MainThread >03:47:56,563 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,563 INFO packaging: have _yum_lock for MainThread >03:47:56,563 INFO packaging: gave up _yum_lock for MainThread >03:47:56,565 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,565 INFO packaging: have _yum_lock for MainThread >03:47:56,565 INFO packaging: gave up _yum_lock for MainThread >03:47:56,566 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,567 INFO packaging: have _yum_lock for MainThread >03:47:56,567 INFO packaging: gave up _yum_lock for MainThread >03:47:56,568 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,568 INFO packaging: have _yum_lock for MainThread >03:47:56,568 INFO packaging: gave up _yum_lock for MainThread >03:47:56,569 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,570 INFO packaging: have _yum_lock for MainThread >03:47:56,570 INFO packaging: gave up _yum_lock for MainThread >03:47:56,571 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,571 INFO packaging: have _yum_lock for MainThread >03:47:56,572 INFO packaging: gave up _yum_lock for MainThread >03:47:56,573 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,573 INFO packaging: have _yum_lock for MainThread >03:47:56,573 INFO packaging: gave up _yum_lock for MainThread >03:47:56,574 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,575 INFO packaging: have _yum_lock for MainThread >03:47:56,575 INFO packaging: gave up _yum_lock for MainThread >03:47:56,576 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,576 INFO packaging: have _yum_lock for MainThread >03:47:56,576 INFO packaging: gave up _yum_lock for MainThread >03:47:56,578 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,578 INFO packaging: have _yum_lock for MainThread >03:47:56,578 INFO packaging: gave up _yum_lock for MainThread >03:47:56,579 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,580 INFO packaging: have _yum_lock for MainThread >03:47:56,580 INFO packaging: gave up _yum_lock for MainThread >03:47:56,581 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,581 INFO packaging: have _yum_lock for MainThread >03:47:56,582 INFO packaging: gave up _yum_lock for MainThread >03:47:56,583 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,583 INFO packaging: have _yum_lock for MainThread >03:47:56,583 INFO packaging: gave up _yum_lock for MainThread >03:47:56,584 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,584 INFO packaging: have _yum_lock for MainThread >03:47:56,585 INFO packaging: gave up _yum_lock for MainThread >03:47:56,586 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,586 INFO packaging: have _yum_lock for MainThread >03:47:56,586 INFO packaging: gave up _yum_lock for MainThread >03:47:56,587 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,588 INFO packaging: have _yum_lock for MainThread >03:47:56,588 INFO packaging: gave up _yum_lock for MainThread >03:47:56,589 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,589 INFO packaging: have _yum_lock for MainThread >03:47:56,590 INFO packaging: gave up _yum_lock for MainThread >03:47:56,591 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,591 INFO packaging: have _yum_lock for MainThread >03:47:56,591 INFO packaging: gave up _yum_lock for MainThread >03:47:56,592 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,593 INFO packaging: have _yum_lock for MainThread >03:47:56,593 INFO packaging: gave up _yum_lock for MainThread >03:47:56,594 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,594 INFO packaging: have _yum_lock for MainThread >03:47:56,595 INFO packaging: gave up _yum_lock for MainThread >03:47:56,596 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,596 INFO packaging: have _yum_lock for MainThread >03:47:56,596 INFO packaging: gave up _yum_lock for MainThread >03:47:56,597 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,598 INFO packaging: have _yum_lock for MainThread >03:47:56,598 INFO packaging: gave up _yum_lock for MainThread >03:47:56,599 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,599 INFO packaging: have _yum_lock for MainThread >03:47:56,600 INFO packaging: gave up _yum_lock for MainThread >03:47:56,601 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,601 INFO packaging: have _yum_lock for MainThread >03:47:56,601 INFO packaging: gave up _yum_lock for MainThread >03:47:56,602 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,603 INFO packaging: have _yum_lock for MainThread >03:47:56,603 INFO packaging: gave up _yum_lock for MainThread >03:47:56,604 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,605 INFO packaging: have _yum_lock for MainThread >03:47:56,605 INFO packaging: gave up _yum_lock for MainThread >03:47:56,606 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,606 INFO packaging: have _yum_lock for MainThread >03:47:56,607 INFO packaging: gave up _yum_lock for MainThread >03:47:56,608 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,608 INFO packaging: have _yum_lock for MainThread >03:47:56,608 INFO packaging: gave up _yum_lock for MainThread >03:47:56,609 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,610 INFO packaging: have _yum_lock for MainThread >03:47:56,610 INFO packaging: gave up _yum_lock for MainThread >03:47:56,611 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1166 (_groupHasInstallableMembers) >03:47:56,611 INFO packaging: have _yum_lock for MainThread >03:47:56,612 INFO packaging: gave up _yum_lock for MainThread >03:47:56,613 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,613 INFO packaging: have _yum_lock for MainThread >03:47:56,613 INFO packaging: gave up _yum_lock for MainThread >03:47:56,614 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,615 INFO packaging: have _yum_lock for MainThread >03:47:56,615 INFO packaging: gave up _yum_lock for MainThread >03:47:56,616 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,616 INFO packaging: have _yum_lock for MainThread >03:47:56,617 INFO packaging: gave up _yum_lock for MainThread >03:47:56,618 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,618 INFO packaging: have _yum_lock for MainThread >03:47:56,618 INFO packaging: gave up _yum_lock for MainThread >03:47:56,619 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,620 INFO packaging: have _yum_lock for MainThread >03:47:56,620 INFO packaging: gave up _yum_lock for MainThread >03:47:56,621 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,621 INFO packaging: have _yum_lock for MainThread >03:47:56,621 INFO packaging: gave up _yum_lock for MainThread >03:47:56,622 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,623 INFO packaging: have _yum_lock for MainThread >03:47:56,623 INFO packaging: gave up _yum_lock for MainThread >03:47:56,624 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,624 INFO packaging: have _yum_lock for MainThread >03:47:56,624 INFO packaging: gave up _yum_lock for MainThread >03:47:56,625 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,626 INFO packaging: have _yum_lock for MainThread >03:47:56,626 INFO packaging: gave up _yum_lock for MainThread >03:47:56,627 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,627 INFO packaging: have _yum_lock for MainThread >03:47:56,628 INFO packaging: gave up _yum_lock for MainThread >03:47:56,629 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,629 INFO packaging: have _yum_lock for MainThread >03:47:56,629 INFO packaging: gave up _yum_lock for MainThread >03:47:56,630 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,630 INFO packaging: have _yum_lock for MainThread >03:47:56,631 INFO packaging: gave up _yum_lock for MainThread >03:47:56,632 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,632 INFO packaging: have _yum_lock for MainThread >03:47:56,632 INFO packaging: gave up _yum_lock for MainThread >03:47:56,633 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,634 INFO packaging: have _yum_lock for MainThread >03:47:56,634 INFO packaging: gave up _yum_lock for MainThread >03:47:56,635 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,635 INFO packaging: have _yum_lock for MainThread >03:47:56,635 INFO packaging: gave up _yum_lock for MainThread >03:47:56,636 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,637 INFO packaging: have _yum_lock for MainThread >03:47:56,637 INFO packaging: gave up _yum_lock for MainThread >03:47:56,638 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,638 INFO packaging: have _yum_lock for MainThread >03:47:56,638 INFO packaging: gave up _yum_lock for MainThread >03:47:56,639 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,639 INFO packaging: have _yum_lock for MainThread >03:47:56,640 INFO packaging: gave up _yum_lock for MainThread >03:47:56,641 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,641 INFO packaging: have _yum_lock for MainThread >03:47:56,641 INFO packaging: gave up _yum_lock for MainThread >03:47:56,642 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,642 INFO packaging: have _yum_lock for MainThread >03:47:56,642 INFO packaging: gave up _yum_lock for MainThread >03:47:56,644 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,644 INFO packaging: have _yum_lock for MainThread >03:47:56,644 INFO packaging: gave up _yum_lock for MainThread >03:47:56,645 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,646 INFO packaging: have _yum_lock for MainThread >03:47:56,646 INFO packaging: gave up _yum_lock for MainThread >03:47:56,647 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,647 INFO packaging: have _yum_lock for MainThread >03:47:56,648 INFO packaging: gave up _yum_lock for MainThread >03:47:56,649 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,649 INFO packaging: have _yum_lock for MainThread >03:47:56,649 INFO packaging: gave up _yum_lock for MainThread >03:47:56,650 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,651 INFO packaging: have _yum_lock for MainThread >03:47:56,651 INFO packaging: gave up _yum_lock for MainThread >03:47:56,652 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,652 INFO packaging: have _yum_lock for MainThread >03:47:56,653 INFO packaging: gave up _yum_lock for MainThread >03:47:56,654 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,654 INFO packaging: have _yum_lock for MainThread >03:47:56,654 INFO packaging: gave up _yum_lock for MainThread >03:47:56,655 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,655 INFO packaging: have _yum_lock for MainThread >03:47:56,656 INFO packaging: gave up _yum_lock for MainThread >03:47:56,657 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,657 INFO packaging: have _yum_lock for MainThread >03:47:56,657 INFO packaging: gave up _yum_lock for MainThread >03:47:56,658 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,659 INFO packaging: have _yum_lock for MainThread >03:47:56,659 INFO packaging: gave up _yum_lock for MainThread >03:47:56,660 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,660 INFO packaging: have _yum_lock for MainThread >03:47:56,660 INFO packaging: gave up _yum_lock for MainThread >03:47:56,661 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,662 INFO packaging: have _yum_lock for MainThread >03:47:56,662 INFO packaging: gave up _yum_lock for MainThread >03:47:56,663 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,663 INFO packaging: have _yum_lock for MainThread >03:47:56,664 INFO packaging: gave up _yum_lock for MainThread >03:47:56,665 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,665 INFO packaging: have _yum_lock for MainThread >03:47:56,665 INFO packaging: gave up _yum_lock for MainThread >03:47:56,666 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,666 INFO packaging: have _yum_lock for MainThread >03:47:56,667 INFO packaging: gave up _yum_lock for MainThread >03:47:56,668 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,668 INFO packaging: have _yum_lock for MainThread >03:47:56,668 INFO packaging: gave up _yum_lock for MainThread >03:47:56,670 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,670 INFO packaging: have _yum_lock for MainThread >03:47:56,670 INFO packaging: gave up _yum_lock for MainThread >03:47:56,671 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,672 INFO packaging: have _yum_lock for MainThread >03:47:56,672 INFO packaging: gave up _yum_lock for MainThread >03:47:56,673 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,673 INFO packaging: have _yum_lock for MainThread >03:47:56,674 INFO packaging: gave up _yum_lock for MainThread >03:47:56,675 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,675 INFO packaging: have _yum_lock for MainThread >03:47:56,675 INFO packaging: gave up _yum_lock for MainThread >03:47:56,676 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,677 INFO packaging: have _yum_lock for MainThread >03:47:56,677 INFO packaging: gave up _yum_lock for MainThread >03:47:56,678 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,678 INFO packaging: have _yum_lock for MainThread >03:47:56,679 INFO packaging: gave up _yum_lock for MainThread >03:47:56,680 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,680 INFO packaging: have _yum_lock for MainThread >03:47:56,680 INFO packaging: gave up _yum_lock for MainThread >03:47:56,681 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,681 INFO packaging: have _yum_lock for MainThread >03:47:56,682 INFO packaging: gave up _yum_lock for MainThread >03:47:56,683 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,683 INFO packaging: have _yum_lock for MainThread >03:47:56,683 INFO packaging: gave up _yum_lock for MainThread >03:47:56,685 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,685 INFO packaging: have _yum_lock for MainThread >03:47:56,685 INFO packaging: gave up _yum_lock for MainThread >03:47:56,686 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,686 INFO packaging: have _yum_lock for MainThread >03:47:56,687 INFO packaging: gave up _yum_lock for MainThread >03:47:56,688 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,688 INFO packaging: have _yum_lock for MainThread >03:47:56,688 INFO packaging: gave up _yum_lock for MainThread >03:47:56,689 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,690 INFO packaging: have _yum_lock for MainThread >03:47:56,690 INFO packaging: gave up _yum_lock for MainThread >03:47:56,691 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,691 INFO packaging: have _yum_lock for MainThread >03:47:56,691 INFO packaging: gave up _yum_lock for MainThread >03:47:56,693 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,693 INFO packaging: have _yum_lock for MainThread >03:47:56,693 INFO packaging: gave up _yum_lock for MainThread >03:47:56,694 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,695 INFO packaging: have _yum_lock for MainThread >03:47:56,695 INFO packaging: gave up _yum_lock for MainThread >03:47:56,696 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,696 INFO packaging: have _yum_lock for MainThread >03:47:56,697 INFO packaging: gave up _yum_lock for MainThread >03:47:56,698 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,698 INFO packaging: have _yum_lock for MainThread >03:47:56,698 INFO packaging: gave up _yum_lock for MainThread >03:47:56,699 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,699 INFO packaging: have _yum_lock for MainThread >03:47:56,700 INFO packaging: gave up _yum_lock for MainThread >03:47:56,701 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,701 INFO packaging: have _yum_lock for MainThread >03:47:56,701 INFO packaging: gave up _yum_lock for MainThread >03:47:56,702 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,703 INFO packaging: have _yum_lock for MainThread >03:47:56,703 INFO packaging: gave up _yum_lock for MainThread >03:47:56,704 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,705 INFO packaging: have _yum_lock for MainThread >03:47:56,705 INFO packaging: gave up _yum_lock for MainThread >03:47:56,706 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,706 INFO packaging: have _yum_lock for MainThread >03:47:56,706 INFO packaging: gave up _yum_lock for MainThread >03:47:56,707 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,708 INFO packaging: have _yum_lock for MainThread >03:47:56,708 INFO packaging: gave up _yum_lock for MainThread >03:47:56,709 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,709 INFO packaging: have _yum_lock for MainThread >03:47:56,710 INFO packaging: gave up _yum_lock for MainThread >03:47:56,711 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,711 INFO packaging: have _yum_lock for MainThread >03:47:56,711 INFO packaging: gave up _yum_lock for MainThread >03:47:56,713 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,713 INFO packaging: have _yum_lock for MainThread >03:47:56,713 INFO packaging: gave up _yum_lock for MainThread >03:47:56,714 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,714 INFO packaging: have _yum_lock for MainThread >03:47:56,715 INFO packaging: gave up _yum_lock for MainThread >03:47:56,716 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,716 INFO packaging: have _yum_lock for MainThread >03:47:56,716 INFO packaging: gave up _yum_lock for MainThread >03:47:56,718 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,718 INFO packaging: have _yum_lock for MainThread >03:47:56,718 INFO packaging: gave up _yum_lock for MainThread >03:47:56,719 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,719 INFO packaging: have _yum_lock for MainThread >03:47:56,720 INFO packaging: gave up _yum_lock for MainThread >03:47:56,721 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,721 INFO packaging: have _yum_lock for MainThread >03:47:56,721 INFO packaging: gave up _yum_lock for MainThread >03:47:56,722 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,722 INFO packaging: have _yum_lock for MainThread >03:47:56,722 INFO packaging: gave up _yum_lock for MainThread >03:47:56,724 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,724 INFO packaging: have _yum_lock for MainThread >03:47:56,724 INFO packaging: gave up _yum_lock for MainThread >03:47:56,725 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,725 INFO packaging: have _yum_lock for MainThread >03:47:56,726 INFO packaging: gave up _yum_lock for MainThread >03:47:56,727 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,727 INFO packaging: have _yum_lock for MainThread >03:47:56,727 INFO packaging: gave up _yum_lock for MainThread >03:47:56,728 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,729 INFO packaging: have _yum_lock for MainThread >03:47:56,729 INFO packaging: gave up _yum_lock for MainThread >03:47:56,730 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,730 INFO packaging: have _yum_lock for MainThread >03:47:56,731 INFO packaging: gave up _yum_lock for MainThread >03:47:56,732 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,732 INFO packaging: have _yum_lock for MainThread >03:47:56,732 INFO packaging: gave up _yum_lock for MainThread >03:47:56,734 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,734 INFO packaging: have _yum_lock for MainThread >03:47:56,734 INFO packaging: gave up _yum_lock for MainThread >03:47:56,735 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,735 INFO packaging: have _yum_lock for MainThread >03:47:56,736 INFO packaging: gave up _yum_lock for MainThread >03:47:56,737 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,737 INFO packaging: have _yum_lock for MainThread >03:47:56,737 INFO packaging: gave up _yum_lock for MainThread >03:47:56,738 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,739 INFO packaging: have _yum_lock for MainThread >03:47:56,739 INFO packaging: gave up _yum_lock for MainThread >03:47:56,740 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,740 INFO packaging: have _yum_lock for MainThread >03:47:56,741 INFO packaging: gave up _yum_lock for MainThread >03:47:56,742 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,742 INFO packaging: have _yum_lock for MainThread >03:47:56,742 INFO packaging: gave up _yum_lock for MainThread >03:47:56,743 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,743 INFO packaging: have _yum_lock for MainThread >03:47:56,744 INFO packaging: gave up _yum_lock for MainThread >03:47:56,745 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,745 INFO packaging: have _yum_lock for MainThread >03:47:56,745 INFO packaging: gave up _yum_lock for MainThread >03:47:56,747 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,747 INFO packaging: have _yum_lock for MainThread >03:47:56,747 INFO packaging: gave up _yum_lock for MainThread >03:47:56,748 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,748 INFO packaging: have _yum_lock for MainThread >03:47:56,749 INFO packaging: gave up _yum_lock for MainThread >03:47:56,750 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,750 INFO packaging: have _yum_lock for MainThread >03:47:56,750 INFO packaging: gave up _yum_lock for MainThread >03:47:56,751 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,752 INFO packaging: have _yum_lock for MainThread >03:47:56,752 INFO packaging: gave up _yum_lock for MainThread >03:47:56,753 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1023 (environmentHasOption) >03:47:56,753 INFO packaging: have _yum_lock for MainThread >03:47:56,754 INFO packaging: gave up _yum_lock for MainThread >03:47:56,755 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:287 (refreshAddons) >03:47:56,755 INFO packaging: have _yum_lock for MainThread >03:47:56,755 INFO packaging: gave up _yum_lock for MainThread >03:47:56,756 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1154 (_isGroupVisible) >03:47:56,757 INFO packaging: have _yum_lock for MainThread >03:47:56,757 INFO packaging: gave up _yum_lock for MainThread >03:47:56,758 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:289 (refreshAddons) >03:47:56,758 INFO packaging: have _yum_lock for MainThread >03:47:56,758 INFO packaging: gave up _yum_lock for MainThread >03:47:56,760 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,760 INFO packaging: have _yum_lock for MainThread >03:47:56,760 INFO packaging: gave up _yum_lock for MainThread >03:47:56,761 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,762 INFO packaging: have _yum_lock for MainThread >03:47:56,762 INFO packaging: gave up _yum_lock for MainThread >03:47:56,764 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,764 INFO packaging: have _yum_lock for MainThread >03:47:56,764 INFO packaging: gave up _yum_lock for MainThread >03:47:56,765 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,766 INFO packaging: have _yum_lock for MainThread >03:47:56,766 INFO packaging: gave up _yum_lock for MainThread >03:47:56,767 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,768 INFO packaging: have _yum_lock for MainThread >03:47:56,768 INFO packaging: gave up _yum_lock for MainThread >03:47:56,769 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,769 INFO packaging: have _yum_lock for MainThread >03:47:56,770 INFO packaging: gave up _yum_lock for MainThread >03:47:56,771 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,772 INFO packaging: have _yum_lock for MainThread >03:47:56,772 INFO packaging: gave up _yum_lock for MainThread >03:47:56,773 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,773 INFO packaging: have _yum_lock for MainThread >03:47:56,774 INFO packaging: gave up _yum_lock for MainThread >03:47:56,775 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,776 INFO packaging: have _yum_lock for MainThread >03:47:56,776 INFO packaging: gave up _yum_lock for MainThread >03:47:56,777 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,777 INFO packaging: have _yum_lock for MainThread >03:47:56,778 INFO packaging: gave up _yum_lock for MainThread >03:47:56,779 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,780 INFO packaging: have _yum_lock for MainThread >03:47:56,780 INFO packaging: gave up _yum_lock for MainThread >03:47:56,781 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,781 INFO packaging: have _yum_lock for MainThread >03:47:56,782 INFO packaging: gave up _yum_lock for MainThread >03:47:56,783 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1141 (groupDescription) >03:47:56,784 INFO packaging: have _yum_lock for MainThread >03:47:56,784 INFO packaging: gave up _yum_lock for MainThread >03:47:56,785 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:269 (_addAddon) >03:47:56,785 INFO packaging: have _yum_lock for MainThread >03:47:56,785 INFO packaging: gave up _yum_lock for MainThread >03:47:58,095 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1051 (selectEnvironment) >03:47:58,097 INFO packaging: have _yum_lock for MainThread >03:47:58,097 INFO packaging: gave up _yum_lock for MainThread >03:47:58,098 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:79 (_apply) >03:47:58,098 INFO packaging: have _yum_lock for MainThread >03:47:58,099 INFO packaging: gave up _yum_lock for MainThread >03:47:58,107 INFO packaging: checking software selection >03:47:58,108 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1370 (checkSoftwareSelection) >03:47:58,108 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:58,108 DEBUG packaging: deleting package sacks >03:47:58,111 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:58,113 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1371 (checkSoftwareSelection) >03:47:58,115 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:58,115 DEBUG packaging: deleting yum transaction info >03:47:58,149 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:58,149 DEBUG packaging: select group core >03:47:58,150 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1299 (_applyYumSelections) >03:47:58,151 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:58,151 DEBUG yum.verbose.YumBase: Setting up Package Sacks >03:47:58,349 DEBUG yum.verbose.YumBase: rpmdb time: 0.000 >03:47:58,409 DEBUG yum.verbose.YumBase: pkgsack time: 0.258 >03:47:58,506 DEBUG yum.verbose.YumBase: group time: 0.355 >03:47:58,511 CRIT yum.YumBase: Warning: Group core does not have any packages to install. >03:47:58,512 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:58,512 DEBUG packaging: select group core >03:47:58,513 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:47:58,514 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:58,514 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:58,514 DEBUG packaging: select package kernel-PAE >03:47:58,516 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1412 (selectKernelPackage) >03:47:58,516 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:58,535 DEBUG yum.verbose.YumBase: Checking for virtual provide or file-provide for kernel-PAE >03:47:58,537 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:58,537 INFO packaging: no kernel-PAE package >03:47:58,537 DEBUG packaging: select package kernel >03:47:58,538 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1412 (selectKernelPackage) >03:47:58,539 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:58,607 DEBUG yum.verbose.YumBase: Obs Init time: 0.068 >03:47:58,611 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:58,611 INFO packaging: selected kernel >03:47:58,612 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1336 (_applyYumSelections) >03:47:58,613 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:58,613 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:58,614 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:101 (checkSoftwareSelection) >03:47:58,614 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:58,614 INFO packaging: checking dependencies >03:47:58,618 DEBUG yum.verbose.YumBase: Building updates object >03:47:58,729 DEBUG yum.verbose.YumBase: up:simple updates time: 0.036 >03:47:58,732 DEBUG yum.verbose.YumBase: up:obs time: 0.002 >03:47:58,732 DEBUG yum.verbose.YumBase: up:condense time: 0.000 >03:47:58,733 DEBUG yum.verbose.YumBase: updates time: 0.114 >03:47:58,735 DEBUG yum.verbose.YumBase: TSINFO: Marking kmod-13-2.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:47:58,737 DEBUG yum.verbose.YumBase: TSINFO: Marking linux-firmware-20130201-0.5.git65a5163.fc19.noarch as install for kernel-3.9.0-301.fc19.x86_64 >03:47:58,740 DEBUG yum.verbose.YumBase: TSINFO: Marking initscripts-9.46-1.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:47:58,743 DEBUG yum.verbose.YumBase: TSINFO: Marking grubby-8.24-1.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:47:58,745 DEBUG yum.verbose.YumBase: TSINFO: Marking dracut-027-45.git20130430.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:47:58,748 DEBUG yum.verbose.YumBase: TSINFO: Marking coreutils-8.21-8.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:47:58,752 DEBUG yum.verbose.YumBase: TSINFO: Marking bash-4.2.45-1.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:47:58,753 DEBUG yum.verbose.YumBase: Quick matched bash-4.2.45-1.fc19.x86_64 to require for /bin/sh >03:47:58,761 DEBUG yum.verbose.YumBase: TSINFO: Marking glibc-2.17-4.fc19.x86_64 as install for bash-4.2.45-1.fc19.x86_64 >03:47:58,763 DEBUG yum.verbose.YumBase: Quick matched glibc-2.17-4.fc19.x86_64 to require for libdl.so.2(GLIBC_2.2.5)(64bit) >03:47:58,763 DEBUG yum.verbose.YumBase: Quick matched glibc-2.17-4.fc19.x86_64 to require for libc.so.6(GLIBC_2.15)(64bit) >03:47:58,765 DEBUG yum.verbose.YumBase: TSINFO: Marking ncurses-libs-5.9-10.20130413.fc19.x86_64 as install for bash-4.2.45-1.fc19.x86_64 >03:47:58,774 DEBUG yum.verbose.YumBase: TSINFO: Marking util-linux-2.23-1.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,778 DEBUG yum.verbose.YumBase: TSINFO: Marking ncurses-5.9-10.20130413.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,780 DEBUG yum.verbose.YumBase: TSINFO: Marking libattr-2.4.46-10.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,782 DEBUG yum.verbose.YumBase: TSINFO: Marking libacl-2.2.51-9.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,784 DEBUG yum.verbose.YumBase: TSINFO: Marking grep-2.14-3.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,786 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:gmp-5.1.1-2.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,788 DEBUG yum.verbose.YumBase: TSINFO: Marking info-5.1-1.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,790 DEBUG yum.verbose.YumBase: TSINFO: Marking libselinux-2.1.13-12.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,792 DEBUG yum.verbose.YumBase: TSINFO: Marking libcap-2.22-5.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:47:58,797 DEBUG yum.verbose.YumBase: TSINFO: Marking systemd-203-2.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,801 DEBUG yum.verbose.YumBase: TSINFO: Marking filesystem-3.2-9.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,803 DEBUG yum.verbose.YumBase: TSINFO: Marking xz-5.1.2-4alpha.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,805 DEBUG yum.verbose.YumBase: TSINFO: Marking sed-4.2.2-2.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,807 DEBUG yum.verbose.YumBase: TSINFO: Marking libgcc-4.8.0-2.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,808 DEBUG yum.verbose.YumBase: Quick matched libgcc-4.8.0-2.fc19.x86_64 to require for libgcc_s.so.1(GCC_3.0)(64bit) >03:47:58,809 DEBUG yum.verbose.YumBase: TSINFO: Marking kpartx-0.4.9-47.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,811 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:hardlink-1.0-17.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,813 DEBUG yum.verbose.YumBase: TSINFO: Marking gzip-1.5-4.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,815 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:findutils-4.5.11-1.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,817 DEBUG yum.verbose.YumBase: TSINFO: Marking cpio-2.11-20.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:47:58,821 DEBUG yum.verbose.YumBase: TSINFO: Marking popt-1.13-14.fc19.x86_64 as install for grubby-8.24-1.fc19.x86_64 >03:47:58,822 DEBUG yum.verbose.YumBase: TSINFO: Marking libblkid-2.23-1.fc19.x86_64 as install for grubby-8.24-1.fc19.x86_64 >03:47:58,828 DEBUG yum.verbose.YumBase: TSINFO: Marking sysvinit-tools-2.88-10.dsf.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,830 DEBUG yum.verbose.YumBase: TSINFO: Marking iproute-3.9.0-1.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,832 DEBUG yum.verbose.YumBase: TSINFO: Marking hostname-3.12-4.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,834 DEBUG yum.verbose.YumBase: TSINFO: Marking 2:shadow-utils-4.1.5.1-5.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,836 DEBUG yum.verbose.YumBase: TSINFO: Marking procps-ng-3.3.7-3.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,838 DEBUG yum.verbose.YumBase: TSINFO: Marking chkconfig-1.3.60-1.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,841 DEBUG yum.verbose.YumBase: TSINFO: Marking iputils-20121221-2.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,843 DEBUG yum.verbose.YumBase: TSINFO: Marking fedora-release-19-0.5.noarch as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,845 DEBUG yum.verbose.YumBase: TSINFO: Marking gawk-4.0.2-2.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,847 DEBUG yum.verbose.YumBase: TSINFO: Marking glib2-2.36.1-2.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:47:58,851 DEBUG yum.verbose.YumBase: TSINFO: Marking kmod-libs-13-2.fc19.x86_64 as install for kmod-13-2.fc19.x86_64 >03:47:58,852 DEBUG yum.verbose.YumBase: Quick matched kmod-libs-13-2.fc19.x86_64 to require for libkmod.so.2(LIBKMOD_6)(64bit) >03:47:58,853 DEBUG yum.verbose.YumBase: Quick matched kmod-libs-13-2.fc19.x86_64 to require for libkmod.so.2(LIBKMOD_5)(64bit) >03:47:58,853 DEBUG yum.verbose.YumBase: Quick matched kmod-libs-13-2.fc19.x86_64 to require for libkmod.so.2()(64bit) >03:47:58,858 DEBUG yum.verbose.YumBase: TSINFO: Marking setup-2.8.69-1.fc19.noarch as install for filesystem-3.2-9.fc19.x86_64 >03:47:58,873 DEBUG yum.verbose.YumBase: TSINFO: Marking shared-mime-info-1.1-4.fc19.x86_64 as install for glib2-2.36.1-2.fc19.x86_64 >03:47:58,875 DEBUG yum.verbose.YumBase: TSINFO: Marking zlib-1.2.7-10.fc19.x86_64 as install for glib2-2.36.1-2.fc19.x86_64 >03:47:58,876 DEBUG yum.verbose.YumBase: Quick matched zlib-1.2.7-10.fc19.x86_64 to require for libz.so.1()(64bit) >03:47:58,878 DEBUG yum.verbose.YumBase: TSINFO: Marking libffi-3.0.13-1.fc19.x86_64 as install for glib2-2.36.1-2.fc19.x86_64 >03:47:58,882 DEBUG yum.verbose.YumBase: TSINFO: Marking glibc-common-2.17-4.fc19.x86_64 as install for glibc-2.17-4.fc19.x86_64 >03:47:58,884 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-softokn-freebl-3.14.3-1.fc19.x86_64 as install for glibc-2.17-4.fc19.x86_64 >03:47:58,886 DEBUG yum.verbose.YumBase: TSINFO: Marking basesystem-10.0-8.fc19.noarch as install for glibc-2.17-4.fc19.x86_64 >03:47:58,891 DEBUG yum.verbose.YumBase: TSINFO: Marking libstdc++-4.8.0-2.fc19.x86_64 as install for 1:gmp-5.1.1-2.fc19.x86_64 >03:47:58,892 DEBUG yum.verbose.YumBase: Quick matched libstdc++-4.8.0-2.fc19.x86_64 to require for libstdc++.so.6(GLIBCXX_3.4)(64bit) >03:47:58,893 DEBUG yum.verbose.YumBase: Quick matched libstdc++-4.8.0-2.fc19.x86_64 to require for libstdc++.so.6(CXXABI_1.3)(64bit) >03:47:58,893 DEBUG yum.verbose.YumBase: Quick matched libstdc++-4.8.0-2.fc19.x86_64 to require for libstdc++.so.6()(64bit) >03:47:58,896 DEBUG yum.verbose.YumBase: TSINFO: Marking pcre-8.32-4.fc19.x86_64 as install for grep-2.14-3.fc19.x86_64 >03:47:58,912 DEBUG yum.verbose.YumBase: TSINFO: Marking iptables-1.4.18-1.fc19.x86_64 as install for iproute-3.9.0-1.fc19.x86_64 >03:47:58,914 DEBUG yum.verbose.YumBase: TSINFO: Marking libdb-5.3.21-9.fc19.x86_64 as install for iproute-3.9.0-1.fc19.x86_64 >03:47:58,916 DEBUG yum.verbose.YumBase: TSINFO: Marking linux-atm-libs-2.5.1-7.fc19.x86_64 as install for iproute-3.9.0-1.fc19.x86_64 >03:47:58,922 DEBUG yum.verbose.YumBase: TSINFO: Marking libidn-1.26-2.fc19.x86_64 as install for iputils-20121221-2.fc19.x86_64 >03:47:58,924 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:openssl-libs-1.0.1e-4.fc19.x86_64 as install for iputils-20121221-2.fc19.x86_64 >03:47:58,928 DEBUG yum.verbose.YumBase: TSINFO: Marking xz-libs-5.1.2-4alpha.fc19.x86_64 as install for kmod-libs-13-2.fc19.x86_64 >03:47:58,930 DEBUG yum.verbose.YumBase: Quick matched xz-libs-5.1.2-4alpha.fc19.x86_64 to require for liblzma.so.5()(64bit) >03:47:58,933 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-libs-1.02.77-8.fc19.x86_64 as install for kpartx-0.4.9-47.fc19.x86_64 >03:47:58,934 DEBUG yum.verbose.YumBase: Quick matched device-mapper-libs-1.02.77-8.fc19.x86_64 to require for libdevmapper.so.1.02()(64bit) >03:47:58,941 DEBUG yum.verbose.YumBase: TSINFO: Marking libuuid-2.23-1.fc19.x86_64 as install for libblkid-2.23-1.fc19.x86_64 >03:47:58,942 DEBUG yum.verbose.YumBase: Quick matched libuuid-2.23-1.fc19.x86_64 to require for libuuid.so.1(UUID_1.0)(64bit) >03:47:58,943 DEBUG yum.verbose.YumBase: Quick matched libuuid-2.23-1.fc19.x86_64 to require for libuuid.so.1()(64bit) >03:47:58,951 DEBUG yum.verbose.YumBase: TSINFO: Marking libsepol-2.1.9-1.fc19.x86_64 as install for libselinux-2.1.13-12.fc19.x86_64 >03:47:58,958 DEBUG yum.verbose.YumBase: TSINFO: Marking ncurses-base-5.9-10.20130413.fc19.noarch as install for ncurses-libs-5.9-10.20130413.fc19.x86_64 >03:47:58,968 DEBUG yum.verbose.YumBase: TSINFO: Marking audit-libs-2.3-2.fc19.x86_64 as install for 2:shadow-utils-4.1.5.1-5.fc19.x86_64 >03:47:58,970 DEBUG yum.verbose.YumBase: TSINFO: Marking libsemanage-2.1.10-4.fc19.x86_64 as install for 2:shadow-utils-4.1.5.1-5.fc19.x86_64 >03:47:58,971 DEBUG yum.verbose.YumBase: Quick matched libsemanage-2.1.10-4.fc19.x86_64 to require for libsemanage.so.1()(64bit) >03:47:58,985 DEBUG yum.verbose.YumBase: TSINFO: Marking systemd-libs-203-2.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:58,986 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libudev.so.1(LIBUDEV_189)(64bit) >03:47:58,987 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libudev.so.1(LIBUDEV_183)(64bit) >03:47:58,987 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libsystemd-journal.so.0(LIBSYSTEMD_JOURNAL_38)(64bit) >03:47:58,988 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libsystemd-journal.so.0(LIBSYSTEMD_JOURNAL_196)(64bit) >03:47:58,988 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libsystemd-journal.so.0(LIBSYSTEMD_JOURNAL_183)(64bit) >03:47:58,989 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libsystemd-id128.so.0(LIBSYSTEMD_ID128_38)(64bit) >03:47:58,989 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libsystemd-daemon.so.0(LIBSYSTEMD_DAEMON_31)(64bit) >03:47:58,992 DEBUG yum.verbose.YumBase: TSINFO: Marking pam-1.1.6-10.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:58,996 DEBUG yum.verbose.YumBase: TSINFO: Marking libgcrypt-1.5.2-1.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:58,998 DEBUG yum.verbose.YumBase: TSINFO: Marking cryptsetup-libs-1.6.1-1.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:59,004 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:dbus-1.6.8-5.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:59,006 DEBUG yum.verbose.YumBase: TSINFO: Marking acl-2.2.51-9.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:59,013 DEBUG yum.verbose.YumBase: TSINFO: Marking tcp_wrappers-libs-7.6-73.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:59,015 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libsystemd-journal.so.0()(64bit) >03:47:59,016 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libsystemd-id128.so.0()(64bit) >03:47:59,017 DEBUG yum.verbose.YumBase: Quick matched systemd-libs-203-2.fc19.x86_64 to require for libsystemd-daemon.so.0()(64bit) >03:47:59,023 DEBUG yum.verbose.YumBase: TSINFO: Marking qrencode-libs-3.4.1-1.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:59,025 DEBUG yum.verbose.YumBase: TSINFO: Marking libmicrohttpd-0.9.24-2.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:59,027 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:dbus-libs-1.6.8-5.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:47:59,171 DEBUG yum.verbose.YumBase: TSINFO: Marking libmount-2.23-1.fc19.x86_64 as install for util-linux-2.23-1.fc19.x86_64 >03:47:59,174 DEBUG yum.verbose.YumBase: TSINFO: Marking libutempter-1.1.6-2.fc19.x86_64 as install for util-linux-2.23-1.fc19.x86_64 >03:47:59,175 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount.so.1(MOUNT_2.22)(64bit) >03:47:59,176 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount.so.1(MOUNT_2.21)(64bit) >03:47:59,176 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount.so.1(MOUNT_2.20)(64bit) >03:47:59,177 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount.so.1(MOUNT_2.19)(64bit) >03:47:59,179 DEBUG yum.verbose.YumBase: TSINFO: Marking libuser-0.59-1.fc19.x86_64 as install for util-linux-2.23-1.fc19.x86_64 >03:47:59,182 DEBUG yum.verbose.YumBase: TSINFO: Marking libcap-ng-0.7.3-3.fc19.x86_64 as install for util-linux-2.23-1.fc19.x86_64 >03:47:59,194 DEBUG yum.verbose.YumBase: TSINFO: Marking fipscheck-lib-1.3.1-3.fc19.x86_64 as install for cryptsetup-libs-1.6.1-1.fc19.x86_64 >03:47:59,196 DEBUG yum.verbose.YumBase: TSINFO: Marking libgpg-error-1.11-1.fc19.x86_64 as install for cryptsetup-libs-1.6.1-1.fc19.x86_64 >03:47:59,205 DEBUG yum.verbose.YumBase: TSINFO: Marking expat-2.1.0-5.fc19.x86_64 as install for 1:dbus-1.6.8-5.fc19.x86_64 >03:47:59,214 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-1.02.77-8.fc19.x86_64 as install for device-mapper-libs-1.02.77-8.fc19.x86_64 >03:47:59,230 DEBUG yum.verbose.YumBase: TSINFO: Marking tzdata-2013b-2.fc19.noarch as install for glibc-common-2.17-4.fc19.x86_64 >03:47:59,252 DEBUG yum.verbose.YumBase: TSINFO: Marking gnutls-3.1.10-1.fc19.x86_64 as install for libmicrohttpd-0.9.24-2.fc19.x86_64 >03:47:59,253 DEBUG yum.verbose.YumBase: Quick matched gnutls-3.1.10-1.fc19.x86_64 to require for libgnutls.so.28()(64bit) >03:47:59,260 DEBUG yum.verbose.YumBase: TSINFO: Marking selinux-policy-3.12.1-42.fc19.noarch as install for libsemanage-2.1.10-4.fc19.x86_64 >03:47:59,263 DEBUG yum.verbose.YumBase: TSINFO: Marking ustr-1.0.4-13.fc18.x86_64 as install for libsemanage-2.1.10-4.fc19.x86_64 >03:47:59,264 DEBUG yum.verbose.YumBase: Quick matched ustr-1.0.4-13.fc18.x86_64 to require for libustr-1.0.so.1(USTR_1.0)(64bit) >03:47:59,265 DEBUG yum.verbose.YumBase: TSINFO: Marking bzip2-libs-1.0.6-8.fc19.x86_64 as install for libsemanage-2.1.10-4.fc19.x86_64 >03:47:59,279 DEBUG yum.verbose.YumBase: TSINFO: Marking openldap-2.4.35-1.fc19.x86_64 as install for libuser-0.59-1.fc19.x86_64 >03:47:59,294 DEBUG yum.verbose.YumBase: TSINFO: Marking ca-certificates-2012.87-10.1.fc19.noarch as install for 1:openssl-libs-1.0.1e-4.fc19.x86_64 >03:47:59,297 DEBUG yum.verbose.YumBase: TSINFO: Marking krb5-libs-1.11.2-2.fc19.x86_64 as install for 1:openssl-libs-1.0.1e-4.fc19.x86_64 >03:47:59,299 DEBUG yum.verbose.YumBase: Quick matched krb5-libs-1.11.2-2.fc19.x86_64 to require for libk5crypto.so.3(k5crypto_3_MIT)(64bit) >03:47:59,299 DEBUG yum.verbose.YumBase: Quick matched krb5-libs-1.11.2-2.fc19.x86_64 to require for libkrb5.so.3()(64bit) >03:47:59,300 DEBUG yum.verbose.YumBase: Quick matched krb5-libs-1.11.2-2.fc19.x86_64 to require for libk5crypto.so.3()(64bit) >03:47:59,300 DEBUG yum.verbose.YumBase: Quick matched krb5-libs-1.11.2-2.fc19.x86_64 to require for libgssapi_krb5.so.2()(64bit) >03:47:59,302 DEBUG yum.verbose.YumBase: TSINFO: Marking libcom_err-1.42.7-2.fc19.x86_64 as install for 1:openssl-libs-1.0.1e-4.fc19.x86_64 >03:47:59,313 DEBUG yum.verbose.YumBase: TSINFO: Marking libpwquality-1.2.1-2.fc19.x86_64 as install for pam-1.1.6-10.fc19.x86_64 >03:47:59,315 DEBUG yum.verbose.YumBase: TSINFO: Marking cracklib-dicts-2.8.22-3.fc19.x86_64 as install for pam-1.1.6-10.fc19.x86_64 >03:47:59,317 DEBUG yum.verbose.YumBase: TSINFO: Marking cracklib-2.8.22-3.fc19.x86_64 as install for pam-1.1.6-10.fc19.x86_64 >03:47:59,326 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:pkgconfig-0.27.1-1.fc19.x86_64 as install for shared-mime-info-1.1-4.fc19.x86_64 >03:47:59,330 DEBUG yum.verbose.YumBase: TSINFO: Marking libxml2-2.9.1-1.fc19.x86_64 as install for shared-mime-info-1.1-4.fc19.x86_64 >03:47:59,350 DEBUG yum.verbose.YumBase: TSINFO: Marking p11-kit-trust-0.18.1-1.fc19.x86_64 as install for ca-certificates-2012.87-10.1.fc19.noarch >03:47:59,353 DEBUG yum.verbose.YumBase: TSINFO: Marking p11-kit-0.18.1-1.fc19.x86_64 as install for ca-certificates-2012.87-10.1.fc19.noarch >03:47:59,364 DEBUG yum.verbose.YumBase: TSINFO: Marking fipscheck-1.3.1-3.fc19.x86_64 as install for fipscheck-lib-1.3.1-3.fc19.x86_64 >03:47:59,371 DEBUG yum.verbose.YumBase: TSINFO: Marking libtasn1-3.3-1.fc19.x86_64 as install for gnutls-3.1.10-1.fc19.x86_64 >03:47:59,372 DEBUG yum.verbose.YumBase: Quick matched libtasn1-3.3-1.fc19.x86_64 to require for libtasn1.so.6()(64bit) >03:47:59,374 DEBUG yum.verbose.YumBase: TSINFO: Marking nettle-2.6-2.fc19.x86_64 as install for gnutls-3.1.10-1.fc19.x86_64 >03:47:59,375 DEBUG yum.verbose.YumBase: Quick matched nettle-2.6-2.fc19.x86_64 to require for libhogweed.so.2()(64bit) >03:47:59,381 DEBUG yum.verbose.YumBase: TSINFO: Marking keyutils-libs-1.5.5-4.fc19.x86_64 as install for krb5-libs-1.11.2-2.fc19.x86_64 >03:47:59,383 DEBUG yum.verbose.YumBase: TSINFO: Marking libverto-0.2.5-2.fc19.x86_64 as install for krb5-libs-1.11.2-2.fc19.x86_64 >03:47:59,405 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-tools-3.14.3-12.0.fc19.x86_64 as install for openldap-2.4.35-1.fc19.x86_64 >03:47:59,410 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-3.14.3-12.0.fc19.x86_64 as install for openldap-2.4.35-1.fc19.x86_64 >03:47:59,412 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libssl3.so(NSS_3.4)(64bit) >03:47:59,412 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libssl3.so(NSS_3.2)(64bit) >03:47:59,413 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.9.3)(64bit) >03:47:59,413 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.9.2)(64bit) >03:47:59,414 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.8)(64bit) >03:47:59,415 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.6)(64bit) >03:47:59,416 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.4)(64bit) >03:47:59,416 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.3)(64bit) >03:47:59,417 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.2)(64bit) >03:47:59,417 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.12.9)(64bit) >03:47:59,418 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.12.5)(64bit) >03:47:59,418 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.12.1)(64bit) >03:47:59,419 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.12)(64bit) >03:47:59,419 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.11.1)(64bit) >03:47:59,420 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.11)(64bit) >03:47:59,420 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libnss3.so(NSS_3.10)(64bit) >03:47:59,421 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libssl3.so()(64bit) >03:47:59,421 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libsmime3.so()(64bit) >03:47:59,423 DEBUG yum.verbose.YumBase: TSINFO: Marking cyrus-sasl-lib-2.1.26-6.fc19.x86_64 as install for openldap-2.4.35-1.fc19.x86_64 >03:47:59,425 DEBUG yum.verbose.YumBase: TSINFO: Marking nspr-4.9.5-2.fc19.x86_64 as install for openldap-2.4.35-1.fc19.x86_64 >03:47:59,427 DEBUG yum.verbose.YumBase: Quick matched nspr-4.9.5-2.fc19.x86_64 to require for libplc4.so()(64bit) >03:47:59,428 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-util-3.14.3-1.fc19.x86_64 as install for openldap-2.4.35-1.fc19.x86_64 >03:47:59,433 DEBUG yum.verbose.YumBase: TSINFO: Marking policycoreutils-2.1.14-37.fc19.x86_64 as install for selinux-policy-3.12.1-42.fc19.noarch >03:47:59,464 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-softokn-3.14.3-1.fc19.x86_64 as install for nss-3.14.3-12.0.fc19.x86_64 >03:47:59,467 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-sysinit-3.14.3-12.0.fc19.x86_64 as install for nss-3.14.3-12.0.fc19.x86_64 >03:47:59,497 DEBUG yum.verbose.YumBase: TSINFO: Marking libselinux-utils-2.1.13-12.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:47:59,499 DEBUG yum.verbose.YumBase: TSINFO: Marking systemd-sysv-203-2.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:47:59,501 DEBUG yum.verbose.YumBase: TSINFO: Marking rpm-4.11.0.1-1.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:47:59,504 DEBUG yum.verbose.YumBase: TSINFO: Marking diffutils-3.3-1.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:47:59,514 DEBUG yum.verbose.YumBase: TSINFO: Marking sqlite-3.7.16.2-1.fc19.x86_64 as install for nss-softokn-3.14.3-1.fc19.x86_64 >03:47:59,525 DEBUG yum.verbose.YumBase: TSINFO: Marking curl-7.29.0-6.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:47:59,528 DEBUG yum.verbose.YumBase: TSINFO: Marking libdb-utils-5.3.21-9.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:47:59,530 DEBUG yum.verbose.YumBase: TSINFO: Marking rpm-libs-4.11.0.1-1.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:47:59,532 DEBUG yum.verbose.YumBase: Quick matched rpm-libs-4.11.0.1-1.fc19.x86_64 to require for librpm.so.3()(64bit) >03:47:59,533 DEBUG yum.verbose.YumBase: TSINFO: Marking lua-5.1.4-12.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:47:59,535 DEBUG yum.verbose.YumBase: TSINFO: Marking elfutils-libelf-0.155-5.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:47:59,539 DEBUG yum.verbose.YumBase: TSINFO: Marking python-2.7.4-4.fc19.x86_64 as install for systemd-sysv-203-2.fc19.x86_64 >03:47:59,545 DEBUG yum.verbose.YumBase: TSINFO: Marking libcurl-7.29.0-6.fc19.x86_64 as install for curl-7.29.0-6.fc19.x86_64 >03:47:59,547 DEBUG yum.verbose.YumBase: Quick matched libcurl-7.29.0-6.fc19.x86_64 to require for libcurl.so.4()(64bit) >03:47:59,558 DEBUG yum.verbose.YumBase: TSINFO: Marking readline-6.2-6.fc19.x86_64 as install for lua-5.1.4-12.fc19.x86_64 >03:47:59,566 DEBUG yum.verbose.YumBase: TSINFO: Marking python-libs-2.7.4-4.fc19.x86_64 as install for python-2.7.4-4.fc19.x86_64 >03:47:59,567 DEBUG yum.verbose.YumBase: Quick matched python-libs-2.7.4-4.fc19.x86_64 to require for libpython2.7.so.1.0()(64bit) >03:47:59,590 DEBUG yum.verbose.YumBase: TSINFO: Marking libssh2-1.4.3-4.fc19.x86_64 as install for libcurl-7.29.0-6.fc19.x86_64 >03:47:59,591 DEBUG yum.verbose.YumBase: Quick matched libssh2-1.4.3-4.fc19.x86_64 to require for libssh2.so.1()(64bit) >03:47:59,605 DEBUG yum.verbose.YumBase: TSINFO: Marking gdbm-1.10-6.fc19.x86_64 as install for python-libs-2.7.4-4.fc19.x86_64 >03:47:59,606 DEBUG yum.verbose.YumBase: Quick matched gdbm-1.10-6.fc19.x86_64 to require for libgdbm.so.4()(64bit) >03:47:59,656 DEBUG yum.verbose.YumBase: Depsolve time: 1.041 >03:47:59,658 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1383 (checkSoftwareSelection) >03:47:59,658 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:59,659 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:59,659 DEBUG packaging: success >03:47:59,660 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:59,661 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1398 (checkSoftwareSelection) >03:47:59,661 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:59,666 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:47:59,668 INFO packaging: about to acquire _yum_lock for AnaCheckSoftwareThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:101 (checkSoftwareSelection) >03:47:59,668 INFO packaging: have _yum_lock for AnaCheckSoftwareThread >03:47:59,669 INFO packaging: 126 packages selected totalling 622.82 MB >03:47:59,669 INFO packaging: gave up _yum_lock for AnaCheckSoftwareThread >03:48:00,005 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:48:00,007 INFO packaging: have _yum_lock for MainThread >03:48:00,010 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,011 INFO packaging: have _yum_lock for MainThread >03:48:00,011 INFO packaging: gave up _yum_lock for MainThread >03:48:00,012 INFO packaging: gave up _yum_lock for MainThread >03:48:00,013 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:48:00,013 INFO packaging: have _yum_lock for MainThread >03:48:00,015 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,015 INFO packaging: have _yum_lock for MainThread >03:48:00,016 INFO packaging: gave up _yum_lock for MainThread >03:48:00,016 INFO packaging: gave up _yum_lock for MainThread >03:48:00,018 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:48:00,018 INFO packaging: have _yum_lock for MainThread >03:48:00,019 INFO packaging: gave up _yum_lock for MainThread >03:48:00,020 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:179 (status) >03:48:00,020 INFO packaging: have _yum_lock for MainThread >03:48:00,021 INFO packaging: gave up _yum_lock for MainThread >03:48:00,022 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:48:00,023 INFO packaging: have _yum_lock for MainThread >03:48:00,024 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,025 INFO packaging: have _yum_lock for MainThread >03:48:00,025 INFO packaging: gave up _yum_lock for MainThread >03:48:00,026 INFO packaging: gave up _yum_lock for MainThread >03:48:00,027 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:48:00,028 INFO packaging: have _yum_lock for MainThread >03:48:00,028 INFO packaging: gave up _yum_lock for MainThread >03:48:00,029 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:179 (status) >03:48:00,030 INFO packaging: have _yum_lock for MainThread >03:48:00,031 INFO packaging: gave up _yum_lock for MainThread >03:48:00,035 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:00,036 INFO packaging: have _yum_lock for MainThread >03:48:00,037 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,038 INFO packaging: have _yum_lock for MainThread >03:48:00,038 INFO packaging: gave up _yum_lock for MainThread >03:48:00,039 INFO packaging: gave up _yum_lock for MainThread >03:48:00,040 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:00,041 INFO packaging: have _yum_lock for MainThread >03:48:00,042 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,042 INFO packaging: have _yum_lock for MainThread >03:48:00,043 INFO packaging: gave up _yum_lock for MainThread >03:48:00,043 INFO packaging: gave up _yum_lock for MainThread >03:48:00,045 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:00,046 INFO packaging: have _yum_lock for MainThread >03:48:00,047 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,047 INFO packaging: have _yum_lock for MainThread >03:48:00,048 INFO packaging: gave up _yum_lock for MainThread >03:48:00,048 INFO packaging: gave up _yum_lock for MainThread >03:48:00,050 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:00,050 INFO packaging: have _yum_lock for MainThread >03:48:00,051 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,052 INFO packaging: have _yum_lock for MainThread >03:48:00,052 INFO packaging: gave up _yum_lock for MainThread >03:48:00,053 INFO packaging: gave up _yum_lock for MainThread >03:48:00,055 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:00,055 INFO packaging: have _yum_lock for MainThread >03:48:00,057 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,058 INFO packaging: have _yum_lock for MainThread >03:48:00,058 INFO packaging: gave up _yum_lock for MainThread >03:48:00,058 INFO packaging: gave up _yum_lock for MainThread >03:48:00,060 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:00,060 INFO packaging: have _yum_lock for MainThread >03:48:00,061 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:00,062 INFO packaging: have _yum_lock for MainThread >03:48:00,062 INFO packaging: gave up _yum_lock for MainThread >03:48:00,063 INFO packaging: gave up _yum_lock for MainThread >03:48:05,680 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/storage.py:187 (_software_is_ready) >03:48:05,680 INFO packaging: have _yum_lock for MainThread >03:48:05,682 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:05,682 INFO packaging: have _yum_lock for MainThread >03:48:05,683 INFO packaging: gave up _yum_lock for MainThread >03:48:05,683 INFO packaging: gave up _yum_lock for MainThread >03:48:08,775 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:48:08,776 INFO packaging: have _yum_lock for MainThread >03:48:08,778 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,778 INFO packaging: have _yum_lock for MainThread >03:48:08,779 INFO packaging: gave up _yum_lock for MainThread >03:48:08,779 INFO packaging: gave up _yum_lock for MainThread >03:48:08,781 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:48:08,783 INFO packaging: have _yum_lock for MainThread >03:48:08,785 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,789 INFO packaging: have _yum_lock for MainThread >03:48:08,789 INFO packaging: gave up _yum_lock for MainThread >03:48:08,790 INFO packaging: gave up _yum_lock for MainThread >03:48:08,792 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:48:08,796 INFO packaging: have _yum_lock for MainThread >03:48:08,796 INFO packaging: gave up _yum_lock for MainThread >03:48:08,797 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:179 (status) >03:48:08,801 INFO packaging: have _yum_lock for MainThread >03:48:08,801 INFO packaging: gave up _yum_lock for MainThread >03:48:08,803 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:48:08,832 INFO packaging: have _yum_lock for MainThread >03:48:08,846 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,848 INFO packaging: have _yum_lock for MainThread >03:48:08,850 INFO packaging: gave up _yum_lock for MainThread >03:48:08,851 INFO packaging: gave up _yum_lock for MainThread >03:48:08,853 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:48:08,853 INFO packaging: have _yum_lock for MainThread >03:48:08,854 INFO packaging: gave up _yum_lock for MainThread >03:48:08,855 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:179 (status) >03:48:08,855 INFO packaging: have _yum_lock for MainThread >03:48:08,856 INFO packaging: gave up _yum_lock for MainThread >03:48:08,879 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:08,885 INFO packaging: have _yum_lock for MainThread >03:48:08,887 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,887 INFO packaging: have _yum_lock for MainThread >03:48:08,887 INFO packaging: gave up _yum_lock for MainThread >03:48:08,888 INFO packaging: gave up _yum_lock for MainThread >03:48:08,889 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:08,892 INFO packaging: have _yum_lock for MainThread >03:48:08,893 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,896 INFO packaging: have _yum_lock for MainThread >03:48:08,896 INFO packaging: gave up _yum_lock for MainThread >03:48:08,897 INFO packaging: gave up _yum_lock for MainThread >03:48:08,898 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:08,899 INFO packaging: have _yum_lock for MainThread >03:48:08,900 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,903 INFO packaging: have _yum_lock for MainThread >03:48:08,903 INFO packaging: gave up _yum_lock for MainThread >03:48:08,904 INFO packaging: gave up _yum_lock for MainThread >03:48:08,905 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:08,906 INFO packaging: have _yum_lock for MainThread >03:48:08,908 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,910 INFO packaging: have _yum_lock for MainThread >03:48:08,910 INFO packaging: gave up _yum_lock for MainThread >03:48:08,913 INFO packaging: gave up _yum_lock for MainThread >03:48:08,915 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:08,917 INFO packaging: have _yum_lock for MainThread >03:48:08,919 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,922 INFO packaging: have _yum_lock for MainThread >03:48:08,922 INFO packaging: gave up _yum_lock for MainThread >03:48:08,923 INFO packaging: gave up _yum_lock for MainThread >03:48:08,924 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:48:08,924 INFO packaging: have _yum_lock for MainThread >03:48:08,927 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:48:08,930 INFO packaging: have _yum_lock for MainThread >03:48:08,931 INFO packaging: gave up _yum_lock for MainThread >03:48:08,932 INFO packaging: gave up _yum_lock for MainThread >03:51:02,491 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:51:02,491 INFO packaging: have _yum_lock for MainThread >03:51:02,493 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,493 INFO packaging: have _yum_lock for MainThread >03:51:02,494 INFO packaging: gave up _yum_lock for MainThread >03:51:02,494 INFO packaging: gave up _yum_lock for MainThread >03:51:02,496 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:51:02,499 INFO packaging: have _yum_lock for MainThread >03:51:02,500 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,501 INFO packaging: have _yum_lock for MainThread >03:51:02,501 INFO packaging: gave up _yum_lock for MainThread >03:51:02,501 INFO packaging: gave up _yum_lock for MainThread >03:51:02,504 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:51:02,506 INFO packaging: have _yum_lock for MainThread >03:51:02,506 INFO packaging: gave up _yum_lock for MainThread >03:51:02,509 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:179 (status) >03:51:02,509 INFO packaging: have _yum_lock for MainThread >03:51:02,510 INFO packaging: gave up _yum_lock for MainThread >03:51:02,514 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:152 (ready) >03:51:02,515 INFO packaging: have _yum_lock for MainThread >03:51:02,518 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,519 INFO packaging: have _yum_lock for MainThread >03:51:02,520 INFO packaging: gave up _yum_lock for MainThread >03:51:02,521 INFO packaging: gave up _yum_lock for MainThread >03:51:02,523 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1038 (environmentDescription) >03:51:02,526 INFO packaging: have _yum_lock for MainThread >03:51:02,526 INFO packaging: gave up _yum_lock for MainThread >03:51:02,528 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/software.py:179 (status) >03:51:02,530 INFO packaging: have _yum_lock for MainThread >03:51:02,531 INFO packaging: gave up _yum_lock for MainThread >03:51:02,544 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:51:02,545 INFO packaging: have _yum_lock for MainThread >03:51:02,546 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,546 INFO packaging: have _yum_lock for MainThread >03:51:02,547 INFO packaging: gave up _yum_lock for MainThread >03:51:02,547 INFO packaging: gave up _yum_lock for MainThread >03:51:02,549 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:51:02,549 INFO packaging: have _yum_lock for MainThread >03:51:02,551 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,551 INFO packaging: have _yum_lock for MainThread >03:51:02,552 INFO packaging: gave up _yum_lock for MainThread >03:51:02,552 INFO packaging: gave up _yum_lock for MainThread >03:51:02,554 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:51:02,554 INFO packaging: have _yum_lock for MainThread >03:51:02,556 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,556 INFO packaging: have _yum_lock for MainThread >03:51:02,557 INFO packaging: gave up _yum_lock for MainThread >03:51:02,557 INFO packaging: gave up _yum_lock for MainThread >03:51:02,559 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:51:02,559 INFO packaging: have _yum_lock for MainThread >03:51:02,561 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,561 INFO packaging: have _yum_lock for MainThread >03:51:02,561 INFO packaging: gave up _yum_lock for MainThread >03:51:02,562 INFO packaging: gave up _yum_lock for MainThread >03:51:02,564 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:51:02,564 INFO packaging: have _yum_lock for MainThread >03:51:02,566 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,566 INFO packaging: have _yum_lock for MainThread >03:51:02,567 INFO packaging: gave up _yum_lock for MainThread >03:51:02,567 INFO packaging: gave up _yum_lock for MainThread >03:51:02,569 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/ui/gui/spokes/source.py:485 (status) >03:51:02,569 INFO packaging: have _yum_lock for MainThread >03:51:02,571 INFO packaging: about to acquire _yum_lock for MainThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:412 (baseRepo) >03:51:02,571 INFO packaging: have _yum_lock for MainThread >03:51:02,572 INFO packaging: gave up _yum_lock for MainThread >03:51:02,572 INFO packaging: gave up _yum_lock for MainThread >03:51:04,509 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:388 (preStorage) >03:51:04,509 INFO packaging: have _yum_lock for AnaInstallThread >03:51:04,509 DEBUG packaging: deleting package sacks >03:51:04,512 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:04,513 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/install.py:124 (doInstall) >03:51:04,514 INFO packaging: have _yum_lock for AnaInstallThread >03:51:04,514 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,777 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1122 (languageGroups) >03:51:54,778 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,778 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,779 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/install.py:136 (doInstall) >03:51:54,780 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,781 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,783 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/install.py:136 (doInstall) >03:51:54,784 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,785 DEBUG packaging: configuring langpacks for ['en_US.UTF-8'] >03:51:54,789 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:354 (_writeInstallConfig) >03:51:54,790 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,813 DEBUG packaging: getting release version from tree at None (19) >03:51:54,814 DEBUG packaging: got a release version of 19 >03:51:54,815 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,815 DEBUG packaging: setting releasever to previous value of 19 >03:51:54,817 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:358 (_writeInstallConfig) >03:51:54,817 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,829 INFO_2 yum.verbose.YumPlugins: Loaded plugins: blacklist, fastestmirror, langpacks, whiteout >03:51:54,830 INFO_2 yum.verbose.YumPlugins: No plugin match for: fastestmirror >03:51:54,831 INFO_2 yum.verbose.YumPlugins: No plugin match for: langpacks >03:51:54,832 DEBUG yum.verbose.plugin: Adding en_US.UTF-8 to language list >03:51:54,836 DEBUG yum.verbose.YumBase: Config time: 0.018 >03:51:54,838 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,838 INFO packaging: gathering repo metadata >03:51:54,840 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:546 (gatherRepoMetadata) >03:51:54,841 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,851 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,852 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:359 (_writeInstallConfig) >03:51:54,853 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,854 DEBUG packaging: getting repo metadata for updates >03:51:54,856 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:551 (gatherRepoMetadata) >03:51:54,856 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,862 DEBUG packaging: getting group info for updates >03:51:54,862 ERR packaging: failed to get groups for repo updates >03:51:54,863 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,864 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,865 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:359 (_writeInstallConfig) >03:51:54,865 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,866 DEBUG packaging: getting repo metadata for anaconda >03:51:54,867 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:551 (gatherRepoMetadata) >03:51:54,868 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,883 DEBUG packaging: getting group info for anaconda >03:51:54,885 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,886 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,886 INFO packaging: metadata retrieval complete >03:51:54,887 DEBUG packaging: installation yum config repos: anaconda,updates >03:51:54,887 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,888 INFO packaging: checking software selection >03:51:54,889 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1370 (checkSoftwareSelection) >03:51:54,890 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,890 DEBUG packaging: deleting package sacks >03:51:54,891 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,892 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1371 (checkSoftwareSelection) >03:51:54,892 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,893 DEBUG packaging: deleting yum transaction info >03:51:54,894 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:54,895 DEBUG packaging: select group core >03:51:54,897 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1299 (_applyYumSelections) >03:51:54,897 INFO packaging: have _yum_lock for AnaInstallThread >03:51:54,898 DEBUG yum.verbose.YumBase: Setting up Package Sacks >03:51:54,899 INFO_2 yum.verbose.plugin: Loading mirror speeds from cached hostfile >03:51:54,900 INFO_2 yum.verbose.plugin: * updates: mirror.globo.com >03:51:55,813 DEBUG yum.verbose.YumBase: rpmdb time: 0.000 >03:51:55,889 DEBUG yum.verbose.YumBase: pkgsack time: 0.991 >03:51:56,175 DEBUG yum.verbose.YumBase: group time: 1.277 >03:51:56,410 DEBUG yum.verbose.YumBase: Obs Init time: 0.225 >03:51:56,450 DEBUG yum.verbose.YumBase: No package named ppc64-utils available to be installed >03:51:56,451 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,452 DEBUG packaging: select group core >03:51:56,453 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1320 (_applyYumSelections) >03:51:56,454 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,454 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,454 DEBUG packaging: select package kernel-PAE >03:51:56,456 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1412 (selectKernelPackage) >03:51:56,456 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,552 DEBUG yum.verbose.YumBase: Checking for virtual provide or file-provide for kernel-PAE >03:51:56,561 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,563 INFO packaging: no kernel-PAE package >03:51:56,565 DEBUG packaging: select package kernel >03:51:56,568 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1412 (selectKernelPackage) >03:51:56,570 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,573 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,577 INFO packaging: selected kernel >03:51:56,582 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1336 (_applyYumSelections) >03:51:56,584 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,586 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,587 DEBUG packaging: select package grub2 >03:51:56,590 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1434 (selectRequiredPackages) >03:51:56,593 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,598 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,601 DEBUG packaging: select package mdadm >03:51:56,604 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1434 (selectRequiredPackages) >03:51:56,606 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,636 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,640 DEBUG packaging: select package e2fsprogs >03:51:56,641 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1434 (selectRequiredPackages) >03:51:56,661 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,663 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,664 DEBUG packaging: select package authconfig >03:51:56,666 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1434 (selectRequiredPackages) >03:51:56,667 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,668 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,668 DEBUG packaging: select package firewalld >03:51:56,670 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1434 (selectRequiredPackages) >03:51:56,671 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,671 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:56,672 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1460 (preInstall) >03:51:56,673 INFO packaging: have _yum_lock for AnaInstallThread >03:51:56,673 INFO packaging: checking dependencies >03:51:56,992 DEBUG yum.verbose.YumBase: Building updates object >03:51:57,374 DEBUG yum.verbose.YumBase: up:simple updates time: 0.112 >03:51:57,386 DEBUG yum.verbose.YumBase: up:obs time: 0.004 >03:51:57,390 DEBUG yum.verbose.YumBase: up:condense time: 0.000 >03:51:57,391 DEBUG yum.verbose.YumBase: updates time: 0.398 >03:51:57,400 DEBUG yum.verbose.YumBase: TSINFO: Marking ppp-2.4.5-28.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,407 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:NetworkManager-glib-0.9.8.1-1.git20130327.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,417 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:wpa_supplicant-1.1-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,424 DEBUG yum.verbose.YumBase: TSINFO: Marking libnl3-3.2.21-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,433 DEBUG yum.verbose.YumBase: TSINFO: Marking glib2-2.36.1-2.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,439 DEBUG yum.verbose.YumBase: TSINFO: Marking dbus-glib-0.100-3.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,481 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:dbus-1.6.8-5.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,484 DEBUG yum.verbose.YumBase: TSINFO: Marking systemd-sysv-203-2.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,499 DEBUG yum.verbose.YumBase: TSINFO: Marking iptables-1.4.18-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,502 DEBUG yum.verbose.YumBase: TSINFO: Marking dnsmasq-2.66-3.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,506 DEBUG yum.verbose.YumBase: TSINFO: Marking chkconfig-1.3.60-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,508 DEBUG yum.verbose.YumBase: Quick matched chkconfig-1.3.60-1.fc19.x86_64 to require for chkconfig >03:51:57,509 DEBUG yum.verbose.YumBase: TSINFO: Marking avahi-autoipd-0.6.31-11.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,511 DEBUG yum.verbose.YumBase: TSINFO: Marking libuuid-2.23-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,514 DEBUG yum.verbose.YumBase: TSINFO: Marking systemd-libs-203-2.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,518 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-3.14.3-12.0.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,520 DEBUG yum.verbose.YumBase: Quick matched nss-3.14.3-12.0.fc19.x86_64 to require for libsmime3.so()(64bit) >03:51:57,523 DEBUG yum.verbose.YumBase: TSINFO: Marking polkit-0.110-3.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,526 DEBUG yum.verbose.YumBase: TSINFO: Marking nspr-4.9.5-2.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,547 DEBUG yum.verbose.YumBase: Quick matched nspr-4.9.5-2.fc19.x86_64 to require for libplc4.so()(64bit) >03:51:57,567 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-util-3.14.3-1.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,611 DEBUG yum.verbose.YumBase: Quick matched 1:NetworkManager-glib-0.9.8.1-1.git20130327.fc19.x86_64 to require for libnm-glib.so.4()(64bit) >03:51:57,616 DEBUG yum.verbose.YumBase: Quick matched libnl3-3.2.21-1.fc19.x86_64 to require for libnl-genl-3.so.200()(64bit) >03:51:57,617 DEBUG yum.verbose.YumBase: Quick matched libnl3-3.2.21-1.fc19.x86_64 to require for libnl-3.so.200()(64bit) >03:51:57,620 DEBUG yum.verbose.YumBase: TSINFO: Marking libgudev1-203-2.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,627 DEBUG yum.verbose.YumBase: Quick matched glib2-2.36.1-2.fc19.x86_64 to require for libgmodule-2.0.so.0()(64bit) >03:51:57,632 DEBUG yum.verbose.YumBase: Quick matched glib2-2.36.1-2.fc19.x86_64 to require for libglib-2.0.so.0()(64bit) >03:51:57,633 DEBUG yum.verbose.YumBase: Quick matched glib2-2.36.1-2.fc19.x86_64 to require for libgio-2.0.so.0()(64bit) >03:51:57,637 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:dbus-libs-1.6.8-5.fc19.x86_64 as install for 1:NetworkManager-0.9.8.1-1.git20130327.fc19.x86_64 >03:51:57,668 DEBUG yum.verbose.YumBase: TSINFO: Marking audit-libs-2.3-2.fc19.x86_64 as install for audit-2.3-2.fc19.x86_64 >03:51:57,676 DEBUG yum.verbose.YumBase: TSINFO: Marking krb5-libs-1.11.2-2.fc19.x86_64 as install for audit-2.3-2.fc19.x86_64 >03:51:57,685 DEBUG yum.verbose.YumBase: Quick matched krb5-libs-1.11.2-2.fc19.x86_64 to require for libgssapi_krb5.so.2(gssapi_krb5_2_MIT)(64bit) >03:51:57,718 DEBUG yum.verbose.YumBase: TSINFO: Marking tcp_wrappers-libs-7.6-73.fc19.x86_64 as install for audit-2.3-2.fc19.x86_64 >03:51:57,739 DEBUG yum.verbose.YumBase: Quick matched krb5-libs-1.11.2-2.fc19.x86_64 to require for libgssapi_krb5.so.2()(64bit) >03:51:57,740 DEBUG yum.verbose.YumBase: Quick matched audit-libs-2.3-2.fc19.x86_64 to require for libaudit.so.1()(64bit) >03:51:57,747 DEBUG yum.verbose.YumBase: TSINFO: Marking python-2.7.4-4.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:51:57,750 DEBUG yum.verbose.YumBase: TSINFO: Marking libpwquality-1.2.1-2.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:51:57,755 DEBUG yum.verbose.YumBase: TSINFO: Marking pam-1.1.6-10.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:51:57,759 DEBUG yum.verbose.YumBase: TSINFO: Marking newt-python-0.52.15-1.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:51:57,761 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:openssl-1.0.1e-4.fc19.x86_64 as install for authconfig-6.2.6-2.fc19.x86_64 >03:51:57,767 DEBUG yum.verbose.YumBase: TSINFO: Marking ncurses-libs-5.9-10.20130413.fc19.x86_64 as install for bash-4.2.45-1.fc19.x86_64 >03:51:57,772 DEBUG yum.verbose.YumBase: TSINFO: Marking pciutils-libs-3.1.10-3.fc19.x86_64 as install for biosdevname-0.4.1-4.fc19.x86_64 >03:51:57,775 DEBUG yum.verbose.YumBase: TSINFO: Marking zlib-1.2.7-10.fc19.x86_64 as install for biosdevname-0.4.1-4.fc19.x86_64 >03:51:57,784 DEBUG yum.verbose.YumBase: TSINFO: Marking libattr-2.4.46-10.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:51:57,786 DEBUG yum.verbose.YumBase: TSINFO: Marking libacl-2.2.51-9.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:51:57,790 DEBUG yum.verbose.YumBase: TSINFO: Marking grep-2.14-3.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:51:57,792 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:gmp-5.1.1-2.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:51:57,794 DEBUG yum.verbose.YumBase: TSINFO: Marking info-5.1-1.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:51:57,796 DEBUG yum.verbose.YumBase: TSINFO: Marking libselinux-2.1.13-12.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:51:57,799 DEBUG yum.verbose.YumBase: TSINFO: Marking libcap-2.22-5.fc19.x86_64 as install for coreutils-8.21-8.fc19.x86_64 >03:51:57,836 DEBUG yum.verbose.YumBase: TSINFO: Marking sed-4.2.2-2.fc19.x86_64 as install for cronie-1.4.10-4.fc19.x86_64 >03:51:57,881 DEBUG yum.verbose.YumBase: TSINFO: Marking cronie-anacron-1.4.10-4.fc19.x86_64 as install for cronie-1.4.10-4.fc19.x86_64 >03:51:57,899 DEBUG yum.verbose.YumBase: TSINFO: Marking libcurl-7.29.0-6.fc19.x86_64 as install for curl-7.29.0-6.fc19.x86_64 >03:51:57,906 DEBUG yum.verbose.YumBase: Quick matched libcurl-7.29.0-6.fc19.x86_64 to require for libcurl.so.4()(64bit) >03:51:57,926 DEBUG yum.verbose.YumBase: TSINFO: Marking 12:dhcp-libs-4.2.5-10.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:51:57,931 DEBUG yum.verbose.YumBase: TSINFO: Marking 12:dhcp-common-4.2.5-10.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:51:57,937 DEBUG yum.verbose.YumBase: TSINFO: Marking openldap-2.4.35-1.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:51:57,946 DEBUG yum.verbose.YumBase: Quick matched openldap-2.4.35-1.fc19.x86_64 to require for liblber-2.4.so.2()(64bit) >03:51:58,004 DEBUG yum.verbose.YumBase: TSINFO: Marking 32:bind-libs-lite-9.9.3-0.2.rc1.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:51:58,006 DEBUG yum.verbose.YumBase: Quick matched 32:bind-libs-lite-9.9.3-0.2.rc1.fc19.x86_64 to require for libdns-export.so.98()(64bit) >03:51:58,008 DEBUG yum.verbose.YumBase: TSINFO: Marking libcap-ng-0.7.3-3.fc19.x86_64 as install for 12:dhclient-4.2.5-10.fc19.x86_64 >03:51:58,012 DEBUG yum.verbose.YumBase: TSINFO: Marking libss-1.42.7-2.fc19.x86_64 as install for e2fsprogs-1.42.7-2.fc19.x86_64 >03:51:58,014 DEBUG yum.verbose.YumBase: TSINFO: Marking libcom_err-1.42.7-2.fc19.x86_64 as install for e2fsprogs-1.42.7-2.fc19.x86_64 >03:51:58,016 DEBUG yum.verbose.YumBase: TSINFO: Marking e2fsprogs-libs-1.42.7-2.fc19.x86_64 as install for e2fsprogs-1.42.7-2.fc19.x86_64 >03:51:58,021 DEBUG yum.verbose.YumBase: TSINFO: Marking libblkid-2.23-1.fc19.x86_64 as install for e2fsprogs-1.42.7-2.fc19.x86_64 >03:51:58,022 DEBUG yum.verbose.YumBase: Quick matched libblkid-2.23-1.fc19.x86_64 to require for libblkid.so.1(BLKID_2.15)(64bit) >03:51:58,023 DEBUG yum.verbose.YumBase: Quick matched libblkid-2.23-1.fc19.x86_64 to require for libblkid.so.1(BLKID_1.0)(64bit) >03:51:58,024 DEBUG yum.verbose.YumBase: Quick matched e2fsprogs-libs-1.42.7-2.fc19.x86_64 to require for libe2p.so.2()(64bit) >03:51:58,028 DEBUG yum.verbose.YumBase: TSINFO: Marking python-slip-dbus-0.4.0-1.fc19.noarch as install for firewalld-0.3.2-1.fc19.noarch >03:51:58,030 DEBUG yum.verbose.YumBase: TSINFO: Marking python-decorator-3.4.0-2.fc19.noarch as install for firewalld-0.3.2-1.fc19.noarch >03:51:58,032 DEBUG yum.verbose.YumBase: TSINFO: Marking pygobject3-base-3.8.1-2.fc19.x86_64 as install for firewalld-0.3.2-1.fc19.noarch >03:51:58,036 DEBUG yum.verbose.YumBase: TSINFO: Marking ebtables-2.0.10-8.fc19.x86_64 as install for firewalld-0.3.2-1.fc19.noarch >03:51:58,039 DEBUG yum.verbose.YumBase: TSINFO: Marking dbus-python-1.1.1-5.fc19.x86_64 as install for firewalld-0.3.2-1.fc19.noarch >03:51:58,044 DEBUG yum.verbose.YumBase: TSINFO: Marking glibc-common-2.17-4.fc19.x86_64 as install for glibc-2.17-4.fc19.x86_64 >03:51:58,047 DEBUG yum.verbose.YumBase: TSINFO: Marking libgcc-4.8.0-2.fc19.x86_64 as install for glibc-2.17-4.fc19.x86_64 >03:51:58,049 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-softokn-freebl-3.14.3-1.fc19.x86_64 as install for glibc-2.17-4.fc19.x86_64 >03:51:58,052 DEBUG yum.verbose.YumBase: Quick matched nss-softokn-freebl-3.14.3-1.fc19.x86_64 to require for libfreebl3.so()(64bit) >03:51:58,055 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:grub2-tools-2.00-16.fc19.x86_64 as install for 1:grub2-2.00-16.fc19.x86_64 >03:51:58,057 DEBUG yum.verbose.YumBase: TSINFO: Marking which-2.20-5.fc19.x86_64 as install for 1:grub2-2.00-16.fc19.x86_64 >03:51:58,059 DEBUG yum.verbose.YumBase: TSINFO: Marking fedora-logos-19.0.1-1.fc19.noarch as install for 1:grub2-2.00-16.fc19.x86_64 >03:51:58,062 DEBUG yum.verbose.YumBase: TSINFO: Marking os-prober-1.58-1.fc19.x86_64 as install for 1:grub2-2.00-16.fc19.x86_64 >03:51:58,063 DEBUG yum.verbose.YumBase: TSINFO: Marking gettext-0.18.2.1-1.fc19.x86_64 as install for 1:grub2-2.00-16.fc19.x86_64 >03:51:58,066 DEBUG yum.verbose.YumBase: TSINFO: Marking file-5.11-9.fc19.x86_64 as install for 1:grub2-2.00-16.fc19.x86_64 >03:51:58,070 DEBUG yum.verbose.YumBase: TSINFO: Marking dracut-027-45.git20130430.fc19.x86_64 as install for 1:grub2-2.00-16.fc19.x86_64 >03:51:58,081 DEBUG yum.verbose.YumBase: TSINFO: Marking sysvinit-tools-2.88-10.dsf.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:51:58,084 DEBUG yum.verbose.YumBase: TSINFO: Marking kmod-13-2.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:51:58,091 DEBUG yum.verbose.YumBase: TSINFO: Marking popt-1.13-14.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:51:58,093 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:findutils-4.5.11-1.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:51:58,095 DEBUG yum.verbose.YumBase: TSINFO: Marking cpio-2.11-20.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:51:58,098 DEBUG yum.verbose.YumBase: TSINFO: Marking fedora-release-19-0.5.noarch as install for initscripts-9.46-1.fc19.x86_64 >03:51:58,100 DEBUG yum.verbose.YumBase: TSINFO: Marking gawk-4.0.2-2.fc19.x86_64 as install for initscripts-9.46-1.fc19.x86_64 >03:51:58,204 DEBUG yum.verbose.YumBase: TSINFO: Marking libdb-5.3.21-9.fc19.x86_64 as install for iproute-3.9.0-1.fc19.x86_64 >03:51:58,215 DEBUG yum.verbose.YumBase: TSINFO: Marking linux-atm-libs-2.5.1-7.fc19.x86_64 as install for iproute-3.9.0-1.fc19.x86_64 >03:51:58,228 DEBUG yum.verbose.YumBase: TSINFO: Marking libsysfs-2.1.0-13.fc19.x86_64 as install for iprutils-2.3.13-2.fc19.x86_64 >03:51:58,245 DEBUG yum.verbose.YumBase: TSINFO: Marking libidn-1.26-2.fc19.x86_64 as install for iputils-20121221-2.fc19.x86_64 >03:51:58,255 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:openssl-libs-1.0.1e-4.fc19.x86_64 as install for iputils-20121221-2.fc19.x86_64 >03:51:58,308 DEBUG yum.verbose.YumBase: TSINFO: Marking kbd-misc-1.15.5-5.fc19.noarch as install for kbd-1.15.5-5.fc19.x86_64 >03:51:58,333 DEBUG yum.verbose.YumBase: TSINFO: Marking linux-firmware-20130201-0.5.git65a5163.fc19.noarch as install for kernel-3.9.0-301.fc19.x86_64 >03:51:58,336 DEBUG yum.verbose.YumBase: TSINFO: Marking grubby-8.24-1.fc19.x86_64 as install for kernel-3.9.0-301.fc19.x86_64 >03:51:58,340 DEBUG yum.verbose.YumBase: TSINFO: Marking groff-base-1.22.2-2.fc19.x86_64 as install for less-458-2.fc19.x86_64 >03:51:58,347 DEBUG yum.verbose.YumBase: TSINFO: Marking gzip-1.5-4.fc19.x86_64 as install for man-db-2.6.3-6.fc19.x86_64 >03:51:58,349 DEBUG yum.verbose.YumBase: TSINFO: Marking libpipeline-1.2.3-1.fc19.x86_64 as install for man-db-2.6.3-6.fc19.x86_64 >03:51:58,351 DEBUG yum.verbose.YumBase: TSINFO: Marking gdbm-1.10-6.fc19.x86_64 as install for man-db-2.6.3-6.fc19.x86_64 >03:51:58,357 DEBUG yum.verbose.YumBase: TSINFO: Marking libreport-filesystem-2.1.3-3.fc19.x86_64 as install for mdadm-3.2.6-15.fc19.x86_64 >03:51:58,369 DEBUG yum.verbose.YumBase: TSINFO: Marking openssh-6.2p1-4.fc19.x86_64 as install for openssh-clients-6.2p1-4.fc19.x86_64 >03:51:58,372 DEBUG yum.verbose.YumBase: TSINFO: Marking fipscheck-lib-1.3.1-3.fc19.x86_64 as install for openssh-clients-6.2p1-4.fc19.x86_64 >03:51:58,373 DEBUG yum.verbose.YumBase: Quick matched fipscheck-lib-1.3.1-3.fc19.x86_64 to require for libfipscheck.so.1()(64bit) >03:51:58,374 DEBUG yum.verbose.YumBase: TSINFO: Marking libedit-3.0-10.20121213cvs.fc19.x86_64 as install for openssh-clients-6.2p1-4.fc19.x86_64 >03:51:58,487 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-libs-1.02.77-8.fc19.x86_64 as install for parted-3.1-12.fc19.x86_64 >03:51:58,492 DEBUG yum.verbose.YumBase: TSINFO: Marking libsepol-2.1.9-1.fc19.x86_64 as install for parted-3.1-12.fc19.x86_64 >03:51:58,501 DEBUG yum.verbose.YumBase: TSINFO: Marking readline-6.2-6.fc19.x86_64 as install for parted-3.1-12.fc19.x86_64 >03:51:58,525 DEBUG yum.verbose.YumBase: TSINFO: Marking libuser-0.59-1.fc19.x86_64 as install for passwd-0.78.99-4.fc19.x86_64 >03:51:58,577 DEBUG yum.verbose.YumBase: TSINFO: Marking plymouth-scripts-0.8.9-0.2013.03.26.0.fc19.x86_64 as install for plymouth-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:51:58,579 DEBUG yum.verbose.YumBase: TSINFO: Marking plymouth-core-libs-0.8.9-0.2013.03.26.0.fc19.x86_64 as install for plymouth-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:51:58,580 DEBUG yum.verbose.YumBase: Quick matched plymouth-core-libs-0.8.9-0.2013.03.26.0.fc19.x86_64 to require for libply-splash-core.so.2()(64bit) >03:51:58,582 DEBUG yum.verbose.YumBase: TSINFO: Marking libdrm-2.4.44-2.fc19.x86_64 as install for plymouth-0.8.9-0.2013.03.26.0.fc19.x86_64 >03:51:58,590 DEBUG yum.verbose.YumBase: TSINFO: Marking libselinux-utils-2.1.13-12.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:51:58,592 DEBUG yum.verbose.YumBase: TSINFO: Marking libsemanage-2.1.10-4.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:51:58,594 DEBUG yum.verbose.YumBase: TSINFO: Marking diffutils-3.3-1.fc19.x86_64 as install for policycoreutils-2.1.14-37.fc19.x86_64 >03:51:58,603 DEBUG yum.verbose.YumBase: TSINFO: Marking libdb-utils-5.3.21-9.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:51:58,609 DEBUG yum.verbose.YumBase: TSINFO: Marking rpm-libs-4.11.0.1-1.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:51:58,611 DEBUG yum.verbose.YumBase: Quick matched rpm-libs-4.11.0.1-1.fc19.x86_64 to require for librpm.so.3()(64bit) >03:51:58,612 DEBUG yum.verbose.YumBase: TSINFO: Marking xz-libs-5.1.2-4alpha.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:51:58,614 DEBUG yum.verbose.YumBase: TSINFO: Marking lua-5.1.4-12.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:51:58,616 DEBUG yum.verbose.YumBase: TSINFO: Marking elfutils-libelf-0.155-5.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:51:58,618 DEBUG yum.verbose.YumBase: TSINFO: Marking bzip2-libs-1.0.6-8.fc19.x86_64 as install for rpm-4.11.0.1-1.fc19.x86_64 >03:51:58,629 DEBUG yum.verbose.YumBase: TSINFO: Marking logrotate-3.8.4-1.fc19.x86_64 as install for rsyslog-7.2.6-1.fc19.x86_64 >03:51:58,631 DEBUG yum.verbose.YumBase: TSINFO: Marking liblognorm-0.3.5-1.fc19.x86_64 as install for rsyslog-7.2.6-1.fc19.x86_64 >03:51:58,633 DEBUG yum.verbose.YumBase: TSINFO: Marking json-c-0.10-3.fc19.x86_64 as install for rsyslog-7.2.6-1.fc19.x86_64 >03:51:58,637 DEBUG yum.verbose.YumBase: TSINFO: Marking libestr-0.1.5-1.fc19.x86_64 as install for rsyslog-7.2.6-1.fc19.x86_64 >03:51:58,641 DEBUG yum.verbose.YumBase: TSINFO: Marking libee-0.4.1-4.fc19.x86_64 as install for rsyslog-7.2.6-1.fc19.x86_64 >03:51:58,643 DEBUG yum.verbose.YumBase: TSINFO: Marking selinux-policy-3.12.1-42.fc19.noarch as install for selinux-policy-targeted-3.12.1-42.fc19.noarch >03:51:58,645 DEBUG yum.verbose.YumBase: Quick matched selinux-policy-3.12.1-42.fc19.noarch to require for selinux-policy >03:51:58,652 DEBUG yum.verbose.YumBase: TSINFO: Marking procmail-3.22-32.fc19.x86_64 as install for sendmail-8.14.7-1.fc19.x86_64 >03:51:58,656 DEBUG yum.verbose.YumBase: TSINFO: Marking cyrus-sasl-2.1.26-6.fc19.x86_64 as install for sendmail-8.14.7-1.fc19.x86_64 >03:51:58,659 DEBUG yum.verbose.YumBase: TSINFO: Marking cyrus-sasl-lib-2.1.26-6.fc19.x86_64 as install for sendmail-8.14.7-1.fc19.x86_64 >03:51:58,661 DEBUG yum.verbose.YumBase: TSINFO: Marking hesiod-3.2.1-1.fc19.x86_64 as install for sendmail-8.14.7-1.fc19.x86_64 >03:51:58,684 DEBUG yum.verbose.YumBase: TSINFO: Marking kmod-libs-13-2.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:51:58,686 DEBUG yum.verbose.YumBase: TSINFO: Marking libgcrypt-1.5.2-1.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:51:58,691 DEBUG yum.verbose.YumBase: TSINFO: Marking cryptsetup-libs-1.6.1-1.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:51:58,699 DEBUG yum.verbose.YumBase: TSINFO: Marking acl-2.2.51-9.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:51:58,711 DEBUG yum.verbose.YumBase: TSINFO: Marking qrencode-libs-3.4.1-1.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:51:58,721 DEBUG yum.verbose.YumBase: TSINFO: Marking libmicrohttpd-0.9.24-2.fc19.x86_64 as install for systemd-203-2.fc19.x86_64 >03:51:58,776 DEBUG yum.verbose.YumBase: TSINFO: Marking libmount-2.23-1.fc19.x86_64 as install for util-linux-2.23-1.fc19.x86_64 >03:51:58,788 DEBUG yum.verbose.YumBase: TSINFO: Marking libutempter-1.1.6-2.fc19.x86_64 as install for util-linux-2.23-1.fc19.x86_64 >03:51:58,795 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount.so.1(MOUNT_2.22)(64bit) >03:51:58,797 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount.so.1(MOUNT_2.21)(64bit) >03:51:58,801 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount.so.1(MOUNT_2.20)(64bit) >03:51:58,804 DEBUG yum.verbose.YumBase: Quick matched libmount-2.23-1.fc19.x86_64 to require for libmount.so.1(MOUNT_2.19)(64bit) >03:51:58,813 DEBUG yum.verbose.YumBase: TSINFO: Marking yum-metadata-parser-1.1.4-8.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:51:58,824 DEBUG yum.verbose.YumBase: TSINFO: Marking python-urlgrabber-3.9.1-26.fc19.noarch as install for yum-3.4.3-83.fc19.noarch >03:51:58,826 DEBUG yum.verbose.YumBase: TSINFO: Marking rpm-python-4.11.0.1-1.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:51:58,828 DEBUG yum.verbose.YumBase: TSINFO: Marking pyxattr-0.5.1-3.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:51:58,830 DEBUG yum.verbose.YumBase: TSINFO: Marking python-iniparse-0.4-7.fc19.noarch as install for yum-3.4.3-83.fc19.noarch >03:51:58,831 DEBUG yum.verbose.YumBase: TSINFO: Marking pyliblzma-0.5.3-8.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:51:58,833 DEBUG yum.verbose.YumBase: TSINFO: Marking pygpgme-0.3-6.fc19.x86_64 as install for yum-3.4.3-83.fc19.noarch >03:51:58,851 DEBUG yum.verbose.YumBase: TSINFO: Marking avahi-libs-0.6.31-11.fc19.x86_64 as install for avahi-autoipd-0.6.31-11.fc19.x86_64 >03:51:58,855 DEBUG yum.verbose.YumBase: TSINFO: Marking libdaemon-0.14-5.fc19.x86_64 as install for avahi-autoipd-0.6.31-11.fc19.x86_64 >03:51:58,863 DEBUG yum.verbose.YumBase: TSINFO: Marking 32:bind-license-9.9.3-0.2.rc1.fc19.noarch as install for 32:bind-libs-lite-9.9.3-0.2.rc1.fc19.x86_64 >03:51:58,865 DEBUG yum.verbose.YumBase: TSINFO: Marking libxml2-2.9.1-1.fc19.x86_64 as install for 32:bind-libs-lite-9.9.3-0.2.rc1.fc19.x86_64 >03:51:58,884 DEBUG yum.verbose.YumBase: TSINFO: Marking crontabs-1.11-5.20121102git.fc19.noarch as install for cronie-anacron-1.4.10-4.fc19.x86_64 >03:51:58,891 DEBUG yum.verbose.YumBase: TSINFO: Marking libgpg-error-1.11-1.fc19.x86_64 as install for cryptsetup-libs-1.6.1-1.fc19.x86_64 >03:51:58,913 DEBUG yum.verbose.YumBase: TSINFO: Marking expat-2.1.0-5.fc19.x86_64 as install for 1:dbus-1.6.8-5.fc19.x86_64 >03:51:58,927 DEBUG yum.verbose.YumBase: TSINFO: Marking device-mapper-1.02.77-8.fc19.x86_64 as install for device-mapper-libs-1.02.77-8.fc19.x86_64 >03:51:58,940 DEBUG yum.verbose.YumBase: TSINFO: Marking xz-5.1.2-4alpha.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:51:58,943 DEBUG yum.verbose.YumBase: TSINFO: Marking kpartx-0.4.9-47.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:51:58,945 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:hardlink-1.0-17.fc19.x86_64 as install for dracut-027-45.git20130430.fc19.x86_64 >03:51:58,956 DEBUG yum.verbose.YumBase: TSINFO: Marking file-libs-5.11-9.fc19.x86_64 as install for file-5.11-9.fc19.x86_64 >03:51:58,957 DEBUG yum.verbose.YumBase: Quick matched file-libs-5.11-9.fc19.x86_64 to require for libmagic.so.1()(64bit) >03:51:58,964 DEBUG yum.verbose.YumBase: TSINFO: Marking fipscheck-1.3.1-3.fc19.x86_64 as install for fipscheck-lib-1.3.1-3.fc19.x86_64 >03:51:58,981 DEBUG yum.verbose.YumBase: TSINFO: Marking libgomp-4.8.0-2.fc19.x86_64 as install for gettext-0.18.2.1-1.fc19.x86_64 >03:51:58,983 DEBUG yum.verbose.YumBase: TSINFO: Marking libunistring-0.9.3-7.fc19.x86_64 as install for gettext-0.18.2.1-1.fc19.x86_64 >03:51:58,985 DEBUG yum.verbose.YumBase: TSINFO: Marking gettext-libs-0.18.2.1-1.fc19.x86_64 as install for gettext-0.18.2.1-1.fc19.x86_64 >03:51:58,988 DEBUG yum.verbose.YumBase: Quick matched gettext-libs-0.18.2.1-1.fc19.x86_64 to require for libgettextlib-0.18.2.so()(64bit) >03:51:58,990 DEBUG yum.verbose.YumBase: TSINFO: Marking libcroco-0.6.8-2.fc19.x86_64 as install for gettext-0.18.2.1-1.fc19.x86_64 >03:51:58,998 DEBUG yum.verbose.YumBase: TSINFO: Marking shared-mime-info-1.1-4.fc19.x86_64 as install for glib2-2.36.1-2.fc19.x86_64 >03:51:59,001 DEBUG yum.verbose.YumBase: TSINFO: Marking libffi-3.0.13-1.fc19.x86_64 as install for glib2-2.36.1-2.fc19.x86_64 >03:51:59,017 DEBUG yum.verbose.YumBase: TSINFO: Marking tzdata-2013b-2.fc19.noarch as install for glibc-common-2.17-4.fc19.x86_64 >03:51:59,025 DEBUG yum.verbose.YumBase: TSINFO: Marking libstdc++-4.8.0-2.fc19.x86_64 as install for 1:gmp-5.1.1-2.fc19.x86_64 >03:51:59,027 DEBUG yum.verbose.YumBase: Quick matched libstdc++-4.8.0-2.fc19.x86_64 to require for libstdc++.so.6(GLIBCXX_3.4)(64bit) >03:51:59,028 DEBUG yum.verbose.YumBase: Quick matched libstdc++-4.8.0-2.fc19.x86_64 to require for libstdc++.so.6(CXXABI_1.3)(64bit) >03:51:59,028 DEBUG yum.verbose.YumBase: Quick matched libstdc++-4.8.0-2.fc19.x86_64 to require for libstdc++.so.6()(64bit) >03:51:59,031 DEBUG yum.verbose.YumBase: TSINFO: Marking pcre-8.32-4.fc19.x86_64 as install for grep-2.14-3.fc19.x86_64 >03:51:59,041 DEBUG yum.verbose.YumBase: TSINFO: Marking freetype-2.4.11-3.fc19.x86_64 as install for 1:grub2-tools-2.00-16.fc19.x86_64 >03:51:59,065 DEBUG yum.verbose.YumBase: TSINFO: Marking keyutils-libs-1.5.5-4.fc19.x86_64 as install for krb5-libs-1.11.2-2.fc19.x86_64 >03:51:59,067 DEBUG yum.verbose.YumBase: TSINFO: Marking libverto-0.2.5-2.fc19.x86_64 as install for krb5-libs-1.11.2-2.fc19.x86_64 >03:51:59,095 DEBUG yum.verbose.YumBase: TSINFO: Marking libssh2-1.4.3-4.fc19.x86_64 as install for libcurl-7.29.0-6.fc19.x86_64 >03:51:59,097 DEBUG yum.verbose.YumBase: Quick matched libssh2-1.4.3-4.fc19.x86_64 to require for libssh2.so.1()(64bit) >03:51:59,106 DEBUG yum.verbose.YumBase: TSINFO: Marking libpciaccess-0.13.1-3.fc19.x86_64 as install for libdrm-2.4.44-2.fc19.x86_64 >03:51:59,130 DEBUG yum.verbose.YumBase: TSINFO: Marking gnutls-3.1.10-1.fc19.x86_64 as install for libmicrohttpd-0.9.24-2.fc19.x86_64 >03:51:59,131 DEBUG yum.verbose.YumBase: Quick matched gnutls-3.1.10-1.fc19.x86_64 to require for libgnutls.so.28()(64bit) >03:51:59,145 DEBUG yum.verbose.YumBase: TSINFO: Marking cracklib-dicts-2.8.22-3.fc19.x86_64 as install for libpwquality-1.2.1-2.fc19.x86_64 >03:51:59,147 DEBUG yum.verbose.YumBase: TSINFO: Marking cracklib-2.8.22-3.fc19.x86_64 as install for libpwquality-1.2.1-2.fc19.x86_64 >03:51:59,160 DEBUG yum.verbose.YumBase: TSINFO: Marking ustr-1.0.4-13.fc18.x86_64 as install for libsemanage-2.1.10-4.fc19.x86_64 >03:51:59,161 DEBUG yum.verbose.YumBase: Quick matched ustr-1.0.4-13.fc18.x86_64 to require for libustr-1.0.so.1(USTR_1.0)(64bit) >03:51:59,161 DEBUG yum.verbose.YumBase: Quick matched ustr-1.0.4-13.fc18.x86_64 to require for libustr-1.0.so.1()(64bit) >03:51:59,208 DEBUG yum.verbose.YumBase: TSINFO: Marking ncurses-base-5.9-10.20130413.fc19.noarch as install for ncurses-libs-5.9-10.20130413.fc19.x86_64 >03:51:59,215 DEBUG yum.verbose.YumBase: TSINFO: Marking newt-0.52.15-1.fc19.x86_64 as install for newt-python-0.52.15-1.fc19.x86_64 >03:51:59,217 DEBUG yum.verbose.YumBase: Quick matched newt-0.52.15-1.fc19.x86_64 to require for libnewt.so.0.52(NEWT_0.52.6)(64bit) >03:51:59,217 DEBUG yum.verbose.YumBase: Quick matched newt-0.52.15-1.fc19.x86_64 to require for libnewt.so.0.52(NEWT_0.52.13)(64bit) >03:51:59,217 DEBUG yum.verbose.YumBase: Quick matched newt-0.52.15-1.fc19.x86_64 to require for libnewt.so.0.52(NEWT_0.52)(64bit) >03:51:59,219 DEBUG yum.verbose.YumBase: TSINFO: Marking slang-2.2.4-8.fc19.x86_64 as install for newt-python-0.52.15-1.fc19.x86_64 >03:51:59,222 DEBUG yum.verbose.YumBase: TSINFO: Marking python-libs-2.7.4-4.fc19.x86_64 as install for newt-python-0.52.15-1.fc19.x86_64 >03:51:59,237 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-softokn-3.14.3-1.fc19.x86_64 as install for nss-3.14.3-12.0.fc19.x86_64 >03:51:59,241 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-sysinit-3.14.3-12.0.fc19.x86_64 as install for nss-3.14.3-12.0.fc19.x86_64 >03:51:59,260 DEBUG yum.verbose.YumBase: TSINFO: Marking nss-tools-3.14.3-12.0.fc19.x86_64 as install for openldap-2.4.35-1.fc19.x86_64 >03:51:59,273 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:make-3.82-15.fc19.x86_64 as install for 1:openssl-1.0.1e-4.fc19.x86_64 >03:51:59,280 DEBUG yum.verbose.YumBase: TSINFO: Marking ca-certificates-2012.87-10.1.fc19.noarch as install for 1:openssl-libs-1.0.1e-4.fc19.x86_64 >03:51:59,317 DEBUG yum.verbose.YumBase: TSINFO: Marking 14:libpcap-1.3.0-4.fc19.x86_64 as install for ppp-2.4.5-28.fc19.x86_64 >03:51:59,318 DEBUG yum.verbose.YumBase: Quick matched 14:libpcap-1.3.0-4.fc19.x86_64 to require for libpcap.so.1()(64bit) >03:51:59,328 DEBUG yum.verbose.YumBase: TSINFO: Marking gobject-introspection-1.36.0-1.fc19.x86_64 as install for pygobject3-base-3.8.1-2.fc19.x86_64 >03:51:59,329 DEBUG yum.verbose.YumBase: Quick matched gobject-introspection-1.36.0-1.fc19.x86_64 to require for libgirepository-1.0.so.1()(64bit) >03:51:59,333 DEBUG yum.verbose.YumBase: TSINFO: Marking gpgme-1.3.2-3.fc19.x86_64 as install for pygpgme-0.3-6.fc19.x86_64 >03:51:59,334 DEBUG yum.verbose.YumBase: Quick matched gpgme-1.3.2-3.fc19.x86_64 to require for libgpgme.so.11(GPGME_1.0)(64bit) >03:51:59,335 DEBUG yum.verbose.YumBase: Quick matched gpgme-1.3.2-3.fc19.x86_64 to require for libgpgme.so.11()(64bit) >03:51:59,345 DEBUG yum.verbose.YumBase: TSINFO: Marking python-slip-0.4.0-1.fc19.noarch as install for python-slip-dbus-0.4.0-1.fc19.noarch >03:51:59,347 DEBUG yum.verbose.YumBase: TSINFO: Marking python-pycurl-7.19.0-15.1.fc19.x86_64 as install for python-urlgrabber-3.9.1-26.fc19.noarch >03:51:59,368 DEBUG yum.verbose.YumBase: TSINFO: Marking rpm-build-libs-4.11.0.1-1.fc19.x86_64 as install for rpm-python-4.11.0.1-1.fc19.x86_64 >03:51:59,370 DEBUG yum.verbose.YumBase: Quick matched rpm-build-libs-4.11.0.1-1.fc19.x86_64 to require for librpmbuild.so.3()(64bit) >03:51:59,398 DEBUG yum.verbose.YumBase: TSINFO: Marking sqlite-3.7.16.2-1.fc19.x86_64 as install for yum-metadata-parser-1.1.4-8.fc19.x86_64 >03:51:59,406 DEBUG yum.verbose.YumBase: TSINFO: Marking avahi-0.6.31-11.fc19.x86_64 as install for avahi-libs-0.6.31-11.fc19.x86_64 >03:51:59,412 DEBUG yum.verbose.YumBase: TSINFO: Marking p11-kit-trust-0.18.1-1.fc19.x86_64 as install for ca-certificates-2012.87-10.1.fc19.noarch >03:51:59,414 DEBUG yum.verbose.YumBase: TSINFO: Marking p11-kit-0.18.1-1.fc19.x86_64 as install for ca-certificates-2012.87-10.1.fc19.noarch >03:51:59,435 DEBUG yum.verbose.YumBase: TSINFO: Marking libtasn1-3.3-1.fc19.x86_64 as install for gnutls-3.1.10-1.fc19.x86_64 >03:51:59,436 DEBUG yum.verbose.YumBase: Quick matched libtasn1-3.3-1.fc19.x86_64 to require for libtasn1.so.6()(64bit) >03:51:59,437 DEBUG yum.verbose.YumBase: TSINFO: Marking nettle-2.6-2.fc19.x86_64 as install for gnutls-3.1.10-1.fc19.x86_64 >03:51:59,438 DEBUG yum.verbose.YumBase: Quick matched nettle-2.6-2.fc19.x86_64 to require for libhogweed.so.2()(64bit) >03:51:59,450 DEBUG yum.verbose.YumBase: TSINFO: Marking libassuan-2.0.3-5.fc19.x86_64 as install for gpgme-1.3.2-3.fc19.x86_64 >03:51:59,451 DEBUG yum.verbose.YumBase: TSINFO: Marking gnupg2-2.0.19-8.fc19.x86_64 as install for gpgme-1.3.2-3.fc19.x86_64 >03:51:59,471 DEBUG yum.verbose.YumBase: TSINFO: Marking hwdata-0.247-1.fc19.noarch as install for libpciaccess-0.13.1-3.fc19.x86_64 >03:51:59,534 DEBUG yum.verbose.YumBase: TSINFO: Marking libselinux-python-2.1.13-12.fc19.x86_64 as install for python-slip-0.4.0-1.fc19.noarch >03:51:59,545 DEBUG yum.verbose.YumBase: TSINFO: Marking 1:pkgconfig-0.27.1-1.fc19.x86_64 as install for shared-mime-info-1.1-4.fc19.x86_64 >03:51:59,571 DEBUG yum.verbose.YumBase: TSINFO: Marking pinentry-0.8.1-10.fc19.x86_64 as install for gnupg2-2.0.19-8.fc19.x86_64 >03:51:59,573 DEBUG yum.verbose.YumBase: TSINFO: Marking pth-2.0.7-19.fc19.x86_64 as install for gnupg2-2.0.19-8.fc19.x86_64 >03:51:59,674 DEBUG yum.verbose.YumBase: Depsolve time: 3.001 >03:51:59,676 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1383 (checkSoftwareSelection) >03:51:59,676 INFO packaging: have _yum_lock for AnaInstallThread >03:51:59,677 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:59,678 DEBUG packaging: success >03:51:59,679 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:59,680 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1398 (checkSoftwareSelection) >03:51:59,681 INFO packaging: have _yum_lock for AnaInstallThread >03:51:59,688 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:59,690 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/packaging/yumpayload.py:1460 (preInstall) >03:51:59,690 INFO packaging: have _yum_lock for AnaInstallThread >03:51:59,692 INFO packaging: 233 packages selected totalling 806.95 MB >03:51:59,692 INFO packaging: gave up _yum_lock for AnaInstallThread >03:51:59,696 INFO packaging: about to acquire _yum_lock for AnaInstallThread at /usr/lib64/python2.7/site-packages/pyanaconda/install.py:137 (doInstall) >03:51:59,696 INFO packaging: have _yum_lock for AnaInstallThread >03:51:59,697 INFO packaging: gave up _yum_lock for AnaInstallThread > > >/tmp/program.log: >03:47:07,314 INFO program: Running... udevadm trigger --action=change --subsystem-match=block >03:47:07,339 DEBUG program: Return code: 0 >03:47:07,340 INFO program: Running... udevadm settle --timeout=300 >03:47:07,488 DEBUG program: Return code: 0 >03:47:07,495 INFO program: Running... modprobe fcoe >03:47:07,707 DEBUG program: Return code: 0 >03:47:07,708 INFO program: Running... /usr/libexec/fcoe/fcoe_edd.sh -i >03:47:07,765 INFO program: No FCoE boot disk information is found in EDD! >03:47:07,766 DEBUG program: Return code: 1 >03:47:09,760 INFO program: Running... metacity --display :1 --sm-disable >03:47:12,838 INFO program: Running... udevadm settle --timeout=300 >03:47:12,883 DEBUG program: Return code: 0 >03:47:12,895 INFO program: Running... udevadm settle --timeout=300 >03:47:12,924 DEBUG program: Return code: 0 >03:47:12,988 INFO program: Running... multipath -c /dev/sda >03:47:13,003 INFO program: /dev/sda is not a valid multipath device path >03:47:13,004 DEBUG program: Return code: 1 >03:47:13,102 INFO program: Running... mdadm --examine --export /dev/sda1 >03:47:13,120 INFO program: MD_LEVEL=raid1 >03:47:13,121 INFO program: MD_DEVICES=4 >03:47:13,121 INFO program: MD_NAME=dhcppc0:swap >03:47:13,122 INFO program: MD_UUID=c51936a3:08423708:8fcc5618:49cc057b >03:47:13,122 INFO program: MD_UPDATE_TIME=1368160021 >03:47:13,123 INFO program: MD_DEV_UUID=1a345fa1:00b8b5d0:45a17a38:75aad50b >03:47:13,123 INFO program: MD_EVENTS=19 >03:47:13,124 DEBUG program: Return code: 0 >03:47:13,124 INFO program: Running... mdadm --examine --brief /dev/sda1 >03:47:13,137 INFO program: ARRAY /dev/md/swap metadata=1.2 UUID=c51936a3:08423708:8fcc5618:49cc057b name=dhcppc0:swap >03:47:13,138 DEBUG program: Return code: 0 >03:47:13,148 INFO program: Running... udevadm settle --timeout=300 >03:47:13,168 DEBUG program: Return code: 0 >03:47:13,187 INFO program: Running... udevadm settle --timeout=300 >03:47:13,207 DEBUG program: Return code: 0 >03:47:13,208 INFO program: Running... mdadm --incremental --quiet /dev/sda1 >03:47:13,220 DEBUG program: Return code: 1 >03:47:13,440 INFO program: Running... mount -t btrfs -o subvolid=0 /dev/sda2 /tmp/btrfs-tmp.59SZ6dI >03:47:13,486 DEBUG program: Return code: 0 >03:47:13,492 INFO program: Running... btrfs subvol list /tmp/btrfs-tmp.59SZ6dI >03:47:13,559 INFO program: ID 256 gen 114 top level 5 path boot >03:47:13,560 INFO program: ID 259 gen 116 top level 5 path root >03:47:13,560 DEBUG program: Return code: 0 >03:47:13,561 INFO program: Running... umount /tmp/btrfs-tmp.59SZ6dI >03:47:15,630 DEBUG program: Return code: 0 >03:47:15,632 INFO program: Running... systemctl start chronyd.service >03:47:15,817 DEBUG program: Return code: 0 >03:47:15,838 INFO program: Running... multipath -c /dev/sdd >03:47:15,902 INFO program: /dev/sdd is not a valid multipath device path >03:47:15,903 DEBUG program: Return code: 1 >03:47:16,123 INFO program: Running... mdadm --examine --export /dev/sdd1 >03:47:16,160 INFO program: MD_LEVEL=raid1 >03:47:16,160 INFO program: MD_DEVICES=4 >03:47:16,161 INFO program: MD_NAME=dhcppc0:swap >03:47:16,162 INFO program: MD_UUID=c51936a3:08423708:8fcc5618:49cc057b >03:47:16,162 INFO program: MD_UPDATE_TIME=1368244033 >03:47:16,163 INFO program: MD_DEV_UUID=b2071e0a:9ea4b36c:9603da25:d8ba131c >03:47:16,163 INFO program: MD_EVENTS=19 >03:47:16,164 DEBUG program: Return code: 0 >03:47:16,164 INFO program: Running... mdadm --examine --brief /dev/sdd1 >03:47:16,300 INFO program: ARRAY /dev/md/swap metadata=1.2 UUID=c51936a3:08423708:8fcc5618:49cc057b name=dhcppc0:swap >03:47:16,301 DEBUG program: Return code: 0 >03:47:16,336 INFO program: Running... udevadm settle --timeout=300 >03:47:16,432 DEBUG program: Return code: 0 >03:47:16,433 INFO program: Running... mdadm --incremental --quiet /dev/sdd1 >03:47:16,454 DEBUG program: Return code: 1 >03:47:16,601 INFO program: Running... multipath -c /dev/sdc >03:47:16,635 INFO program: /dev/sdc is not a valid multipath device path >03:47:16,636 DEBUG program: Return code: 1 >03:47:16,732 INFO program: Running... mdadm --examine --export /dev/sdc1 >03:47:16,749 INFO program: MD_LEVEL=raid1 >03:47:16,751 INFO program: MD_DEVICES=4 >03:47:16,753 INFO program: MD_NAME=dhcppc0:swap >03:47:16,754 INFO program: MD_UUID=c51936a3:08423708:8fcc5618:49cc057b >03:47:16,756 INFO program: MD_UPDATE_TIME=1368244033 >03:47:16,757 INFO program: MD_DEV_UUID=9e04ee80:95066f3c:2aca750d:ff574837 >03:47:16,758 INFO program: MD_EVENTS=19 >03:47:16,761 DEBUG program: Return code: 0 >03:47:16,761 INFO program: Running... mdadm --examine --brief /dev/sdc1 >03:47:16,778 INFO program: ARRAY /dev/md/swap metadata=1.2 UUID=c51936a3:08423708:8fcc5618:49cc057b name=dhcppc0:swap >03:47:16,779 DEBUG program: Return code: 0 >03:47:16,812 INFO program: Running... udevadm settle --timeout=300 >03:47:16,848 DEBUG program: Return code: 0 >03:47:16,849 INFO program: Running... mdadm --incremental --quiet /dev/sdc1 >03:47:16,906 DEBUG program: Return code: 1 >03:47:17,099 INFO program: Running... multipath -c /dev/sdb >03:47:17,185 INFO program: /dev/sdb is not a valid multipath device path >03:47:17,186 DEBUG program: Return code: 1 >03:47:17,472 INFO program: Running... mdadm --examine --export /dev/sdb1 >03:47:17,517 INFO program: MD_LEVEL=raid1 >03:47:17,517 INFO program: MD_DEVICES=4 >03:47:17,518 INFO program: MD_NAME=dhcppc0:swap >03:47:17,518 INFO program: MD_UUID=c51936a3:08423708:8fcc5618:49cc057b >03:47:17,519 INFO program: MD_UPDATE_TIME=1368244033 >03:47:17,520 INFO program: MD_DEV_UUID=9e6e36f1:2e007c25:8a0af068:8cb14a98 >03:47:17,520 INFO program: MD_EVENTS=19 >03:47:17,521 DEBUG program: Return code: 0 >03:47:17,522 INFO program: Running... mdadm --examine --brief /dev/sdb1 >03:47:17,543 INFO program: ARRAY /dev/md/swap metadata=1.2 UUID=c51936a3:08423708:8fcc5618:49cc057b name=dhcppc0:swap >03:47:17,544 DEBUG program: Return code: 0 >03:47:17,569 INFO program: Running... udevadm settle --timeout=300 >03:47:17,612 DEBUG program: Return code: 0 >03:47:17,613 INFO program: Running... mdadm --incremental --quiet /dev/sdb1 >03:47:17,639 DEBUG program: Return code: 1 >03:47:17,939 INFO program: Running... dumpe2fs -h /dev/loop1 >03:47:17,989 INFO program: dumpe2fs 1.42.7 (21-Jan-2013) >03:47:17,990 INFO program: Filesystem volume name: Anaconda >03:47:17,990 INFO program: Last mounted on: / >03:47:17,991 INFO program: Filesystem UUID: 932a9ea8-7790-43fd-a10c-20d783f65a9d >03:47:17,991 INFO program: Filesystem magic number: 0xEF53 >03:47:17,992 INFO program: Filesystem revision #: 1 (dynamic) >03:47:17,992 INFO program: Filesystem features: has_journal ext_attr resize_inode dir_index filetype extent flex_bg sparse_super huge_file uninit_bg dir_nlink extra_isize >03:47:17,993 INFO program: Filesystem flags: signed_directory_hash >03:47:17,993 INFO program: Default mount options: user_xattr acl >03:47:17,994 INFO program: Filesystem state: clean >03:47:17,994 INFO program: Errors behavior: Continue >03:47:17,995 INFO program: Filesystem OS type: Linux >03:47:17,995 INFO program: Inode count: 65536 >03:47:17,995 INFO program: Block count: 1048576 >03:47:17,996 INFO program: Reserved block count: 0 >03:47:17,996 INFO program: Free blocks: 269898 >03:47:17,998 INFO program: Free inodes: 37797 >03:47:17,998 INFO program: First block: 1 >03:47:17,999 INFO program: Block size: 1024 >03:47:18,001 INFO program: Fragment size: 1024 >03:47:18,001 INFO program: Reserved GDT blocks: 256 >03:47:18,002 INFO program: Blocks per group: 8192 >03:47:18,002 INFO program: Fragments per group: 8192 >03:47:18,003 INFO program: Inodes per group: 512 >03:47:18,003 INFO program: Inode blocks per group: 128 >03:47:18,004 INFO program: Flex block group size: 16 >03:47:18,004 INFO program: Filesystem created: Fri May 10 15:47:01 2013 >03:47:18,005 INFO program: Last mount time: Fri May 10 15:48:14 2013 >03:47:18,006 INFO program: Last write time: Fri May 10 15:48:24 2013 >03:47:18,007 INFO program: Mount count: 2 >03:47:18,007 INFO program: Maximum mount count: -1 >03:47:18,008 INFO program: Last checked: Fri May 10 15:47:01 2013 >03:47:18,009 INFO program: Check interval: 0 (<none>) >03:47:18,010 INFO program: Lifetime writes: 32 MB >03:47:18,011 INFO program: Reserved blocks uid: 0 (user root) >03:47:18,011 INFO program: Reserved blocks gid: 0 (group root) >03:47:18,012 INFO program: First inode: 11 >03:47:18,012 INFO program: Inode size: 256 >03:47:18,014 INFO program: Required extra isize: 28 >03:47:18,015 INFO program: Desired extra isize: 28 >03:47:18,016 INFO program: Journal inode: 8 >03:47:18,016 INFO program: Default directory hash: half_md4 >03:47:18,017 INFO program: Directory Hash Seed: f77f1af6-7633-41c3-8e3b-055593212a76 >03:47:18,017 INFO program: Journal backup: inode blocks >03:47:18,018 INFO program: Journal features: (none) >03:47:18,018 INFO program: Journal size: 32M >03:47:18,019 INFO program: Journal length: 32768 >03:47:18,019 INFO program: Journal sequence: 0x00000009 >03:47:18,020 INFO program: Journal start: 0 >03:47:18,021 INFO program: >03:47:18,022 DEBUG program: Return code: 0 >03:47:18,022 INFO program: Running... resize2fs -P /dev/loop1 >03:47:18,068 INFO program: resize2fs 1.42.7 (21-Jan-2013) >03:47:18,069 INFO program: resize2fs: Device or resource busy while trying to open /dev/loop1 >03:47:18,070 INFO program: Couldn't find valid filesystem superblock. >03:47:18,070 DEBUG program: Return code: 1 >03:47:18,215 INFO program: Running... dumpe2fs -h /dev/mapper/live-rw >03:47:18,236 INFO program: dumpe2fs 1.42.7 (21-Jan-2013) >03:47:18,237 INFO program: Filesystem volume name: Anaconda >03:47:18,237 INFO program: Last mounted on: / >03:47:18,238 INFO program: Filesystem UUID: 932a9ea8-7790-43fd-a10c-20d783f65a9d >03:47:18,239 INFO program: Filesystem magic number: 0xEF53 >03:47:18,239 INFO program: Filesystem revision #: 1 (dynamic) >03:47:18,240 INFO program: Filesystem features: has_journal ext_attr resize_inode dir_index filetype needs_recovery extent flex_bg sparse_super huge_file uninit_bg dir_nlink extra_isize >03:47:18,240 INFO program: Filesystem flags: signed_directory_hash >03:47:18,241 INFO program: Default mount options: user_xattr acl >03:47:18,242 INFO program: Filesystem state: clean >03:47:18,242 INFO program: Errors behavior: Continue >03:47:18,243 INFO program: Filesystem OS type: Linux >03:47:18,243 INFO program: Inode count: 65536 >03:47:18,244 INFO program: Block count: 1048576 >03:47:18,244 INFO program: Reserved block count: 0 >03:47:18,245 INFO program: Free blocks: 269898 >03:47:18,247 INFO program: Free inodes: 37797 >03:47:18,248 INFO program: First block: 1 >03:47:18,249 INFO program: Block size: 1024 >03:47:18,249 INFO program: Fragment size: 1024 >03:47:18,250 INFO program: Reserved GDT blocks: 256 >03:47:18,250 INFO program: Blocks per group: 8192 >03:47:18,251 INFO program: Fragments per group: 8192 >03:47:18,251 INFO program: Inodes per group: 512 >03:47:18,252 INFO program: Inode blocks per group: 128 >03:47:18,252 INFO program: Flex block group size: 16 >03:47:18,253 INFO program: Filesystem created: Fri May 10 15:47:01 2013 >03:47:18,257 INFO program: Last mount time: Sat May 11 03:46:56 2013 >03:47:18,258 INFO program: Last write time: Sat May 11 03:46:56 2013 >03:47:18,258 INFO program: Mount count: 3 >03:47:18,259 INFO program: Maximum mount count: -1 >03:47:18,259 INFO program: Last checked: Fri May 10 15:47:01 2013 >03:47:18,260 INFO program: Check interval: 0 (<none>) >03:47:18,260 INFO program: Lifetime writes: 32 MB >03:47:18,261 INFO program: Reserved blocks uid: 0 (user root) >03:47:18,262 INFO program: Reserved blocks gid: 0 (group root) >03:47:18,262 INFO program: First inode: 11 >03:47:18,263 INFO program: Inode size: 256 >03:47:18,263 INFO program: Required extra isize: 28 >03:47:18,264 INFO program: Desired extra isize: 28 >03:47:18,264 INFO program: Journal inode: 8 >03:47:18,265 INFO program: Default directory hash: half_md4 >03:47:18,265 INFO program: Directory Hash Seed: f77f1af6-7633-41c3-8e3b-055593212a76 >03:47:18,266 INFO program: Journal backup: inode blocks >03:47:18,266 INFO program: Journal features: (none) >03:47:18,268 INFO program: Journal size: 32M >03:47:18,268 INFO program: Journal length: 32768 >03:47:18,269 INFO program: Journal sequence: 0x0000000a >03:47:18,270 INFO program: Journal start: 1 >03:47:18,270 INFO program: >03:47:18,271 DEBUG program: Return code: 0 >03:47:18,272 INFO program: Running... resize2fs -P /dev/mapper/live-rw >03:47:18,294 INFO program: resize2fs 1.42.7 (21-Jan-2013) >03:47:18,295 INFO program: Estimated minimum size of the filesystem: 776082 >03:47:18,295 DEBUG program: Return code: 0 >03:47:18,374 INFO program: Running... udevadm settle --timeout=300 >03:47:18,410 DEBUG program: Return code: 0 >03:47:18,431 INFO program: Running... udevadm settle --timeout=300 >03:47:18,464 DEBUG program: Return code: 0 >03:47:18,471 INFO program: Running... udevadm settle --timeout=300 >03:47:18,506 DEBUG program: Return code: 0 >03:47:18,507 INFO program: Running... mdadm --stop /dev/md/dhcppc0:swap >03:47:18,735 INFO program: mdadm: stopped /dev/md/dhcppc0:swap >03:47:18,737 DEBUG program: Return code: 0 >03:47:18,749 INFO program: Running... udevadm settle --timeout=300 >03:47:18,787 DEBUG program: Return code: 0 >03:47:18,798 INFO program: Running... udevadm settle --timeout=300 >03:47:18,832 DEBUG program: Return code: 0 >03:47:18,843 INFO program: Running... udevadm settle --timeout=300 >03:47:18,882 DEBUG program: Return code: 0 >03:47:18,891 INFO program: Running... udevadm settle --timeout=300 >03:47:18,928 DEBUG program: Return code: 0 >03:47:18,940 INFO program: Running... udevadm settle --timeout=300 >03:47:18,974 DEBUG program: Return code: 0 >03:47:18,984 INFO program: Running... udevadm settle --timeout=300 >03:47:19,029 DEBUG program: Return code: 0 >03:47:19,038 INFO program: Running... udevadm settle --timeout=300 >03:47:19,073 DEBUG program: Return code: 0 >03:47:19,085 INFO program: Running... udevadm settle --timeout=300 >03:47:19,119 DEBUG program: Return code: 0 >03:47:19,122 INFO program: Running... udevadm settle --timeout=300 >03:47:19,155 DEBUG program: Return code: 0 >03:47:19,158 INFO program: Running... udevadm settle --timeout=300 >03:47:19,194 DEBUG program: Return code: 0 >03:47:19,197 INFO program: Running... udevadm settle --timeout=300 >03:47:19,231 DEBUG program: Return code: 0 >03:47:19,243 INFO program: Running... udevadm settle --timeout=300 >03:47:19,278 DEBUG program: Return code: 0 >03:47:19,281 INFO program: Running... udevadm settle --timeout=300 >03:47:19,317 DEBUG program: Return code: 0 >03:47:19,323 INFO program: Running... udevadm settle --timeout=300 >03:47:19,359 DEBUG program: Return code: 0 >03:47:19,362 INFO program: Running... udevadm settle --timeout=300 >03:47:19,395 DEBUG program: Return code: 0 >03:47:19,406 INFO program: Running... udevadm settle --timeout=300 >03:47:19,441 DEBUG program: Return code: 0 >03:47:19,444 INFO program: Running... udevadm settle --timeout=300 >03:47:19,479 DEBUG program: Return code: 0 >03:47:19,491 INFO program: Running... udevadm settle --timeout=300 >03:47:19,524 DEBUG program: Return code: 0 >03:47:19,527 INFO program: Running... udevadm settle --timeout=300 >03:47:19,559 DEBUG program: Return code: 0 >03:47:19,562 INFO program: Running... udevadm settle --timeout=300 >03:47:19,598 DEBUG program: Return code: 0 >03:47:19,601 INFO program: Running... udevadm settle --timeout=300 >03:47:19,633 DEBUG program: Return code: 0 >03:47:19,644 INFO program: Running... udevadm settle --timeout=300 >03:47:19,678 DEBUG program: Return code: 0 >03:47:19,682 INFO program: Running... udevadm settle --timeout=300 >03:47:19,717 DEBUG program: Return code: 0 >03:47:19,733 INFO program: Running... udevadm settle --timeout=300 >03:47:19,769 DEBUG program: Return code: 0 >03:47:19,777 INFO program: Running... udevadm settle --timeout=300 >03:47:19,814 DEBUG program: Return code: 0 >03:47:19,828 INFO program: Running... udevadm settle --timeout=300 >03:47:19,861 DEBUG program: Return code: 0 >03:47:19,864 INFO program: Running... udevadm settle --timeout=300 >03:47:19,900 DEBUG program: Return code: 0 >03:47:19,914 INFO program: Running... udevadm settle --timeout=300 >03:47:19,950 DEBUG program: Return code: 0 >03:47:19,966 INFO program: Running... mount -t btrfs -o subvol=boot,ro /dev/sda2 /mnt/sysimage >03:47:20,102 DEBUG program: Return code: 0 >03:47:20,106 INFO program: Running... umount /mnt/sysimage >03:47:20,129 DEBUG program: Return code: 0 >03:47:20,130 INFO program: Running... udevadm settle --timeout=300 >03:47:20,168 DEBUG program: Return code: 0 >03:47:20,171 INFO program: Running... udevadm settle --timeout=300 >03:47:20,210 DEBUG program: Return code: 0 >03:47:20,215 INFO program: Running... udevadm settle --timeout=300 >03:47:20,250 DEBUG program: Return code: 0 >03:47:20,261 INFO program: Running... udevadm settle --timeout=300 >03:47:20,295 DEBUG program: Return code: 0 >03:47:20,299 INFO program: Running... udevadm settle --timeout=300 >03:47:20,332 DEBUG program: Return code: 0 >03:47:20,343 INFO program: Running... udevadm settle --timeout=300 >03:47:20,376 DEBUG program: Return code: 0 >03:47:20,379 INFO program: Running... udevadm settle --timeout=300 >03:47:20,417 DEBUG program: Return code: 0 >03:47:20,429 INFO program: Running... udevadm settle --timeout=300 >03:47:20,462 DEBUG program: Return code: 0 >03:47:20,466 INFO program: Running... udevadm settle --timeout=300 >03:47:20,501 DEBUG program: Return code: 0 >03:47:20,513 INFO program: Running... udevadm settle --timeout=300 >03:47:20,551 DEBUG program: Return code: 0 >03:47:20,554 INFO program: Running... mount -t btrfs -o subvol=root,ro /dev/sda2 /mnt/sysimage >03:47:20,682 DEBUG program: Return code: 0 >03:47:20,705 INFO program: Running... arch >03:47:21,006 INFO program: x86_64 >03:47:21,012 DEBUG program: Return code: 0 >03:47:21,044 INFO program: Running... umount /mnt/sysimage >03:47:21,072 DEBUG program: Return code: 0 >03:47:21,073 INFO program: Running... udevadm settle --timeout=300 >03:47:21,124 DEBUG program: Return code: 0 >03:47:36,277 INFO program: Running... modprobe xfs >03:47:36,398 DEBUG program: Return code: 0 >03:47:36,418 INFO program: Running... modprobe vfat >03:47:36,472 DEBUG program: Return code: 0 >03:48:08,804 INFO program: Running... udevadm settle --timeout=300 >03:48:08,879 DEBUG program: Return code: 0 >03:48:14,512 INFO program: Running... udevadm settle --timeout=300 >03:48:14,536 DEBUG program: Return code: 0 >03:48:14,550 INFO program: Running... udevadm settle --timeout=300 >03:48:14,573 DEBUG program: Return code: 0 >03:48:14,586 INFO program: Running... udevadm settle --timeout=300 >03:48:14,609 DEBUG program: Return code: 0 >03:48:14,629 INFO program: Running... udevadm settle --timeout=300 >03:48:14,652 DEBUG program: Return code: 0 >03:48:14,670 INFO program: Running... udevadm settle --timeout=300 >03:48:14,693 DEBUG program: Return code: 0 >03:48:14,710 INFO program: Running... udevadm settle --timeout=300 >03:48:14,733 DEBUG program: Return code: 0 >03:48:14,750 INFO program: Running... udevadm settle --timeout=300 >03:48:14,774 DEBUG program: Return code: 0 >03:48:14,812 INFO program: Running... udevadm settle --timeout=300 >03:48:14,835 DEBUG program: Return code: 0 >03:48:14,875 INFO program: Running... udevadm settle --timeout=300 >03:48:14,899 DEBUG program: Return code: 0 >03:48:14,938 INFO program: Running... udevadm settle --timeout=300 >03:48:14,963 DEBUG program: Return code: 0 >03:48:15,004 INFO program: Running... udevadm settle --timeout=300 >03:48:15,030 DEBUG program: Return code: 0 >03:51:04,200 INFO program: Running... hwclock --systohc --utc >03:51:04,501 DEBUG program: Return code: 0 >03:51:04,520 INFO program: Running... udevadm settle --timeout=300 >03:51:04,554 DEBUG program: Return code: 0 >03:51:04,803 INFO program: Running... udevadm settle --timeout=300 >03:51:04,837 DEBUG program: Return code: 0 >03:51:04,844 INFO program: Running... udevadm settle --timeout=300 >03:51:04,874 DEBUG program: Return code: 0 >03:51:04,876 INFO program: Running... udevadm settle --timeout=300 >03:51:04,903 DEBUG program: Return code: 0 >03:51:04,914 INFO program: Running... udevadm settle --timeout=300 >03:51:04,946 DEBUG program: Return code: 0 >03:51:04,974 INFO program: Running... mount -t btrfs -o subvolid=0 /dev/sda2 /tmp/btrfs-tmp.5yIXAwt >03:51:05,008 DEBUG program: Return code: 0 >03:51:05,011 INFO program: Running... btrfs subvol delete /tmp/btrfs-tmp.5yIXAwt/boot >03:51:05,052 INFO program: Delete subvolume '/tmp/btrfs-tmp.5yIXAwt/boot' >03:51:05,053 DEBUG program: Return code: 0 >03:51:05,056 INFO program: Running... umount /tmp/btrfs-tmp.5yIXAwt >03:51:08,211 DEBUG program: Return code: 0 >03:51:08,214 INFO program: Running... udevadm settle --timeout=300 >03:51:08,259 DEBUG program: Return code: 0 >03:51:08,263 INFO program: Running... udevadm settle --timeout=300 >03:51:08,292 DEBUG program: Return code: 0 >03:51:08,298 INFO program: Running... udevadm settle --timeout=300 >03:51:08,327 DEBUG program: Return code: 0 >03:51:08,328 INFO program: Running... udevadm settle --timeout=300 >03:51:08,358 DEBUG program: Return code: 0 >03:51:08,367 INFO program: Running... udevadm settle --timeout=300 >03:51:08,397 DEBUG program: Return code: 0 >03:51:08,405 INFO program: Running... mount -t btrfs -o subvolid=0 /dev/sda2 /tmp/btrfs-tmp.5174F_f >03:51:08,454 DEBUG program: Return code: 0 >03:51:08,456 INFO program: Running... btrfs subvol delete /tmp/btrfs-tmp.5174F_f/root >03:51:08,529 INFO program: Delete subvolume '/tmp/btrfs-tmp.5174F_f/root' >03:51:08,531 DEBUG program: Return code: 0 >03:51:08,533 INFO program: Running... umount /tmp/btrfs-tmp.5174F_f >03:51:21,383 DEBUG program: Return code: 0 >03:51:21,384 INFO program: Running... udevadm settle --timeout=300 >03:51:21,415 DEBUG program: Return code: 0 >03:51:21,421 INFO program: Running... udevadm settle --timeout=300 >03:51:21,449 DEBUG program: Return code: 0 >03:51:21,457 INFO program: Running... udevadm settle --timeout=300 >03:51:21,484 DEBUG program: Return code: 0 >03:51:21,485 INFO program: Running... udevadm settle --timeout=300 >03:51:21,514 DEBUG program: Return code: 0 >03:51:21,522 INFO program: Running... udevadm settle --timeout=300 >03:51:21,553 DEBUG program: Return code: 0 >03:51:21,577 INFO program: Running... wipefs -f -a /dev/sda2 >03:51:21,679 INFO program: /dev/sda2: 8 bytes were erased at offset 0x00010040 (btrfs): 5f 42 48 52 66 53 5f 4d >03:51:21,681 INFO program: /dev/sda2: 8 bytes were erased at offset 0x04000040 (btrfs): 5f 42 48 52 66 53 5f 4d >03:51:21,681 DEBUG program: Return code: 0 >03:51:21,690 INFO program: Running... wipefs -f -a /dev/sdd2 >03:51:21,746 INFO program: /dev/sdd2: 8 bytes were erased at offset 0x00010040 (btrfs): 5f 42 48 52 66 53 5f 4d >03:51:21,749 INFO program: /dev/sdd2: 8 bytes were erased at offset 0x04000040 (btrfs): 5f 42 48 52 66 53 5f 4d >03:51:21,749 DEBUG program: Return code: 0 >03:51:21,754 INFO program: Running... wipefs -f -a /dev/sdc2 >03:51:21,839 INFO program: /dev/sdc2: 8 bytes were erased at offset 0x00010040 (btrfs): 5f 42 48 52 66 53 5f 4d >03:51:21,842 INFO program: /dev/sdc2: 8 bytes were erased at offset 0x04000040 (btrfs): 5f 42 48 52 66 53 5f 4d >03:51:21,842 DEBUG program: Return code: 0 >03:51:21,847 INFO program: Running... wipefs -f -a /dev/sdb2 >03:51:21,931 INFO program: /dev/sdb2: 8 bytes were erased at offset 0x00010040 (btrfs): 5f 42 48 52 66 53 5f 4d >03:51:21,932 INFO program: /dev/sdb2: 8 bytes were erased at offset 0x04000040 (btrfs): 5f 42 48 52 66 53 5f 4d >03:51:21,932 DEBUG program: Return code: 0 >03:51:21,933 INFO program: Running... udevadm settle --timeout=300 >03:51:21,965 DEBUG program: Return code: 0 >03:51:21,969 INFO program: Running... udevadm settle --timeout=300 >03:51:22,000 DEBUG program: Return code: 0 >03:51:22,007 INFO program: Running... udevadm settle --timeout=300 >03:51:22,038 DEBUG program: Return code: 0 >03:51:22,039 INFO program: Running... udevadm settle --timeout=300 >03:51:22,066 DEBUG program: Return code: 0 >03:51:22,077 INFO program: Running... udevadm settle --timeout=300 >03:51:22,109 DEBUG program: Return code: 0 >03:51:22,166 INFO program: Running... udevadm settle --timeout=300 >03:51:22,217 DEBUG program: Return code: 0 >03:51:22,220 INFO program: Running... udevadm settle --timeout=300 >03:51:22,248 DEBUG program: Return code: 0 >03:51:22,253 INFO program: Running... udevadm settle --timeout=300 >03:51:22,283 DEBUG program: Return code: 0 >03:51:22,291 INFO program: Running... udevadm settle --timeout=300 >03:51:22,320 DEBUG program: Return code: 0 >03:51:22,322 INFO program: Running... udevadm settle --timeout=300 >03:51:22,352 DEBUG program: Return code: 0 >03:51:22,363 INFO program: Running... udevadm settle --timeout=300 >03:51:22,394 DEBUG program: Return code: 0 >03:51:22,460 INFO program: Running... udevadm settle --timeout=300 >03:51:22,507 DEBUG program: Return code: 0 >03:51:22,509 INFO program: Running... udevadm settle --timeout=300 >03:51:22,539 DEBUG program: Return code: 0 >03:51:22,543 INFO program: Running... udevadm settle --timeout=300 >03:51:22,572 DEBUG program: Return code: 0 >03:51:22,579 INFO program: Running... udevadm settle --timeout=300 >03:51:22,611 DEBUG program: Return code: 0 >03:51:22,612 INFO program: Running... udevadm settle --timeout=300 >03:51:22,640 DEBUG program: Return code: 0 >03:51:22,650 INFO program: Running... udevadm settle --timeout=300 >03:51:22,680 DEBUG program: Return code: 0 >03:51:22,753 INFO program: Running... udevadm settle --timeout=300 >03:51:22,801 DEBUG program: Return code: 0 >03:51:22,802 INFO program: Running... udevadm settle --timeout=300 >03:51:22,832 DEBUG program: Return code: 0 >03:51:22,836 INFO program: Running... udevadm settle --timeout=300 >03:51:22,864 DEBUG program: Return code: 0 >03:51:22,870 INFO program: Running... udevadm settle --timeout=300 >03:51:22,900 DEBUG program: Return code: 0 >03:51:22,901 INFO program: Running... udevadm settle --timeout=300 >03:51:22,932 DEBUG program: Return code: 0 >03:51:22,941 INFO program: Running... udevadm settle --timeout=300 >03:51:22,970 DEBUG program: Return code: 0 >03:51:23,057 INFO program: Running... udevadm settle --timeout=300 >03:51:23,098 DEBUG program: Return code: 0 >03:51:23,099 INFO program: Running... udevadm settle --timeout=300 >03:51:23,130 DEBUG program: Return code: 0 >03:51:23,162 INFO program: Running... mdadm --assemble /dev/md/dhcppc0:swap --uuid=c51936a3:08423708:8fcc5618:49cc057b --run /dev/sda1 /dev/sdd1 /dev/sdc1 /dev/sdb1 >03:51:23,437 INFO program: mdadm: /dev/md/dhcppc0:swap has been started with 4 drives. >03:51:23,443 DEBUG program: Return code: 0 >03:51:23,445 INFO program: Running... udevadm settle --timeout=300 >03:51:23,484 DEBUG program: Return code: 0 >03:51:23,489 INFO program: Running... wipefs -f -a /dev/md/dhcppc0:swap >03:51:23,715 INFO program: /dev/md/dhcppc0:swap: 10 bytes were erased at offset 0x00000ff6 (swap): 53 57 41 50 53 50 41 43 45 32 >03:51:23,729 DEBUG program: Return code: 0 >03:51:23,730 INFO program: Running... udevadm settle --timeout=300 >03:51:23,771 DEBUG program: Return code: 0 >03:51:23,780 INFO program: Running... udevadm settle --timeout=300 >03:51:23,810 DEBUG program: Return code: 0 >03:51:23,811 INFO program: Running... mdadm --stop /dev/md/dhcppc0:swap >03:51:24,042 INFO program: mdadm: stopped /dev/md/dhcppc0:swap >03:51:24,042 DEBUG program: Return code: 0 >03:51:24,043 INFO program: Running... udevadm settle --timeout=300 >03:51:24,074 DEBUG program: Return code: 0 >03:51:24,080 INFO program: Running... udevadm settle --timeout=300 >03:51:24,111 DEBUG program: Return code: 0 >03:51:24,115 INFO program: Running... mdadm --zero-superblock /dev/sda1 >03:51:24,168 DEBUG program: Return code: 0 >03:51:24,172 INFO program: Running... udevadm settle --timeout=300 >03:51:24,231 DEBUG program: Return code: 0 >03:51:24,238 INFO program: Running... udevadm settle --timeout=300 >03:51:24,269 DEBUG program: Return code: 0 >03:51:24,271 INFO program: Running... udevadm settle --timeout=300 >03:51:24,307 DEBUG program: Return code: 0 >03:51:24,319 INFO program: Running... udevadm settle --timeout=300 >03:51:24,348 DEBUG program: Return code: 0 >03:51:24,390 INFO program: Running... udevadm settle --timeout=300 >03:51:24,434 DEBUG program: Return code: 0 >03:51:24,435 INFO program: Running... udevadm settle --timeout=300 >03:51:24,466 DEBUG program: Return code: 0 >03:51:24,502 INFO program: Running... udevadm settle --timeout=300 >03:51:24,558 DEBUG program: Return code: 0 >03:51:24,564 INFO program: Running... udevadm settle --timeout=300 >03:51:24,594 DEBUG program: Return code: 0 >03:51:24,595 INFO program: Running... udevadm settle --timeout=300 >03:51:24,628 DEBUG program: Return code: 0 >03:51:24,632 INFO program: Running... mdadm --zero-superblock /dev/sdd1 >03:51:24,684 DEBUG program: Return code: 0 >03:51:24,686 INFO program: Running... udevadm settle --timeout=300 >03:51:24,727 DEBUG program: Return code: 0 >03:51:24,734 INFO program: Running... udevadm settle --timeout=300 >03:51:24,764 DEBUG program: Return code: 0 >03:51:24,765 INFO program: Running... udevadm settle --timeout=300 >03:51:24,794 DEBUG program: Return code: 0 >03:51:24,807 INFO program: Running... udevadm settle --timeout=300 >03:51:24,836 DEBUG program: Return code: 0 >03:51:24,864 INFO program: Running... udevadm settle --timeout=300 >03:51:24,903 DEBUG program: Return code: 0 >03:51:24,905 INFO program: Running... udevadm settle --timeout=300 >03:51:24,932 DEBUG program: Return code: 0 >03:51:25,036 INFO program: Running... udevadm settle --timeout=300 >03:51:25,084 DEBUG program: Return code: 0 >03:51:25,092 INFO program: Running... udevadm settle --timeout=300 >03:51:25,123 DEBUG program: Return code: 0 >03:51:25,124 INFO program: Running... udevadm settle --timeout=300 >03:51:25,151 DEBUG program: Return code: 0 >03:51:25,155 INFO program: Running... mdadm --zero-superblock /dev/sdc1 >03:51:25,264 DEBUG program: Return code: 0 >03:51:25,266 INFO program: Running... udevadm settle --timeout=300 >03:51:25,304 DEBUG program: Return code: 0 >03:51:25,311 INFO program: Running... udevadm settle --timeout=300 >03:51:25,342 DEBUG program: Return code: 0 >03:51:25,343 INFO program: Running... udevadm settle --timeout=300 >03:51:25,373 DEBUG program: Return code: 0 >03:51:25,383 INFO program: Running... udevadm settle --timeout=300 >03:51:25,413 DEBUG program: Return code: 0 >03:51:25,496 INFO program: Running... udevadm settle --timeout=300 >03:51:25,543 DEBUG program: Return code: 0 >03:51:25,545 INFO program: Running... udevadm settle --timeout=300 >03:51:25,577 DEBUG program: Return code: 0 >03:51:25,777 INFO program: Running... udevadm settle --timeout=300 >03:51:25,825 DEBUG program: Return code: 0 >03:51:25,834 INFO program: Running... udevadm settle --timeout=300 >03:51:25,865 DEBUG program: Return code: 0 >03:51:25,866 INFO program: Running... udevadm settle --timeout=300 >03:51:25,896 DEBUG program: Return code: 0 >03:51:25,901 INFO program: Running... mdadm --zero-superblock /dev/sdb1 >03:51:26,000 DEBUG program: Return code: 0 >03:51:26,003 INFO program: Running... udevadm settle --timeout=300 >03:51:26,053 DEBUG program: Return code: 0 >03:51:26,063 INFO program: Running... udevadm settle --timeout=300 >03:51:26,094 DEBUG program: Return code: 0 >03:51:26,095 INFO program: Running... udevadm settle --timeout=300 >03:51:26,121 DEBUG program: Return code: 0 >03:51:26,134 INFO program: Running... udevadm settle --timeout=300 >03:51:26,162 DEBUG program: Return code: 0 >03:51:26,202 INFO program: Running... udevadm settle --timeout=300 >03:51:26,242 DEBUG program: Return code: 0 >03:51:26,243 INFO program: Running... udevadm settle --timeout=300 >03:51:26,272 DEBUG program: Return code: 0 >03:51:26,349 INFO program: Running... udevadm settle --timeout=300 >03:51:26,394 DEBUG program: Return code: 0 >03:51:26,399 INFO program: Running... udevadm settle --timeout=300 >03:51:26,429 DEBUG program: Return code: 0 >03:51:26,430 INFO program: Running... udevadm settle --timeout=300 >03:51:26,460 DEBUG program: Return code: 0 >03:51:26,528 INFO program: Running... udevadm settle --timeout=300 >03:51:26,572 DEBUG program: Return code: 0 >03:51:26,573 INFO program: Running... udevadm settle --timeout=300 >03:51:26,605 DEBUG program: Return code: 0 >03:51:26,610 INFO program: Running... udevadm settle --timeout=300 >03:51:26,649 DEBUG program: Return code: 0 >03:51:26,697 INFO program: Running... udevadm settle --timeout=300 >03:51:26,749 DEBUG program: Return code: 0 >03:51:26,759 INFO program: Running... wipefs -f -a /dev/sda1 >03:51:26,795 DEBUG program: Return code: 0 >03:51:26,805 INFO program: Running... udevadm settle --timeout=300 >03:51:26,838 DEBUG program: Return code: 0 >03:51:26,839 INFO program: Running... udevadm settle --timeout=300 >03:51:26,869 DEBUG program: Return code: 0 >03:51:26,907 INFO program: Running... udevadm settle --timeout=300 >03:51:26,968 DEBUG program: Return code: 0 >03:51:26,977 INFO program: Running... wipefs -f -a /dev/sda2 >03:51:27,023 DEBUG program: Return code: 0 >03:51:27,038 INFO program: Running... udevadm settle --timeout=300 >03:51:27,068 DEBUG program: Return code: 0 >03:51:27,069 INFO program: Running... udevadm settle --timeout=300 >03:51:27,095 DEBUG program: Return code: 0 >03:51:27,140 INFO program: Running... udevadm settle --timeout=300 >03:51:27,206 DEBUG program: Return code: 0 >03:51:27,214 INFO program: Running... wipefs -f -a /dev/sda3 >03:51:27,251 DEBUG program: Return code: 0 >03:51:27,260 INFO program: Running... udevadm settle --timeout=300 >03:51:27,291 DEBUG program: Return code: 0 >03:51:27,292 INFO program: Running... udevadm settle --timeout=300 >03:51:27,322 DEBUG program: Return code: 0 >03:51:27,619 INFO program: Running... udevadm settle --timeout=300 >03:51:27,659 DEBUG program: Return code: 0 >03:51:27,664 INFO program: Running... udevadm settle --timeout=300 >03:51:27,696 DEBUG program: Return code: 0 >03:51:27,753 INFO program: Running... udevadm settle --timeout=300 >03:51:27,791 DEBUG program: Return code: 0 >03:51:27,797 INFO program: Running... udevadm settle --timeout=300 >03:51:27,830 DEBUG program: Return code: 0 >03:51:27,884 INFO program: Running... udevadm settle --timeout=300 >03:51:27,935 DEBUG program: Return code: 0 >03:51:27,945 INFO program: Running... udevadm settle --timeout=300 >03:51:27,976 DEBUG program: Return code: 0 >03:51:28,041 INFO program: Running... udevadm settle --timeout=300 >03:51:28,081 DEBUG program: Return code: 0 >03:51:28,082 INFO program: Running... udevadm settle --timeout=300 >03:51:28,111 DEBUG program: Return code: 0 >03:51:28,117 INFO program: Running... udevadm settle --timeout=300 >03:51:28,149 DEBUG program: Return code: 0 >03:51:28,175 INFO program: Running... udevadm settle --timeout=300 >03:51:28,242 DEBUG program: Return code: 0 >03:51:28,248 INFO program: Running... wipefs -f -a /dev/sdd1 >03:51:28,303 DEBUG program: Return code: 0 >03:51:28,316 INFO program: Running... udevadm settle --timeout=300 >03:51:28,342 DEBUG program: Return code: 0 >03:51:28,343 INFO program: Running... udevadm settle --timeout=300 >03:51:28,374 DEBUG program: Return code: 0 >03:51:28,472 INFO program: Running... udevadm settle --timeout=300 >03:51:28,529 DEBUG program: Return code: 0 >03:51:28,535 INFO program: Running... wipefs -f -a /dev/sdd2 >03:51:28,593 DEBUG program: Return code: 0 >03:51:28,606 INFO program: Running... udevadm settle --timeout=300 >03:51:28,636 DEBUG program: Return code: 0 >03:51:28,637 INFO program: Running... udevadm settle --timeout=300 >03:51:28,665 DEBUG program: Return code: 0 >03:51:28,750 INFO program: Running... udevadm settle --timeout=300 >03:51:28,838 DEBUG program: Return code: 0 >03:51:28,845 INFO program: Running... wipefs -f -a /dev/sdd3 >03:51:28,889 DEBUG program: Return code: 0 >03:51:28,899 INFO program: Running... udevadm settle --timeout=300 >03:51:28,929 DEBUG program: Return code: 0 >03:51:28,930 INFO program: Running... udevadm settle --timeout=300 >03:51:28,959 DEBUG program: Return code: 0 >03:51:29,058 INFO program: Running... udevadm settle --timeout=300 >03:51:29,092 DEBUG program: Return code: 0 >03:51:29,097 INFO program: Running... udevadm settle --timeout=300 >03:51:29,127 DEBUG program: Return code: 0 >03:51:29,179 INFO program: Running... udevadm settle --timeout=300 >03:51:29,222 DEBUG program: Return code: 0 >03:51:29,227 INFO program: Running... udevadm settle --timeout=300 >03:51:29,257 DEBUG program: Return code: 0 >03:51:29,309 INFO program: Running... udevadm settle --timeout=300 >03:51:29,342 DEBUG program: Return code: 0 >03:51:29,351 INFO program: Running... udevadm settle --timeout=300 >03:51:29,382 DEBUG program: Return code: 0 >03:51:29,458 INFO program: Running... udevadm settle --timeout=300 >03:51:29,512 DEBUG program: Return code: 0 >03:51:29,513 INFO program: Running... udevadm settle --timeout=300 >03:51:29,544 DEBUG program: Return code: 0 >03:51:29,548 INFO program: Running... udevadm settle --timeout=300 >03:51:29,577 DEBUG program: Return code: 0 >03:51:29,609 INFO program: Running... udevadm settle --timeout=300 >03:51:29,656 DEBUG program: Return code: 0 >03:51:29,662 INFO program: Running... wipefs -f -a /dev/sdc1 >03:51:29,712 DEBUG program: Return code: 0 >03:51:29,721 INFO program: Running... udevadm settle --timeout=300 >03:51:29,752 DEBUG program: Return code: 0 >03:51:29,753 INFO program: Running... udevadm settle --timeout=300 >03:51:29,780 DEBUG program: Return code: 0 >03:51:29,811 INFO program: Running... udevadm settle --timeout=300 >03:51:29,878 DEBUG program: Return code: 0 >03:51:29,884 INFO program: Running... wipefs -f -a /dev/sdc2 >03:51:29,946 DEBUG program: Return code: 0 >03:51:29,954 INFO program: Running... udevadm settle --timeout=300 >03:51:29,985 DEBUG program: Return code: 0 >03:51:29,986 INFO program: Running... udevadm settle --timeout=300 >03:51:30,017 DEBUG program: Return code: 0 >03:51:30,094 INFO program: Running... udevadm settle --timeout=300 >03:51:30,180 DEBUG program: Return code: 0 >03:51:30,196 INFO program: Running... wipefs -f -a /dev/sdc3 >03:51:30,252 DEBUG program: Return code: 0 >03:51:30,264 INFO program: Running... udevadm settle --timeout=300 >03:51:30,291 DEBUG program: Return code: 0 >03:51:30,292 INFO program: Running... udevadm settle --timeout=300 >03:51:30,322 DEBUG program: Return code: 0 >03:51:30,416 INFO program: Running... udevadm settle --timeout=300 >03:51:30,454 DEBUG program: Return code: 0 >03:51:30,459 INFO program: Running... udevadm settle --timeout=300 >03:51:30,488 DEBUG program: Return code: 0 >03:51:30,546 INFO program: Running... udevadm settle --timeout=300 >03:51:30,578 DEBUG program: Return code: 0 >03:51:30,583 INFO program: Running... udevadm settle --timeout=300 >03:51:30,613 DEBUG program: Return code: 0 >03:51:30,732 INFO program: Running... udevadm settle --timeout=300 >03:51:30,785 DEBUG program: Return code: 0 >03:51:30,792 INFO program: Running... udevadm settle --timeout=300 >03:51:30,825 DEBUG program: Return code: 0 >03:51:30,883 INFO program: Running... udevadm settle --timeout=300 >03:51:30,937 DEBUG program: Return code: 0 >03:51:30,938 INFO program: Running... udevadm settle --timeout=300 >03:51:30,987 DEBUG program: Return code: 0 >03:51:30,995 INFO program: Running... udevadm settle --timeout=300 >03:51:31,030 DEBUG program: Return code: 0 >03:51:31,060 INFO program: Running... udevadm settle --timeout=300 >03:51:31,120 DEBUG program: Return code: 0 >03:51:31,129 INFO program: Running... wipefs -f -a /dev/sdb1 >03:51:31,171 DEBUG program: Return code: 0 >03:51:31,181 INFO program: Running... udevadm settle --timeout=300 >03:51:31,213 DEBUG program: Return code: 0 >03:51:31,214 INFO program: Running... udevadm settle --timeout=300 >03:51:31,257 DEBUG program: Return code: 0 >03:51:31,321 INFO program: Running... udevadm settle --timeout=300 >03:51:31,389 DEBUG program: Return code: 0 >03:51:31,396 INFO program: Running... wipefs -f -a /dev/sdb2 >03:51:31,433 DEBUG program: Return code: 0 >03:51:31,445 INFO program: Running... udevadm settle --timeout=300 >03:51:31,475 DEBUG program: Return code: 0 >03:51:31,477 INFO program: Running... udevadm settle --timeout=300 >03:51:31,503 DEBUG program: Return code: 0 >03:51:31,551 INFO program: Running... udevadm settle --timeout=300 >03:51:31,627 DEBUG program: Return code: 0 >03:51:31,638 INFO program: Running... wipefs -f -a /dev/sdb3 >03:51:31,675 DEBUG program: Return code: 0 >03:51:31,690 INFO program: Running... udevadm settle --timeout=300 >03:51:31,719 DEBUG program: Return code: 0 >03:51:31,721 INFO program: Running... udevadm settle --timeout=300 >03:51:31,749 DEBUG program: Return code: 0 >03:51:31,816 INFO program: Running... udevadm settle --timeout=300 >03:51:31,852 DEBUG program: Return code: 0 >03:51:31,858 INFO program: Running... udevadm settle --timeout=300 >03:51:31,889 DEBUG program: Return code: 0 >03:51:31,907 INFO program: Running... mdadm --create /dev/md/swap --run --level=10 --raid-devices=4 --metadata=default /dev/sda3 /dev/sdb3 /dev/sdc3 /dev/sdd3 >03:51:32,563 INFO program: mdadm: array /dev/md/swap started. >03:51:32,566 DEBUG program: Return code: 0 >03:51:32,572 INFO program: Running... udevadm settle --timeout=300 >03:51:32,603 DEBUG program: Return code: 0 >03:51:32,604 INFO program: Running... udevadm settle --timeout=300 >03:51:32,636 DEBUG program: Return code: 0 >03:51:32,645 INFO program: Running... mkswap -f /dev/md/swap >03:51:32,848 INFO program: Setting up swapspace version 1, size = 785404 KiB >03:51:32,850 INFO program: no label, UUID=e6bfd32a-4edf-45f8-88fc-b0d52bc2e20b >03:51:32,850 DEBUG program: Return code: 0 >03:51:32,857 INFO program: Running... udevadm settle --timeout=300 >03:51:32,911 DEBUG program: Return code: 0 >03:51:32,915 INFO program: Running... udevadm settle --timeout=300 >03:51:32,944 DEBUG program: Return code: 0 >03:51:33,026 INFO program: Running... udevadm settle --timeout=300 >03:51:33,063 DEBUG program: Return code: 0 >03:51:33,069 INFO program: Running... udevadm settle --timeout=300 >03:51:33,101 DEBUG program: Return code: 0 >03:51:33,120 INFO program: Running... mdadm --create /dev/md/boot --run --level=1 --raid-devices=4 --metadata=1.0 /dev/sda2 /dev/sdb2 /dev/sdc2 /dev/sdd2 >03:51:33,699 INFO program: mdadm: array /dev/md/boot started. >03:51:33,700 DEBUG program: Return code: 0 >03:51:33,705 INFO program: Running... udevadm settle --timeout=300 >03:51:33,739 DEBUG program: Return code: 0 >03:51:33,741 INFO program: Running... udevadm settle --timeout=300 >03:51:33,771 DEBUG program: Return code: 0 >03:51:33,780 INFO program: Running... mke2fs -t ext4 /dev/md/boot >03:51:37,294 INFO program: mke2fs 1.42.7 (21-Jan-2013) >03:51:37,296 INFO program: Filesystem label= >03:51:37,297 INFO program: OS type: Linux >03:51:37,298 INFO program: Block size=1024 (log=0) >03:51:37,298 INFO program: Fragment size=1024 (log=0) >03:51:37,299 INFO program: Stride=0 blocks, Stripe width=0 blocks >03:51:37,300 INFO program: 131072 inodes, 524224 blocks >03:51:37,301 INFO program: 26211 blocks (5.00%) reserved for the super user >03:51:37,302 INFO program: First data block=1 >03:51:37,303 INFO program: Maximum filesystem blocks=67633152 >03:51:37,304 INFO program: 64 block groups >03:51:37,305 INFO program: 8192 blocks per group, 8192 fragments per group >03:51:37,308 INFO program: 2048 inodes per group >03:51:37,310 INFO program: Superblock backups stored on blocks: >03:51:37,311 INFO program: 8193, 24577, 40961, 57345, 73729, 204801, 221185, 401409 >03:51:37,313 INFO program: >03:51:37,315 INFO program: Allocating group tables: 0/64 done >03:51:37,317 INFO program: Writing inode tables: 0/64 done >03:51:37,321 INFO program: Creating journal (8192 blocks): done >03:51:37,322 INFO program: Writing superblocks and filesystem accounting information: 0/64 done >03:51:37,323 INFO program: >03:51:37,325 DEBUG program: Return code: 0 >03:51:37,332 INFO program: Running... udevadm settle --timeout=300 >03:51:37,377 DEBUG program: Return code: 0 >03:51:37,381 INFO program: Running... udevadm settle --timeout=300 >03:51:37,410 DEBUG program: Return code: 0 >03:51:37,488 INFO program: Running... udevadm settle --timeout=300 >03:51:37,544 DEBUG program: Return code: 0 >03:51:37,550 INFO program: Running... udevadm settle --timeout=300 >03:51:37,589 DEBUG program: Return code: 0 >03:51:37,617 INFO program: Running... mdadm --create /dev/md/root --run --level=10 --raid-devices=4 --metadata=default --bitmap=internal /dev/sda1 /dev/sdb1 /dev/sdc1 /dev/sdd1 >03:51:38,520 INFO program: mdadm: array /dev/md/root started. >03:51:38,523 DEBUG program: Return code: 0 >03:51:38,529 INFO program: Running... udevadm settle --timeout=300 >03:51:38,561 DEBUG program: Return code: 0 >03:51:38,562 INFO program: Running... udevadm settle --timeout=300 >03:51:38,600 DEBUG program: Return code: 0 >03:51:38,613 INFO program: Running... mke2fs -t ext4 /dev/md/root >03:51:53,665 INFO program: mke2fs 1.42.7 (21-Jan-2013) >03:51:53,666 INFO program: Filesystem label= >03:51:53,667 INFO program: OS type: Linux >03:51:53,668 INFO program: Block size=4096 (log=2) >03:51:53,669 INFO program: Fragment size=4096 (log=2) >03:51:53,675 INFO program: Stride=128 blocks, Stripe width=256 blocks >03:51:53,676 INFO program: 384272 inodes, 1536768 blocks >03:51:53,678 INFO program: 76838 blocks (5.00%) reserved for the super user >03:51:53,679 INFO program: First data block=0 >03:51:53,680 INFO program: Maximum filesystem blocks=1577058304 >03:51:53,682 INFO program: 47 block groups >03:51:53,684 INFO program: 32768 blocks per group, 32768 fragments per group >03:51:53,685 INFO program: 8176 inodes per group >03:51:53,690 INFO program: Superblock backups stored on blocks: >03:51:53,691 INFO program: 32768, 98304, 163840, 229376, 294912, 819200, 884736 >03:51:53,692 INFO program: >03:51:53,693 INFO program: Allocating group tables: 0/47 done >03:51:53,694 INFO program: Writing inode tables: 0/47 done >03:51:53,695 INFO program: Creating journal (32768 blocks): done >03:51:53,695 INFO program: Writing superblocks and filesystem accounting information: 0/47 done >03:51:53,696 INFO program: >03:51:53,697 DEBUG program: Return code: 0 >03:51:53,704 INFO program: Running... udevadm settle --timeout=300 >03:51:53,753 DEBUG program: Return code: 0 >03:51:53,759 INFO program: Running... udevadm settle --timeout=300 >03:51:53,786 DEBUG program: Return code: 0 >03:51:54,142 INFO program: Running... swapon /dev/md/swap >03:51:54,173 DEBUG program: Return code: 0 >03:51:54,221 INFO program: Running... mount -t ext4 -o defaults /dev/md/root /mnt/sysimage >03:51:54,336 DEBUG program: Return code: 0 >03:51:54,347 INFO program: Running... mount -t ext4 -o defaults /dev/md/boot /mnt/sysimage/boot >03:51:54,504 DEBUG program: Return code: 0 >03:51:54,511 INFO program: Running... mount -t bind -o bind,defaults /dev /mnt/sysimage/dev >03:51:54,534 DEBUG program: Return code: 0 >03:51:54,539 INFO program: Running... mount -t devpts -o gid=5,mode=620 devpts /mnt/sysimage/dev/pts >03:51:54,562 DEBUG program: Return code: 0 >03:51:54,567 INFO program: Running... mount -t tmpfs -o defaults tmpfs /mnt/sysimage/dev/shm >03:51:54,592 DEBUG program: Return code: 0 >03:51:54,597 INFO program: Running... mount -t proc -o defaults proc /mnt/sysimage/proc >03:51:54,619 DEBUG program: Return code: 0 >03:51:54,688 INFO program: Running... mount -t bind -o bind,defaults /run /mnt/sysimage/run >03:51:54,710 DEBUG program: Return code: 0 >03:51:54,717 INFO program: Running... mount -t sysfs -o defaults sysfs /mnt/sysimage/sys >03:51:54,739 DEBUG program: Return code: 0 >03:51:54,744 INFO program: Running... mount -t selinuxfs -o defaults selinuxfs /mnt/sysimage/sys/fs/selinux >03:51:54,765 DEBUG program: Return code: 0 > > >/tmp/storage.log: >03:47:07,492 INFO blivet: ISCSID is /sbin/iscsid >03:47:07,493 INFO blivet: no initiator set >03:47:07,767 INFO blivet: No FCoE EDD info found: No FCoE boot disk information is found in EDD! >03:47:07,767 INFO blivet: no /etc/zfcp.conf; not configuring zfcp >03:47:11,504 DEBUG blivet: trying to set new default fstype to 'ext4' >03:47:11,542 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:11,543 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:47:11,545 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:12,394 INFO blivet: Detected 2016M of memory >03:47:12,395 INFO blivet: Swap attempt of 4032M >03:47:12,836 INFO blivet: resetting Blivet (version 0.13) instance <blivet.Blivet object at 0x7fae1bf4f590> >03:47:12,837 INFO blivet: no initiator set >03:47:12,837 INFO blivet: not going to create backup copy of non-existent /etc/mdadm.conf >03:47:12,838 INFO blivet: DeviceTree.populate: ignoredDisks is [] ; exclusiveDisks is [] >03:47:12,893 DEBUG blivet: protected device spec LABEL=Fedorax2019-Beta-TC4x20x86_64 resolved to None >03:47:12,933 INFO blivet: devices to scan: ['sr0', 'sda', 'sda1', 'sda2', 'sdd', 'sdd1', 'sdd2', 'sdc', 'sdc1', 'sdc2', 'sdb', 'sdb1', 'sdb2', 'loop0', 'loop1', 'loop2', 'loop3', 'loop4', 'loop5', 'loop6', 'loop7', 'dm-0', 'md127'] >03:47:12,940 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/cdrom /dev/disk/by-id/ata-QEMU_DVD-ROM_QM00003 /dev/disk/by-label/Fedora\\x2019-Beta-TC4\\x20x86_64 /dev/disk/by-uuid/2013-05-10-11-54-01-00', > 'DEVNAME': 'sr0', > 'DEVPATH': '/devices/pci0000:00/0000:00:01.1/ata2/host1/target1:0:0/1:0:0:0/block/sr0', > 'DEVTYPE': 'disk', > 'ID_ATA': '1', > 'ID_BUS': 'ata', > 'ID_CDROM': '1', > 'ID_CDROM_DVD': '1', > 'ID_CDROM_MEDIA': '1', > 'ID_CDROM_MEDIA_DVD': '1', > 'ID_CDROM_MEDIA_TRACK_COUNT_DATA': '1', > 'ID_FS_APPLICATION_ID': 'GENISOIMAGE\\x20ISO\\x209660\\x2fHFS\\x20FILESYSTEM\\x20CREATOR\\x20\\x28C\\x29\\x201993\\x20E.YOUNGDALE\\x20\\x28C\\x29\\x201997-2006\\x20J.PEARSON\\x2fJ.SCHILLING\\x20\\x28C\\x29\\x202006-2007\\x20CDRKIT\\x20TEAM', > 'ID_FS_BOOT_SYSTEM_ID': 'EL\\x20TORITO\\x20SPECIFICATION', > 'ID_FS_LABEL': 'Fedora_19-Beta-TC4_x86_64', > 'ID_FS_LABEL_ENC': 'Fedora\\x2019-Beta-TC4\\x20x86_64', > 'ID_FS_SYSTEM_ID': 'LINUX', > 'ID_FS_TYPE': 'iso9660', > 'ID_FS_USAGE': 'filesystem', > 'ID_FS_UUID': '2013-05-10-11-54-01-00', > 'ID_FS_UUID_ENC': '2013-05-10-11-54-01-00', > 'ID_FS_VERSION': 'Joliet Extension', > 'ID_MODEL': 'QEMU_DVD-ROM', > 'ID_MODEL_ENC': 'QEMU\\x20DVD-ROM\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20\\x20', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_REVISION': '1.0.1', > 'ID_SERIAL': 'QEMU_DVD-ROM_QM00003', > 'ID_SERIAL_SHORT': 'QM00003', > 'ID_TYPE': 'cd', > 'MAJOR': '11', > 'MINOR': '0', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':seat:systemd:uaccess:', > 'USEC_INITIALIZED': '45443', > 'name': 'sr0', > 'symlinks': ['/dev/cdrom', > '/dev/disk/by-id/ata-QEMU_DVD-ROM_QM00003', > '/dev/disk/by-label/Fedora\\x2019-Beta-TC4\\x20x86_64', > '/dev/disk/by-uuid/2013-05-10-11-54-01-00'], > 'sysfs_path': '/devices/pci0000:00/0000:00:01.1/ata2/host1/target1:0:0/1:0:0:0/block/sr0'} ; name: sr0 ; >03:47:12,941 INFO blivet: scanning sr0 (/devices/pci0000:00/0000:00:01.1/ata2/host1/target1:0:0/1:0:0:0/block/sr0)... >03:47:12,943 DEBUG blivet: DeviceTree.getDeviceByName: name: sr0 ; >03:47:12,944 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:12,945 INFO blivet: sr0 is a cdrom >03:47:12,946 DEBUG blivet: DeviceTree.addUdevOpticalDevice: >03:47:12,947 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:12,949 DEBUG blivet: OpticalDevice._setFormat: sr0 ; current: None ; type: None ; >03:47:12,950 INFO blivet: added cdrom sr0 (id 0) to device tree >03:47:12,951 DEBUG blivet: OpticalDevice.mediaPresent: sr0 ; status: True ; >03:47:12,955 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sr0 ; >03:47:12,955 INFO blivet: type detected on 'sr0' is 'iso9660' >03:47:12,957 DEBUG blivet: Iso9660FS.supported: supported: True ; >03:47:12,958 DEBUG blivet: getFormat('iso9660') returning Iso9660FS instance >03:47:12,959 DEBUG blivet: OpticalDevice._setFormat: sr0 ; current: None ; type: iso9660 ; >03:47:12,961 DEBUG blivet: OpticalDevice.mediaPresent: sr0 ; status: True ; >03:47:12,963 DEBUG blivet: looking up parted Device: /dev/sr0 >03:47:12,970 INFO blivet: got device: OpticalDevice instance (0x7fae1279ad10) -- > name = sr0 status = True kids = 0 id = 0 > parents = [] > uuid = None size = 4585.0 > format = existing iso9660 filesystem > major = 11 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:01.1/ata2/host1/target1:0:0/1:0:0:0/block/sr0 partedDevice = parted.Device instance -- > model: QEMU QEMU DVD-ROM path: /dev/sr0 type: 1 > sectorSize: 2048 physicalSectorSize: 2048 > length: 2347520 openCount: 0 readOnly: True > externalMode: False dirty: False bootDirty: False > host: 2 did: 0 busy: True > hardwareGeometry: (146, 255, 63) biosGeometry: (146, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127ea9e0> > target size = 0 path = /dev/sr0 > format args = [] originalFormat = None >03:47:12,972 DEBUG blivet: Iso9660FS.supported: supported: True ; >03:47:12,973 INFO blivet: got format: Iso9660FS instance (0x7fae27c412d0) -- > type = iso9660 name = iso9660 status = False > device = /dev/sr0 uuid = 2013-05-10-11-54-01-00 exists = True > options = ro supported = True formattable = False resizable = False > mountpoint = None mountopts = None > label = Fedora_19-Beta-TC4_x86_64 size = 0.0 targetSize = 0.0 > >03:47:12,977 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0', > 'DEVNAME': 'sda', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda', > 'DEVTYPE': 'disk', > 'ID_BUS': 'scsi', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_0', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-0', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '0', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47040', > 'name': 'sda', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda'} ; name: sda ; >03:47:12,979 INFO blivet: scanning sda (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda)... >03:47:12,980 DEBUG blivet: DeviceTree.getDeviceByName: name: sda ; >03:47:12,981 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:12,983 DEBUG blivet: DeviceTree.addUdevDiskDevice: name: sda ; >03:47:12,985 INFO blivet: sda is a disk >03:47:12,985 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:12,987 DEBUG blivet: DiskDevice._setFormat: sda ; current: None ; type: None ; >03:47:13,004 INFO blivet: added disk sda (id 1) to device tree >03:47:13,005 DEBUG blivet: looking up parted Device: /dev/sda >03:47:13,012 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sda ; >03:47:13,015 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sda ; label_type: dos ; >03:47:13,016 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:47:13,017 DEBUG blivet: required disklabel type for sda (1) is None >03:47:13,017 DEBUG blivet: default disklabel type for sda is msdos >03:47:13,018 DEBUG blivet: selecting msdos disklabel for sda based on size >03:47:13,019 DEBUG blivet: DiskLabel.__init__: device: /dev/sda ; labelType: msdos ; exists: True ; >03:47:13,041 DEBUG blivet: Did not change pmbr_boot on parted.Disk instance -- > type: msdos primaryPartitionCount: 2 > lastPartitionNumber: 2 maxPrimaryPartitionCount: 4 > partitions: [<parted.partition.Partition object at 0x7fae27c4d790>, <parted.partition.Partition object at 0x7fae27c4db50>] > device: <parted.device.Device object at 0x7fae27c4d5d0> > PedDisk: <_ped.Disk object at 0x7fae27c44098> >03:47:13,042 DEBUG blivet: getFormat('disklabel') returning DiskLabel instance >03:47:13,044 DEBUG blivet: DiskDevice._setFormat: sda ; current: None ; type: disklabel ; >03:47:13,046 INFO blivet: got device: DiskDevice instance (0x7fae27c41550) -- > name = sda status = True kids = 0 id = 1 > parents = [] > uuid = None size = 12000.0 > format = existing msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = None removable = False partedDevice = <parted.device.Device object at 0x7fae1279ad50> >03:47:13,048 INFO blivet: got format: DiskLabel instance (0x7fae27c413d0) -- > type = disklabel name = partition table (MSDOS) status = False > device = /dev/sda uuid = None exists = True > options = None supported = False formattable = True resizable = False > type = msdos partition count = 2 sectorSize = 512 > align_offset = 0 align_grain = 2048 > partedDisk = parted.Disk instance -- > type: msdos primaryPartitionCount: 2 > lastPartitionNumber: 2 maxPrimaryPartitionCount: 4 > partitions: [<parted.partition.Partition object at 0x7fae27c4d790>, <parted.partition.Partition object at 0x7fae27c4db50>] > device: <parted.device.Device object at 0x7fae27c4d5d0> > PedDisk: <_ped.Disk object at 0x7fae27c44098> > origPartedDisk = <parted.disk.Disk object at 0x7fae27c4dd50> > partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eadd0> > >03:47:13,054 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part1 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0-part1', > 'DEVNAME': 'sda1', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda1', > 'DEVTYPE': 'partition', > 'ID_BUS': 'scsi', > 'ID_FS_LABEL': 'dhcppc0:swap', > 'ID_FS_LABEL_ENC': 'dhcppc0:swap', > 'ID_FS_TYPE': 'linux_raid_member', > 'ID_FS_USAGE': 'raid', > 'ID_FS_UUID': 'c51936a3-0842-3708-8fcc-561849cc057b', > 'ID_FS_UUID_ENC': 'c51936a3-0842-3708-8fcc-561849cc057b', > 'ID_FS_UUID_SUB': '1a345fa1-00b8-b5d0-45a1-7a3875aad50b', > 'ID_FS_UUID_SUB_ENC': '1a345fa1-00b8-b5d0-45a1-7a3875aad50b', > 'ID_FS_VERSION': '1.2', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_ENTRY_DISK': '8:0', > 'ID_PART_ENTRY_NUMBER': '1', > 'ID_PART_ENTRY_OFFSET': '2048', > 'ID_PART_ENTRY_SCHEME': 'dos', > 'ID_PART_ENTRY_SIZE': '4179968', > 'ID_PART_ENTRY_TYPE': '0xfd', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_0', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-0', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '1', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47070', > 'name': 'sda1', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part1', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0-part1'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda1'} ; name: sda1 ; >03:47:13,056 INFO blivet: scanning sda1 (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda1)... >03:47:13,057 DEBUG blivet: DeviceTree.getDeviceByName: name: sda1 ; >03:47:13,058 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:13,059 INFO blivet: sda1 is a partition >03:47:13,060 DEBUG blivet: DeviceTree.addUdevPartitionDevice: name: sda1 ; >03:47:13,062 DEBUG blivet: DeviceTree.getDeviceByName: name: sda ; >03:47:13,063 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with existing msdos disklabel >03:47:13,065 DEBUG blivet: DiskDevice.addChild: kids: 0 ; name: sda ; >03:47:13,067 DEBUG blivet: PartitionDevice._setFormat: sda1 ; >03:47:13,067 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:13,069 DEBUG blivet: PartitionDevice._setFormat: sda1 ; current: None ; type: None ; >03:47:13,070 DEBUG blivet: looking up parted Partition: /dev/sda1 >03:47:13,071 DEBUG blivet: PartitionDevice.probe: sda1 ; exists: True ; >03:47:13,073 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sda1 ; flag: 1 ; >03:47:13,075 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sda1 ; flag: 10 ; >03:47:13,076 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sda1 ; flag: 12 ; >03:47:13,077 INFO blivet: added partition sda1 (id 2) to device tree >03:47:13,079 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sda1 ; >03:47:13,080 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sda1 ; label_type: dos ; >03:47:13,081 DEBUG blivet: PartitionDevice.setup: sda1 ; status: True ; controllable: True ; orig: False ; >03:47:13,083 DEBUG blivet: DiskLabel.__init__: device: /dev/sda1 ; labelType: dos ; exists: True ; >03:47:13,101 WARN blivet: disklabel detected but not usable on sda1 >03:47:13,139 INFO blivet: type detected on 'sda1' is 'linux_raid_member' >03:47:13,141 DEBUG blivet: MDRaidMember.__init__: uuid: c51936a3-0842-3708-8fcc-561849cc057b ; exists: True ; label: dhcppc0:swap ; device: /dev/sda1 ; serial: drive-scsi0-0-0-0 ; mdUuid: c51936a3:08423708:8fcc5618:49cc057b ; biosraid: False ; >03:47:13,141 DEBUG blivet: getFormat('linux_raid_member') returning MDRaidMember instance >03:47:13,143 DEBUG blivet: PartitionDevice._setFormat: sda1 ; >03:47:13,145 DEBUG blivet: PartitionDevice._setFormat: sda1 ; current: None ; type: mdmember ; >03:47:13,146 DEBUG blivet: DeviceTree.handleUdevMDMemberFormat: type: mdmember ; name: sda1 ; >03:47:13,148 DEBUG blivet: DeviceTree.getDeviceByUuid returned None >03:47:13,177 INFO blivet: using name dhcppc0:swap for md array containing member sda1 >03:47:13,178 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:13,180 DEBUG blivet: MDRaidArrayDevice._setFormat: dhcppc0:swap ; current: None ; type: None ; >03:47:13,182 DEBUG blivet: MDRaidArrayDevice.updateSysfsPath: dhcppc0:swap ; status: False ; >03:47:13,184 DEBUG blivet: MDRaidArrayDevice._addDevice: dhcppc0:swap ; device: sda1 ; status: True ; >03:47:13,185 DEBUG blivet: PartitionDevice.addChild: kids: 0 ; name: sda1 ; >03:47:13,187 DEBUG blivet: PartitionDevice.setup: sda1 ; status: True ; controllable: True ; orig: False ; >03:47:13,221 WARN blivet: failed to add member /dev/sda1 to md array /dev/md/dhcppc0:swap: mdadd failed for /dev/sda1: running mdadm --incremental --quiet /dev/sda1 failed >03:47:13,222 DEBUG blivet: looking up parted Device: /dev/md/dhcppc0:swap >03:47:13,333 INFO blivet: added mdarray dhcppc0:swap (id 3) to device tree >03:47:13,338 DEBUG blivet: looking up parted Device: /dev/sda1 >03:47:13,346 INFO blivet: got device: PartitionDevice instance (0x7fae27c4df10) -- > name = sda1 status = True kids = 1 id = 2 > parents = ['existing 12000MB disk sda (1) with existing msdos disklabel'] > uuid = None size = 2041.0 > format = existing mdmember > major = 8 minor = 1 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda1 partedDevice = parted.Device instance -- > model: Unknown path: /dev/sda1 type: 0 > sectorSize: 512 physicalSectorSize: 512 > length: 4179968 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (2041, 64, 32) biosGeometry: (260, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59050> > target size = 0 path = /dev/sda1 > format args = [] originalFormat = None grow = None max size = 0 bootable = None > part type = 0 primary = None > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae27c4d690> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae27c4d910> PedPartition: <_ped.Partition object at 0x7fae27c40d70> > disk = existing 12000MB disk sda (1) with existing msdos disklabel > start = 2048 end = 4182015 length = 4179968 > flags = raid >03:47:13,349 INFO blivet: got format: MDRaidMember instance (0x7fae27c54250) -- > type = mdmember name = software RAID status = False > device = /dev/sda1 uuid = c51936a3-0842-3708-8fcc-561849cc057b exists = True > options = None supported = True formattable = True resizable = False > mdUUID = c51936a3:08423708:8fcc5618:49cc057b biosraid = False >03:47:13,365 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part2 /dev/disk/by-label/fedora_dhcppc0 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0-part2 /dev/disk/by-uuid/852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'DEVNAME': 'sda2', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda2', > 'DEVTYPE': 'partition', > 'ID_BTRFS_READY': '1', > 'ID_BUS': 'scsi', > 'ID_FS_LABEL': 'fedora_dhcppc0', > 'ID_FS_LABEL_ENC': 'fedora_dhcppc0', > 'ID_FS_TYPE': 'btrfs', > 'ID_FS_USAGE': 'filesystem', > 'ID_FS_UUID': '852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'ID_FS_UUID_ENC': '852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'ID_FS_UUID_SUB': 'c2bf2d26-7177-4e7b-b298-6e134a95e913', > 'ID_FS_UUID_SUB_ENC': 'c2bf2d26-7177-4e7b-b298-6e134a95e913', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_ENTRY_DISK': '8:0', > 'ID_PART_ENTRY_NUMBER': '2', > 'ID_PART_ENTRY_OFFSET': '4182016', > 'ID_PART_ENTRY_SCHEME': 'dos', > 'ID_PART_ENTRY_SIZE': '20393984', > 'ID_PART_ENTRY_TYPE': '0x83', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_0', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-0', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '2', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47099', > 'name': 'sda2', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-0-part2', > '/dev/disk/by-label/fedora_dhcppc0', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:0-part2', > '/dev/disk/by-uuid/852bfcd3-84c3-4cb0-92cc-787d2f56d51c'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda2'} ; name: sda2 ; >03:47:13,367 INFO blivet: scanning sda2 (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda2)... >03:47:13,368 DEBUG blivet: DeviceTree.getDeviceByName: name: sda2 ; >03:47:13,370 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:13,370 INFO blivet: sda2 is a partition >03:47:13,372 DEBUG blivet: DeviceTree.addUdevPartitionDevice: name: sda2 ; >03:47:13,373 DEBUG blivet: DeviceTree.getDeviceByName: name: sda ; >03:47:13,375 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with existing msdos disklabel >03:47:13,376 DEBUG blivet: DiskDevice.addChild: kids: 1 ; name: sda ; >03:47:13,378 DEBUG blivet: PartitionDevice._setFormat: sda2 ; >03:47:13,378 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:13,380 DEBUG blivet: PartitionDevice._setFormat: sda2 ; current: None ; type: None ; >03:47:13,381 DEBUG blivet: looking up parted Partition: /dev/sda2 >03:47:13,382 DEBUG blivet: PartitionDevice.probe: sda2 ; exists: True ; >03:47:13,384 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sda2 ; flag: 1 ; >03:47:13,385 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sda2 ; flag: 10 ; >03:47:13,387 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sda2 ; flag: 12 ; >03:47:13,387 INFO blivet: added partition sda2 (id 4) to device tree >03:47:13,389 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sda2 ; >03:47:13,390 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sda2 ; label_type: dos ; >03:47:13,392 DEBUG blivet: PartitionDevice.setup: sda2 ; status: True ; controllable: True ; orig: False ; >03:47:13,393 DEBUG blivet: DiskLabel.__init__: device: /dev/sda2 ; labelType: dos ; exists: True ; >03:47:13,404 WARN blivet: disklabel detected but not usable on sda2 >03:47:13,406 INFO blivet: type detected on 'sda2' is 'btrfs' >03:47:13,409 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:13,410 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:47:13,411 DEBUG blivet: PartitionDevice._setFormat: sda2 ; >03:47:13,413 DEBUG blivet: PartitionDevice._setFormat: sda2 ; current: None ; type: btrfs ; >03:47:13,415 DEBUG blivet: DeviceTree.handleBTRFSFormat: name: sda2 ; >03:47:13,415 INFO blivet: creating btrfs volume btrfs.fedora_dhcppc0 >03:47:13,417 DEBUG blivet: PartitionDevice.addChild: kids: 0 ; name: sda2 ; >03:47:13,417 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:13,420 DEBUG blivet: BTRFSVolumeDevice._setFormat: fedora_dhcppc0 ; current: None ; type: None ; >03:47:13,422 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:13,422 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:47:13,424 DEBUG blivet: BTRFSVolumeDevice._setFormat: btrfs.5 ; current: None ; type: btrfs ; >03:47:13,425 INFO blivet: added btrfs volume fedora_dhcppc0 (id 5) to device tree >03:47:13,426 DEBUG blivet: BTRFSVolumeDevice.setup: fedora_dhcppc0 ; status: True ; controllable: True ; orig: True ; >03:47:13,439 INFO blivet: failed to get default SELinux context for /tmp/btrfs-tmp.59SZ6dI: [Errno 2] No such file or directory >03:47:13,440 INFO blivet: set SELinux context for mountpoint /tmp/btrfs-tmp.59SZ6dI to None >03:47:13,491 INFO blivet: failed to get default SELinux context for /tmp/btrfs-tmp.59SZ6dI: [Errno 2] No such file or directory >03:47:13,491 INFO blivet: set SELinux context for newly mounted filesystem root at /tmp/btrfs-tmp.59SZ6dI to None >03:47:15,650 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:15,651 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:47:15,654 DEBUG blivet: BTRFSVolumeDevice.addChild: kids: 0 ; name: fedora_dhcppc0 ; >03:47:15,667 DEBUG blivet: BTRFSSubVolumeDevice._setFormat: boot ; current: None ; type: btrfs ; >03:47:15,668 INFO blivet: added btrfs subvolume boot (id 6) to device tree >03:47:15,673 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:15,684 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:47:15,693 DEBUG blivet: BTRFSVolumeDevice.addChild: kids: 1 ; name: fedora_dhcppc0 ; >03:47:15,696 DEBUG blivet: BTRFSSubVolumeDevice._setFormat: root ; current: None ; type: btrfs ; >03:47:15,696 INFO blivet: added btrfs subvolume root (id 7) to device tree >03:47:15,698 DEBUG blivet: looking up parted Device: /dev/sda2 >03:47:15,700 INFO blivet: got device: PartitionDevice instance (0x7fae27c4de50) -- > name = sda2 status = True kids = 1 id = 4 > parents = ['existing 12000MB disk sda (1) with existing msdos disklabel'] > uuid = None size = 9958.0 > format = existing btrfs filesystem > major = 8 minor = 2 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda2 partedDevice = parted.Device instance -- > model: Unknown path: /dev/sda2 type: 0 > sectorSize: 512 physicalSectorSize: 512 > length: 20393984 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (9958, 64, 32) biosGeometry: (1269, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59170> > target size = 0 path = /dev/sda2 > format args = [] originalFormat = None grow = None max size = 0 bootable = None > part type = 0 primary = None > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae27c4d690> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae27c4dc10> PedPartition: <_ped.Partition object at 0x7fae27c40dd0> > disk = existing 12000MB disk sda (1) with existing msdos disklabel > start = 4182016 end = 24575999 length = 20393984 > flags = >03:47:15,728 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:15,728 INFO blivet: got format: BTRFS instance (0x7fae27c54790) -- > type = btrfs name = btrfs status = False > device = /dev/sda2 uuid = c2bf2d26-7177-4e7b-b298-6e134a95e913 exists = True > options = defaults supported = True formattable = True resizable = False > mountpoint = None mountopts = None > label = fedora_dhcppc0 size = 0.0 targetSize = 0.0 > >03:47:15,733 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1', > 'DEVNAME': 'sdd', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd', > 'DEVTYPE': 'disk', > 'ID_BUS': 'scsi', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_1', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-1', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '48', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47275', > 'name': 'sdd', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd'} ; name: sdd ; >03:47:15,736 INFO blivet: scanning sdd (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd)... >03:47:15,739 DEBUG blivet: DeviceTree.getDeviceByName: name: sdd ; >03:47:15,742 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:15,745 DEBUG blivet: DeviceTree.addUdevDiskDevice: name: sdd ; >03:47:15,748 INFO blivet: sdd is a disk >03:47:15,754 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:15,762 DEBUG blivet: DiskDevice._setFormat: sdd ; current: None ; type: None ; >03:47:15,904 INFO blivet: added disk sdd (id 8) to device tree >03:47:15,904 DEBUG blivet: looking up parted Device: /dev/sdd >03:47:15,912 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdd ; >03:47:15,923 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdd ; label_type: dos ; >03:47:15,925 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:47:15,925 DEBUG blivet: required disklabel type for sdd (1) is None >03:47:15,926 DEBUG blivet: default disklabel type for sdd is msdos >03:47:15,927 DEBUG blivet: selecting msdos disklabel for sdd based on size >03:47:15,928 DEBUG blivet: DiskLabel.__init__: device: /dev/sdd ; labelType: msdos ; exists: True ; >03:47:15,973 DEBUG blivet: Did not change pmbr_boot on parted.Disk instance -- > type: msdos primaryPartitionCount: 2 > lastPartitionNumber: 2 maxPrimaryPartitionCount: 4 > partitions: [<parted.partition.Partition object at 0x7fae0f75f290>, <parted.partition.Partition object at 0x7fae0f75f650>] > device: <parted.device.Device object at 0x7fae0f745f50> > PedDisk: <_ped.Disk object at 0x7fae27c5f368> >03:47:15,974 DEBUG blivet: getFormat('disklabel') returning DiskLabel instance >03:47:15,975 DEBUG blivet: DiskDevice._setFormat: sdd ; current: None ; type: disklabel ; >03:47:15,977 INFO blivet: got device: DiskDevice instance (0x7fae27c54bd0) -- > name = sdd status = True kids = 0 id = 8 > parents = [] > uuid = None size = 12000.0 > format = existing msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = None removable = False partedDevice = <parted.device.Device object at 0x7fae27c54dd0> >03:47:15,983 INFO blivet: got format: DiskLabel instance (0x7fae0f745d50) -- > type = disklabel name = partition table (MSDOS) status = False > device = /dev/sdd uuid = None exists = True > options = None supported = False formattable = True resizable = False > type = msdos partition count = 2 sectorSize = 512 > align_offset = 0 align_grain = 2048 > partedDisk = parted.Disk instance -- > type: msdos primaryPartitionCount: 2 > lastPartitionNumber: 2 maxPrimaryPartitionCount: 4 > partitions: [<parted.partition.Partition object at 0x7fae0f75f290>, <parted.partition.Partition object at 0x7fae0f75f650>] > device: <parted.device.Device object at 0x7fae0f745f50> > PedDisk: <_ped.Disk object at 0x7fae27c5f368> > origPartedDisk = <parted.disk.Disk object at 0x7fae0f75f850> > partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59b00> > >03:47:15,990 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1-part1 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1-part1', > 'DEVNAME': 'sdd1', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd1', > 'DEVTYPE': 'partition', > 'ID_BUS': 'scsi', > 'ID_FS_LABEL': 'dhcppc0:swap', > 'ID_FS_LABEL_ENC': 'dhcppc0:swap', > 'ID_FS_TYPE': 'linux_raid_member', > 'ID_FS_USAGE': 'raid', > 'ID_FS_UUID': 'c51936a3-0842-3708-8fcc-561849cc057b', > 'ID_FS_UUID_ENC': 'c51936a3-0842-3708-8fcc-561849cc057b', > 'ID_FS_UUID_SUB': 'b2071e0a-9ea4-b36c-9603-da25d8ba131c', > 'ID_FS_UUID_SUB_ENC': 'b2071e0a-9ea4-b36c-9603-da25d8ba131c', > 'ID_FS_VERSION': '1.2', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_ENTRY_DISK': '8:48', > 'ID_PART_ENTRY_NUMBER': '1', > 'ID_PART_ENTRY_OFFSET': '2048', > 'ID_PART_ENTRY_SCHEME': 'dos', > 'ID_PART_ENTRY_SIZE': '4179968', > 'ID_PART_ENTRY_TYPE': '0xfd', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_1', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-1', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '49', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47310', > 'name': 'sdd1', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1-part1', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1-part1'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd1'} ; name: sdd1 ; >03:47:15,996 INFO blivet: scanning sdd1 (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd1)... >03:47:15,997 DEBUG blivet: DeviceTree.getDeviceByName: name: sdd1 ; >03:47:15,999 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:16,005 INFO blivet: sdd1 is a partition >03:47:16,007 DEBUG blivet: DeviceTree.addUdevPartitionDevice: name: sdd1 ; >03:47:16,008 DEBUG blivet: DeviceTree.getDeviceByName: name: sdd ; >03:47:16,010 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sdd (8) with existing msdos disklabel >03:47:16,014 DEBUG blivet: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:47:16,015 DEBUG blivet: PartitionDevice._setFormat: sdd1 ; >03:47:16,026 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:16,028 DEBUG blivet: PartitionDevice._setFormat: sdd1 ; current: None ; type: None ; >03:47:16,054 DEBUG blivet: looking up parted Partition: /dev/sdd1 >03:47:16,056 DEBUG blivet: PartitionDevice.probe: sdd1 ; exists: True ; >03:47:16,058 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdd1 ; flag: 1 ; >03:47:16,059 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdd1 ; flag: 10 ; >03:47:16,080 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdd1 ; flag: 12 ; >03:47:16,080 INFO blivet: added partition sdd1 (id 9) to device tree >03:47:16,082 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdd1 ; >03:47:16,087 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdd1 ; label_type: dos ; >03:47:16,088 DEBUG blivet: PartitionDevice.setup: sdd1 ; status: True ; controllable: True ; orig: False ; >03:47:16,090 DEBUG blivet: DiskLabel.__init__: device: /dev/sdd1 ; labelType: dos ; exists: True ; >03:47:16,122 WARN blivet: disklabel detected but not usable on sdd1 >03:47:16,311 INFO blivet: type detected on 'sdd1' is 'linux_raid_member' >03:47:16,313 DEBUG blivet: MDRaidMember.__init__: uuid: c51936a3-0842-3708-8fcc-561849cc057b ; exists: True ; label: dhcppc0:swap ; device: /dev/sdd1 ; serial: drive-scsi0-0-0-1 ; mdUuid: c51936a3:08423708:8fcc5618:49cc057b ; biosraid: False ; >03:47:16,314 DEBUG blivet: getFormat('linux_raid_member') returning MDRaidMember instance >03:47:16,315 DEBUG blivet: PartitionDevice._setFormat: sdd1 ; >03:47:16,317 DEBUG blivet: PartitionDevice._setFormat: sdd1 ; current: None ; type: mdmember ; >03:47:16,325 DEBUG blivet: DeviceTree.handleUdevMDMemberFormat: type: mdmember ; name: sdd1 ; >03:47:16,327 DEBUG blivet: raw RAID 1 size == 2041.0 >03:47:16,327 INFO blivet: Using 1MB superBlockSize >03:47:16,328 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:47:16,329 DEBUG blivet: DeviceTree.getDeviceByUuid returned existing 2039MB mdarray dhcppc0:swap (3) >03:47:16,332 DEBUG blivet: MDRaidArrayDevice._addDevice: dhcppc0:swap ; device: sdd1 ; status: True ; >03:47:16,333 DEBUG blivet: PartitionDevice.addChild: kids: 0 ; name: sdd1 ; >03:47:16,335 DEBUG blivet: PartitionDevice.setup: sdd1 ; status: True ; controllable: True ; orig: False ; >03:47:16,455 WARN blivet: failed to add member /dev/sdd1 to md array /dev/md/dhcppc0:swap: mdadd failed for /dev/sdd1: running mdadm --incremental --quiet /dev/sdd1 failed >03:47:16,457 DEBUG blivet: looking up parted Device: /dev/sdd1 >03:47:16,459 INFO blivet: got device: PartitionDevice instance (0x7fae0f7616d0) -- > name = sdd1 status = True kids = 1 id = 9 > parents = ['existing 12000MB disk sdd (8) with existing msdos disklabel'] > uuid = None size = 2041.0 > format = existing mdmember > major = 8 minor = 49 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd1 partedDevice = parted.Device instance -- > model: Unknown path: /dev/sdd1 type: 0 > sectorSize: 512 physicalSectorSize: 512 > length: 4179968 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (2041, 64, 32) biosGeometry: (260, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674560> > target size = 0 path = /dev/sdd1 > format args = [] originalFormat = None grow = None max size = 0 bootable = None > part type = 0 primary = None > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae0f752bd0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0f75f410> PedPartition: <_ped.Partition object at 0x7fae0f741f50> > disk = existing 12000MB disk sdd (8) with existing msdos disklabel > start = 2048 end = 4182015 length = 4179968 > flags = raid >03:47:16,460 INFO blivet: got format: MDRaidMember instance (0x7fae0c01d6d0) -- > type = mdmember name = software RAID status = False > device = /dev/sdd1 uuid = c51936a3-0842-3708-8fcc-561849cc057b exists = True > options = None supported = True formattable = True resizable = False > mdUUID = c51936a3:08423708:8fcc5618:49cc057b biosraid = False >03:47:16,467 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1-part2 /dev/disk/by-label/fedora_dhcppc0 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1-part2 /dev/disk/by-uuid/852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'DEVNAME': 'sdd2', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd2', > 'DEVTYPE': 'partition', > 'ID_BTRFS_READY': '1', > 'ID_BUS': 'scsi', > 'ID_FS_LABEL': 'fedora_dhcppc0', > 'ID_FS_LABEL_ENC': 'fedora_dhcppc0', > 'ID_FS_TYPE': 'btrfs', > 'ID_FS_USAGE': 'filesystem', > 'ID_FS_UUID': '852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'ID_FS_UUID_ENC': '852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'ID_FS_UUID_SUB': '4f0bfd9e-856d-44e8-81d1-e1ee467c09c7', > 'ID_FS_UUID_SUB_ENC': '4f0bfd9e-856d-44e8-81d1-e1ee467c09c7', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_ENTRY_DISK': '8:48', > 'ID_PART_ENTRY_NUMBER': '2', > 'ID_PART_ENTRY_OFFSET': '4182016', > 'ID_PART_ENTRY_SCHEME': 'dos', > 'ID_PART_ENTRY_SIZE': '20393984', > 'ID_PART_ENTRY_TYPE': '0x83', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_1', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-1', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '50', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47345', > 'name': 'sdd2', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-1-part2', > '/dev/disk/by-label/fedora_dhcppc0', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:1-part2', > '/dev/disk/by-uuid/852bfcd3-84c3-4cb0-92cc-787d2f56d51c'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd2'} ; name: sdd2 ; >03:47:16,474 INFO blivet: scanning sdd2 (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd2)... >03:47:16,476 DEBUG blivet: DeviceTree.getDeviceByName: name: sdd2 ; >03:47:16,477 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:16,482 INFO blivet: sdd2 is a partition >03:47:16,483 DEBUG blivet: DeviceTree.addUdevPartitionDevice: name: sdd2 ; >03:47:16,485 DEBUG blivet: DeviceTree.getDeviceByName: name: sdd ; >03:47:16,486 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sdd (8) with existing msdos disklabel >03:47:16,492 DEBUG blivet: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:47:16,494 DEBUG blivet: PartitionDevice._setFormat: sdd2 ; >03:47:16,495 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:16,496 DEBUG blivet: PartitionDevice._setFormat: sdd2 ; current: None ; type: None ; >03:47:16,501 DEBUG blivet: looking up parted Partition: /dev/sdd2 >03:47:16,502 DEBUG blivet: PartitionDevice.probe: sdd2 ; exists: True ; >03:47:16,504 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdd2 ; flag: 1 ; >03:47:16,505 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdd2 ; flag: 10 ; >03:47:16,509 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdd2 ; flag: 12 ; >03:47:16,509 INFO blivet: added partition sdd2 (id 10) to device tree >03:47:16,512 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdd2 ; >03:47:16,518 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdd2 ; label_type: dos ; >03:47:16,519 DEBUG blivet: PartitionDevice.setup: sdd2 ; status: True ; controllable: True ; orig: False ; >03:47:16,521 DEBUG blivet: DiskLabel.__init__: device: /dev/sdd2 ; labelType: dos ; exists: True ; >03:47:16,554 WARN blivet: disklabel detected but not usable on sdd2 >03:47:16,555 INFO blivet: type detected on 'sdd2' is 'btrfs' >03:47:16,558 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:16,564 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:47:16,565 DEBUG blivet: PartitionDevice._setFormat: sdd2 ; >03:47:16,568 DEBUG blivet: PartitionDevice._setFormat: sdd2 ; current: None ; type: btrfs ; >03:47:16,571 DEBUG blivet: DeviceTree.handleBTRFSFormat: name: sdd2 ; >03:47:16,572 INFO blivet: found btrfs volume fedora_dhcppc0 >03:47:16,573 DEBUG blivet: BTRFSVolumeDevice._addDevice: fedora_dhcppc0 ; device: sdd2 ; status: True ; >03:47:16,574 DEBUG blivet: PartitionDevice.addChild: kids: 0 ; name: sdd2 ; >03:47:16,575 DEBUG blivet: looking up parted Device: /dev/sdd2 >03:47:16,577 INFO blivet: got device: PartitionDevice instance (0x7fae0f68bd50) -- > name = sdd2 status = True kids = 1 id = 10 > parents = ['existing 12000MB disk sdd (8) with existing msdos disklabel'] > uuid = None size = 9958.0 > format = existing btrfs filesystem > major = 8 minor = 50 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd2 partedDevice = parted.Device instance -- > model: Unknown path: /dev/sdd2 type: 0 > sectorSize: 512 physicalSectorSize: 512 > length: 20393984 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (9958, 64, 32) biosGeometry: (1269, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674680> > target size = 0 path = /dev/sdd2 > format args = [] originalFormat = None grow = None max size = 0 bootable = None > part type = 0 primary = None > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae0f752bd0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0f75f710> PedPartition: <_ped.Partition object at 0x7fae0f741170> > disk = existing 12000MB disk sdd (8) with existing msdos disklabel > start = 4182016 end = 24575999 length = 20393984 > flags = >03:47:16,579 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:16,580 INFO blivet: got format: BTRFS instance (0x7fae0f6a4410) -- > type = btrfs name = btrfs status = False > device = /dev/sdd2 uuid = 4f0bfd9e-856d-44e8-81d1-e1ee467c09c7 exists = True > options = defaults supported = True formattable = True resizable = False > mountpoint = None mountopts = None > label = fedora_dhcppc0 size = 0.0 targetSize = 0.0 > >03:47:16,584 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2', > 'DEVNAME': 'sdc', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc', > 'DEVTYPE': 'disk', > 'ID_BUS': 'scsi', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_2', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-2', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '32', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47531', > 'name': 'sdc', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc'} ; name: sdc ; >03:47:16,585 INFO blivet: scanning sdc (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc)... >03:47:16,587 DEBUG blivet: DeviceTree.getDeviceByName: name: sdc ; >03:47:16,588 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:16,590 DEBUG blivet: DeviceTree.addUdevDiskDevice: name: sdc ; >03:47:16,591 INFO blivet: sdc is a disk >03:47:16,591 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:16,601 DEBUG blivet: DiskDevice._setFormat: sdc ; current: None ; type: None ; >03:47:16,636 INFO blivet: added disk sdc (id 11) to device tree >03:47:16,637 DEBUG blivet: looking up parted Device: /dev/sdc >03:47:16,642 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdc ; >03:47:16,643 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdc ; label_type: dos ; >03:47:16,644 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:47:16,645 DEBUG blivet: required disklabel type for sdc (1) is None >03:47:16,645 DEBUG blivet: default disklabel type for sdc is msdos >03:47:16,646 DEBUG blivet: selecting msdos disklabel for sdc based on size >03:47:16,647 DEBUG blivet: DiskLabel.__init__: device: /dev/sdc ; labelType: msdos ; exists: True ; >03:47:16,661 DEBUG blivet: Did not change pmbr_boot on parted.Disk instance -- > type: msdos primaryPartitionCount: 2 > lastPartitionNumber: 2 maxPrimaryPartitionCount: 4 > partitions: [<parted.partition.Partition object at 0x7fae0f6a4750>, <parted.partition.Partition object at 0x7fae0f6a48d0>] > device: <parted.device.Device object at 0x7fae0f6a4110> > PedDisk: <_ped.Disk object at 0x7fae0f686128> >03:47:16,662 DEBUG blivet: getFormat('disklabel') returning DiskLabel instance >03:47:16,663 DEBUG blivet: DiskDevice._setFormat: sdc ; current: None ; type: disklabel ; >03:47:16,669 INFO blivet: got device: DiskDevice instance (0x7fae0f7615d0) -- > name = sdc status = True kids = 0 id = 11 > parents = [] > uuid = None size = 12000.0 > format = existing msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = None removable = False partedDevice = <parted.device.Device object at 0x7fae0f68bed0> >03:47:16,674 INFO blivet: got format: DiskLabel instance (0x7fae0f6a4590) -- > type = disklabel name = partition table (MSDOS) status = False > device = /dev/sdc uuid = None exists = True > options = None supported = False formattable = True resizable = False > type = msdos partition count = 2 sectorSize = 512 > align_offset = 0 align_grain = 2048 > partedDisk = parted.Disk instance -- > type: msdos primaryPartitionCount: 2 > lastPartitionNumber: 2 maxPrimaryPartitionCount: 4 > partitions: [<parted.partition.Partition object at 0x7fae0f6a4750>, <parted.partition.Partition object at 0x7fae0f6a48d0>] > device: <parted.device.Device object at 0x7fae0f6a4110> > PedDisk: <_ped.Disk object at 0x7fae0f686128> > origPartedDisk = <parted.disk.Disk object at 0x7fae0f6a4ad0> > partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f6747a0> > >03:47:16,695 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2-part1 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2-part1', > 'DEVNAME': 'sdc1', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc1', > 'DEVTYPE': 'partition', > 'ID_BUS': 'scsi', > 'ID_FS_LABEL': 'dhcppc0:swap', > 'ID_FS_LABEL_ENC': 'dhcppc0:swap', > 'ID_FS_TYPE': 'linux_raid_member', > 'ID_FS_USAGE': 'raid', > 'ID_FS_UUID': 'c51936a3-0842-3708-8fcc-561849cc057b', > 'ID_FS_UUID_ENC': 'c51936a3-0842-3708-8fcc-561849cc057b', > 'ID_FS_UUID_SUB': '9e04ee80-9506-6f3c-2aca-750dff574837', > 'ID_FS_UUID_SUB_ENC': '9e04ee80-9506-6f3c-2aca-750dff574837', > 'ID_FS_VERSION': '1.2', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_ENTRY_DISK': '8:32', > 'ID_PART_ENTRY_NUMBER': '1', > 'ID_PART_ENTRY_OFFSET': '2048', > 'ID_PART_ENTRY_SCHEME': 'dos', > 'ID_PART_ENTRY_SIZE': '4179968', > 'ID_PART_ENTRY_TYPE': '0xfd', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_2', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-2', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '33', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47563', > 'name': 'sdc1', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2-part1', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2-part1'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc1'} ; name: sdc1 ; >03:47:16,697 INFO blivet: scanning sdc1 (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc1)... >03:47:16,699 DEBUG blivet: DeviceTree.getDeviceByName: name: sdc1 ; >03:47:16,700 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:16,701 INFO blivet: sdc1 is a partition >03:47:16,702 DEBUG blivet: DeviceTree.addUdevPartitionDevice: name: sdc1 ; >03:47:16,703 DEBUG blivet: DeviceTree.getDeviceByName: name: sdc ; >03:47:16,705 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sdc (11) with existing msdos disklabel >03:47:16,706 DEBUG blivet: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:47:16,708 DEBUG blivet: PartitionDevice._setFormat: sdc1 ; >03:47:16,708 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:16,710 DEBUG blivet: PartitionDevice._setFormat: sdc1 ; current: None ; type: None ; >03:47:16,710 DEBUG blivet: looking up parted Partition: /dev/sdc1 >03:47:16,712 DEBUG blivet: PartitionDevice.probe: sdc1 ; exists: True ; >03:47:16,713 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdc1 ; flag: 1 ; >03:47:16,714 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdc1 ; flag: 10 ; >03:47:16,716 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdc1 ; flag: 12 ; >03:47:16,716 INFO blivet: added partition sdc1 (id 12) to device tree >03:47:16,718 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdc1 ; >03:47:16,719 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdc1 ; label_type: dos ; >03:47:16,720 DEBUG blivet: PartitionDevice.setup: sdc1 ; status: True ; controllable: True ; orig: False ; >03:47:16,722 DEBUG blivet: DiskLabel.__init__: device: /dev/sdc1 ; labelType: dos ; exists: True ; >03:47:16,731 WARN blivet: disklabel detected but not usable on sdc1 >03:47:16,779 INFO blivet: type detected on 'sdc1' is 'linux_raid_member' >03:47:16,781 DEBUG blivet: MDRaidMember.__init__: uuid: c51936a3-0842-3708-8fcc-561849cc057b ; exists: True ; label: dhcppc0:swap ; device: /dev/sdc1 ; serial: drive-scsi0-0-0-2 ; mdUuid: c51936a3:08423708:8fcc5618:49cc057b ; biosraid: False ; >03:47:16,782 DEBUG blivet: getFormat('linux_raid_member') returning MDRaidMember instance >03:47:16,783 DEBUG blivet: PartitionDevice._setFormat: sdc1 ; >03:47:16,794 DEBUG blivet: PartitionDevice._setFormat: sdc1 ; current: None ; type: mdmember ; >03:47:16,799 DEBUG blivet: DeviceTree.handleUdevMDMemberFormat: type: mdmember ; name: sdc1 ; >03:47:16,805 DEBUG blivet: raw RAID 1 size == 2041.0 >03:47:16,806 INFO blivet: Using 1MB superBlockSize >03:47:16,806 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:47:16,807 DEBUG blivet: DeviceTree.getDeviceByUuid returned existing 2039MB mdarray dhcppc0:swap (3) >03:47:16,808 DEBUG blivet: MDRaidArrayDevice._addDevice: dhcppc0:swap ; device: sdc1 ; status: True ; >03:47:16,810 DEBUG blivet: PartitionDevice.addChild: kids: 0 ; name: sdc1 ; >03:47:16,812 DEBUG blivet: PartitionDevice.setup: sdc1 ; status: True ; controllable: True ; orig: False ; >03:47:16,907 WARN blivet: failed to add member /dev/sdc1 to md array /dev/md/dhcppc0:swap: mdadd failed for /dev/sdc1: running mdadm --incremental --quiet /dev/sdc1 failed >03:47:16,909 DEBUG blivet: looking up parted Device: /dev/sdc1 >03:47:16,910 INFO blivet: got device: PartitionDevice instance (0x7fae0f68be50) -- > name = sdc1 status = True kids = 1 id = 12 > parents = ['existing 12000MB disk sdc (11) with existing msdos disklabel'] > uuid = None size = 2041.0 > format = existing mdmember > major = 8 minor = 33 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc1 partedDevice = parted.Device instance -- > model: Unknown path: /dev/sdc1 type: 0 > sectorSize: 512 physicalSectorSize: 512 > length: 4179968 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (2041, 64, 32) biosGeometry: (260, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f6749e0> > target size = 0 path = /dev/sdc1 > format args = [] originalFormat = None grow = None max size = 0 bootable = None > part type = 0 primary = None > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae0f6a4690> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0f6a4810> PedPartition: <_ped.Partition object at 0x7fae1285fef0> > disk = existing 12000MB disk sdc (11) with existing msdos disklabel > start = 2048 end = 4182015 length = 4179968 > flags = raid >03:47:16,912 INFO blivet: got format: MDRaidMember instance (0x7fae0f6a4f90) -- > type = mdmember name = software RAID status = False > device = /dev/sdc1 uuid = c51936a3-0842-3708-8fcc-561849cc057b exists = True > options = None supported = True formattable = True resizable = False > mdUUID = c51936a3:08423708:8fcc5618:49cc057b biosraid = False >03:47:16,921 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2-part2 /dev/disk/by-label/fedora_dhcppc0 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2-part2 /dev/disk/by-uuid/852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'DEVNAME': 'sdc2', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc2', > 'DEVTYPE': 'partition', > 'ID_BTRFS_READY': '1', > 'ID_BUS': 'scsi', > 'ID_FS_LABEL': 'fedora_dhcppc0', > 'ID_FS_LABEL_ENC': 'fedora_dhcppc0', > 'ID_FS_TYPE': 'btrfs', > 'ID_FS_USAGE': 'filesystem', > 'ID_FS_UUID': '852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'ID_FS_UUID_ENC': '852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'ID_FS_UUID_SUB': 'f54da582-0e57-46d0-a99b-420d668f19e5', > 'ID_FS_UUID_SUB_ENC': 'f54da582-0e57-46d0-a99b-420d668f19e5', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_ENTRY_DISK': '8:32', > 'ID_PART_ENTRY_NUMBER': '2', > 'ID_PART_ENTRY_OFFSET': '4182016', > 'ID_PART_ENTRY_SCHEME': 'dos', > 'ID_PART_ENTRY_SIZE': '20393984', > 'ID_PART_ENTRY_TYPE': '0x83', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_2', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-2', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '34', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47597', > 'name': 'sdc2', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-2-part2', > '/dev/disk/by-label/fedora_dhcppc0', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:2-part2', > '/dev/disk/by-uuid/852bfcd3-84c3-4cb0-92cc-787d2f56d51c'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc2'} ; name: sdc2 ; >03:47:16,922 INFO blivet: scanning sdc2 (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc2)... >03:47:16,924 DEBUG blivet: DeviceTree.getDeviceByName: name: sdc2 ; >03:47:16,925 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:16,926 INFO blivet: sdc2 is a partition >03:47:16,928 DEBUG blivet: DeviceTree.addUdevPartitionDevice: name: sdc2 ; >03:47:16,967 DEBUG blivet: DeviceTree.getDeviceByName: name: sdc ; >03:47:16,969 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sdc (11) with existing msdos disklabel >03:47:16,970 DEBUG blivet: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:47:16,974 DEBUG blivet: PartitionDevice._setFormat: sdc2 ; >03:47:16,974 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:16,977 DEBUG blivet: PartitionDevice._setFormat: sdc2 ; current: None ; type: None ; >03:47:16,978 DEBUG blivet: looking up parted Partition: /dev/sdc2 >03:47:16,980 DEBUG blivet: PartitionDevice.probe: sdc2 ; exists: True ; >03:47:16,987 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdc2 ; flag: 1 ; >03:47:16,988 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdc2 ; flag: 10 ; >03:47:16,990 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdc2 ; flag: 12 ; >03:47:16,990 INFO blivet: added partition sdc2 (id 13) to device tree >03:47:16,992 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdc2 ; >03:47:16,997 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdc2 ; label_type: dos ; >03:47:17,020 DEBUG blivet: PartitionDevice.setup: sdc2 ; status: True ; controllable: True ; orig: False ; >03:47:17,023 DEBUG blivet: DiskLabel.__init__: device: /dev/sdc2 ; labelType: dos ; exists: True ; >03:47:17,048 WARN blivet: disklabel detected but not usable on sdc2 >03:47:17,049 INFO blivet: type detected on 'sdc2' is 'btrfs' >03:47:17,052 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:17,057 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:47:17,060 DEBUG blivet: PartitionDevice._setFormat: sdc2 ; >03:47:17,061 DEBUG blivet: PartitionDevice._setFormat: sdc2 ; current: None ; type: btrfs ; >03:47:17,063 DEBUG blivet: DeviceTree.handleBTRFSFormat: name: sdc2 ; >03:47:17,063 INFO blivet: found btrfs volume fedora_dhcppc0 >03:47:17,065 DEBUG blivet: BTRFSVolumeDevice._addDevice: fedora_dhcppc0 ; device: sdc2 ; status: True ; >03:47:17,066 DEBUG blivet: PartitionDevice.addChild: kids: 0 ; name: sdc2 ; >03:47:17,067 DEBUG blivet: looking up parted Device: /dev/sdc2 >03:47:17,069 INFO blivet: got device: PartitionDevice instance (0x7fae0f6a4f10) -- > name = sdc2 status = True kids = 1 id = 13 > parents = ['existing 12000MB disk sdc (11) with existing msdos disklabel'] > uuid = None size = 9958.0 > format = existing btrfs filesystem > major = 8 minor = 34 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc2 partedDevice = parted.Device instance -- > model: Unknown path: /dev/sdc2 type: 0 > sectorSize: 512 physicalSectorSize: 512 > length: 20393984 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (9958, 64, 32) biosGeometry: (1269, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674b00> > target size = 0 path = /dev/sdc2 > format args = [] originalFormat = None grow = None max size = 0 bootable = None > part type = 0 primary = None > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae0f6a4690> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0f6a4990> PedPartition: <_ped.Partition object at 0x7fae1285fa10> > disk = existing 12000MB disk sdc (11) with existing msdos disklabel > start = 4182016 end = 24575999 length = 20393984 > flags = >03:47:17,071 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:17,071 INFO blivet: got format: BTRFS instance (0x7fae0f6a43d0) -- > type = btrfs name = btrfs status = False > device = /dev/sdc2 uuid = f54da582-0e57-46d0-a99b-420d668f19e5 exists = True > options = defaults supported = True formattable = True resizable = False > mountpoint = None mountopts = None > label = fedora_dhcppc0 size = 0.0 targetSize = 0.0 > >03:47:17,076 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3', > 'DEVNAME': 'sdb', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb', > 'DEVTYPE': 'disk', > 'ID_BUS': 'scsi', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_3', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-3', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '16', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47815', > 'name': 'sdb', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb'} ; name: sdb ; >03:47:17,090 INFO blivet: scanning sdb (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb)... >03:47:17,092 DEBUG blivet: DeviceTree.getDeviceByName: name: sdb ; >03:47:17,093 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:17,095 DEBUG blivet: DeviceTree.addUdevDiskDevice: name: sdb ; >03:47:17,096 INFO blivet: sdb is a disk >03:47:17,097 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:17,098 DEBUG blivet: DiskDevice._setFormat: sdb ; current: None ; type: None ; >03:47:17,186 INFO blivet: added disk sdb (id 14) to device tree >03:47:17,187 DEBUG blivet: looking up parted Device: /dev/sdb >03:47:17,199 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdb ; >03:47:17,208 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdb ; label_type: dos ; >03:47:17,254 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:47:17,320 DEBUG blivet: required disklabel type for sdb (1) is None >03:47:17,337 DEBUG blivet: default disklabel type for sdb is msdos >03:47:17,346 DEBUG blivet: selecting msdos disklabel for sdb based on size >03:47:17,358 DEBUG blivet: DiskLabel.__init__: device: /dev/sdb ; labelType: msdos ; exists: True ; >03:47:17,385 DEBUG blivet: Did not change pmbr_boot on parted.Disk instance -- > type: msdos primaryPartitionCount: 2 > lastPartitionNumber: 2 maxPrimaryPartitionCount: 4 > partitions: [<parted.partition.Partition object at 0x7fae0c00b610>, <parted.partition.Partition object at 0x7fae0c00b790>] > device: <parted.device.Device object at 0x7fae0c00b450> > PedDisk: <_ped.Disk object at 0x7fae0c010128> >03:47:17,386 DEBUG blivet: getFormat('disklabel') returning DiskLabel instance >03:47:17,392 DEBUG blivet: DiskDevice._setFormat: sdb ; current: None ; type: disklabel ; >03:47:17,394 INFO blivet: got device: DiskDevice instance (0x7fae0f6a4e50) -- > name = sdb status = True kids = 0 id = 14 > parents = [] > uuid = None size = 12000.0 > format = existing msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = None removable = False partedDevice = <parted.device.Device object at 0x7fae0c00b1d0> >03:47:17,396 INFO blivet: got format: DiskLabel instance (0x7fae0c00b210) -- > type = disklabel name = partition table (MSDOS) status = False > device = /dev/sdb uuid = None exists = True > options = None supported = False formattable = True resizable = False > type = msdos partition count = 2 sectorSize = 512 > align_offset = 0 align_grain = 2048 > partedDisk = parted.Disk instance -- > type: msdos primaryPartitionCount: 2 > lastPartitionNumber: 2 maxPrimaryPartitionCount: 4 > partitions: [<parted.partition.Partition object at 0x7fae0c00b610>, <parted.partition.Partition object at 0x7fae0c00b790>] > device: <parted.device.Device object at 0x7fae0c00b450> > PedDisk: <_ped.Disk object at 0x7fae0c010128> > origPartedDisk = <parted.disk.Disk object at 0x7fae0c00b990> > partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674c20> > >03:47:17,410 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3-part1 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3-part1', > 'DEVNAME': 'sdb1', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb1', > 'DEVTYPE': 'partition', > 'ID_BUS': 'scsi', > 'ID_FS_LABEL': 'dhcppc0:swap', > 'ID_FS_LABEL_ENC': 'dhcppc0:swap', > 'ID_FS_TYPE': 'linux_raid_member', > 'ID_FS_USAGE': 'raid', > 'ID_FS_UUID': 'c51936a3-0842-3708-8fcc-561849cc057b', > 'ID_FS_UUID_ENC': 'c51936a3-0842-3708-8fcc-561849cc057b', > 'ID_FS_UUID_SUB': '9e6e36f1-2e00-7c25-8a0a-f0688cb14a98', > 'ID_FS_UUID_SUB_ENC': '9e6e36f1-2e00-7c25-8a0a-f0688cb14a98', > 'ID_FS_VERSION': '1.2', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_ENTRY_DISK': '8:16', > 'ID_PART_ENTRY_NUMBER': '1', > 'ID_PART_ENTRY_OFFSET': '2048', > 'ID_PART_ENTRY_SCHEME': 'dos', > 'ID_PART_ENTRY_SIZE': '4179968', > 'ID_PART_ENTRY_TYPE': '0xfd', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_3', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-3', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '17', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47853', > 'name': 'sdb1', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3-part1', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3-part1'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb1'} ; name: sdb1 ; >03:47:17,412 INFO blivet: scanning sdb1 (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb1)... >03:47:17,414 DEBUG blivet: DeviceTree.getDeviceByName: name: sdb1 ; >03:47:17,415 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:17,420 INFO blivet: sdb1 is a partition >03:47:17,421 DEBUG blivet: DeviceTree.addUdevPartitionDevice: name: sdb1 ; >03:47:17,423 DEBUG blivet: DeviceTree.getDeviceByName: name: sdb ; >03:47:17,424 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sdb (14) with existing msdos disklabel >03:47:17,427 DEBUG blivet: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:47:17,430 DEBUG blivet: PartitionDevice._setFormat: sdb1 ; >03:47:17,431 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:17,433 DEBUG blivet: PartitionDevice._setFormat: sdb1 ; current: None ; type: None ; >03:47:17,435 DEBUG blivet: looking up parted Partition: /dev/sdb1 >03:47:17,441 DEBUG blivet: PartitionDevice.probe: sdb1 ; exists: True ; >03:47:17,442 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdb1 ; flag: 1 ; >03:47:17,444 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdb1 ; flag: 10 ; >03:47:17,446 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdb1 ; flag: 12 ; >03:47:17,447 INFO blivet: added partition sdb1 (id 15) to device tree >03:47:17,448 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdb1 ; >03:47:17,452 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdb1 ; label_type: dos ; >03:47:17,453 DEBUG blivet: PartitionDevice.setup: sdb1 ; status: True ; controllable: True ; orig: False ; >03:47:17,459 DEBUG blivet: DiskLabel.__init__: device: /dev/sdb1 ; labelType: dos ; exists: True ; >03:47:17,471 WARN blivet: disklabel detected but not usable on sdb1 >03:47:17,544 INFO blivet: type detected on 'sdb1' is 'linux_raid_member' >03:47:17,546 DEBUG blivet: MDRaidMember.__init__: uuid: c51936a3-0842-3708-8fcc-561849cc057b ; exists: True ; label: dhcppc0:swap ; device: /dev/sdb1 ; serial: drive-scsi0-0-0-3 ; mdUuid: c51936a3:08423708:8fcc5618:49cc057b ; biosraid: False ; >03:47:17,547 DEBUG blivet: getFormat('linux_raid_member') returning MDRaidMember instance >03:47:17,549 DEBUG blivet: PartitionDevice._setFormat: sdb1 ; >03:47:17,551 DEBUG blivet: PartitionDevice._setFormat: sdb1 ; current: None ; type: mdmember ; >03:47:17,554 DEBUG blivet: DeviceTree.handleUdevMDMemberFormat: type: mdmember ; name: sdb1 ; >03:47:17,561 DEBUG blivet: raw RAID 1 size == 2041.0 >03:47:17,561 INFO blivet: Using 1MB superBlockSize >03:47:17,562 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:47:17,562 DEBUG blivet: DeviceTree.getDeviceByUuid returned existing 2039MB mdarray dhcppc0:swap (3) >03:47:17,564 DEBUG blivet: MDRaidArrayDevice._addDevice: dhcppc0:swap ; device: sdb1 ; status: True ; >03:47:17,565 DEBUG blivet: PartitionDevice.addChild: kids: 0 ; name: sdb1 ; >03:47:17,569 DEBUG blivet: PartitionDevice.setup: sdb1 ; status: True ; controllable: True ; orig: False ; >03:47:17,640 WARN blivet: failed to add member /dev/sdb1 to md array /dev/md/dhcppc0:swap: mdadd failed for /dev/sdb1: running mdadm --incremental --quiet /dev/sdb1 failed >03:47:17,642 DEBUG blivet: looking up parted Device: /dev/sdb1 >03:47:17,644 INFO blivet: got device: PartitionDevice instance (0x7fae0f6a4350) -- > name = sdb1 status = True kids = 1 id = 15 > parents = ['existing 12000MB disk sdb (14) with existing msdos disklabel'] > uuid = None size = 2041.0 > format = existing mdmember > major = 8 minor = 17 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb1 partedDevice = parted.Device instance -- > model: Unknown path: /dev/sdb1 type: 0 > sectorSize: 512 physicalSectorSize: 512 > length: 4179968 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (2041, 64, 32) biosGeometry: (260, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674e60> > target size = 0 path = /dev/sdb1 > format args = [] originalFormat = None grow = None max size = 0 bootable = None > part type = 0 primary = None > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae0c00b510> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0c00b6d0> PedPartition: <_ped.Partition object at 0x7fae1285f9b0> > disk = existing 12000MB disk sdb (14) with existing msdos disklabel > start = 2048 end = 4182015 length = 4179968 > flags = raid >03:47:17,649 INFO blivet: got format: MDRaidMember instance (0x7fae0c00bb10) -- > type = mdmember name = software RAID status = False > device = /dev/sdb1 uuid = c51936a3-0842-3708-8fcc-561849cc057b exists = True > options = None supported = True formattable = True resizable = False > mdUUID = c51936a3:08423708:8fcc5618:49cc057b biosraid = False >03:47:17,655 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3-part2 /dev/disk/by-label/fedora_dhcppc0 /dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3-part2 /dev/disk/by-uuid/852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'DEVNAME': 'sdb2', > 'DEVPATH': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb2', > 'DEVTYPE': 'partition', > 'ID_BTRFS_READY': '1', > 'ID_BUS': 'scsi', > 'ID_FS_LABEL': 'fedora_dhcppc0', > 'ID_FS_LABEL_ENC': 'fedora_dhcppc0', > 'ID_FS_TYPE': 'btrfs', > 'ID_FS_USAGE': 'filesystem', > 'ID_FS_UUID': '852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'ID_FS_UUID_ENC': '852bfcd3-84c3-4cb0-92cc-787d2f56d51c', > 'ID_FS_UUID_SUB': 'ecfc79bc-a494-44e4-b0e3-463ef6a52a1a', > 'ID_FS_UUID_SUB_ENC': 'ecfc79bc-a494-44e4-b0e3-463ef6a52a1a', > 'ID_MODEL': 'QEMU_HARDDISK', > 'ID_MODEL_ENC': 'QEMU\\x20HARDDISK\\x20\\x20\\x20', > 'ID_PART_ENTRY_DISK': '8:16', > 'ID_PART_ENTRY_NUMBER': '2', > 'ID_PART_ENTRY_OFFSET': '4182016', > 'ID_PART_ENTRY_SCHEME': 'dos', > 'ID_PART_ENTRY_SIZE': '20393984', > 'ID_PART_ENTRY_TYPE': '0x83', > 'ID_PART_TABLE_TYPE': 'dos', > 'ID_PATH': 'pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3', > 'ID_PATH_TAG': 'pci-0000_00_06_0-virtio-pci-virtio2-scsi-0_0_0_3', > 'ID_REVISION': '1.0.', > 'ID_SCSI': '1', > 'ID_SERIAL': '0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3', > 'ID_SERIAL_SHORT': 'drive-scsi0-0-0-3', > 'ID_TYPE': 'disk', > 'ID_VENDOR': 'QEMU', > 'ID_VENDOR_ENC': 'QEMU\\x20\\x20\\x20\\x20', > 'MAJOR': '8', > 'MINOR': '18', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '47892', > 'name': 'sdb2', > 'symlinks': ['/dev/disk/by-id/scsi-0QEMU_QEMU_HARDDISK_drive-scsi0-0-0-3-part2', > '/dev/disk/by-label/fedora_dhcppc0', > '/dev/disk/by-path/pci-0000:00:06.0-virtio-pci-virtio2-scsi-0:0:0:3-part2', > '/dev/disk/by-uuid/852bfcd3-84c3-4cb0-92cc-787d2f56d51c'], > 'sysfs_path': '/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb2'} ; name: sdb2 ; >03:47:17,659 INFO blivet: scanning sdb2 (/devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb2)... >03:47:17,661 DEBUG blivet: DeviceTree.getDeviceByName: name: sdb2 ; >03:47:17,664 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:17,666 INFO blivet: sdb2 is a partition >03:47:17,677 DEBUG blivet: DeviceTree.addUdevPartitionDevice: name: sdb2 ; >03:47:17,680 DEBUG blivet: DeviceTree.getDeviceByName: name: sdb ; >03:47:17,682 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sdb (14) with existing msdos disklabel >03:47:17,686 DEBUG blivet: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:47:17,688 DEBUG blivet: PartitionDevice._setFormat: sdb2 ; >03:47:17,690 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:17,693 DEBUG blivet: PartitionDevice._setFormat: sdb2 ; current: None ; type: None ; >03:47:17,695 DEBUG blivet: looking up parted Partition: /dev/sdb2 >03:47:17,698 DEBUG blivet: PartitionDevice.probe: sdb2 ; exists: True ; >03:47:17,705 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdb2 ; flag: 1 ; >03:47:17,707 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdb2 ; flag: 10 ; >03:47:17,708 DEBUG blivet: PartitionDevice.getFlag: path: /dev/sdb2 ; flag: 12 ; >03:47:17,709 INFO blivet: added partition sdb2 (id 16) to device tree >03:47:17,714 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: sdb2 ; >03:47:17,716 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: sdb2 ; label_type: dos ; >03:47:17,725 DEBUG blivet: PartitionDevice.setup: sdb2 ; status: True ; controllable: True ; orig: False ; >03:47:17,731 DEBUG blivet: DiskLabel.__init__: device: /dev/sdb2 ; labelType: dos ; exists: True ; >03:47:17,805 WARN blivet: disklabel detected but not usable on sdb2 >03:47:17,806 INFO blivet: type detected on 'sdb2' is 'btrfs' >03:47:17,808 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:17,809 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:47:17,816 DEBUG blivet: PartitionDevice._setFormat: sdb2 ; >03:47:17,819 DEBUG blivet: PartitionDevice._setFormat: sdb2 ; current: None ; type: btrfs ; >03:47:17,822 DEBUG blivet: DeviceTree.handleBTRFSFormat: name: sdb2 ; >03:47:17,826 INFO blivet: found btrfs volume fedora_dhcppc0 >03:47:17,828 DEBUG blivet: BTRFSVolumeDevice._addDevice: fedora_dhcppc0 ; device: sdb2 ; status: True ; >03:47:17,829 DEBUG blivet: PartitionDevice.addChild: kids: 0 ; name: sdb2 ; >03:47:17,835 DEBUG blivet: looking up parted Device: /dev/sdb2 >03:47:17,837 INFO blivet: got device: PartitionDevice instance (0x7fae0c02ca90) -- > name = sdb2 status = True kids = 1 id = 16 > parents = ['existing 12000MB disk sdb (14) with existing msdos disklabel'] > uuid = None size = 9958.0 > format = existing btrfs filesystem > major = 8 minor = 18 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb2 partedDevice = parted.Device instance -- > model: Unknown path: /dev/sdb2 type: 0 > sectorSize: 512 physicalSectorSize: 512 > length: 20393984 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (9958, 64, 32) biosGeometry: (1269, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0c0247a0> > target size = 0 path = /dev/sdb2 > format args = [] originalFormat = None grow = None max size = 0 bootable = None > part type = 0 primary = None > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae0c00b510> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0c00b850> PedPartition: <_ped.Partition object at 0x7fae1285f770> > disk = existing 12000MB disk sdb (14) with existing msdos disklabel > start = 4182016 end = 24575999 length = 20393984 > flags = >03:47:17,840 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:17,844 INFO blivet: got format: BTRFS instance (0x7fae0c02f290) -- > type = btrfs name = btrfs status = False > device = /dev/sdb2 uuid = ecfc79bc-a494-44e4-b0e3-463ef6a52a1a exists = True > options = defaults supported = True formattable = True resizable = False > mountpoint = None mountopts = None > label = fedora_dhcppc0 size = 0.0 targetSize = 0.0 > >03:47:17,847 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVNAME': 'loop0', > 'DEVPATH': '/devices/virtual/block/loop0', > 'DEVTYPE': 'disk', > 'ID_FS_TYPE': 'squashfs', > 'ID_FS_USAGE': 'filesystem', > 'ID_FS_VERSION': '1024.0', > 'MAJOR': '7', > 'MINOR': '0', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'USEC_INITIALIZED': '53039', > 'name': 'loop0', > 'symlinks': [], > 'sysfs_path': '/devices/virtual/block/loop0'} ; name: loop0 ; >03:47:17,848 INFO blivet: scanning loop0 (/devices/virtual/block/loop0)... >03:47:17,849 DEBUG blivet: DeviceTree.getDeviceByName: name: loop0 ; >03:47:17,855 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:17,856 INFO blivet: loop0 is a loop device >03:47:17,857 DEBUG blivet: DeviceTree.addUdevLoopDevice: name: loop0 ; >03:47:17,860 DEBUG blivet: DeviceTree.getDeviceByName: name: /run/install/repo/LiveOS/squashfs.img ; >03:47:17,862 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:17,867 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:17,869 DEBUG blivet: FileDevice._setFormat: /run/install/repo/LiveOS/squashfs.img ; current: None ; type: None ; >03:47:17,870 INFO blivet: added file /run/install/repo/LiveOS/squashfs.img (id 17) to device tree >03:47:17,871 DEBUG blivet: FileDevice.addChild: kids: 0 ; name: /run/install/repo/LiveOS/squashfs.img ; >03:47:17,876 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:17,878 DEBUG blivet: LoopDevice._setFormat: loop0 ; current: None ; type: None ; >03:47:17,879 INFO blivet: added loop loop0 (id 18) to device tree >03:47:17,880 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: loop0 ; >03:47:17,885 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: loop0 ; label_type: None ; >03:47:17,885 DEBUG blivet: getFormat('squashfs') returning DeviceFormat instance >03:47:17,886 DEBUG blivet: device loop0 does not contain a disklabel >03:47:17,886 INFO blivet: type detected on 'loop0' is 'squashfs' >03:47:17,887 DEBUG blivet: getFormat('squashfs') returning DeviceFormat instance >03:47:17,893 DEBUG blivet: LoopDevice._setFormat: loop0 ; current: None ; type: squashfs ; >03:47:17,894 INFO blivet: got device: LoopDevice instance (0x7fae0c02fe10) -- > name = loop0 status = False kids = 0 id = 18 > parents = ['existing 0MB file /run/install/repo/LiveOS/squashfs.img (17)'] > uuid = None size = 0 > format = existing squashfs > major = 0 minor = 0 exists = True protected = False > sysfs path = partedDevice = None > target size = 0 path = /dev/loop0 > format args = [] originalFormat = None >03:47:17,895 INFO blivet: got format: DeviceFormat instance (0x7fae0c02ff10) -- > type = squashfs name = squashfs status = False > device = /dev/loop0 uuid = None exists = True > options = None supported = False formattable = False resizable = False > >03:47:17,898 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-label/Anaconda /dev/disk/by-uuid/932a9ea8-7790-43fd-a10c-20d783f65a9d', > 'DEVNAME': 'loop1', > 'DEVPATH': '/devices/virtual/block/loop1', > 'DEVTYPE': 'disk', > 'ID_FS_LABEL': 'Anaconda', > 'ID_FS_LABEL_ENC': 'Anaconda', > 'ID_FS_TYPE': 'ext4', > 'ID_FS_USAGE': 'filesystem', > 'ID_FS_UUID': '932a9ea8-7790-43fd-a10c-20d783f65a9d', > 'ID_FS_UUID_ENC': '932a9ea8-7790-43fd-a10c-20d783f65a9d', > 'ID_FS_VERSION': '1.0', > 'MAJOR': '7', > 'MINOR': '1', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'USEC_INITIALIZED': '53229', > 'name': 'loop1', > 'symlinks': ['/dev/disk/by-label/Anaconda', > '/dev/disk/by-uuid/932a9ea8-7790-43fd-a10c-20d783f65a9d'], > 'sysfs_path': '/devices/virtual/block/loop1'} ; name: loop1 ; >03:47:17,903 INFO blivet: scanning loop1 (/devices/virtual/block/loop1)... >03:47:17,905 DEBUG blivet: DeviceTree.getDeviceByName: name: loop1 ; >03:47:17,906 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:17,906 INFO blivet: loop1 is a loop device >03:47:17,908 DEBUG blivet: DeviceTree.addUdevLoopDevice: name: loop1 ; >03:47:17,911 DEBUG blivet: DeviceTree.getDeviceByName: name: /LiveOS/rootfs.img ; >03:47:17,913 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:17,915 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:17,917 DEBUG blivet: FileDevice._setFormat: /LiveOS/rootfs.img ; current: None ; type: None ; >03:47:17,922 INFO blivet: added file /LiveOS/rootfs.img (id 19) to device tree >03:47:17,924 DEBUG blivet: FileDevice.addChild: kids: 0 ; name: /LiveOS/rootfs.img ; >03:47:17,924 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:17,926 DEBUG blivet: LoopDevice._setFormat: loop1 ; current: None ; type: None ; >03:47:17,927 INFO blivet: added loop loop1 (id 20) to device tree >03:47:17,929 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: loop1 ; >03:47:17,932 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: loop1 ; label_type: None ; >03:47:17,934 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:17,936 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:47:17,937 DEBUG blivet: device loop1 does not contain a disklabel >03:47:17,938 INFO blivet: type detected on 'loop1' is 'ext4' >03:47:18,071 DEBUG blivet: using current size 0 as min size >03:47:18,073 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:18,073 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:47:18,075 DEBUG blivet: LoopDevice._setFormat: loop1 ; current: None ; type: ext4 ; >03:47:18,081 INFO blivet: got device: LoopDevice instance (0x7fae0c02f990) -- > name = loop1 status = False kids = 0 id = 20 > parents = ['existing 0MB file /LiveOS/rootfs.img (19)'] > uuid = None size = 0 > format = existing ext4 filesystem > major = 0 minor = 0 exists = True protected = False > sysfs path = partedDevice = None > target size = 0 path = /dev/loop1 > format args = [] originalFormat = None >03:47:18,082 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:18,083 INFO blivet: got format: Ext4FS instance (0x7fae0c033710) -- > type = ext4 name = ext4 status = False > device = /dev/loop1 uuid = 932a9ea8-7790-43fd-a10c-20d783f65a9d exists = True > options = defaults supported = True formattable = True resizable = True > mountpoint = None mountopts = None > label = Anaconda size = 1024.0 targetSize = 1024.0 > >03:47:18,086 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVNAME': 'loop2', > 'DEVPATH': '/devices/virtual/block/loop2', > 'DEVTYPE': 'disk', > 'ID_FS_TYPE': 'DM_snapshot_cow', > 'ID_FS_USAGE': 'other', > 'MAJOR': '7', > 'MINOR': '2', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'USEC_INITIALIZED': '53418', > 'name': 'loop2', > 'symlinks': [], > 'sysfs_path': '/devices/virtual/block/loop2'} ; name: loop2 ; >03:47:18,091 INFO blivet: scanning loop2 (/devices/virtual/block/loop2)... >03:47:18,092 DEBUG blivet: DeviceTree.getDeviceByName: name: loop2 ; >03:47:18,093 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:18,094 INFO blivet: loop2 is a loop device >03:47:18,096 DEBUG blivet: DeviceTree.addUdevLoopDevice: name: loop2 ; >03:47:18,098 DEBUG blivet: DeviceTree.getDeviceByName: name: /overlay (deleted) ; >03:47:18,103 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:18,104 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:18,106 DEBUG blivet: FileDevice._setFormat: /overlay (deleted) ; current: None ; type: None ; >03:47:18,108 INFO blivet: added file /overlay (deleted) (id 21) to device tree >03:47:18,110 DEBUG blivet: FileDevice.addChild: kids: 0 ; name: /overlay (deleted) ; >03:47:18,110 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:18,113 DEBUG blivet: LoopDevice._setFormat: loop2 ; current: None ; type: None ; >03:47:18,114 INFO blivet: added loop loop2 (id 22) to device tree >03:47:18,115 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: loop2 ; >03:47:18,122 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: loop2 ; label_type: None ; >03:47:18,123 DEBUG blivet: getFormat('DM_snapshot_cow') returning DeviceFormat instance >03:47:18,123 DEBUG blivet: device loop2 does not contain a disklabel >03:47:18,124 INFO blivet: type detected on 'loop2' is 'DM_snapshot_cow' >03:47:18,124 DEBUG blivet: getFormat('DM_snapshot_cow') returning DeviceFormat instance >03:47:18,126 DEBUG blivet: LoopDevice._setFormat: loop2 ; current: None ; type: DM_snapshot_cow ; >03:47:18,126 INFO blivet: got device: LoopDevice instance (0x7fae0c03c410) -- > name = loop2 status = False kids = 0 id = 22 > parents = ['existing 0MB file /overlay (deleted) (21)'] > uuid = None size = 0 > format = existing DM_snapshot_cow > major = 0 minor = 0 exists = True protected = False > sysfs path = partedDevice = None > target size = 0 path = /dev/loop2 > format args = [] originalFormat = None >03:47:18,131 INFO blivet: got format: DeviceFormat instance (0x7fae0c03c550) -- > type = DM_snapshot_cow name = DM_snapshot_cow status = False > device = /dev/loop2 uuid = None exists = True > options = None supported = False formattable = False resizable = False > >03:47:18,133 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVNAME': 'loop3', > 'DEVPATH': '/devices/virtual/block/loop3', > 'DEVTYPE': 'disk', > 'MAJOR': '7', > 'MINOR': '3', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'USEC_INITIALIZED': '53622', > 'name': 'loop3', > 'symlinks': [], > 'sysfs_path': '/devices/virtual/block/loop3'} ; name: loop3 ; >03:47:18,134 INFO blivet: ignoring loop3 (/devices/virtual/block/loop3) >03:47:18,135 DEBUG blivet: lvm filter: adding loop3 to the reject list >03:47:18,137 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVNAME': 'loop4', > 'DEVPATH': '/devices/virtual/block/loop4', > 'DEVTYPE': 'disk', > 'MAJOR': '7', > 'MINOR': '4', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'USEC_INITIALIZED': '53825', > 'name': 'loop4', > 'symlinks': [], > 'sysfs_path': '/devices/virtual/block/loop4'} ; name: loop4 ; >03:47:18,141 INFO blivet: ignoring loop4 (/devices/virtual/block/loop4) >03:47:18,142 DEBUG blivet: lvm filter: adding loop4 to the reject list >03:47:18,144 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVNAME': 'loop5', > 'DEVPATH': '/devices/virtual/block/loop5', > 'DEVTYPE': 'disk', > 'MAJOR': '7', > 'MINOR': '5', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'USEC_INITIALIZED': '54039', > 'name': 'loop5', > 'symlinks': [], > 'sysfs_path': '/devices/virtual/block/loop5'} ; name: loop5 ; >03:47:18,145 INFO blivet: ignoring loop5 (/devices/virtual/block/loop5) >03:47:18,145 DEBUG blivet: lvm filter: adding loop5 to the reject list >03:47:18,147 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVNAME': 'loop6', > 'DEVPATH': '/devices/virtual/block/loop6', > 'DEVTYPE': 'disk', > 'MAJOR': '7', > 'MINOR': '6', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'USEC_INITIALIZED': '54249', > 'name': 'loop6', > 'symlinks': [], > 'sysfs_path': '/devices/virtual/block/loop6'} ; name: loop6 ; >03:47:18,153 INFO blivet: ignoring loop6 (/devices/virtual/block/loop6) >03:47:18,154 DEBUG blivet: lvm filter: adding loop6 to the reject list >03:47:18,156 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVNAME': 'loop7', > 'DEVPATH': '/devices/virtual/block/loop7', > 'DEVTYPE': 'disk', > 'MAJOR': '7', > 'MINOR': '7', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'USEC_INITIALIZED': '54461', > 'name': 'loop7', > 'symlinks': [], > 'sysfs_path': '/devices/virtual/block/loop7'} ; name: loop7 ; >03:47:18,156 INFO blivet: ignoring loop7 (/devices/virtual/block/loop7) >03:47:18,157 DEBUG blivet: lvm filter: adding loop7 to the reject list >03:47:18,161 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/dm-name-live-rw /dev/disk/by-label/Anaconda /dev/disk/by-uuid/932a9ea8-7790-43fd-a10c-20d783f65a9d /dev/mapper/live-rw', > 'DEVNAME': 'dm-0', > 'DEVPATH': '/devices/virtual/block/dm-0', > 'DEVTYPE': 'disk', > 'DM_NAME': 'live-rw', > 'DM_SUSPENDED': '0', > 'DM_UDEV_DISABLE_LIBRARY_FALLBACK_FLAG': '1', > 'DM_UDEV_PRIMARY_SOURCE_FLAG': '1', > 'DM_UDEV_RULES_VSN': '2', > 'ID_FS_LABEL': 'Anaconda', > 'ID_FS_LABEL_ENC': 'Anaconda', > 'ID_FS_TYPE': 'ext4', > 'ID_FS_USAGE': 'filesystem', > 'ID_FS_UUID': '932a9ea8-7790-43fd-a10c-20d783f65a9d', > 'ID_FS_UUID_ENC': '932a9ea8-7790-43fd-a10c-20d783f65a9d', > 'ID_FS_VERSION': '1.0', > 'MAJOR': '253', > 'MINOR': '0', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '40683', > 'name': 'dm-0', > 'symlinks': ['/dev/disk/by-id/dm-name-live-rw', > '/dev/disk/by-label/Anaconda', > '/dev/disk/by-uuid/932a9ea8-7790-43fd-a10c-20d783f65a9d', > '/dev/mapper/live-rw'], > 'sysfs_path': '/devices/virtual/block/dm-0'} ; name: live-rw ; >03:47:18,168 INFO blivet: scanning live-rw (/devices/virtual/block/dm-0)... >03:47:18,170 DEBUG blivet: DeviceTree.getDeviceByName: name: live-rw ; >03:47:18,173 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:18,178 INFO blivet: live-rw is a device-mapper device >03:47:18,180 DEBUG blivet: DeviceTree.addUdevDMDevice: name: live-rw ; >03:47:18,181 DEBUG blivet: DeviceTree.getDeviceByName: name: loop1 ; >03:47:18,185 DEBUG blivet: DeviceTree.getDeviceByName returned existing 0MB loop loop1 (20) with existing ext4 filesystem >03:47:18,187 DEBUG blivet: DeviceTree.getDeviceByName: name: loop2 ; >03:47:18,189 DEBUG blivet: DeviceTree.getDeviceByName returned existing 0MB loop loop2 (22) with existing DM_snapshot_cow >03:47:18,193 DEBUG blivet: DeviceTree.getDeviceByName: name: live-rw ; >03:47:18,195 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:18,199 DEBUG blivet: LoopDevice.addChild: kids: 0 ; name: loop2 ; >03:47:18,199 DEBUG blivet: getFormat('None') returning DeviceFormat instance >03:47:18,201 DEBUG blivet: DMDevice._setFormat: live-rw ; current: None ; type: None ; >03:47:18,206 INFO blivet: added dm live-rw (id 23) to device tree >03:47:18,208 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: live-rw ; >03:47:18,209 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: live-rw ; label_type: None ; >03:47:18,211 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:18,213 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:47:18,214 DEBUG blivet: device live-rw does not contain a disklabel >03:47:18,214 INFO blivet: type detected on 'live-rw' is 'ext4' >03:47:18,296 DEBUG blivet: padding min size from 758 up to 833 >03:47:18,298 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:18,298 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:47:18,301 DEBUG blivet: DMDevice._setFormat: live-rw ; current: None ; type: ext4 ; >03:47:18,303 DEBUG blivet: looking up parted Device: /dev/mapper/live-rw >03:47:18,317 INFO blivet: got device: DMDevice instance (0x7fae0c033350) -- > name = live-rw status = True kids = 0 id = 23 > parents = ['existing 0MB loop loop2 (22) with existing DM_snapshot_cow'] > uuid = None size = 1024.0 > format = existing ext4 filesystem > major = 0 minor = 0 exists = True protected = True > sysfs path = /devices/virtual/block/dm-0 partedDevice = parted.Device instance -- > model: Linux device-mapper (snapshot) path: /dev/mapper/live-rw type: 12 > sectorSize: 512 physicalSectorSize: 512 > length: 2097152 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: True > hardwareGeometry: (130, 255, 63) biosGeometry: (130, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0c024e60> > target size = 0 path = /dev/mapper/live-rw > format args = [] originalFormat = None target = None dmUuid = None >03:47:18,323 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:18,324 INFO blivet: got format: Ext4FS instance (0x7fae0c03d8d0) -- > type = ext4 name = ext4 status = False > device = /dev/mapper/live-rw uuid = 932a9ea8-7790-43fd-a10c-20d783f65a9d exists = True > options = defaults supported = True formattable = True resizable = True > mountpoint = None mountopts = None > label = Anaconda size = 1024.0 targetSize = 1024.0 > >03:47:18,328 DEBUG blivet: DeviceTree.addUdevDevice: info: {'DEVLINKS': '/dev/disk/by-id/md-name-dhcppc0:swap /dev/disk/by-id/md-uuid-c51936a3:08423708:8fcc5618:49cc057b /dev/disk/by-label/swap-fs /dev/disk/by-uuid/3cae7094-037f-4b48-bd37-90b60795ca6e /dev/md/dhcppc0:swap', > 'DEVNAME': 'md127', > 'DEVPATH': '/devices/virtual/block/md127', > 'DEVTYPE': 'disk', > 'ID_FS_LABEL': 'swap-fs', > 'ID_FS_LABEL_ENC': 'swap-fs', > 'ID_FS_TYPE': 'swap', > 'ID_FS_USAGE': 'other', > 'ID_FS_UUID': '3cae7094-037f-4b48-bd37-90b60795ca6e', > 'ID_FS_UUID_ENC': '3cae7094-037f-4b48-bd37-90b60795ca6e', > 'ID_FS_VERSION': '2', > 'MAJOR': '9', > 'MD_DEVICES': '4', > 'MD_DEVNAME': 'dhcppc0:swap', > 'MD_LEVEL': 'raid1', > 'MD_METADATA': '1.2', > 'MD_NAME': 'dhcppc0:swap', > 'MD_UUID': 'c51936a3:08423708:8fcc5618:49cc057b', > 'MINOR': '127', > 'MPATH_SBIN_PATH': '/sbin', > 'SUBSYSTEM': 'block', > 'TAGS': ':systemd:', > 'USEC_INITIALIZED': '221281', > 'name': 'md127', > 'symlinks': ['/dev/disk/by-id/md-name-dhcppc0:swap', > '/dev/disk/by-id/md-uuid-c51936a3:08423708:8fcc5618:49cc057b', > '/dev/disk/by-label/swap-fs', > '/dev/disk/by-uuid/3cae7094-037f-4b48-bd37-90b60795ca6e', > '/dev/md/dhcppc0:swap'], > 'sysfs_path': '/devices/virtual/block/md127'} ; name: md127 ; >03:47:18,334 INFO blivet: scanning md127 (/devices/virtual/block/md127)... >03:47:18,336 DEBUG blivet: DeviceTree.getDeviceByName: name: md127 ; >03:47:18,337 DEBUG blivet: DeviceTree.getDeviceByName returned None >03:47:18,340 DEBUG blivet: DeviceTree.getDeviceByName: name: dhcppc0:swap ; >03:47:18,343 DEBUG blivet: raw RAID 1 size == 2041.0 >03:47:18,343 INFO blivet: Using 1MB superBlockSize >03:47:18,344 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:47:18,347 DEBUG blivet: DeviceTree.getDeviceByName returned existing 2039MB mdarray dhcppc0:swap (3) >03:47:18,348 DEBUG blivet: DeviceTree.handleUdevDeviceFormat: name: dhcppc0:swap ; >03:47:18,351 DEBUG blivet: DeviceTree.handleUdevDiskLabelFormat: device: dhcppc0:swap ; label_type: None ; >03:47:18,354 DEBUG blivet: SwapSpace.__init__: >03:47:18,356 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:47:18,357 DEBUG blivet: device dhcppc0:swap does not contain a disklabel >03:47:18,357 INFO blivet: type detected on 'md127' is 'swap' >03:47:18,359 DEBUG blivet: SwapSpace.__init__: device: /dev/md/dhcppc0:swap ; serial: None ; uuid: 3cae7094-037f-4b48-bd37-90b60795ca6e ; exists: True ; label: swap-fs ; >03:47:18,361 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:47:18,363 DEBUG blivet: MDRaidArrayDevice._setFormat: dhcppc0:swap ; current: None ; type: swap ; >03:47:18,369 DEBUG blivet: raw RAID 1 size == 2041.0 >03:47:18,369 INFO blivet: Using 1MB superBlockSize >03:47:18,370 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:47:18,371 INFO blivet: got device: MDRaidArrayDevice instance (0x7fae1279ad90) -- > name = dhcppc0:swap status = True kids = 0 id = 3 > parents = ['existing 2041MB partition sda1 (2) with existing mdmember', > 'existing 2041MB partition sdd1 (9) with existing mdmember', > 'existing 2041MB partition sdc1 (12) with existing mdmember', > 'existing 2041MB partition sdb1 (15) with existing mdmember'] > uuid = c51936a3:08423708:8fcc5618:49cc057b size = 2039.9375 > format = existing swap > major = 0 minor = 0 exists = True protected = False > sysfs path = /devices/virtual/block/md127 partedDevice = parted.Device instance -- > model: Linux Software RAID Array path: /dev/md/dhcppc0:swap type: 17 > sectorSize: 512 physicalSectorSize: 512 > length: 4177792 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 13107 did: 13107 busy: False > hardwareGeometry: (522224, 2, 4) biosGeometry: (260, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127ead40> > target size = 0 path = /dev/md/dhcppc0:swap > format args = None originalFormat = None level = 1 spares = 0 > members = 4 > total devices = 4 metadata version = 1.2 >03:47:18,372 INFO blivet: got format: SwapSpace instance (0x7fae0f79b9d0) -- > type = swap name = swap status = False > device = /dev/md/dhcppc0:swap uuid = 3cae7094-037f-4b48-bd37-90b60795ca6e exists = True > options = supported = True formattable = True resizable = False > priority = None label = swap-fs >03:47:18,430 DEBUG blivet: OpticalDevice.teardown: sr0 ; status: True ; controllable: True ; >03:47:18,466 DEBUG blivet: MDRaidArrayDevice.teardown: dhcppc0:swap ; status: True ; controllable: True ; >03:47:18,467 DEBUG blivet: SwapSpace.teardown: device: /dev/md/dhcppc0:swap ; status: False ; type: swap ; >03:47:18,470 DEBUG blivet: SwapSpace.teardown: device: /dev/md/dhcppc0:swap ; status: False ; type: swap ; >03:47:18,739 DEBUG blivet: PartitionDevice.teardown: sda1 ; status: True ; controllable: True ; >03:47:18,740 DEBUG blivet: MDRaidMember.teardown: device: /dev/sda1 ; status: False ; type: mdmember ; >03:47:18,748 DEBUG blivet: MDRaidMember.teardown: device: /dev/sda1 ; status: False ; type: mdmember ; >03:47:18,790 DEBUG blivet: DiskDevice.teardown: sda ; status: True ; controllable: True ; >03:47:18,792 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:47:18,795 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:47:18,835 DEBUG blivet: PartitionDevice.teardown: sdd1 ; status: True ; controllable: True ; >03:47:18,839 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdd1 ; status: False ; type: mdmember ; >03:47:18,842 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdd1 ; status: False ; type: mdmember ; >03:47:18,885 DEBUG blivet: DiskDevice.teardown: sdd ; status: True ; controllable: True ; >03:47:18,887 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:47:18,889 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:47:18,931 DEBUG blivet: PartitionDevice.teardown: sdc1 ; status: True ; controllable: True ; >03:47:18,933 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdc1 ; status: False ; type: mdmember ; >03:47:18,938 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdc1 ; status: False ; type: mdmember ; >03:47:18,976 DEBUG blivet: DiskDevice.teardown: sdc ; status: True ; controllable: True ; >03:47:18,978 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:47:18,983 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:47:19,031 DEBUG blivet: PartitionDevice.teardown: sdb1 ; status: True ; controllable: True ; >03:47:19,032 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdb1 ; status: False ; type: mdmember ; >03:47:19,038 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdb1 ; status: False ; type: mdmember ; >03:47:19,075 DEBUG blivet: DiskDevice.teardown: sdb ; status: True ; controllable: True ; >03:47:19,077 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:47:19,080 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:47:19,121 DEBUG blivet: BTRFSSubVolumeDevice.teardown: boot ; status: True ; controllable: True ; >03:47:19,157 DEBUG blivet: BTRFSVolumeDevice.teardown: fedora_dhcppc0 ; status: True ; controllable: True ; >03:47:19,196 DEBUG blivet: PartitionDevice.teardown: sda2 ; status: True ; controllable: True ; >03:47:19,233 DEBUG blivet: DiskDevice.teardown: sda ; status: True ; controllable: True ; >03:47:19,235 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:47:19,238 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:47:19,280 DEBUG blivet: PartitionDevice.teardown: sdd2 ; status: True ; controllable: True ; >03:47:19,319 DEBUG blivet: DiskDevice.teardown: sdd ; status: True ; controllable: True ; >03:47:19,321 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:47:19,323 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:47:19,361 DEBUG blivet: PartitionDevice.teardown: sdc2 ; status: True ; controllable: True ; >03:47:19,397 DEBUG blivet: DiskDevice.teardown: sdc ; status: True ; controllable: True ; >03:47:19,399 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:47:19,406 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:47:19,443 DEBUG blivet: PartitionDevice.teardown: sdb2 ; status: True ; controllable: True ; >03:47:19,481 DEBUG blivet: DiskDevice.teardown: sdb ; status: True ; controllable: True ; >03:47:19,486 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:47:19,489 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:47:19,526 DEBUG blivet: BTRFSSubVolumeDevice.teardown: root ; status: True ; controllable: True ; >03:47:19,561 DEBUG blivet: BTRFSVolumeDevice.teardown: fedora_dhcppc0 ; status: True ; controllable: True ; >03:47:19,600 DEBUG blivet: PartitionDevice.teardown: sda2 ; status: True ; controllable: True ; >03:47:19,635 DEBUG blivet: DiskDevice.teardown: sda ; status: True ; controllable: True ; >03:47:19,637 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:47:19,643 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:47:19,680 DEBUG blivet: PartitionDevice.teardown: sdd2 ; status: True ; controllable: True ; >03:47:19,720 DEBUG blivet: DiskDevice.teardown: sdd ; status: True ; controllable: True ; >03:47:19,722 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:47:19,728 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:47:19,772 DEBUG blivet: PartitionDevice.teardown: sdc2 ; status: True ; controllable: True ; >03:47:19,817 DEBUG blivet: DiskDevice.teardown: sdc ; status: True ; controllable: True ; >03:47:19,819 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:47:19,828 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:47:19,863 DEBUG blivet: PartitionDevice.teardown: sdb2 ; status: True ; controllable: True ; >03:47:19,902 DEBUG blivet: DiskDevice.teardown: sdb ; status: True ; controllable: True ; >03:47:19,906 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:47:19,909 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:47:19,952 DEBUG blivet: LoopDevice.teardown: loop0 ; status: False ; controllable: False ; >03:47:19,953 DEBUG blivet: LoopDevice.teardown: loop1 ; status: False ; controllable: False ; >03:47:19,954 INFO blivet: not going to restore from backup of non-existent /etc/mdadm.conf >03:47:19,958 INFO blivet: edd: collected mbr signatures: {'sdd': '0x0009965e', 'sda': '0x0009107f', 'sdb': '0x000950b2', 'sdc': '0x00080b43'} >03:47:19,959 DEBUG blivet: edd: data extracted from 0x80: > type: ATA, ata_device: 0 > channel: 0, mbr_signature: 0x0009107f > pci_dev: 00:06.0, scsi_id: None > scsi_lun: None, sectors: 24576000 >03:47:19,959 WARN blivet: edd: directory does not exist: /sys/devices/pci0000:00/0000:00:06.0/host0/target0:0:0/0:0:0:0/block >03:47:19,960 INFO blivet: edd: matched 0x80 to sda using MBR sig >03:47:19,962 DEBUG blivet: BTRFSSubVolumeDevice.setup: boot ; status: True ; controllable: True ; orig: False ; >03:47:19,966 INFO blivet: set SELinux context for mountpoint /mnt/sysimage to system_u:object_r:mnt_t:s0 >03:47:20,104 DEBUG blivet: BTRFSSubVolumeDevice.teardown: boot ; status: True ; controllable: True ; >03:47:20,170 DEBUG blivet: BTRFSVolumeDevice.teardown: fedora_dhcppc0 ; status: True ; controllable: True ; >03:47:20,212 DEBUG blivet: PartitionDevice.teardown: sda2 ; status: True ; controllable: True ; >03:47:20,253 DEBUG blivet: DiskDevice.teardown: sda ; status: True ; controllable: True ; >03:47:20,255 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:47:20,261 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:47:20,297 DEBUG blivet: PartitionDevice.teardown: sdd2 ; status: True ; controllable: True ; >03:47:20,334 DEBUG blivet: DiskDevice.teardown: sdd ; status: True ; controllable: True ; >03:47:20,336 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:47:20,342 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:47:20,378 DEBUG blivet: PartitionDevice.teardown: sdc2 ; status: True ; controllable: True ; >03:47:20,419 DEBUG blivet: DiskDevice.teardown: sdc ; status: True ; controllable: True ; >03:47:20,421 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:47:20,428 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:47:20,464 DEBUG blivet: PartitionDevice.teardown: sdb2 ; status: True ; controllable: True ; >03:47:20,504 DEBUG blivet: DiskDevice.teardown: sdb ; status: True ; controllable: True ; >03:47:20,507 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:47:20,511 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:47:20,552 DEBUG blivet: BTRFSSubVolumeDevice.setup: root ; status: True ; controllable: True ; orig: False ; >03:47:20,553 INFO blivet: set SELinux context for mountpoint /mnt/sysimage to system_u:object_r:mnt_t:s0 >03:47:21,029 DEBUG blivet: parsing /mnt/sysimage/etc/blkid/blkid.tab >03:47:21,036 INFO blivet: error parsing blkid.tab: [Errno 2] No such file or directory: '/mnt/sysimage/etc/blkid/blkid.tab' >03:47:21,037 DEBUG blivet: parsing /mnt/sysimage/etc/crypttab >03:47:21,038 DEBUG blivet: parsing /mnt/sysimage/etc/blkid/blkid.tab >03:47:21,038 DEBUG blivet: crypttab maps: [] >03:47:21,039 DEBUG blivet: parsing /mnt/sysimage/etc/fstab >03:47:21,040 DEBUG blivet: resolved 'UUID=852bfcd3-84c3-4cb0-92cc-787d2f56d51c' to 'root' (btrfs subvolume) >03:47:21,040 DEBUG blivet: resolved 'UUID=852bfcd3-84c3-4cb0-92cc-787d2f56d51c' to 'boot' (btrfs subvolume) >03:47:21,041 DEBUG blivet: resolved 'UUID=3cae7094-037f-4b48-bd37-90b60795ca6e' to 'dhcppc0:swap' (mdarray) >03:47:21,042 DEBUG blivet: BTRFSSubVolumeDevice.teardown: root ; status: True ; controllable: True ; >03:47:21,219 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:21,226 DEBUG blivet: raw RAID 1 size == 2041.0 >03:47:21,234 INFO blivet: Using 1MB superBlockSize >03:47:21,234 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:47:21,236 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:21,238 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:21,241 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:21,251 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:21,306 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:21,349 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:21,353 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:21,360 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:21,363 DEBUG blivet: OpticalDevice.mediaPresent: sr0 ; status: True ; >03:47:21,460 DEBUG blivet: Iso9660FS.supported: supported: True ; >03:47:21,622 DEBUG blivet: /dev/sr0 is mounted on /run/install/repo >03:47:21,623 DEBUG blivet: /dev/sr0 is mounted on /run/install/repo >03:47:21,625 DEBUG blivet: DeviceTree.getDeviceByPath: path: /dev/sr0 ; >03:47:21,627 DEBUG blivet: OpticalDevice.mediaPresent: sr0 ; status: True ; >03:47:21,635 DEBUG blivet: DeviceTree.getDeviceByPath returned existing 4585MB cdrom sr0 (0) with existing iso9660 filesystem >03:47:35,644 DEBUG blivet: Iso9660FS.supported: supported: True ; >03:47:35,648 DEBUG blivet: Iso9660FS.supported: supported: True ; >03:47:35,649 DEBUG blivet: NFSv4.supported: supported: False ; >03:47:35,656 DEBUG blivet: NFSv4.supported: supported: False ; >03:47:35,658 DEBUG blivet: SELinuxFS.supported: supported: False ; >03:47:35,659 DEBUG blivet: SELinuxFS.supported: supported: False ; >03:47:35,667 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:35,669 DEBUG blivet: Ext4FS.supported: supported: True ; >03:47:36,086 DEBUG blivet: Ext3FS.supported: supported: True ; >03:47:36,092 DEBUG blivet: Ext3FS.supported: supported: True ; >03:47:36,098 DEBUG blivet: Ext2FS.supported: supported: True ; >03:47:36,100 DEBUG blivet: Ext2FS.supported: supported: True ; >03:47:36,180 DEBUG blivet: SysFS.supported: supported: False ; >03:47:36,181 DEBUG blivet: SysFS.supported: supported: False ; >03:47:36,185 DEBUG blivet: MultipathMember.__init__: >03:47:36,190 DEBUG blivet: SwapSpace.__init__: >03:47:36,249 DEBUG blivet: ProcFS.supported: supported: False ; >03:47:36,250 DEBUG blivet: ProcFS.supported: supported: False ; >03:47:36,251 DEBUG blivet: NoDevFS.supported: supported: False ; >03:47:36,253 DEBUG blivet: NoDevFS.supported: supported: False ; >03:47:36,256 DEBUG blivet: DevPtsFS.supported: supported: False ; >03:47:36,257 DEBUG blivet: DevPtsFS.supported: supported: False ; >03:47:36,258 DEBUG blivet: BTRFS.supported: supported: True ; >03:47:36,265 DEBUG blivet: USBFS.supported: supported: False ; >03:47:36,266 DEBUG blivet: USBFS.supported: supported: False ; >03:47:36,267 DEBUG blivet: DiskLabel.__init__: >03:47:36,268 INFO blivet: DiskLabel.partedDevice returning None >03:47:36,269 DEBUG blivet: HFSPlus.supported: supported: False ; >03:47:36,275 DEBUG blivet: HFSPlus.supported: supported: False ; >03:47:36,276 DEBUG blivet: XFS.supported: supported: True ; >03:47:36,400 DEBUG blivet: XFS.supported: supported: True ; >03:47:36,403 DEBUG blivet: TmpFS.supported: supported: False ; >03:47:36,404 DEBUG blivet: TmpFS.supported: supported: False ; >03:47:36,405 DEBUG blivet: LUKS.__init__: >03:47:36,406 DEBUG blivet: NTFS.supported: supported: False ; >03:47:36,408 DEBUG blivet: NTFS.supported: supported: False ; >03:47:36,409 DEBUG blivet: BindFS.supported: supported: False ; >03:47:36,410 DEBUG blivet: BindFS.supported: supported: False ; >03:47:36,411 DEBUG blivet: HFS.supported: supported: False ; >03:47:36,412 DEBUG blivet: HFS.supported: supported: False ; >03:47:36,414 DEBUG blivet: LVMPhysicalVolume.__init__: >03:47:36,415 DEBUG blivet: NFS.supported: supported: False ; >03:47:36,416 DEBUG blivet: NFS.supported: supported: False ; >03:47:36,417 DEBUG blivet: FATFS.supported: supported: True ; >03:47:36,474 DEBUG blivet: FATFS.supported: supported: True ; >03:47:36,476 DEBUG blivet: DMRaidMember.__init__: >03:47:36,478 DEBUG blivet: MDRaidMember.__init__: >03:48:08,782 DEBUG blivet: clearpart: looking at sda2 >03:48:08,783 DEBUG blivet: clearpart: looking at sdb2 >03:48:08,784 DEBUG blivet: clearpart: looking at sdc2 >03:48:08,786 DEBUG blivet: clearpart: looking at sdd2 >03:48:08,786 DEBUG blivet: clearpart: looking at sda1 >03:48:08,787 DEBUG blivet: clearpart: looking at sdb1 >03:48:08,787 DEBUG blivet: clearpart: looking at sdc1 >03:48:08,788 DEBUG blivet: clearpart: looking at sdd1 >03:48:08,790 DEBUG blivet: checking whether disk sda has an empty extended >03:48:08,793 DEBUG blivet: extended is None ; logicals is [] >03:48:08,794 DEBUG blivet: checking whether disk sdb has an empty extended >03:48:08,794 DEBUG blivet: extended is None ; logicals is [] >03:48:08,795 DEBUG blivet: checking whether disk sdc has an empty extended >03:48:08,796 DEBUG blivet: extended is None ; logicals is [] >03:48:08,798 DEBUG blivet: checking whether disk sdd has an empty extended >03:48:08,799 DEBUG blivet: extended is None ; logicals is [] >03:48:08,906 DEBUG blivet: DeviceTree.getDeviceByName: name: sda ; >03:48:08,909 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with existing msdos disklabel >03:48:08,913 DEBUG blivet: DeviceTree.getDeviceByName: name: sda ; >03:48:08,917 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with existing msdos disklabel >03:48:08,919 DEBUG blivet: resolved 'sda' to 'sda' (disk) >03:48:08,946 DEBUG blivet: starting Blivet copy >03:48:08,988 DEBUG blivet: PartitionDevice._setPartedPartition: sda1 ; >03:48:08,990 DEBUG blivet: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05319b90> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b16750> PedPartition: <_ped.Partition object at 0x7fae05b14350> >03:48:08,992 DEBUG blivet: PartitionDevice._setPartedPartition: sda2 ; >03:48:08,993 DEBUG blivet: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05319b90> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b168d0> PedPartition: <_ped.Partition object at 0x7fae05b14290> >03:48:08,996 DEBUG blivet: PartitionDevice._setPartedPartition: sdb1 ; >03:48:08,997 DEBUG blivet: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b11910> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b169d0> PedPartition: <_ped.Partition object at 0x7fae05b142f0> >03:48:08,999 DEBUG blivet: PartitionDevice._setPartedPartition: sdb2 ; >03:48:09,000 DEBUG blivet: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b11910> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b16b50> PedPartition: <_ped.Partition object at 0x7fae05b14470> >03:48:09,002 DEBUG blivet: PartitionDevice._setPartedPartition: sdc1 ; >03:48:09,003 DEBUG blivet: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aeae10> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b16c50> PedPartition: <_ped.Partition object at 0x7fae05b143b0> >03:48:09,005 DEBUG blivet: PartitionDevice._setPartedPartition: sdc2 ; >03:48:09,006 DEBUG blivet: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aeae10> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b16dd0> PedPartition: <_ped.Partition object at 0x7fae05b14410> >03:48:09,008 DEBUG blivet: PartitionDevice._setPartedPartition: sdd1 ; >03:48:09,010 DEBUG blivet: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aea3d0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b16ed0> PedPartition: <_ped.Partition object at 0x7fae05b144d0> >03:48:09,011 DEBUG blivet: PartitionDevice._setPartedPartition: sdd2 ; >03:48:09,013 DEBUG blivet: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aea3d0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b1e090> PedPartition: <_ped.Partition object at 0x7fae05b14530> >03:48:09,013 DEBUG blivet: finished Blivet copy >03:48:09,055 DEBUG blivet: raw RAID 1 size == 2041.0 >03:48:09,056 INFO blivet: Using 1MB superBlockSize >03:48:09,056 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:48:10,600 DEBUG blivet: BTRFS.supported: supported: True ; >03:48:10,601 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:48:10,609 DEBUG blivet: BTRFS.supported: supported: True ; >03:48:10,609 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:48:10,611 DEBUG blivet: BTRFS.supported: supported: True ; >03:48:10,612 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.BTRFSFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 0, [], {} >03:48:10,630 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.BTRFSFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 0, [], {} >03:48:11,380 DEBUG blivet: BTRFS.supported: supported: True ; >03:48:11,382 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:48:11,388 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.BTRFSFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 39832, ['sda', 'sdd', 'sdc', 'sdb'], {'encrypted': False, 'raid_level': None} >03:48:11,404 DEBUG blivet: BTRFS.supported: supported: True ; >03:48:11,404 DEBUG blivet: getFormat('btrfs') returning BTRFS instance >03:48:11,412 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.BTRFSFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 0, [], {} >03:48:11,417 DEBUG blivet: raw RAID 1 size == 2041.0 >03:48:11,418 INFO blivet: Using 1MB superBlockSize >03:48:11,418 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:48:11,420 DEBUG blivet: raw RAID 1 size == 2041.0 >03:48:11,421 INFO blivet: Using 1MB superBlockSize >03:48:11,422 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:48:11,427 DEBUG blivet: raw RAID 1 size == 2041.0 >03:48:11,428 INFO blivet: Using 1MB superBlockSize >03:48:11,428 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:48:11,433 DEBUG blivet: SwapSpace.__init__: >03:48:11,434 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:48:12,653 DEBUG blivet: raw RAID 1 size == 2041.0 >03:48:12,655 INFO blivet: Using 1MB superBlockSize >03:48:12,658 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:48:14,502 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,506 DEBUG storage.ui: BTRFSSubVolumeDevice._setFormat: boot ; current: btrfs ; type: None ; >03:48:14,507 INFO storage.ui: registered action: [0] Destroy Format btrfs filesystem on btrfs subvolume boot (id 6) >03:48:14,508 DEBUG storage.ui: BTRFSSubVolumeDevice.teardown: boot ; status: True ; controllable: True ; >03:48:14,511 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:48:14,537 INFO storage.ui: removed btrfs subvolume boot (id 6) from device tree >03:48:14,540 DEBUG storage.ui: BTRFSVolumeDevice.removeChild: kids: 2 ; name: fedora_dhcppc0 ; >03:48:14,540 INFO storage.ui: registered action: [1] Destroy Device btrfs subvolume boot (id 6) >03:48:14,542 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,543 DEBUG storage.ui: BTRFSSubVolumeDevice._setFormat: root ; current: btrfs ; type: None ; >03:48:14,544 INFO storage.ui: registered action: [2] Destroy Format btrfs filesystem on btrfs subvolume root (id 7) >03:48:14,546 DEBUG storage.ui: BTRFSSubVolumeDevice.teardown: root ; status: True ; controllable: True ; >03:48:14,549 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:48:14,574 INFO storage.ui: removed btrfs subvolume root (id 7) from device tree >03:48:14,576 DEBUG storage.ui: BTRFSVolumeDevice.removeChild: kids: 1 ; name: fedora_dhcppc0 ; >03:48:14,577 INFO storage.ui: registered action: [3] Destroy Device btrfs subvolume root (id 7) >03:48:14,578 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,580 DEBUG storage.ui: BTRFSVolumeDevice._setFormat: fedora_dhcppc0 ; current: btrfs ; type: None ; >03:48:14,581 INFO storage.ui: registered action: [4] Destroy Format btrfs filesystem on btrfs volume btrfs.5 (id 5) >03:48:14,583 DEBUG storage.ui: BTRFSVolumeDevice.teardown: btrfs.5 ; status: True ; controllable: True ; >03:48:14,585 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:48:14,610 INFO storage.ui: removed btrfs volume btrfs.5 (id 5) from device tree >03:48:14,612 DEBUG storage.ui: PartitionDevice.removeChild: kids: 1 ; name: sda2 ; >03:48:14,614 DEBUG storage.ui: PartitionDevice.removeChild: kids: 1 ; name: sdd2 ; >03:48:14,615 DEBUG storage.ui: PartitionDevice.removeChild: kids: 1 ; name: sdc2 ; >03:48:14,617 DEBUG storage.ui: PartitionDevice.removeChild: kids: 1 ; name: sdb2 ; >03:48:14,618 INFO storage.ui: registered action: [5] Destroy Device btrfs volume btrfs.5 (id 5) >03:48:14,620 DEBUG storage.ui: PartitionDevice._setFormat: sda2 ; >03:48:14,621 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,623 DEBUG storage.ui: PartitionDevice._setFormat: sda2 ; current: btrfs ; type: None ; >03:48:14,623 INFO storage.ui: registered action: [6] Destroy Format btrfs filesystem on partition sda2 (id 4) >03:48:14,625 DEBUG storage.ui: PartitionDevice.teardown: sda2 ; status: True ; controllable: True ; >03:48:14,628 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:48:14,654 INFO storage.ui: removed partition sda2 (id 4) from device tree >03:48:14,656 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:48:14,657 INFO storage.ui: registered action: [7] Destroy Device partition sda2 (id 4) >03:48:14,660 DEBUG storage.ui: PartitionDevice._setFormat: sdd2 ; >03:48:14,661 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,663 DEBUG storage.ui: PartitionDevice._setFormat: sdd2 ; current: btrfs ; type: None ; >03:48:14,664 INFO storage.ui: registered action: [8] Destroy Format btrfs filesystem on partition sdd2 (id 10) >03:48:14,666 DEBUG storage.ui: PartitionDevice.teardown: sdd2 ; status: True ; controllable: True ; >03:48:14,669 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdd2 ; status: False ; type: None ; >03:48:14,695 INFO storage.ui: removed partition sdd2 (id 10) from device tree >03:48:14,697 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:48:14,698 INFO storage.ui: registered action: [9] Destroy Device partition sdd2 (id 10) >03:48:14,700 DEBUG storage.ui: PartitionDevice._setFormat: sdc2 ; >03:48:14,701 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,703 DEBUG storage.ui: PartitionDevice._setFormat: sdc2 ; current: btrfs ; type: None ; >03:48:14,704 INFO storage.ui: registered action: [10] Destroy Format btrfs filesystem on partition sdc2 (id 13) >03:48:14,706 DEBUG storage.ui: PartitionDevice.teardown: sdc2 ; status: True ; controllable: True ; >03:48:14,709 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdc2 ; status: False ; type: None ; >03:48:14,734 INFO storage.ui: removed partition sdc2 (id 13) from device tree >03:48:14,737 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:48:14,738 INFO storage.ui: registered action: [11] Destroy Device partition sdc2 (id 13) >03:48:14,740 DEBUG storage.ui: PartitionDevice._setFormat: sdb2 ; >03:48:14,741 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,743 DEBUG storage.ui: PartitionDevice._setFormat: sdb2 ; current: btrfs ; type: None ; >03:48:14,744 INFO storage.ui: registered action: [12] Destroy Format btrfs filesystem on partition sdb2 (id 16) >03:48:14,746 DEBUG storage.ui: PartitionDevice.teardown: sdb2 ; status: True ; controllable: True ; >03:48:14,749 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdb2 ; status: False ; type: None ; >03:48:14,776 INFO storage.ui: removed partition sdb2 (id 16) from device tree >03:48:14,778 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:48:14,779 INFO storage.ui: registered action: [13] Destroy Device partition sdb2 (id 16) >03:48:14,781 DEBUG storage.ui: SwapSpace.teardown: device: /dev/md/dhcppc0:swap ; status: False ; type: swap ; >03:48:14,782 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,784 DEBUG storage.ui: MDRaidArrayDevice._setFormat: dhcppc0:swap ; current: swap ; type: None ; >03:48:14,785 INFO storage.ui: registered action: [14] Destroy Format swap on mdarray dhcppc0:swap (id 3) >03:48:14,787 DEBUG storage.ui: MDRaidArrayDevice.teardown: dhcppc0:swap ; status: False ; controllable: True ; >03:48:14,788 INFO storage.ui: removed mdarray dhcppc0:swap (id 3) from device tree >03:48:14,790 DEBUG storage.ui: PartitionDevice.removeChild: kids: 1 ; name: sda1 ; >03:48:14,792 DEBUG storage.ui: PartitionDevice.removeChild: kids: 1 ; name: sdd1 ; >03:48:14,794 DEBUG storage.ui: PartitionDevice.removeChild: kids: 1 ; name: sdc1 ; >03:48:14,796 DEBUG storage.ui: PartitionDevice.removeChild: kids: 1 ; name: sdb1 ; >03:48:14,796 INFO storage.ui: registered action: [15] Destroy Device mdarray dhcppc0:swap (id 3) >03:48:14,799 DEBUG storage.ui: MDRaidMember.teardown: device: /dev/sda1 ; status: False ; type: mdmember ; >03:48:14,801 DEBUG storage.ui: PartitionDevice._setFormat: sda1 ; >03:48:14,801 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,803 DEBUG storage.ui: PartitionDevice._setFormat: sda1 ; current: mdmember ; type: None ; >03:48:14,804 INFO storage.ui: registered action: [16] Destroy Format mdmember on partition sda1 (id 2) >03:48:14,806 DEBUG storage.ui: PartitionDevice.teardown: sda1 ; status: True ; controllable: True ; >03:48:14,808 DEBUG storage.ui: MDRaidMember.teardown: device: /dev/sda1 ; status: False ; type: mdmember ; >03:48:14,811 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sda1 ; status: False ; type: None ; >03:48:14,837 INFO storage.ui: removed partition sda1 (id 2) from device tree >03:48:14,839 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:14,839 INFO storage.ui: registered action: [17] Destroy Device partition sda1 (id 2) >03:48:14,842 DEBUG storage.ui: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:48:14,843 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,846 DEBUG storage.ui: DiskDevice._setFormat: sda ; current: disklabel ; type: None ; >03:48:14,847 INFO storage.ui: registered action: [18] Destroy Format msdos disklabel on disk sda (id 1) >03:48:14,847 DEBUG storage.ui: required disklabel type for sda (1) is None >03:48:14,848 DEBUG storage.ui: default disklabel type for sda is msdos >03:48:14,849 DEBUG storage.ui: selecting msdos disklabel for sda based on size >03:48:14,851 DEBUG storage.ui: DiskLabel.__init__: device: /dev/sda ; labelType: msdos ; >03:48:14,853 DEBUG storage.ui: DiskLabel.freshPartedDisk: device: /dev/sda ; labelType: msdos ; >03:48:14,854 DEBUG storage.ui: Did not change pmbr_boot on parted.Disk instance -- > type: msdos primaryPartitionCount: 0 > lastPartitionNumber: -1 maxPrimaryPartitionCount: 4 > partitions: [] > device: <parted.device.Device object at 0x7fae05b094d0> > PedDisk: <_ped.Disk object at 0x7fae04e47518> >03:48:14,855 DEBUG storage.ui: getFormat('disklabel') returning DiskLabel instance >03:48:14,857 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sda ; status: False ; type: None ; >03:48:14,859 DEBUG storage.ui: DiskDevice._setFormat: sda ; current: None ; type: disklabel ; >03:48:14,860 INFO storage.ui: registered action: [19] Create Format msdos disklabel on disk sda (id 1) >03:48:14,862 DEBUG storage.ui: MDRaidMember.teardown: device: /dev/sdd1 ; status: False ; type: mdmember ; >03:48:14,864 DEBUG storage.ui: PartitionDevice._setFormat: sdd1 ; >03:48:14,865 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,867 DEBUG storage.ui: PartitionDevice._setFormat: sdd1 ; current: mdmember ; type: None ; >03:48:14,867 INFO storage.ui: registered action: [20] Destroy Format mdmember on partition sdd1 (id 9) >03:48:14,869 DEBUG storage.ui: PartitionDevice.teardown: sdd1 ; status: True ; controllable: True ; >03:48:14,872 DEBUG storage.ui: MDRaidMember.teardown: device: /dev/sdd1 ; status: False ; type: mdmember ; >03:48:14,874 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdd1 ; status: False ; type: None ; >03:48:14,900 INFO storage.ui: removed partition sdd1 (id 9) from device tree >03:48:14,903 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:14,903 INFO storage.ui: registered action: [21] Destroy Device partition sdd1 (id 9) >03:48:14,906 DEBUG storage.ui: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:48:14,907 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,908 DEBUG storage.ui: DiskDevice._setFormat: sdd ; current: disklabel ; type: None ; >03:48:14,909 INFO storage.ui: registered action: [22] Destroy Format msdos disklabel on disk sdd (id 8) >03:48:14,910 DEBUG storage.ui: required disklabel type for sdd (1) is None >03:48:14,910 DEBUG storage.ui: default disklabel type for sdd is msdos >03:48:14,911 DEBUG storage.ui: selecting msdos disklabel for sdd based on size >03:48:14,913 DEBUG storage.ui: DiskLabel.__init__: device: /dev/sdd ; labelType: msdos ; >03:48:14,915 DEBUG storage.ui: DiskLabel.freshPartedDisk: device: /dev/sdd ; labelType: msdos ; >03:48:14,917 DEBUG storage.ui: Did not change pmbr_boot on parted.Disk instance -- > type: msdos primaryPartitionCount: 0 > lastPartitionNumber: -1 maxPrimaryPartitionCount: 4 > partitions: [] > device: <parted.device.Device object at 0x7fae05b09f10> > PedDisk: <_ped.Disk object at 0x7fae04c84d40> >03:48:14,918 DEBUG storage.ui: getFormat('disklabel') returning DiskLabel instance >03:48:14,919 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdd ; status: False ; type: None ; >03:48:14,921 DEBUG storage.ui: DiskDevice._setFormat: sdd ; current: None ; type: disklabel ; >03:48:14,922 INFO storage.ui: registered action: [23] Create Format msdos disklabel on disk sdd (id 8) >03:48:14,924 DEBUG storage.ui: MDRaidMember.teardown: device: /dev/sdc1 ; status: False ; type: mdmember ; >03:48:14,926 DEBUG storage.ui: PartitionDevice._setFormat: sdc1 ; >03:48:14,927 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,929 DEBUG storage.ui: PartitionDevice._setFormat: sdc1 ; current: mdmember ; type: None ; >03:48:14,930 INFO storage.ui: registered action: [24] Destroy Format mdmember on partition sdc1 (id 12) >03:48:14,932 DEBUG storage.ui: PartitionDevice.teardown: sdc1 ; status: True ; controllable: True ; >03:48:14,934 DEBUG storage.ui: MDRaidMember.teardown: device: /dev/sdc1 ; status: False ; type: mdmember ; >03:48:14,937 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdc1 ; status: False ; type: None ; >03:48:14,965 INFO storage.ui: removed partition sdc1 (id 12) from device tree >03:48:14,967 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:14,967 INFO storage.ui: registered action: [25] Destroy Device partition sdc1 (id 12) >03:48:14,970 DEBUG storage.ui: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:48:14,971 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,973 DEBUG storage.ui: DiskDevice._setFormat: sdc ; current: disklabel ; type: None ; >03:48:14,974 INFO storage.ui: registered action: [26] Destroy Format msdos disklabel on disk sdc (id 11) >03:48:14,974 DEBUG storage.ui: required disklabel type for sdc (1) is None >03:48:14,975 DEBUG storage.ui: default disklabel type for sdc is msdos >03:48:14,976 DEBUG storage.ui: selecting msdos disklabel for sdc based on size >03:48:14,978 DEBUG storage.ui: DiskLabel.__init__: device: /dev/sdc ; labelType: msdos ; >03:48:14,980 DEBUG storage.ui: DiskLabel.freshPartedDisk: device: /dev/sdc ; labelType: msdos ; >03:48:14,982 DEBUG storage.ui: Did not change pmbr_boot on parted.Disk instance -- > type: msdos primaryPartitionCount: 0 > lastPartitionNumber: -1 maxPrimaryPartitionCount: 4 > partitions: [] > device: <parted.device.Device object at 0x7fae05aee710> > PedDisk: <_ped.Disk object at 0x7fae0df19d40> >03:48:14,982 DEBUG storage.ui: getFormat('disklabel') returning DiskLabel instance >03:48:14,984 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdc ; status: False ; type: None ; >03:48:14,986 DEBUG storage.ui: DiskDevice._setFormat: sdc ; current: None ; type: disklabel ; >03:48:14,987 INFO storage.ui: registered action: [27] Create Format msdos disklabel on disk sdc (id 11) >03:48:14,990 DEBUG storage.ui: MDRaidMember.teardown: device: /dev/sdb1 ; status: False ; type: mdmember ; >03:48:14,992 DEBUG storage.ui: PartitionDevice._setFormat: sdb1 ; >03:48:14,993 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:14,995 DEBUG storage.ui: PartitionDevice._setFormat: sdb1 ; current: mdmember ; type: None ; >03:48:14,996 INFO storage.ui: registered action: [28] Destroy Format mdmember on partition sdb1 (id 15) >03:48:14,998 DEBUG storage.ui: PartitionDevice.teardown: sdb1 ; status: True ; controllable: True ; >03:48:15,000 DEBUG storage.ui: MDRaidMember.teardown: device: /dev/sdb1 ; status: False ; type: mdmember ; >03:48:15,003 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdb1 ; status: False ; type: None ; >03:48:15,032 INFO storage.ui: removed partition sdb1 (id 15) from device tree >03:48:15,034 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:15,035 INFO storage.ui: registered action: [29] Destroy Device partition sdb1 (id 15) >03:48:15,037 DEBUG storage.ui: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:48:15,038 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:15,040 DEBUG storage.ui: DiskDevice._setFormat: sdb ; current: disklabel ; type: None ; >03:48:15,041 INFO storage.ui: registered action: [30] Destroy Format msdos disklabel on disk sdb (id 14) >03:48:15,041 DEBUG storage.ui: required disklabel type for sdb (1) is None >03:48:15,042 DEBUG storage.ui: default disklabel type for sdb is msdos >03:48:15,042 DEBUG storage.ui: selecting msdos disklabel for sdb based on size >03:48:15,044 DEBUG storage.ui: DiskLabel.__init__: device: /dev/sdb ; labelType: msdos ; >03:48:15,046 DEBUG storage.ui: DiskLabel.freshPartedDisk: device: /dev/sdb ; labelType: msdos ; >03:48:15,048 DEBUG storage.ui: Did not change pmbr_boot on parted.Disk instance -- > type: msdos primaryPartitionCount: 0 > lastPartitionNumber: -1 maxPrimaryPartitionCount: 4 > partitions: [] > device: <parted.device.Device object at 0x7fae05aee3d0> > PedDisk: <_ped.Disk object at 0x7fae04d993f8> >03:48:15,048 DEBUG storage.ui: getFormat('disklabel') returning DiskLabel instance >03:48:15,050 DEBUG storage.ui: DeviceFormat.teardown: device: /dev/sdb ; status: False ; type: None ; >03:48:15,052 DEBUG storage.ui: DiskDevice._setFormat: sdb ; current: None ; type: disklabel ; >03:48:15,053 INFO storage.ui: registered action: [31] Create Format msdos disklabel on disk sdb (id 14) >03:48:28,832 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768.0, [], {} >03:48:28,838 DEBUG storage.ui: Blivet.factoryDevice: 2 ; 768.0 ; mountpoint: None ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 0 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 0 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 0 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 0 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; fstype: swap ; encrypted: False ; >03:48:28,839 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768.0, ['sda', 'sdb', 'sdc', 'sdd'], {'mountpoint': None, 'fstype': 'swap', 'encrypted': False} >03:48:28,842 DEBUG storage.ui: PartitionFactory.configure: parent_factory: None ; >03:48:28,842 DEBUG storage.ui: starting Blivet copy >03:48:28,889 DEBUG storage.ui: finished Blivet copy >03:48:28,892 DEBUG storage.ui: SwapSpace.__init__: >03:48:28,893 DEBUG storage.ui: getFormat('swap') returning SwapSpace instance >03:48:28,895 DEBUG storage.ui: SwapSpace.__init__: >03:48:28,896 DEBUG storage.ui: getFormat('swap') returning SwapSpace instance >03:48:28,898 DEBUG storage.ui: SwapSpace.__init__: mountpoint: None ; >03:48:28,899 DEBUG storage.ui: getFormat('swap') returning SwapSpace instance >03:48:28,901 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:28,903 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:28,906 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:28,908 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:28,910 DEBUG storage.ui: PartitionDevice._setFormat: req0 ; >03:48:28,912 DEBUG storage.ui: PartitionDevice._setFormat: req0 ; current: None ; type: swap ; >03:48:28,914 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:28,916 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:28,919 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:28,921 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:28,921 INFO storage.ui: added partition req0 (id 24) to device tree >03:48:28,922 INFO storage.ui: registered action: [32] Create Device partition req0 (id 24) >03:48:28,922 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:28,923 INFO storage.ui: registered action: [33] Create Format swap on partition req0 (id 24) >03:48:28,926 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:48:28,928 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:48:28,930 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:48:28,932 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:48:28,933 DEBUG storage.ui: removing all non-preexisting partitions ['req0(id 24)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:28,935 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req0(id 24)'] >03:48:28,935 DEBUG storage.ui: removing all non-preexisting partitions ['req0(id 24)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:28,938 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:28,940 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:28,941 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:28,943 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:28,945 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:28,946 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:28,948 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:28,950 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:28,951 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:28,953 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:28,956 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:28,956 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:28,958 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:28,961 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:28,961 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:28,963 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:28,967 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:28,967 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:28,970 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:28,972 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:28,973 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:28,975 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:28,978 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:28,978 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:28,979 DEBUG storage.ui: allocating partition: req0 ; id: 24 ; disks: ['sda', 'sdb', 'sdc', 'sdd'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 768.0 >03:48:28,980 DEBUG storage.ui: checking freespace on sda >03:48:28,981 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=False best=None grow=True >03:48:28,981 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:28,982 DEBUG storage.ui: evaluating growth potential for new layout >03:48:28,983 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:28,983 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:28,984 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:28,984 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:28,985 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:28,986 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:28,986 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:28,987 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:28,988 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:28,988 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:28,989 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:28,992 DEBUG storage.ui: PartitionDevice._setPartedPartition: req0 ; >03:48:28,993 DEBUG storage.ui: device req0 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f81610> PedPartition: <_ped.Partition object at 0x7fae04fb37d0> >03:48:28,996 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:28,998 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:28,999 DEBUG storage.ui: adding request 24 to chunk 24575937 (63-24575999) on /dev/sda >03:48:29,000 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,000 DEBUG storage.ui: req: PartitionRequest instance -- >id = 24 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 1570816 >done = False >03:48:29,001 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:29,001 DEBUG storage.ui: adding 24573889 (11998MB) to 24 (sda1) >03:48:29,002 DEBUG storage.ui: taking back 23003073 (11231MB) from 24 (sda1) >03:48:29,002 DEBUG storage.ui: new grow amount for request 24 (sda1) is 1570816 units, or 767MB >03:48:29,003 DEBUG storage.ui: request 24 (sda1) growth: 1570816 (767MB) size: 768MB >03:48:29,004 DEBUG storage.ui: disk /dev/sda growth: 1570816 (767MB) >03:48:29,006 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:29,007 DEBUG storage.ui: device sda1 new partedPartition None >03:48:29,009 DEBUG storage.ui: PartitionDevice._setDisk: req0 ; new: None ; old: sda ; >03:48:29,011 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:29,011 DEBUG storage.ui: total growth: 1570816 sectors >03:48:29,012 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:48:29,013 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:29,013 DEBUG storage.ui: new free allows for 1570816 sectors of growth >03:48:29,014 DEBUG storage.ui: checking freespace on sdb >03:48:29,015 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=1MB boot=False best=None grow=True >03:48:29,016 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:29,016 DEBUG storage.ui: evaluating growth potential for new layout >03:48:29,017 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:29,018 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,018 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:29,019 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:29,020 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:29,023 DEBUG storage.ui: PartitionDevice._setPartedPartition: req0 ; >03:48:29,024 DEBUG storage.ui: device req0 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f81a90> PedPartition: <_ped.Partition object at 0x7fae04fb3830> >03:48:29,026 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:29,029 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:29,030 DEBUG storage.ui: adding request 24 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:29,030 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,031 DEBUG storage.ui: req: PartitionRequest instance -- >id = 24 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 1570816 >done = False >03:48:29,031 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:29,032 DEBUG storage.ui: adding 24573889 (11998MB) to 24 (sdb1) >03:48:29,033 DEBUG storage.ui: taking back 23003073 (11231MB) from 24 (sdb1) >03:48:29,033 DEBUG storage.ui: new grow amount for request 24 (sdb1) is 1570816 units, or 767MB >03:48:29,034 DEBUG storage.ui: request 24 (sdb1) growth: 1570816 (767MB) size: 768MB >03:48:29,035 DEBUG storage.ui: disk /dev/sdb growth: 1570816 (767MB) >03:48:29,035 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:29,036 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,037 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:29,037 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:29,038 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,038 DEBUG storage.ui: disk /dev/sda growth: 0 (0MB) >03:48:29,041 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:29,041 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:29,043 DEBUG storage.ui: PartitionDevice._setDisk: req0 ; new: None ; old: sdb ; >03:48:29,046 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:29,046 DEBUG storage.ui: total growth: 1570816 sectors >03:48:29,047 DEBUG storage.ui: keeping old free: 1570816 <= 1570816 >03:48:29,048 DEBUG storage.ui: checking freespace on sdc >03:48:29,048 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=1MB boot=False best=None grow=True >03:48:29,049 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:29,050 DEBUG storage.ui: evaluating growth potential for new layout >03:48:29,051 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:29,051 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,052 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:29,053 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:29,053 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,054 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:29,054 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:29,055 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:29,058 DEBUG storage.ui: PartitionDevice._setPartedPartition: req0 ; >03:48:29,059 DEBUG storage.ui: device req0 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f81d90> PedPartition: <_ped.Partition object at 0x7fae04fb3890> >03:48:29,062 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:29,064 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:29,065 DEBUG storage.ui: adding request 24 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:29,066 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,066 DEBUG storage.ui: req: PartitionRequest instance -- >id = 24 name = sdc1 growable = True >base = 2048 growth = 0 max_grow = 1570816 >done = False >03:48:29,067 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:29,067 DEBUG storage.ui: adding 24573889 (11998MB) to 24 (sdc1) >03:48:29,068 DEBUG storage.ui: taking back 23003073 (11231MB) from 24 (sdc1) >03:48:29,069 DEBUG storage.ui: new grow amount for request 24 (sdc1) is 1570816 units, or 767MB >03:48:29,069 DEBUG storage.ui: request 24 (sdc1) growth: 1570816 (767MB) size: 768MB >03:48:29,070 DEBUG storage.ui: disk /dev/sdc growth: 1570816 (767MB) >03:48:29,071 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:29,071 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,072 DEBUG storage.ui: disk /dev/sda growth: 0 (0MB) >03:48:29,074 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:29,075 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:29,077 DEBUG storage.ui: PartitionDevice._setDisk: req0 ; new: None ; old: sdc ; >03:48:29,079 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:29,079 DEBUG storage.ui: total growth: 1570816 sectors >03:48:29,080 DEBUG storage.ui: keeping old free: 1570816 <= 1570816 >03:48:29,080 DEBUG storage.ui: checking freespace on sdd >03:48:29,081 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=1MB boot=False best=None grow=True >03:48:29,082 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:29,083 DEBUG storage.ui: evaluating growth potential for new layout >03:48:29,083 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:29,084 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:29,086 DEBUG storage.ui: PartitionDevice._setPartedPartition: req0 ; >03:48:29,087 DEBUG storage.ui: device req0 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb00d0> PedPartition: <_ped.Partition object at 0x7fae04fb3950> >03:48:29,090 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:29,092 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:29,092 DEBUG storage.ui: adding request 24 to chunk 24575937 (63-24575999) on /dev/sdd >03:48:29,093 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,094 DEBUG storage.ui: req: PartitionRequest instance -- >id = 24 name = sdd1 growable = True >base = 2048 growth = 0 max_grow = 1570816 >done = False >03:48:29,094 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:29,095 DEBUG storage.ui: adding 24573889 (11998MB) to 24 (sdd1) >03:48:29,096 DEBUG storage.ui: taking back 23003073 (11231MB) from 24 (sdd1) >03:48:29,096 DEBUG storage.ui: new grow amount for request 24 (sdd1) is 1570816 units, or 767MB >03:48:29,097 DEBUG storage.ui: request 24 (sdd1) growth: 1570816 (767MB) size: 768MB >03:48:29,097 DEBUG storage.ui: disk /dev/sdd growth: 1570816 (767MB) >03:48:29,098 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:29,098 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,099 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:29,100 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:29,100 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,101 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:29,101 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:29,102 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,103 DEBUG storage.ui: disk /dev/sda growth: 0 (0MB) >03:48:29,105 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:29,105 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:29,107 DEBUG storage.ui: PartitionDevice._setDisk: req0 ; new: None ; old: sdd ; >03:48:29,110 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:29,111 DEBUG storage.ui: total growth: 1570816 sectors >03:48:29,111 DEBUG storage.ui: keeping old free: 1570816 <= 1570816 >03:48:29,112 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:29,113 DEBUG storage.ui: created partition sda1 of 1MB and added it to /dev/sda >03:48:29,115 DEBUG storage.ui: PartitionDevice._setPartedPartition: req0 ; >03:48:29,116 DEBUG storage.ui: device req0 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f817d0> PedPartition: <_ped.Partition object at 0x7fae04fb3890> >03:48:29,118 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:29,120 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:29,123 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:29,124 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f81410> PedPartition: <_ped.Partition object at 0x7fae04fb3830> >03:48:29,124 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda1(id 24)'] >03:48:29,125 DEBUG storage.ui: growable partitions are ['sda1'] >03:48:29,126 DEBUG storage.ui: adding request 24 to chunk 24575937 (63-24575999) on /dev/sda >03:48:29,126 DEBUG storage.ui: disk sda has 1 chunks >03:48:29,127 DEBUG storage.ui: disk sdb has 1 chunks >03:48:29,128 DEBUG storage.ui: disk sdc has 1 chunks >03:48:29,128 DEBUG storage.ui: disk sdd has 1 chunks >03:48:29,129 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:29,129 DEBUG storage.ui: req: PartitionRequest instance -- >id = 24 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 1570816 >done = False >03:48:29,130 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:29,131 DEBUG storage.ui: adding 24573889 (11998MB) to 24 (sda1) >03:48:29,131 DEBUG storage.ui: taking back 23003073 (11231MB) from 24 (sda1) >03:48:29,132 DEBUG storage.ui: new grow amount for request 24 (sda1) is 1570816 units, or 767MB >03:48:29,133 DEBUG storage.ui: growing partitions on sda >03:48:29,133 DEBUG storage.ui: partition sda1 (24): 0 >03:48:29,134 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 1574911 length: 1572864 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f81390> >03:48:29,135 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 24)'] from disk(s) ['sda'] >03:48:29,137 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:29,138 DEBUG storage.ui: device sda1 new partedPartition None >03:48:29,140 DEBUG storage.ui: PartitionDevice._setDisk: req0 ; new: None ; old: sda ; >03:48:29,142 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:29,143 DEBUG storage.ui: back from removeNewPartitions >03:48:29,143 DEBUG storage.ui: extended: None >03:48:29,144 DEBUG storage.ui: setting req0 new geometry: parted.Geometry instance -- > start: 2048 end: 1574911 length: 1572864 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f81390> >03:48:29,146 DEBUG storage.ui: PartitionDevice._setPartedPartition: req0 ; >03:48:29,148 DEBUG storage.ui: device req0 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb0150> PedPartition: <_ped.Partition object at 0x7fae04fb37d0> >03:48:29,150 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:29,152 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:29,154 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:29,156 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb0310> PedPartition: <_ped.Partition object at 0x7fae04fb39b0> >03:48:29,156 DEBUG storage.ui: growing partitions on sdb >03:48:29,157 DEBUG storage.ui: growing partitions on sdc >03:48:29,158 DEBUG storage.ui: growing partitions on sdd >03:48:29,158 DEBUG storage.ui: fixing size of non-existent 768MB partition sda1 (24) with non-existent swap at 768.00 >03:48:29,184 DEBUG blivet: SwapSpace.__init__: >03:48:29,185 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:48:29,222 DEBUG blivet: SwapSpace.__init__: >03:48:29,223 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:48:29,229 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda'], {'encrypted': False, 'raid_level': None} >03:48:29,242 DEBUG storage.ui: Blivet.factoryDevice: 2 ; 768 ; container_raid_level: None ; name: None ; encrypted: False ; container_encrypted: False ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 1 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 0 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 0 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 0 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; raid_level: None ; label: ; container_name: None ; device: non-existent 768MB partition sda1 (24) with non-existent swap ; mountpoint: None ; fstype: swap ; container_size: 0 ; >03:48:29,245 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda', 'sdb', 'sdc', 'sdd'], {'container_raid_level': None, 'name': None, 'encrypted': False, 'container_encrypted': False, 'raid_level': None, 'label': '', 'container_name': None, 'device': PartitionDevice instance (0x7fae04f4bf90) -- > name = sda1 status = False kids = 0 id = 24 > parents = ['existing 12000MB disk sda (1) with non-existent msdos disklabel'] > uuid = None size = 768.0 > format = non-existent swap > major = 0 minor = 0 exists = False protected = False > sysfs path = partedDevice = None > target size = 1 path = /dev/sda1 > format args = [] originalFormat = swap grow = False max size = 768.0 bootable = False > part type = 0 primary = False > partedPartition = parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb0310> PedPartition: <_ped.Partition object at 0x7fae04fb39b0> > disk = existing 12000MB disk sda (1) with non-existent msdos disklabel > start = 2048 end = 1574911 length = 1572864 > flags = , 'mountpoint': None, 'fstype': 'swap', 'container_size': 0} >03:48:29,248 DEBUG storage.ui: PartitionFactory.configure: parent_factory: None ; >03:48:29,249 DEBUG storage.ui: starting Blivet copy >03:48:29,293 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:29,295 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fb0b90> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f76910> PedPartition: <_ped.Partition object at 0x7fae04fb37d0> >03:48:29,296 DEBUG storage.ui: finished Blivet copy >03:48:29,299 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:48:29,301 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:48:29,304 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:48:29,306 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:48:29,307 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 24)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:29,310 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:29,311 DEBUG storage.ui: device sda1 new partedPartition None >03:48:29,313 DEBUG storage.ui: PartitionDevice._setDisk: req0 ; new: None ; old: sda ; >03:48:29,316 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:29,317 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req0(id 24)'] >03:48:29,318 DEBUG storage.ui: removing all non-preexisting partitions ['req0(id 24)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:29,320 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:29,322 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:29,323 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:29,325 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:29,327 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:29,327 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:29,330 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:29,332 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:29,333 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:29,335 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:29,338 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:29,338 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:29,340 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:29,343 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:29,343 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:29,346 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:29,348 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:29,349 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:29,351 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:29,354 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:29,354 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:29,356 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:29,359 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:29,360 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:29,360 DEBUG storage.ui: allocating partition: req0 ; id: 24 ; disks: ['sda', 'sdb', 'sdc', 'sdd'] ; >boot: False ; primary: False ; size: 768MB ; grow: False ; max_size: 768.0 >03:48:29,361 DEBUG storage.ui: checking freespace on sda >03:48:29,362 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=768MB boot=False best=None grow=False >03:48:29,362 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:29,363 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:48:29,363 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:29,363 DEBUG storage.ui: new free allows for 0 sectors of growth >03:48:29,364 DEBUG storage.ui: checking freespace on sdb >03:48:29,365 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=768MB boot=False best=parted.Geometry instance -- > start: 63 end: 24575999 length: 24575937 > device: <parted.device.Device object at 0x7fae04f76550> PedGeometry: <_ped.Geometry object at 0x7fae04f76790> grow=False >03:48:29,365 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:29,366 DEBUG storage.ui: not enough free space for primary -- trying logical >03:48:29,366 DEBUG storage.ui: checking freespace on sdc >03:48:29,367 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=768MB boot=False best=parted.Geometry instance -- > start: 63 end: 24575999 length: 24575937 > device: <parted.device.Device object at 0x7fae04f76550> PedGeometry: <_ped.Geometry object at 0x7fae04f76790> grow=False >03:48:29,368 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:29,368 DEBUG storage.ui: not enough free space for primary -- trying logical >03:48:29,369 DEBUG storage.ui: checking freespace on sdd >03:48:29,370 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=768MB boot=False best=parted.Geometry instance -- > start: 63 end: 24575999 length: 24575937 > device: <parted.device.Device object at 0x7fae04f76550> PedGeometry: <_ped.Geometry object at 0x7fae04f76790> grow=False >03:48:29,370 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:29,371 DEBUG storage.ui: not enough free space for primary -- trying logical >03:48:29,372 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:29,372 DEBUG storage.ui: created partition sda1 of 768MB and added it to /dev/sda >03:48:29,374 DEBUG storage.ui: PartitionDevice._setPartedPartition: req0 ; >03:48:29,375 DEBUG storage.ui: device req0 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb0310> PedPartition: <_ped.Partition object at 0x7fae04fb3830> >03:48:29,378 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:29,380 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:29,383 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:29,384 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb01d0> PedPartition: <_ped.Partition object at 0x7fae04fb39b0> >03:48:29,384 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda1(id 24)'] >03:48:29,385 DEBUG storage.ui: no growable partitions >03:48:29,385 DEBUG storage.ui: fixing size of non-existent 768MB partition sda1 (24) with non-existent swap at 768.00 >03:48:37,711 DEBUG blivet: SwapSpace.__init__: >03:48:37,712 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:48:37,715 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda'], {'encrypted': False, 'raid_level': 'raid10'} >03:48:37,720 INFO storage.ui: removed partition sda1 (id 24) from device tree >03:48:37,722 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:37,722 INFO storage.ui: registered action: [34] Destroy Device partition sda1 (id 24) >03:48:37,728 DEBUG storage.ui: Blivet.factoryDevice: 1 ; 768 ; container_raid_level: None ; name: swap ; encrypted: False ; container_encrypted: False ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 0 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 0 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 0 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 0 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; raid_level: raid10 ; label: ; container_name: None ; device: None ; mountpoint: None ; fstype: swap ; container_size: 0 ; >03:48:37,730 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda', 'sdb', 'sdc', 'sdd'], {'container_raid_level': None, 'name': 'swap', 'encrypted': False, 'container_encrypted': False, 'raid_level': 'raid10', 'label': '', 'container_name': None, 'device': None, 'mountpoint': None, 'fstype': 'swap', 'container_size': 0} >03:48:37,732 DEBUG storage.ui: MDFactory.configure: parent_factory: None ; >03:48:37,732 DEBUG storage.ui: starting Blivet copy >03:48:37,770 DEBUG storage.ui: finished Blivet copy >03:48:37,771 INFO storage.ui: Using 0MB superBlockSize >03:48:37,772 DEBUG storage.ui: child factory class: <class 'blivet.devicefactory.PartitionSetFactory'> >03:48:37,777 DEBUG storage.ui: child factory args: [<blivet.Blivet object at 0x7fae05326d50>, 1536.0, [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 0 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 0 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 0 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 0 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>]] >03:48:37,780 DEBUG storage.ui: child factory kwargs: {'fstype': 'mdmember'} >03:48:37,782 DEBUG storage.ui: PartitionSetFactory.configure: parent_factory: <blivet.devicefactory.MDFactory object at 0x7fae04fb04d0> ; >03:48:37,782 DEBUG storage.ui: parent factory container: None >03:48:37,783 DEBUG storage.ui: members: [] >03:48:37,784 DEBUG storage.ui: add_disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:37,784 DEBUG storage.ui: remove_disks: [] >03:48:37,786 DEBUG storage.ui: MDRaidMember.__init__: >03:48:37,786 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:37,788 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:48:37,788 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:37,790 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:37,792 DEBUG storage.ui: PartitionDevice._setFormat: req1 ; >03:48:37,795 DEBUG storage.ui: PartitionDevice._setFormat: req1 ; current: None ; type: mdmember ; >03:48:37,796 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:37,797 INFO storage.ui: added partition req1 (id 25) to device tree >03:48:37,797 INFO storage.ui: registered action: [35] Create Device partition req1 (id 25) >03:48:37,798 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:37,798 INFO storage.ui: registered action: [36] Create Format mdmember on partition req1 (id 25) >03:48:37,801 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:48:37,801 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:37,803 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:37,805 DEBUG storage.ui: PartitionDevice._setFormat: req2 ; >03:48:37,807 DEBUG storage.ui: PartitionDevice._setFormat: req2 ; current: None ; type: mdmember ; >03:48:37,809 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:37,809 INFO storage.ui: added partition req2 (id 26) to device tree >03:48:37,810 INFO storage.ui: registered action: [37] Create Device partition req2 (id 26) >03:48:37,810 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:37,811 INFO storage.ui: registered action: [38] Create Format mdmember on partition req2 (id 26) >03:48:37,813 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:48:37,813 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:37,815 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:37,817 DEBUG storage.ui: PartitionDevice._setFormat: req3 ; >03:48:37,819 DEBUG storage.ui: PartitionDevice._setFormat: req3 ; current: None ; type: mdmember ; >03:48:37,821 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:37,821 INFO storage.ui: added partition req3 (id 27) to device tree >03:48:37,822 INFO storage.ui: registered action: [39] Create Device partition req3 (id 27) >03:48:37,823 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:37,823 INFO storage.ui: registered action: [40] Create Format mdmember on partition req3 (id 27) >03:48:37,825 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:48:37,826 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:37,828 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:37,830 DEBUG storage.ui: PartitionDevice._setFormat: req4 ; >03:48:37,832 DEBUG storage.ui: PartitionDevice._setFormat: req4 ; current: None ; type: mdmember ; >03:48:37,834 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:37,834 INFO storage.ui: added partition req4 (id 28) to device tree >03:48:37,835 INFO storage.ui: registered action: [41] Create Device partition req4 (id 28) >03:48:37,835 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:37,836 INFO storage.ui: registered action: [42] Create Format mdmember on partition req4 (id 28) >03:48:37,836 INFO storage.ui: Using 0MB superBlockSize >03:48:37,837 DEBUG storage.ui: adding a SameSizeSet with size 1536 >03:48:37,839 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:48:37,841 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:48:37,842 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:48:37,844 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:48:37,845 DEBUG storage.ui: removing all non-preexisting partitions ['req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:37,846 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] >03:48:37,847 DEBUG storage.ui: removing all non-preexisting partitions ['req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:37,849 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:37,851 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:37,852 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:37,854 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:37,856 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:37,856 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:37,857 DEBUG storage.ui: allocating partition: req1 ; id: 25 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:48:37,857 DEBUG storage.ui: checking freespace on sda >03:48:37,858 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=False best=None grow=True >03:48:37,859 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:37,859 DEBUG storage.ui: evaluating growth potential for new layout >03:48:37,860 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:37,860 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,860 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:37,861 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:37,861 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,861 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:37,862 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:37,862 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,862 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:37,863 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:37,863 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:37,865 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:37,866 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f898d0> PedPartition: <_ped.Partition object at 0x7fae05b0b110> >03:48:37,868 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:37,870 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:37,871 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:37,871 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,872 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:37,872 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:37,872 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:37,873 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:37,873 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:37,874 DEBUG storage.ui: request 25 (sda1) growth: 784384 (383MB) size: 384MB >03:48:37,874 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:48:37,876 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:37,876 DEBUG storage.ui: device sda1 new partedPartition None >03:48:37,878 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:37,880 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:37,881 DEBUG storage.ui: total growth: 784384 sectors >03:48:37,881 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:48:37,882 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:37,882 DEBUG storage.ui: new free allows for 784384 sectors of growth >03:48:37,882 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:37,883 DEBUG storage.ui: created partition sda1 of 1MB and added it to /dev/sda >03:48:37,885 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:37,886 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89b10> PedPartition: <_ped.Partition object at 0x7fae04fb3c50> >03:48:37,888 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:37,890 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:37,892 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:37,893 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89990> PedPartition: <_ped.Partition object at 0x7fae04fb3890> >03:48:37,895 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:37,898 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:37,898 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:37,900 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:37,902 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:37,903 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:37,903 DEBUG storage.ui: allocating partition: req2 ; id: 26 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:48:37,903 DEBUG storage.ui: checking freespace on sdb >03:48:37,904 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=1MB boot=False best=None grow=True >03:48:37,905 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:37,905 DEBUG storage.ui: evaluating growth potential for new layout >03:48:37,905 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:37,906 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,906 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:37,906 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:37,907 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:37,909 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:37,910 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5050> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:48:37,912 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:37,914 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:37,915 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:37,915 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,916 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:37,916 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:37,917 DEBUG storage.ui: adding 24573889 (11998MB) to 26 (sdb1) >03:48:37,917 DEBUG storage.ui: taking back 23789505 (11615MB) from 26 (sdb1) >03:48:37,917 DEBUG storage.ui: new grow amount for request 26 (sdb1) is 784384 units, or 383MB >03:48:37,918 DEBUG storage.ui: request 26 (sdb1) growth: 784384 (383MB) size: 384MB >03:48:37,918 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:48:37,918 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:37,919 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,919 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:37,919 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:37,920 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:37,920 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,921 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:37,921 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:37,921 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:37,922 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:37,922 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:37,922 DEBUG storage.ui: request 25 (sda1) growth: 784384 (383MB) size: 384MB >03:48:37,923 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:48:37,925 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:37,925 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:37,927 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:48:37,929 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:37,929 DEBUG storage.ui: total growth: 1568768 sectors >03:48:37,929 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:48:37,930 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:37,930 DEBUG storage.ui: new free allows for 1568768 sectors of growth >03:48:37,930 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:37,931 DEBUG storage.ui: created partition sdb1 of 1MB and added it to /dev/sdb >03:48:37,933 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:37,934 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5250> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:48:37,936 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:37,938 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:37,940 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:37,941 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5210> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:48:37,944 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:37,946 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:37,947 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:37,949 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:37,951 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:37,951 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:37,951 DEBUG storage.ui: allocating partition: req3 ; id: 27 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:48:37,952 DEBUG storage.ui: checking freespace on sdc >03:48:37,952 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=1MB boot=False best=None grow=True >03:48:37,953 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:37,953 DEBUG storage.ui: evaluating growth potential for new layout >03:48:37,954 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:37,954 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,954 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:37,955 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:37,955 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:37,956 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,956 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:37,956 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:37,957 DEBUG storage.ui: adding 24573889 (11998MB) to 26 (sdb1) >03:48:37,957 DEBUG storage.ui: taking back 23789505 (11615MB) from 26 (sdb1) >03:48:37,958 DEBUG storage.ui: new grow amount for request 26 (sdb1) is 784384 units, or 383MB >03:48:37,959 DEBUG storage.ui: request 26 (sdb1) growth: 784384 (383MB) size: 384MB >03:48:37,959 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:48:37,960 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:37,960 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:37,962 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:37,963 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89610> PedPartition: <_ped.Partition object at 0x7fae05b0b110> >03:48:37,966 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:37,968 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:37,969 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:37,969 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,969 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:37,970 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:37,970 DEBUG storage.ui: adding 24573889 (11998MB) to 27 (sdc1) >03:48:37,970 DEBUG storage.ui: taking back 23789505 (11615MB) from 27 (sdc1) >03:48:37,971 DEBUG storage.ui: new grow amount for request 27 (sdc1) is 784384 units, or 383MB >03:48:37,971 DEBUG storage.ui: request 27 (sdc1) growth: 784384 (383MB) size: 384MB >03:48:37,972 DEBUG storage.ui: disk /dev/sdc growth: 784384 (383MB) >03:48:37,972 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:37,972 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:37,973 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:37,973 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:37,974 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:37,974 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:37,975 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:37,975 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:37,975 DEBUG storage.ui: request 25 (sda1) growth: 784384 (383MB) size: 384MB >03:48:37,976 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:48:37,977 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:37,978 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:37,980 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:48:37,982 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:37,982 DEBUG storage.ui: total growth: 2353152 sectors >03:48:37,983 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:48:37,983 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:37,984 DEBUG storage.ui: new free allows for 2353152 sectors of growth >03:48:37,984 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:37,985 DEBUG storage.ui: created partition sdc1 of 1MB and added it to /dev/sdc >03:48:37,987 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:37,988 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f86d10> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:48:37,990 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:37,992 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:37,995 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:37,996 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89910> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:48:37,998 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:38,000 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:38,000 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:38,003 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:38,005 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:38,005 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:38,006 DEBUG storage.ui: allocating partition: req4 ; id: 28 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:48:38,006 DEBUG storage.ui: checking freespace on sdd >03:48:38,007 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=1MB boot=False best=None grow=True >03:48:38,007 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:38,008 DEBUG storage.ui: evaluating growth potential for new layout >03:48:38,008 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:38,009 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:38,011 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:38,012 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5390> PedPartition: <_ped.Partition object at 0x7fae04fb3c50> >03:48:38,014 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:38,016 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:38,017 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:48:38,017 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:38,018 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:38,018 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:38,018 DEBUG storage.ui: adding 24573889 (11998MB) to 28 (sdd1) >03:48:38,019 DEBUG storage.ui: taking back 23789505 (11615MB) from 28 (sdd1) >03:48:38,019 DEBUG storage.ui: new grow amount for request 28 (sdd1) is 784384 units, or 383MB >03:48:38,019 DEBUG storage.ui: request 28 (sdd1) growth: 784384 (383MB) size: 384MB >03:48:38,020 DEBUG storage.ui: disk /dev/sdd growth: 784384 (383MB) >03:48:38,020 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:38,021 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:38,021 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:38,022 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:38,022 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:38,022 DEBUG storage.ui: adding 24573889 (11998MB) to 26 (sdb1) >03:48:38,023 DEBUG storage.ui: taking back 23789505 (11615MB) from 26 (sdb1) >03:48:38,023 DEBUG storage.ui: new grow amount for request 26 (sdb1) is 784384 units, or 383MB >03:48:38,024 DEBUG storage.ui: request 26 (sdb1) growth: 784384 (383MB) size: 384MB >03:48:38,024 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:48:38,024 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:38,025 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:38,025 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:38,025 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:38,026 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:38,026 DEBUG storage.ui: adding 24573889 (11998MB) to 27 (sdc1) >03:48:38,027 DEBUG storage.ui: taking back 23789505 (11615MB) from 27 (sdc1) >03:48:38,027 DEBUG storage.ui: new grow amount for request 27 (sdc1) is 784384 units, or 383MB >03:48:38,027 DEBUG storage.ui: request 27 (sdc1) growth: 784384 (383MB) size: 384MB >03:48:38,028 DEBUG storage.ui: disk /dev/sdc growth: 784384 (383MB) >03:48:38,028 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:38,029 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:38,029 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:38,030 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:38,030 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:38,031 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:38,031 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:38,031 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:38,032 DEBUG storage.ui: request 25 (sda1) growth: 784384 (383MB) size: 384MB >03:48:38,032 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:48:38,034 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:38,035 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:38,036 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:48:38,038 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:38,039 DEBUG storage.ui: total growth: 3137536 sectors >03:48:38,039 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:48:38,039 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:38,040 DEBUG storage.ui: new free allows for 3137536 sectors of growth >03:48:38,040 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:38,041 DEBUG storage.ui: created partition sdd1 of 1MB and added it to /dev/sdd >03:48:38,043 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:38,044 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5590> PedPartition: <_ped.Partition object at 0x7fae04fb3950> >03:48:38,046 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:38,048 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:38,050 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:38,051 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5410> PedPartition: <_ped.Partition object at 0x7fae04fb38f0> >03:48:38,052 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda1(id 25)', 'sdb1(id 26)', 'sdc1(id 27)', 'sdd1(id 28)'] >03:48:38,052 DEBUG storage.ui: growable partitions are ['sda1', 'sdb1', 'sdc1', 'sdd1'] >03:48:38,052 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:38,053 DEBUG storage.ui: disk sda has 1 chunks >03:48:38,053 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:38,054 DEBUG storage.ui: disk sdb has 1 chunks >03:48:38,054 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:38,055 DEBUG storage.ui: disk sdc has 1 chunks >03:48:38,055 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:48:38,055 DEBUG storage.ui: disk sdd has 1 chunks >03:48:38,056 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:38,056 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:38,057 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:38,057 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:38,057 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:38,058 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:38,058 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:38,058 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:38,059 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:38,059 DEBUG storage.ui: adding 24573889 (11998MB) to 26 (sdb1) >03:48:38,059 DEBUG storage.ui: taking back 23789505 (11615MB) from 26 (sdb1) >03:48:38,060 DEBUG storage.ui: new grow amount for request 26 (sdb1) is 784384 units, or 383MB >03:48:38,060 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:38,060 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:38,061 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:38,061 DEBUG storage.ui: adding 24573889 (11998MB) to 27 (sdc1) >03:48:38,061 DEBUG storage.ui: taking back 23789505 (11615MB) from 27 (sdc1) >03:48:38,062 DEBUG storage.ui: new grow amount for request 27 (sdc1) is 784384 units, or 383MB >03:48:38,062 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:38,062 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:38,063 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:38,063 DEBUG storage.ui: adding 24573889 (11998MB) to 28 (sdd1) >03:48:38,063 DEBUG storage.ui: taking back 23789505 (11615MB) from 28 (sdd1) >03:48:38,064 DEBUG storage.ui: new grow amount for request 28 (sdd1) is 784384 units, or 383MB >03:48:38,064 DEBUG storage.ui: set: ['sda1', 'sdb1', 'sdc1', 'sdd1'] 384 >03:48:38,065 DEBUG storage.ui: min growth is 784384 >03:48:38,065 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:38,065 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:38,066 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:38,066 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 28 name = sdd1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:38,066 DEBUG storage.ui: set: ['sda1', 'sdb1', 'sdc1', 'sdd1'] 384 >03:48:38,067 DEBUG storage.ui: min growth is 784384 >03:48:38,067 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:38,067 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:38,068 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:38,068 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 28 name = sdd1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:38,069 DEBUG storage.ui: growing partitions on sda >03:48:38,069 DEBUG storage.ui: partition sda1 (25): 0 >03:48:38,070 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fa5810> >03:48:38,070 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 25)'] from disk(s) ['sda'] >03:48:38,073 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:38,073 DEBUG storage.ui: device sda1 new partedPartition None >03:48:38,075 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:38,077 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:38,078 DEBUG storage.ui: back from removeNewPartitions >03:48:38,078 DEBUG storage.ui: extended: None >03:48:38,078 DEBUG storage.ui: setting req1 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fa5810> >03:48:38,080 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:38,081 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89690> PedPartition: <_ped.Partition object at 0x7fae04fb3b90> >03:48:38,083 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:38,085 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:38,088 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:38,089 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f897d0> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:48:38,089 DEBUG storage.ui: growing partitions on sdb >03:48:38,090 DEBUG storage.ui: partition sdb1 (26): 0 >03:48:38,090 DEBUG storage.ui: new geometry for sdb1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f89990> >03:48:38,091 DEBUG storage.ui: removing all non-preexisting partitions ['sdb1(id 26)'] from disk(s) ['sdb'] >03:48:38,093 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:38,093 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:38,095 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:48:38,097 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:38,097 DEBUG storage.ui: back from removeNewPartitions >03:48:38,098 DEBUG storage.ui: extended: None >03:48:38,098 DEBUG storage.ui: setting req2 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f89990> >03:48:38,100 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:38,101 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f895d0> PedPartition: <_ped.Partition object at 0x7fae04fb3890> >03:48:38,104 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:38,106 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:38,108 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:38,109 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89fd0> PedPartition: <_ped.Partition object at 0x7fae05b0b110> >03:48:38,110 DEBUG storage.ui: growing partitions on sdc >03:48:38,110 DEBUG storage.ui: partition sdc1 (27): 0 >03:48:38,111 DEBUG storage.ui: new geometry for sdc1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f89b90> >03:48:38,111 DEBUG storage.ui: removing all non-preexisting partitions ['sdc1(id 27)'] from disk(s) ['sdc'] >03:48:38,113 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:38,113 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:38,115 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:48:38,117 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:38,118 DEBUG storage.ui: back from removeNewPartitions >03:48:38,118 DEBUG storage.ui: extended: None >03:48:38,119 DEBUG storage.ui: setting req3 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f89b90> >03:48:38,121 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:38,122 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89c10> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:48:38,124 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:38,126 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:38,128 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:38,129 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5810> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:48:38,129 DEBUG storage.ui: growing partitions on sdd >03:48:38,130 DEBUG storage.ui: partition sdd1 (28): 0 >03:48:38,130 DEBUG storage.ui: new geometry for sdd1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04f89cd0> >03:48:38,131 DEBUG storage.ui: removing all non-preexisting partitions ['sdd1(id 28)'] from disk(s) ['sdd'] >03:48:38,133 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:38,134 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:38,136 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:48:38,138 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:38,138 DEBUG storage.ui: back from removeNewPartitions >03:48:38,139 DEBUG storage.ui: extended: None >03:48:38,139 DEBUG storage.ui: setting req4 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04f89cd0> >03:48:38,141 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:38,142 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5a50> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:48:38,144 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:38,146 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:38,149 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:38,150 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5b10> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:48:38,150 DEBUG storage.ui: fixing size of non-existent 384MB partition sda1 (25) with non-existent mdmember at 384.00 >03:48:38,151 DEBUG storage.ui: fixing size of non-existent 384MB partition sdb1 (26) with non-existent mdmember at 384.00 >03:48:38,152 DEBUG storage.ui: fixing size of non-existent 384MB partition sdc1 (27) with non-existent mdmember at 384.00 >03:48:38,152 DEBUG storage.ui: fixing size of non-existent 384MB partition sdd1 (28) with non-existent mdmember at 384.00 >03:48:38,162 DEBUG storage.ui: SwapSpace.__init__: mountpoint: None ; >03:48:38,162 DEBUG storage.ui: getFormat('swap') returning SwapSpace instance >03:48:38,164 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sda1 ; >03:48:38,166 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdb1 ; >03:48:38,168 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdc1 ; >03:48:38,170 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdd1 ; >03:48:38,172 DEBUG storage.ui: MDRaidArrayDevice._setFormat: swap ; current: None ; type: swap ; >03:48:38,173 INFO storage.ui: added mdarray swap (id 29) to device tree >03:48:38,174 INFO storage.ui: registered action: [43] Create Device mdarray swap (id 29) >03:48:38,174 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:38,175 INFO storage.ui: registered action: [44] Create Format swap on mdarray swap (id 29) >03:48:38,176 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:48:38,176 INFO storage.ui: Using 0MB superBlockSize >03:48:38,177 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:48:38,178 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:48:38,178 INFO storage.ui: Using 0MB superBlockSize >03:48:38,179 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:48:38,182 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:38,182 INFO blivet: Using 0MB superBlockSize >03:48:38,183 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:38,196 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:38,197 INFO blivet: Using 0MB superBlockSize >03:48:38,197 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:38,201 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:38,201 INFO blivet: Using 0MB superBlockSize >03:48:38,202 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:39,763 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:39,765 INFO blivet: Using 0MB superBlockSize >03:48:39,767 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:39,776 DEBUG blivet: SwapSpace.__init__: >03:48:39,777 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:48:39,783 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid10'} >03:48:39,796 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:48:39,797 INFO storage.ui: Using 0MB superBlockSize >03:48:39,798 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:48:39,798 DEBUG storage.ui: Blivet.factoryDevice: 1 ; 768 ; container_raid_level: None ; name: swap ; encrypted: False ; container_encrypted: False ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 1 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 1 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 1 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 1 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; raid_level: raid10 ; label: ; container_name: None ; device: non-existent 768MB mdarray swap (29) with non-existent swap ; mountpoint: None ; fstype: swap ; container_size: 0 ; >03:48:39,802 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:48:39,802 INFO storage.ui: Using 0MB superBlockSize >03:48:39,803 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:48:39,804 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda', 'sdb', 'sdc', 'sdd'], {'container_raid_level': None, 'name': 'swap', 'encrypted': False, 'container_encrypted': False, 'raid_level': 'raid10', 'label': '', 'container_name': None, 'device': MDRaidArrayDevice instance (0x7fae04f96bd0) -- > name = swap status = False kids = 0 id = 29 > parents = ['non-existent 384MB partition sda1 (25) with non-existent mdmember', > 'non-existent 384MB partition sdb1 (26) with non-existent mdmember', > 'non-existent 384MB partition sdc1 (27) with non-existent mdmember', > 'non-existent 384MB partition sdd1 (28) with non-existent mdmember'] > uuid = None size = 768.0 > format = non-existent swap > major = 0 minor = 0 exists = False protected = False > sysfs path = partedDevice = None > target size = 768 path = /dev/md/swap > format args = None originalFormat = swap level = 10 spares = 0 > members = 4 > total devices = 4 metadata version = default, 'mountpoint': None, 'fstype': 'swap', 'container_size': 0} >03:48:39,806 DEBUG storage.ui: MDFactory.configure: parent_factory: None ; >03:48:39,807 DEBUG storage.ui: starting Blivet copy >03:48:39,849 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:39,850 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f78350> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa0c50> PedPartition: <_ped.Partition object at 0x7fae05afbf50> >03:48:39,853 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:39,855 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f4b5d0> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa0cd0> PedPartition: <_ped.Partition object at 0x7fae04fb3bf0> >03:48:39,857 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:39,858 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f85f90> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa0e10> PedPartition: <_ped.Partition object at 0x7fae04fb3950> >03:48:39,861 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:39,862 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f93c90> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fae050> PedPartition: <_ped.Partition object at 0x7fae04fb3d10> >03:48:39,863 DEBUG storage.ui: finished Blivet copy >03:48:39,864 INFO storage.ui: Using 0MB superBlockSize >03:48:39,865 DEBUG storage.ui: child factory class: <class 'blivet.devicefactory.PartitionSetFactory'> >03:48:39,869 DEBUG storage.ui: child factory args: [<blivet.Blivet object at 0x7fae05326d50>, 1536.0, [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 1 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 1 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 1 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 1 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>]] >03:48:39,871 DEBUG storage.ui: child factory kwargs: {'fstype': 'mdmember'} >03:48:39,873 DEBUG storage.ui: PartitionSetFactory.configure: parent_factory: <blivet.devicefactory.MDFactory object at 0x7fae04f78d50> ; >03:48:39,875 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:48:39,875 INFO storage.ui: Using 0MB superBlockSize >03:48:39,876 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:48:39,876 DEBUG storage.ui: parent factory container: non-existent 768MB mdarray swap (29) with non-existent swap >03:48:39,877 DEBUG storage.ui: members: ['sda1', 'sdb1', 'sdc1', 'sdd1'] >03:48:39,878 DEBUG storage.ui: add_disks: [] >03:48:39,878 DEBUG storage.ui: remove_disks: [] >03:48:39,880 DEBUG storage.ui: MDRaidMember.__init__: >03:48:39,881 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:39,882 INFO storage.ui: Using 0MB superBlockSize >03:48:39,882 DEBUG storage.ui: adding a SameSizeSet with size 1536 >03:48:39,885 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:48:39,887 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:48:39,889 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:48:39,891 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:48:39,893 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 25)', 'sdb1(id 26)', 'sdc1(id 27)', 'sdd1(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:39,896 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:39,896 DEBUG storage.ui: device sda1 new partedPartition None >03:48:39,898 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:39,901 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:39,903 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:39,904 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:39,906 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:48:39,909 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:39,911 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:39,912 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:39,914 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:48:39,916 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:39,918 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:39,919 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:39,921 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:48:39,923 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:39,925 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] >03:48:39,925 DEBUG storage.ui: removing all non-preexisting partitions ['req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:39,928 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:39,930 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:39,931 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:39,933 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:39,936 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:39,936 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:39,937 DEBUG storage.ui: allocating partition: req1 ; id: 25 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:48:39,937 DEBUG storage.ui: checking freespace on sda >03:48:39,938 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=False best=None grow=True >03:48:39,939 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:39,940 DEBUG storage.ui: evaluating growth potential for new layout >03:48:39,940 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:39,941 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:39,942 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:39,942 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:39,943 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:39,943 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:39,944 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:39,945 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:39,945 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:39,946 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:39,946 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:39,949 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:39,951 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04faed90> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:48:39,953 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:39,956 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:39,956 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:39,957 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:39,958 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:39,958 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:39,959 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:39,960 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:39,960 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:39,961 DEBUG storage.ui: request 25 (sda1) growth: 784384 (383MB) size: 384MB >03:48:39,961 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:48:39,963 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:39,964 DEBUG storage.ui: device sda1 new partedPartition None >03:48:39,966 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:39,969 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:39,969 DEBUG storage.ui: total growth: 784384 sectors >03:48:39,970 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:48:39,970 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:39,971 DEBUG storage.ui: new free allows for 784384 sectors of growth >03:48:39,972 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:39,973 DEBUG storage.ui: created partition sda1 of 1MB and added it to /dev/sda >03:48:39,975 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:39,976 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fae290> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:48:39,978 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:39,981 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:39,983 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:39,984 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fae8d0> PedPartition: <_ped.Partition object at 0x7fae04fb3830> >03:48:39,987 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:39,990 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:39,990 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:39,993 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:39,995 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:39,995 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:39,996 DEBUG storage.ui: allocating partition: req2 ; id: 26 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:48:39,997 DEBUG storage.ui: checking freespace on sdb >03:48:39,998 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=1MB boot=False best=None grow=True >03:48:39,999 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:40,000 DEBUG storage.ui: evaluating growth potential for new layout >03:48:40,000 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:40,001 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,001 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:40,002 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:40,003 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:40,005 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:40,007 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04faecd0> PedPartition: <_ped.Partition object at 0x7fae05b0b110> >03:48:40,009 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:40,011 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:40,013 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:40,013 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,014 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,014 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,015 DEBUG storage.ui: adding 24573889 (11998MB) to 26 (sdb1) >03:48:40,015 DEBUG storage.ui: taking back 23789505 (11615MB) from 26 (sdb1) >03:48:40,016 DEBUG storage.ui: new grow amount for request 26 (sdb1) is 784384 units, or 383MB >03:48:40,017 DEBUG storage.ui: request 26 (sdb1) growth: 784384 (383MB) size: 384MB >03:48:40,017 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:48:40,018 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:40,018 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,019 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:40,019 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:40,020 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:40,021 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,022 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,022 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,023 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:40,024 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:40,024 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:40,025 DEBUG storage.ui: request 25 (sda1) growth: 784384 (383MB) size: 384MB >03:48:40,026 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:48:40,028 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:40,028 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:40,030 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:48:40,033 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:40,033 DEBUG storage.ui: total growth: 1568768 sectors >03:48:40,034 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:48:40,035 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:40,035 DEBUG storage.ui: new free allows for 1568768 sectors of growth >03:48:40,036 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:40,037 DEBUG storage.ui: created partition sdb1 of 1MB and added it to /dev/sdb >03:48:40,039 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:40,040 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04faef10> PedPartition: <_ped.Partition object at 0x7fae04fb3b90> >03:48:40,043 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:40,045 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:40,048 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:40,049 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04faee10> PedPartition: <_ped.Partition object at 0x7fae04fb3890> >03:48:40,052 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:40,055 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:40,055 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:40,058 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:40,060 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:40,061 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:40,061 DEBUG storage.ui: allocating partition: req3 ; id: 27 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:48:40,062 DEBUG storage.ui: checking freespace on sdc >03:48:40,063 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=1MB boot=False best=None grow=True >03:48:40,065 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:40,065 DEBUG storage.ui: evaluating growth potential for new layout >03:48:40,066 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:40,066 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,067 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:40,067 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:40,068 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:40,069 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,069 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,070 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,070 DEBUG storage.ui: adding 24573889 (11998MB) to 26 (sdb1) >03:48:40,071 DEBUG storage.ui: taking back 23789505 (11615MB) from 26 (sdb1) >03:48:40,071 DEBUG storage.ui: new grow amount for request 26 (sdb1) is 784384 units, or 383MB >03:48:40,072 DEBUG storage.ui: request 26 (sdb1) growth: 784384 (383MB) size: 384MB >03:48:40,073 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:48:40,073 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:40,074 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:40,076 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:40,077 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2150> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:48:40,080 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:40,083 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:40,084 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:40,084 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,085 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,085 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,086 DEBUG storage.ui: adding 24573889 (11998MB) to 27 (sdc1) >03:48:40,087 DEBUG storage.ui: taking back 23789505 (11615MB) from 27 (sdc1) >03:48:40,087 DEBUG storage.ui: new grow amount for request 27 (sdc1) is 784384 units, or 383MB >03:48:40,088 DEBUG storage.ui: request 27 (sdc1) growth: 784384 (383MB) size: 384MB >03:48:40,088 DEBUG storage.ui: disk /dev/sdc growth: 784384 (383MB) >03:48:40,089 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:40,089 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:40,090 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,090 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,091 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,092 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:40,092 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:40,093 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:40,093 DEBUG storage.ui: request 25 (sda1) growth: 784384 (383MB) size: 384MB >03:48:40,094 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:48:40,096 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:40,097 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:40,099 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:48:40,101 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:40,101 DEBUG storage.ui: total growth: 2353152 sectors >03:48:40,102 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:48:40,102 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:40,103 DEBUG storage.ui: new free allows for 2353152 sectors of growth >03:48:40,104 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:40,105 DEBUG storage.ui: created partition sdc1 of 1MB and added it to /dev/sdc >03:48:40,107 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:40,107 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fae950> PedPartition: <_ped.Partition object at 0x7fae05b0b110> >03:48:40,110 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:40,112 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:40,115 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:40,116 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04faef10> PedPartition: <_ped.Partition object at 0x7fae04fb3b90> >03:48:40,118 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:40,120 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:40,121 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:40,123 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:40,126 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:40,126 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:40,127 DEBUG storage.ui: allocating partition: req4 ; id: 28 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:48:40,127 DEBUG storage.ui: checking freespace on sdd >03:48:40,129 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=1MB boot=False best=None grow=True >03:48:40,129 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:40,130 DEBUG storage.ui: evaluating growth potential for new layout >03:48:40,131 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:40,131 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:40,134 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:40,135 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89650> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:48:40,137 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:40,140 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:40,141 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:48:40,141 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,142 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,142 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,143 DEBUG storage.ui: adding 24573889 (11998MB) to 28 (sdd1) >03:48:40,144 DEBUG storage.ui: taking back 23789505 (11615MB) from 28 (sdd1) >03:48:40,144 DEBUG storage.ui: new grow amount for request 28 (sdd1) is 784384 units, or 383MB >03:48:40,145 DEBUG storage.ui: request 28 (sdd1) growth: 784384 (383MB) size: 384MB >03:48:40,145 DEBUG storage.ui: disk /dev/sdd growth: 784384 (383MB) >03:48:40,146 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:40,147 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:40,148 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,148 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,149 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,149 DEBUG storage.ui: adding 24573889 (11998MB) to 26 (sdb1) >03:48:40,150 DEBUG storage.ui: taking back 23789505 (11615MB) from 26 (sdb1) >03:48:40,151 DEBUG storage.ui: new grow amount for request 26 (sdb1) is 784384 units, or 383MB >03:48:40,151 DEBUG storage.ui: request 26 (sdb1) growth: 784384 (383MB) size: 384MB >03:48:40,152 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:48:40,152 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:40,153 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:40,154 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,154 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,155 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,156 DEBUG storage.ui: adding 24573889 (11998MB) to 27 (sdc1) >03:48:40,156 DEBUG storage.ui: taking back 23789505 (11615MB) from 27 (sdc1) >03:48:40,157 DEBUG storage.ui: new grow amount for request 27 (sdc1) is 784384 units, or 383MB >03:48:40,157 DEBUG storage.ui: request 27 (sdc1) growth: 784384 (383MB) size: 384MB >03:48:40,157 DEBUG storage.ui: disk /dev/sdc growth: 784384 (383MB) >03:48:40,158 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:40,158 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:40,159 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,159 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,159 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,160 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:40,160 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:40,160 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:40,161 DEBUG storage.ui: request 25 (sda1) growth: 784384 (383MB) size: 384MB >03:48:40,161 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:48:40,163 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:40,164 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:40,166 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:48:40,168 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:40,168 DEBUG storage.ui: total growth: 3137536 sectors >03:48:40,168 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:48:40,169 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:40,169 DEBUG storage.ui: new free allows for 3137536 sectors of growth >03:48:40,170 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:40,171 DEBUG storage.ui: created partition sdd1 of 1MB and added it to /dev/sdd >03:48:40,173 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:40,174 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f897d0> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:48:40,176 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:40,178 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:40,180 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:40,181 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89e90> PedPartition: <_ped.Partition object at 0x7fae04fb3c50> >03:48:40,182 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda1(id 25)', 'sdb1(id 26)', 'sdc1(id 27)', 'sdd1(id 28)'] >03:48:40,182 DEBUG storage.ui: growable partitions are ['sda1', 'sdb1', 'sdc1', 'sdd1'] >03:48:40,183 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:40,183 DEBUG storage.ui: disk sda has 1 chunks >03:48:40,184 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:40,184 DEBUG storage.ui: disk sdb has 1 chunks >03:48:40,185 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:40,185 DEBUG storage.ui: disk sdc has 1 chunks >03:48:40,185 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:48:40,186 DEBUG storage.ui: disk sdd has 1 chunks >03:48:40,186 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,187 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,187 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,187 DEBUG storage.ui: adding 24573889 (11998MB) to 25 (sda1) >03:48:40,188 DEBUG storage.ui: taking back 23789505 (11615MB) from 25 (sda1) >03:48:40,188 DEBUG storage.ui: new grow amount for request 25 (sda1) is 784384 units, or 383MB >03:48:40,188 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,189 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,189 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,189 DEBUG storage.ui: adding 24573889 (11998MB) to 26 (sdb1) >03:48:40,190 DEBUG storage.ui: taking back 23789505 (11615MB) from 26 (sdb1) >03:48:40,190 DEBUG storage.ui: new grow amount for request 26 (sdb1) is 784384 units, or 383MB >03:48:40,191 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,191 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,191 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,192 DEBUG storage.ui: adding 24573889 (11998MB) to 27 (sdc1) >03:48:40,192 DEBUG storage.ui: taking back 23789505 (11615MB) from 27 (sdc1) >03:48:40,193 DEBUG storage.ui: new grow amount for request 27 (sdc1) is 784384 units, or 383MB >03:48:40,193 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:40,193 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:48:40,194 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:40,194 DEBUG storage.ui: adding 24573889 (11998MB) to 28 (sdd1) >03:48:40,194 DEBUG storage.ui: taking back 23789505 (11615MB) from 28 (sdd1) >03:48:40,195 DEBUG storage.ui: new grow amount for request 28 (sdd1) is 784384 units, or 383MB >03:48:40,195 DEBUG storage.ui: set: ['sda1', 'sdb1', 'sdc1', 'sdd1'] 384 >03:48:40,195 DEBUG storage.ui: min growth is 784384 >03:48:40,196 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:40,196 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:40,197 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:40,197 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 28 name = sdd1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:40,197 DEBUG storage.ui: set: ['sda1', 'sdb1', 'sdc1', 'sdd1'] 384 >03:48:40,198 DEBUG storage.ui: min growth is 784384 >03:48:40,198 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 25 name = sda1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:40,198 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 26 name = sdb1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:40,199 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 27 name = sdc1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:40,199 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 28 name = sdd1 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:48:40,200 DEBUG storage.ui: growing partitions on sda >03:48:40,200 DEBUG storage.ui: partition sda1 (25): 0 >03:48:40,201 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fb20d0> >03:48:40,201 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 25)'] from disk(s) ['sda'] >03:48:40,204 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:40,204 DEBUG storage.ui: device sda1 new partedPartition None >03:48:40,206 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:40,208 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:40,208 DEBUG storage.ui: back from removeNewPartitions >03:48:40,209 DEBUG storage.ui: extended: None >03:48:40,209 DEBUG storage.ui: setting req1 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fb20d0> >03:48:40,211 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:40,212 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2650> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:48:40,214 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:40,217 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:40,219 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:40,220 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2850> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:48:40,220 DEBUG storage.ui: growing partitions on sdb >03:48:40,221 DEBUG storage.ui: partition sdb1 (26): 0 >03:48:40,221 DEBUG storage.ui: new geometry for sdb1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fa0990> >03:48:40,222 DEBUG storage.ui: removing all non-preexisting partitions ['sdb1(id 26)'] from disk(s) ['sdb'] >03:48:40,224 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:40,225 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:40,227 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:48:40,229 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:40,230 DEBUG storage.ui: back from removeNewPartitions >03:48:40,230 DEBUG storage.ui: extended: None >03:48:40,230 DEBUG storage.ui: setting req2 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fa0990> >03:48:40,232 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:40,233 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04faef90> PedPartition: <_ped.Partition object at 0x7fae04fb3830> >03:48:40,235 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:40,237 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:40,240 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:40,240 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2590> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:48:40,241 DEBUG storage.ui: growing partitions on sdc >03:48:40,241 DEBUG storage.ui: partition sdc1 (27): 0 >03:48:40,242 DEBUG storage.ui: new geometry for sdc1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04faebd0> >03:48:40,242 DEBUG storage.ui: removing all non-preexisting partitions ['sdc1(id 27)'] from disk(s) ['sdc'] >03:48:40,244 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:40,245 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:40,247 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:48:40,249 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:40,249 DEBUG storage.ui: back from removeNewPartitions >03:48:40,249 DEBUG storage.ui: extended: None >03:48:40,250 DEBUG storage.ui: setting req3 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04faebd0> >03:48:40,252 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:40,253 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04faee50> PedPartition: <_ped.Partition object at 0x7fae04fb3890> >03:48:40,255 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:40,257 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:40,259 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:40,260 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fae910> PedPartition: <_ped.Partition object at 0x7fae05b0b110> >03:48:40,261 DEBUG storage.ui: growing partitions on sdd >03:48:40,261 DEBUG storage.ui: partition sdd1 (28): 0 >03:48:40,262 DEBUG storage.ui: new geometry for sdd1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04faef10> >03:48:40,262 DEBUG storage.ui: removing all non-preexisting partitions ['sdd1(id 28)'] from disk(s) ['sdd'] >03:48:40,264 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:40,264 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:40,266 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:48:40,268 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:40,269 DEBUG storage.ui: back from removeNewPartitions >03:48:40,269 DEBUG storage.ui: extended: None >03:48:40,269 DEBUG storage.ui: setting req4 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04faef10> >03:48:40,272 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:40,273 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb25d0> PedPartition: <_ped.Partition object at 0x7fae04fb3b90> >03:48:40,275 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:40,277 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:40,279 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:40,280 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2510> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:48:40,281 DEBUG storage.ui: fixing size of non-existent 384MB partition sda1 (25) with non-existent mdmember at 384.00 >03:48:40,282 DEBUG storage.ui: fixing size of non-existent 384MB partition sdb1 (26) with non-existent mdmember at 384.00 >03:48:40,282 DEBUG storage.ui: fixing size of non-existent 384MB partition sdc1 (27) with non-existent mdmember at 384.00 >03:48:40,283 DEBUG storage.ui: fixing size of non-existent 384MB partition sdd1 (28) with non-existent mdmember at 384.00 >03:48:40,284 DEBUG storage.ui: new member set: ['sda1', 'sdb1', 'sdc1', 'sdd1'] >03:48:40,285 DEBUG storage.ui: old member set: ['sda1', 'sdb1', 'sdc1', 'sdd1'] >03:48:40,286 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:48:40,286 INFO storage.ui: Using 0MB superBlockSize >03:48:40,287 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:48:40,288 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:48:40,288 INFO storage.ui: Using 0MB superBlockSize >03:48:40,289 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:48:40,292 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:40,292 INFO blivet: Using 0MB superBlockSize >03:48:40,293 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:40,296 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:40,296 INFO blivet: Using 0MB superBlockSize >03:48:40,297 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:40,311 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:40,311 INFO blivet: Using 0MB superBlockSize >03:48:40,312 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:40,316 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:40,316 INFO blivet: Using 0MB superBlockSize >03:48:40,316 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:47,273 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512.0, [], {} >03:48:47,279 DEBUG storage.ui: Blivet.factoryDevice: 2 ; 512.0 ; mountpoint: /boot ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 1 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 1 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 1 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 1 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; fstype: ext4 ; encrypted: False ; >03:48:47,281 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512.0, ['sda', 'sdb', 'sdc', 'sdd'], {'mountpoint': '/boot', 'fstype': 'ext4', 'encrypted': False} >03:48:47,282 DEBUG storage.ui: PartitionFactory.configure: parent_factory: None ; >03:48:47,283 DEBUG storage.ui: starting Blivet copy >03:48:47,328 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:47,329 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fa30d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efc710> PedPartition: <_ped.Partition object at 0x7fae04fb3fb0> >03:48:47,331 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:47,333 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f98650> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efc790> PedPartition: <_ped.Partition object at 0x7fae04fb3890> >03:48:47,335 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:47,336 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fa3c90> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efc8d0> PedPartition: <_ped.Partition object at 0x7fae04fb3830> >03:48:47,339 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:47,340 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fa3a50> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efca10> PedPartition: <_ped.Partition object at 0x7fae04fb3e30> >03:48:47,340 DEBUG storage.ui: finished Blivet copy >03:48:47,342 DEBUG storage.ui: Ext4FS.supported: supported: True ; >03:48:47,343 DEBUG storage.ui: getFormat('ext4') returning Ext4FS instance >03:48:47,345 DEBUG storage.ui: Ext4FS.supported: supported: True ; >03:48:47,346 DEBUG storage.ui: getFormat('ext4') returning Ext4FS instance >03:48:47,347 DEBUG storage.ui: Ext4FS.supported: supported: True ; >03:48:47,348 DEBUG storage.ui: getFormat('ext4') returning Ext4FS instance >03:48:47,350 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:48:47,352 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:48:47,354 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:48:47,356 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:48:47,358 DEBUG storage.ui: PartitionDevice._setFormat: req5 ; >03:48:47,360 DEBUG storage.ui: PartitionDevice._setFormat: req5 ; current: None ; type: ext4 ; >03:48:47,362 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:48:47,364 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:48:47,365 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:48:47,367 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:48:47,368 INFO storage.ui: added partition req5 (id 30) to device tree >03:48:47,369 INFO storage.ui: registered action: [45] Create Device partition req5 (id 30) >03:48:47,369 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:47,370 INFO storage.ui: registered action: [46] Create Format ext4 filesystem mounted at /boot on partition req5 (id 30) >03:48:47,372 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:48:47,374 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:48:47,376 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:48:47,377 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:48:47,379 DEBUG storage.ui: removing all non-preexisting partitions ['req5(id 30)', 'sda1(id 25)', 'sdb1(id 26)', 'sdc1(id 27)', 'sdd1(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:47,381 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:47,381 DEBUG storage.ui: device sda1 new partedPartition None >03:48:47,383 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:47,389 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:47,391 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:47,392 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:47,394 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:48:47,396 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:47,398 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:47,398 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:47,400 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:48:47,402 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:47,404 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:47,404 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:47,406 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:48:47,408 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:47,409 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req5(id 30)', 'req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] >03:48:47,410 DEBUG storage.ui: removing all non-preexisting partitions ['req5(id 30)', 'req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:47,412 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,414 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,415 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,417 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,418 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,419 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,421 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,423 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,423 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,425 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,428 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,428 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,430 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,432 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,432 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,434 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,437 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,437 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,439 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,441 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,441 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,443 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,445 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,446 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,446 DEBUG storage.ui: allocating partition: req5 ; id: 30 ; disks: ['sda', 'sdb', 'sdc', 'sdd'] ; >boot: True ; primary: False ; size: 1MB ; grow: True ; max_size: 512.0 >03:48:47,447 DEBUG storage.ui: checking freespace on sda >03:48:47,448 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=True best=None grow=True >03:48:47,449 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:47,449 DEBUG storage.ui: evaluating growth potential for new layout >03:48:47,449 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:47,450 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,450 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:47,450 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:47,451 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,451 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:47,451 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:47,452 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,452 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:47,453 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:47,453 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:47,456 DEBUG storage.ui: PartitionDevice._setPartedPartition: req5 ; >03:48:47,456 DEBUG storage.ui: device req5 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efa610> PedPartition: <_ped.Partition object at 0x7fae05afbf50> >03:48:47,459 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:47,461 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:47,461 DEBUG storage.ui: adding request 30 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,462 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,462 DEBUG storage.ui: req: PartitionRequest instance -- >id = 30 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:48:47,463 DEBUG storage.ui: 1 requests and 24573889 (11998MB) left in chunk >03:48:47,463 DEBUG storage.ui: adding 24573889 (11998MB) to 30 (sda1) >03:48:47,463 DEBUG storage.ui: taking back 23527361 (11487MB) from 30 (sda1) >03:48:47,464 DEBUG storage.ui: new grow amount for request 30 (sda1) is 1046528 units, or 511MB >03:48:47,464 DEBUG storage.ui: request 30 (sda1) growth: 1046528 (511MB) size: 512MB >03:48:47,464 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:48:47,466 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:47,467 DEBUG storage.ui: device sda1 new partedPartition None >03:48:47,468 DEBUG storage.ui: PartitionDevice._setDisk: req5 ; new: None ; old: sda ; >03:48:47,471 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:47,471 DEBUG storage.ui: total growth: 1046528 sectors >03:48:47,471 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:48:47,472 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:47,472 DEBUG storage.ui: new free allows for 1046528 sectors of growth >03:48:47,473 DEBUG storage.ui: found free space for bootable request >03:48:47,473 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:47,474 DEBUG storage.ui: created partition sda1 of 1MB and added it to /dev/sda >03:48:47,476 DEBUG storage.ui: PartitionDevice._setPartedPartition: req5 ; >03:48:47,477 DEBUG storage.ui: device req5 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efae90> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:48:47,479 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:47,482 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:47,484 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:47,485 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efa150> PedPartition: <_ped.Partition object at 0x7fae04fb3dd0> >03:48:47,487 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,489 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,489 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,491 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,493 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,494 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,494 DEBUG storage.ui: allocating partition: req1 ; id: 25 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:48:47,495 DEBUG storage.ui: checking freespace on sda >03:48:47,495 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=384MB boot=False best=None grow=False >03:48:47,496 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:48:47,497 DEBUG storage.ui: current free range is 4096-24575999 (11998MB) >03:48:47,497 DEBUG storage.ui: evaluating growth potential for new layout >03:48:47,497 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:47,498 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,498 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:47,499 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:47,499 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,499 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:47,500 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:47,500 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,500 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:47,501 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:47,503 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:47,504 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb20d0> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:48:47,506 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:48:47,508 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:48:47,509 DEBUG storage.ui: adding request 30 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,509 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,509 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,510 DEBUG storage.ui: req: PartitionRequest instance -- >id = 30 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:48:47,510 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,511 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:48:47,511 DEBUG storage.ui: adding 23787457 (11614MB) to 30 (sda1) >03:48:47,512 DEBUG storage.ui: taking back 22740929 (11103MB) from 30 (sda1) >03:48:47,512 DEBUG storage.ui: new grow amount for request 30 (sda1) is 1046528 units, or 511MB >03:48:47,512 DEBUG storage.ui: request 30 (sda1) growth: 1046528 (511MB) size: 512MB >03:48:47,513 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:48:47,513 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:48:47,515 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:48:47,516 DEBUG storage.ui: device sda2 new partedPartition None >03:48:47,517 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:47,519 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:48:47,520 DEBUG storage.ui: total growth: 1046528 sectors >03:48:47,520 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:48:47,521 DEBUG storage.ui: new free: 4096-24575999 / 11998MB >03:48:47,521 DEBUG storage.ui: new free allows for 1046528 sectors of growth >03:48:47,522 DEBUG storage.ui: created partition sda2 of 384MB and added it to /dev/sda >03:48:47,524 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:47,525 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2710> PedPartition: <_ped.Partition object at 0x7fae05afbf50> >03:48:47,527 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:48:47,529 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:48:47,532 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:48:47,532 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efa650> PedPartition: <_ped.Partition object at 0x7fae04fb3b90> >03:48:47,535 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,537 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,537 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,539 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,541 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,542 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,542 DEBUG storage.ui: allocating partition: req2 ; id: 26 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:48:47,542 DEBUG storage.ui: checking freespace on sdb >03:48:47,543 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=384MB boot=False best=None grow=False >03:48:47,544 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:47,544 DEBUG storage.ui: evaluating growth potential for new layout >03:48:47,545 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:47,545 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,546 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:47,546 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:47,546 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:47,549 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:47,550 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2190> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:48:47,552 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:47,554 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:47,554 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:47,555 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,555 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,556 DEBUG storage.ui: request 26 (sdb1) growth: 0 (0MB) size: 384MB >03:48:47,556 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:47,556 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:47,557 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,557 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:47,557 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:47,558 DEBUG storage.ui: adding request 30 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,558 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,559 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,559 DEBUG storage.ui: req: PartitionRequest instance -- >id = 30 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:48:47,559 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,560 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:48:47,560 DEBUG storage.ui: adding 23787457 (11614MB) to 30 (sda1) >03:48:47,561 DEBUG storage.ui: taking back 22740929 (11103MB) from 30 (sda1) >03:48:47,561 DEBUG storage.ui: new grow amount for request 30 (sda1) is 1046528 units, or 511MB >03:48:47,562 DEBUG storage.ui: request 30 (sda1) growth: 1046528 (511MB) size: 512MB >03:48:47,562 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:48:47,562 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:48:47,564 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:47,565 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:47,566 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:48:47,568 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:47,569 DEBUG storage.ui: total growth: 1046528 sectors >03:48:47,569 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:48:47,569 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:47,570 DEBUG storage.ui: new free allows for 1046528 sectors of growth >03:48:47,570 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:47,571 DEBUG storage.ui: created partition sdb1 of 384MB and added it to /dev/sdb >03:48:47,573 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:47,574 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5a10> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:48:47,576 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:47,578 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:47,581 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:47,581 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2890> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:48:47,584 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,586 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,586 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,589 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,591 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,591 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,591 DEBUG storage.ui: allocating partition: req3 ; id: 27 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:48:47,592 DEBUG storage.ui: checking freespace on sdc >03:48:47,593 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=384MB boot=False best=None grow=False >03:48:47,593 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:47,594 DEBUG storage.ui: evaluating growth potential for new layout >03:48:47,594 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:47,594 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,595 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:47,595 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:47,596 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:47,596 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,597 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,597 DEBUG storage.ui: request 26 (sdb1) growth: 0 (0MB) size: 384MB >03:48:47,598 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:47,598 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:47,598 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:47,601 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:47,601 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2950> PedPartition: <_ped.Partition object at 0x7fae05b0b110> >03:48:47,603 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:47,605 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:47,606 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:47,606 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,607 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,607 DEBUG storage.ui: request 27 (sdc1) growth: 0 (0MB) size: 384MB >03:48:47,607 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:47,608 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:47,608 DEBUG storage.ui: adding request 30 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,609 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,609 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,609 DEBUG storage.ui: req: PartitionRequest instance -- >id = 30 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:48:47,610 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,610 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:48:47,610 DEBUG storage.ui: adding 23787457 (11614MB) to 30 (sda1) >03:48:47,611 DEBUG storage.ui: taking back 22740929 (11103MB) from 30 (sda1) >03:48:47,611 DEBUG storage.ui: new grow amount for request 30 (sda1) is 1046528 units, or 511MB >03:48:47,612 DEBUG storage.ui: request 30 (sda1) growth: 1046528 (511MB) size: 512MB >03:48:47,612 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:48:47,612 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:48:47,614 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:47,614 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:47,616 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:48:47,618 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:47,618 DEBUG storage.ui: total growth: 1046528 sectors >03:48:47,619 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:48:47,619 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:47,619 DEBUG storage.ui: new free allows for 1046528 sectors of growth >03:48:47,620 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:47,621 DEBUG storage.ui: created partition sdc1 of 384MB and added it to /dev/sdc >03:48:47,623 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:47,623 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb21d0> PedPartition: <_ped.Partition object at 0x7fae05afbf50> >03:48:47,625 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:47,627 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:47,630 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:47,630 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2b10> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:48:47,632 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,634 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,635 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,637 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:47,639 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:47,639 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:47,640 DEBUG storage.ui: allocating partition: req4 ; id: 28 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:48:47,640 DEBUG storage.ui: checking freespace on sdd >03:48:47,641 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=384MB boot=False best=None grow=False >03:48:47,642 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:47,642 DEBUG storage.ui: evaluating growth potential for new layout >03:48:47,642 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:47,643 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:47,645 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:47,646 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5c10> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:48:47,648 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:47,650 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:47,650 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:48:47,650 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,651 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,651 DEBUG storage.ui: request 28 (sdd1) growth: 0 (0MB) size: 384MB >03:48:47,652 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:47,652 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:47,652 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:47,653 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,653 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,654 DEBUG storage.ui: request 26 (sdb1) growth: 0 (0MB) size: 384MB >03:48:47,654 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:47,654 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:47,655 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:47,655 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,656 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,656 DEBUG storage.ui: request 27 (sdc1) growth: 0 (0MB) size: 384MB >03:48:47,656 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:47,657 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:47,657 DEBUG storage.ui: adding request 30 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,658 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,658 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,659 DEBUG storage.ui: req: PartitionRequest instance -- >id = 30 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:48:47,659 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,659 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:48:47,660 DEBUG storage.ui: adding 23787457 (11614MB) to 30 (sda1) >03:48:47,660 DEBUG storage.ui: taking back 22740929 (11103MB) from 30 (sda1) >03:48:47,661 DEBUG storage.ui: new grow amount for request 30 (sda1) is 1046528 units, or 511MB >03:48:47,661 DEBUG storage.ui: request 30 (sda1) growth: 1046528 (511MB) size: 512MB >03:48:47,662 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:48:47,662 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:48:47,664 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:47,665 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:47,667 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:48:47,669 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:47,669 DEBUG storage.ui: total growth: 1046528 sectors >03:48:47,669 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:48:47,670 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:47,670 DEBUG storage.ui: new free allows for 1046528 sectors of growth >03:48:47,671 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:47,672 DEBUG storage.ui: created partition sdd1 of 384MB and added it to /dev/sdd >03:48:47,674 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:47,675 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5c90> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:48:47,676 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:47,679 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:47,681 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:47,682 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5990> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:48:47,682 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda1(id 30)', 'sda2(id 25)', 'sdb1(id 26)', 'sdc1(id 27)', 'sdd1(id 28)'] >03:48:47,683 DEBUG storage.ui: growable partitions are ['sda1'] >03:48:47,683 DEBUG storage.ui: adding request 30 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,684 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:47,684 DEBUG storage.ui: disk sda has 1 chunks >03:48:47,684 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:47,685 DEBUG storage.ui: disk sdb has 1 chunks >03:48:47,685 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:47,686 DEBUG storage.ui: disk sdc has 1 chunks >03:48:47,686 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:48:47,687 DEBUG storage.ui: disk sdd has 1 chunks >03:48:47,687 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:47,687 DEBUG storage.ui: req: PartitionRequest instance -- >id = 30 name = sda1 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:48:47,688 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:47,688 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:48:47,688 DEBUG storage.ui: adding 23787457 (11614MB) to 30 (sda1) >03:48:47,689 DEBUG storage.ui: taking back 22740929 (11103MB) from 30 (sda1) >03:48:47,689 DEBUG storage.ui: new grow amount for request 30 (sda1) is 1046528 units, or 511MB >03:48:47,689 DEBUG storage.ui: growing partitions on sda >03:48:47,690 DEBUG storage.ui: partition sda1 (30): 0 >03:48:47,690 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fa5c50> >03:48:47,691 DEBUG storage.ui: partition sda2 (25): 0 >03:48:47,691 DEBUG storage.ui: new geometry for sda2: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fa5450> >03:48:47,692 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 30)', 'sda2(id 25)'] from disk(s) ['sda'] >03:48:47,694 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:47,694 DEBUG storage.ui: device sda1 new partedPartition None >03:48:47,696 DEBUG storage.ui: PartitionDevice._setDisk: req5 ; new: None ; old: sda ; >03:48:47,698 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:48:47,700 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:48:47,700 DEBUG storage.ui: device sda2 new partedPartition None >03:48:47,702 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:47,704 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:47,705 DEBUG storage.ui: back from removeNewPartitions >03:48:47,705 DEBUG storage.ui: extended: None >03:48:47,705 DEBUG storage.ui: setting req5 new geometry: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fa5c50> >03:48:47,709 DEBUG storage.ui: PartitionDevice._setPartedPartition: req5 ; >03:48:47,710 DEBUG storage.ui: device req5 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efa710> PedPartition: <_ped.Partition object at 0x7fae04fb3bf0> >03:48:47,712 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:47,714 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:47,716 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:47,717 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2290> PedPartition: <_ped.Partition object at 0x7fae05afbf50> >03:48:47,717 DEBUG storage.ui: setting req1 new geometry: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fa5450> >03:48:47,720 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:47,720 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00810> PedPartition: <_ped.Partition object at 0x7fae04fb3d10> >03:48:47,723 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:48:47,725 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:48:47,727 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:48:47,728 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5e90> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:48:47,729 DEBUG storage.ui: growing partitions on sdb >03:48:47,729 DEBUG storage.ui: growing partitions on sdc >03:48:47,730 DEBUG storage.ui: growing partitions on sdd >03:48:47,730 DEBUG storage.ui: fixing size of non-existent 512MB partition sda1 (30) with non-existent ext4 filesystem mounted at /boot at 512.00 >03:48:47,731 DEBUG storage.ui: fixing size of non-existent 384MB partition sda2 (25) with non-existent mdmember at 384.00 >03:48:47,732 DEBUG storage.ui: fixing size of non-existent 384MB partition sdb1 (26) with non-existent mdmember at 384.00 >03:48:47,733 DEBUG storage.ui: fixing size of non-existent 384MB partition sdc1 (27) with non-existent mdmember at 384.00 >03:48:47,733 DEBUG storage.ui: fixing size of non-existent 384MB partition sdd1 (28) with non-existent mdmember at 384.00 >03:48:47,745 DEBUG blivet: raw RAID 10 size == 768.0 >03:48:47,745 INFO blivet: Using 0MB superBlockSize >03:48:47,746 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:48:47,757 DEBUG blivet: Ext4FS.supported: supported: True ; >03:48:47,758 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:48:47,792 DEBUG blivet: Ext4FS.supported: supported: True ; >03:48:47,793 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:48:47,797 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda'], {'encrypted': False, 'raid_level': None} >03:48:59,610 DEBUG blivet: Ext4FS.supported: supported: True ; >03:48:59,611 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:48:59,618 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda'], {'encrypted': False, 'raid_level': 'raid1'} >03:48:59,624 INFO storage.ui: removed partition sda1 (id 30) from device tree >03:48:59,626 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:48:59,626 INFO storage.ui: registered action: [47] Destroy Device partition sda1 (id 30) >03:48:59,633 DEBUG storage.ui: Blivet.factoryDevice: 1 ; 512 ; container_raid_level: None ; name: boot ; encrypted: False ; container_encrypted: False ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 1 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 1 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 1 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 1 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; raid_level: raid1 ; label: ; container_name: None ; device: None ; mountpoint: /boot ; fstype: ext4 ; container_size: 0 ; >03:48:59,635 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda', 'sdb', 'sdc', 'sdd'], {'container_raid_level': None, 'name': 'boot', 'encrypted': False, 'container_encrypted': False, 'raid_level': 'raid1', 'label': '', 'container_name': None, 'device': None, 'mountpoint': '/boot', 'fstype': 'ext4', 'container_size': 0} >03:48:59,636 DEBUG storage.ui: MDFactory.configure: parent_factory: None ; >03:48:59,637 DEBUG storage.ui: starting Blivet copy >03:48:59,679 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:48:59,680 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fae790> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89950> PedPartition: <_ped.Partition object at 0x7fae05b0b110> >03:48:59,683 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:59,684 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f85fd0> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89ad0> PedPartition: <_ped.Partition object at 0x7fae04fb3d10> >03:48:59,687 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:59,688 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f86990> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89890> PedPartition: <_ped.Partition object at 0x7fae04fb3bf0> >03:48:59,691 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:59,692 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fa0cd0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89bd0> PedPartition: <_ped.Partition object at 0x7fae04fb3c50> >03:48:59,692 DEBUG storage.ui: finished Blivet copy >03:48:59,693 INFO storage.ui: Using 0MB superBlockSize >03:48:59,694 DEBUG storage.ui: child factory class: <class 'blivet.devicefactory.PartitionSetFactory'> >03:48:59,699 DEBUG storage.ui: child factory args: [<blivet.Blivet object at 0x7fae05326d50>, 2048, [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 1 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 1 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 1 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 1 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>]] >03:48:59,701 DEBUG storage.ui: child factory kwargs: {'fstype': 'mdmember'} >03:48:59,703 DEBUG storage.ui: PartitionSetFactory.configure: parent_factory: <blivet.devicefactory.MDFactory object at 0x7fae05b09d50> ; >03:48:59,703 DEBUG storage.ui: parent factory container: None >03:48:59,704 DEBUG storage.ui: members: [] >03:48:59,706 DEBUG storage.ui: add_disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:48:59,707 DEBUG storage.ui: remove_disks: [] >03:48:59,709 DEBUG storage.ui: MDRaidMember.__init__: >03:48:59,709 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:59,711 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:48:59,712 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:59,714 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:48:59,716 DEBUG storage.ui: PartitionDevice._setFormat: req6 ; >03:48:59,719 DEBUG storage.ui: PartitionDevice._setFormat: req6 ; current: None ; type: mdmember ; >03:48:59,721 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:48:59,722 INFO storage.ui: added partition req6 (id 31) to device tree >03:48:59,722 INFO storage.ui: registered action: [48] Create Device partition req6 (id 31) >03:48:59,723 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:59,724 INFO storage.ui: registered action: [49] Create Format mdmember on partition req6 (id 31) >03:48:59,726 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:48:59,726 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:59,729 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:48:59,731 DEBUG storage.ui: PartitionDevice._setFormat: req7 ; >03:48:59,733 DEBUG storage.ui: PartitionDevice._setFormat: req7 ; current: None ; type: mdmember ; >03:48:59,735 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:48:59,736 INFO storage.ui: added partition req7 (id 32) to device tree >03:48:59,737 INFO storage.ui: registered action: [50] Create Device partition req7 (id 32) >03:48:59,737 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:59,738 INFO storage.ui: registered action: [51] Create Format mdmember on partition req7 (id 32) >03:48:59,740 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:48:59,741 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:59,743 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:48:59,745 DEBUG storage.ui: PartitionDevice._setFormat: req8 ; >03:48:59,747 DEBUG storage.ui: PartitionDevice._setFormat: req8 ; current: None ; type: mdmember ; >03:48:59,749 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:48:59,750 INFO storage.ui: added partition req8 (id 33) to device tree >03:48:59,751 INFO storage.ui: registered action: [52] Create Device partition req8 (id 33) >03:48:59,751 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:59,752 INFO storage.ui: registered action: [53] Create Format mdmember on partition req8 (id 33) >03:48:59,754 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:48:59,755 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:48:59,757 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:48:59,759 DEBUG storage.ui: PartitionDevice._setFormat: req9 ; >03:48:59,761 DEBUG storage.ui: PartitionDevice._setFormat: req9 ; current: None ; type: mdmember ; >03:48:59,763 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:48:59,764 INFO storage.ui: added partition req9 (id 34) to device tree >03:48:59,765 INFO storage.ui: registered action: [54] Create Device partition req9 (id 34) >03:48:59,765 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:48:59,766 INFO storage.ui: registered action: [55] Create Format mdmember on partition req9 (id 34) >03:48:59,767 INFO storage.ui: Using 0MB superBlockSize >03:48:59,767 DEBUG storage.ui: adding a SameSizeSet with size 2048 >03:48:59,770 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:48:59,772 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:48:59,775 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:48:59,777 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:48:59,778 DEBUG storage.ui: removing all non-preexisting partitions ['req6(id 31)', 'req7(id 32)', 'req8(id 33)', 'req9(id 34)', 'sda2(id 25)', 'sdb1(id 26)', 'sdc1(id 27)', 'sdd1(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:59,781 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:48:59,782 DEBUG storage.ui: device sda2 new partedPartition None >03:48:59,784 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:48:59,786 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:48:59,789 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:59,789 DEBUG storage.ui: device sdb1 new partedPartition None >03:48:59,791 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:48:59,795 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:48:59,797 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:59,798 DEBUG storage.ui: device sdc1 new partedPartition None >03:48:59,800 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:48:59,802 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:48:59,805 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:59,805 DEBUG storage.ui: device sdd1 new partedPartition None >03:48:59,808 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:48:59,810 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:48:59,812 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req6(id 31)', 'req7(id 32)', 'req8(id 33)', 'req9(id 34)', 'req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] >03:48:59,813 DEBUG storage.ui: removing all non-preexisting partitions ['req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)', 'req6(id 31)', 'req7(id 32)', 'req8(id 33)', 'req9(id 34)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:48:59,815 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,818 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,818 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,821 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,823 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,824 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,824 DEBUG storage.ui: allocating partition: req1 ; id: 25 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:48:59,825 DEBUG storage.ui: checking freespace on sda >03:48:59,826 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=384MB boot=False best=None grow=False >03:48:59,827 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:59,828 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:48:59,828 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:59,829 DEBUG storage.ui: new free allows for 0 sectors of growth >03:48:59,830 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:59,831 DEBUG storage.ui: created partition sda1 of 384MB and added it to /dev/sda >03:48:59,833 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:48:59,834 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f78f90> PedPartition: <_ped.Partition object at 0x7fae04fb3dd0> >03:48:59,837 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:48:59,839 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:48:59,842 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:48:59,843 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f78e10> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:48:59,845 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,848 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,848 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,851 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,853 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,854 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,854 DEBUG storage.ui: allocating partition: req2 ; id: 26 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:48:59,855 DEBUG storage.ui: checking freespace on sdb >03:48:59,856 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=384MB boot=False best=None grow=False >03:48:59,857 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:59,858 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:48:59,858 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:59,859 DEBUG storage.ui: new free allows for 0 sectors of growth >03:48:59,859 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:59,860 DEBUG storage.ui: created partition sdb1 of 384MB and added it to /dev/sdb >03:48:59,863 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:48:59,864 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efcb90> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:48:59,867 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:48:59,869 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:48:59,872 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:48:59,874 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5e90> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:48:59,876 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,879 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,879 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,882 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,884 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,884 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,885 DEBUG storage.ui: allocating partition: req3 ; id: 27 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:48:59,886 DEBUG storage.ui: checking freespace on sdc >03:48:59,887 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=384MB boot=False best=None grow=False >03:48:59,888 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:59,888 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:48:59,889 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:59,890 DEBUG storage.ui: new free allows for 0 sectors of growth >03:48:59,890 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:59,892 DEBUG storage.ui: created partition sdc1 of 384MB and added it to /dev/sdc >03:48:59,894 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:48:59,895 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fae9d0> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:48:59,897 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:48:59,900 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:48:59,903 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:48:59,904 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f78b90> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:48:59,906 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,909 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,909 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,912 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,914 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,915 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,916 DEBUG storage.ui: allocating partition: req4 ; id: 28 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:48:59,916 DEBUG storage.ui: checking freespace on sdd >03:48:59,917 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=384MB boot=False best=None grow=False >03:48:59,918 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:48:59,919 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:48:59,920 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:48:59,920 DEBUG storage.ui: new free allows for 0 sectors of growth >03:48:59,921 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:48:59,922 DEBUG storage.ui: created partition sdd1 of 384MB and added it to /dev/sdd >03:48:59,924 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:48:59,925 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efc590> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:48:59,928 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:48:59,930 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:48:59,933 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:48:59,934 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2b10> PedPartition: <_ped.Partition object at 0x7fae04fb3b90> >03:48:59,936 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,939 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,939 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,942 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:48:59,944 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:48:59,945 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:48:59,946 DEBUG storage.ui: allocating partition: req6 ; id: 31 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 512 >03:48:59,946 DEBUG storage.ui: checking freespace on sda >03:48:59,947 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=False best=None grow=True >03:48:59,948 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:48:59,949 DEBUG storage.ui: current free range is 788480-24575999 (11615MB) >03:48:59,949 DEBUG storage.ui: evaluating growth potential for new layout >03:48:59,950 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:48:59,951 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:48:59,951 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:48:59,952 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:59,953 DEBUG storage.ui: request 28 (sdd1) growth: 0 (0MB) size: 384MB >03:48:59,953 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:48:59,954 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:48:59,954 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:48:59,955 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:48:59,956 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:59,956 DEBUG storage.ui: request 26 (sdb1) growth: 0 (0MB) size: 384MB >03:48:59,957 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:48:59,957 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:48:59,958 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:48:59,959 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:48:59,959 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:59,960 DEBUG storage.ui: request 27 (sdc1) growth: 0 (0MB) size: 384MB >03:48:59,961 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:48:59,961 DEBUG storage.ui: calculating growth for disk /dev/sda >03:48:59,964 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:48:59,965 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17050> PedPartition: <_ped.Partition object at 0x7fae04fb3950> >03:48:59,967 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:48:59,970 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:48:59,971 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:48:59,971 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:48:59,972 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:48:59,972 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:48:59,973 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:48:59,974 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:48:59,974 DEBUG storage.ui: adding 23787457 (11614MB) to 31 (sda2) >03:48:59,975 DEBUG storage.ui: taking back 22740929 (11103MB) from 31 (sda2) >03:48:59,975 DEBUG storage.ui: new grow amount for request 31 (sda2) is 1046528 units, or 511MB >03:48:59,976 DEBUG storage.ui: request 25 (sda1) growth: 0 (0MB) size: 384MB >03:48:59,977 DEBUG storage.ui: request 31 (sda2) growth: 1046528 (511MB) size: 512MB >03:48:59,977 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:48:59,979 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:48:59,980 DEBUG storage.ui: device sda2 new partedPartition None >03:48:59,982 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:48:59,985 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:48:59,985 DEBUG storage.ui: total growth: 1046528 sectors >03:48:59,986 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:48:59,987 DEBUG storage.ui: new free: 788480-24575999 / 11615MB >03:48:59,987 DEBUG storage.ui: new free allows for 1046528 sectors of growth >03:48:59,988 DEBUG storage.ui: created partition sda2 of 1MB and added it to /dev/sda >03:48:59,991 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:48:59,992 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17290> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:48:59,994 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:48:59,997 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:49:00,000 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:00,001 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89f90> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:00,003 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:00,006 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:00,006 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:00,009 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:00,011 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:00,012 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:00,012 DEBUG storage.ui: allocating partition: req7 ; id: 32 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 512 >03:49:00,013 DEBUG storage.ui: checking freespace on sdb >03:49:00,014 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=1MB boot=False best=None grow=True >03:49:00,014 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:00,016 DEBUG storage.ui: current free range is 788480-24575999 (11615MB) >03:49:00,016 DEBUG storage.ui: evaluating growth potential for new layout >03:49:00,017 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:00,018 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:00,018 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,019 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,019 DEBUG storage.ui: request 28 (sdd1) growth: 0 (0MB) size: 384MB >03:49:00,020 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:00,020 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:00,023 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:49:00,024 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efc350> PedPartition: <_ped.Partition object at 0x7fae04fb3dd0> >03:49:00,026 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:49:00,029 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:49:00,029 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:00,030 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:00,031 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,031 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,032 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,032 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,033 DEBUG storage.ui: adding 23787457 (11614MB) to 32 (sdb2) >03:49:00,034 DEBUG storage.ui: taking back 22740929 (11103MB) from 32 (sdb2) >03:49:00,034 DEBUG storage.ui: new grow amount for request 32 (sdb2) is 1046528 units, or 511MB >03:49:00,035 DEBUG storage.ui: request 26 (sdb1) growth: 0 (0MB) size: 384MB >03:49:00,035 DEBUG storage.ui: request 32 (sdb2) growth: 1046528 (511MB) size: 512MB >03:49:00,036 DEBUG storage.ui: disk /dev/sdb growth: 1046528 (511MB) >03:49:00,037 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:00,037 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:00,038 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,038 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,039 DEBUG storage.ui: request 27 (sdc1) growth: 0 (0MB) size: 384MB >03:49:00,040 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:49:00,040 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:00,041 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:00,042 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:00,042 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,043 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,043 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,044 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,044 DEBUG storage.ui: adding 23787457 (11614MB) to 31 (sda2) >03:49:00,045 DEBUG storage.ui: taking back 22740929 (11103MB) from 31 (sda2) >03:49:00,045 DEBUG storage.ui: new grow amount for request 31 (sda2) is 1046528 units, or 511MB >03:49:00,045 DEBUG storage.ui: request 25 (sda1) growth: 0 (0MB) size: 384MB >03:49:00,046 DEBUG storage.ui: request 31 (sda2) growth: 1046528 (511MB) size: 512MB >03:49:00,046 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:49:00,048 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:00,049 DEBUG storage.ui: device sdb2 new partedPartition None >03:49:00,050 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:49:00,052 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:49:00,053 DEBUG storage.ui: total growth: 2093056 sectors >03:49:00,053 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:00,053 DEBUG storage.ui: new free: 788480-24575999 / 11615MB >03:49:00,054 DEBUG storage.ui: new free allows for 2093056 sectors of growth >03:49:00,055 DEBUG storage.ui: created partition sdb2 of 1MB and added it to /dev/sdb >03:49:00,056 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:49:00,057 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17250> PedPartition: <_ped.Partition object at 0x7fae04fb3950> >03:49:00,059 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:49:00,061 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:49:00,064 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:00,065 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17290> PedPartition: <_ped.Partition object at 0x7fae04fb3e90> >03:49:00,067 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:00,069 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:00,069 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:00,072 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:00,074 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:00,074 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:00,074 DEBUG storage.ui: allocating partition: req8 ; id: 33 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 512 >03:49:00,075 DEBUG storage.ui: checking freespace on sdc >03:49:00,075 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=1MB boot=False best=None grow=True >03:49:00,076 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:00,077 DEBUG storage.ui: current free range is 788480-24575999 (11615MB) >03:49:00,077 DEBUG storage.ui: evaluating growth potential for new layout >03:49:00,077 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:00,078 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:00,078 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,079 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,079 DEBUG storage.ui: request 28 (sdd1) growth: 0 (0MB) size: 384MB >03:49:00,079 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:00,080 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:00,080 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:00,081 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:00,081 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,082 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,082 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,082 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,083 DEBUG storage.ui: adding 23787457 (11614MB) to 32 (sdb2) >03:49:00,083 DEBUG storage.ui: taking back 22740929 (11103MB) from 32 (sdb2) >03:49:00,083 DEBUG storage.ui: new grow amount for request 32 (sdb2) is 1046528 units, or 511MB >03:49:00,084 DEBUG storage.ui: request 26 (sdb1) growth: 0 (0MB) size: 384MB >03:49:00,084 DEBUG storage.ui: request 32 (sdb2) growth: 1046528 (511MB) size: 512MB >03:49:00,085 DEBUG storage.ui: disk /dev/sdb growth: 1046528 (511MB) >03:49:00,085 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:00,087 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:49:00,088 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17690> PedPartition: <_ped.Partition object at 0x7fae04f8e050> >03:49:00,091 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:49:00,093 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:49:00,093 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:00,094 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:00,094 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,094 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,095 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,095 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,096 DEBUG storage.ui: adding 23787457 (11614MB) to 33 (sdc2) >03:49:00,096 DEBUG storage.ui: taking back 22740929 (11103MB) from 33 (sdc2) >03:49:00,096 DEBUG storage.ui: new grow amount for request 33 (sdc2) is 1046528 units, or 511MB >03:49:00,097 DEBUG storage.ui: request 27 (sdc1) growth: 0 (0MB) size: 384MB >03:49:00,097 DEBUG storage.ui: request 33 (sdc2) growth: 1046528 (511MB) size: 512MB >03:49:00,097 DEBUG storage.ui: disk /dev/sdc growth: 1046528 (511MB) >03:49:00,098 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:00,098 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:00,099 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:00,099 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,099 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,100 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,100 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,100 DEBUG storage.ui: adding 23787457 (11614MB) to 31 (sda2) >03:49:00,101 DEBUG storage.ui: taking back 22740929 (11103MB) from 31 (sda2) >03:49:00,101 DEBUG storage.ui: new grow amount for request 31 (sda2) is 1046528 units, or 511MB >03:49:00,102 DEBUG storage.ui: request 25 (sda1) growth: 0 (0MB) size: 384MB >03:49:00,102 DEBUG storage.ui: request 31 (sda2) growth: 1046528 (511MB) size: 512MB >03:49:00,102 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:49:00,104 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:00,105 DEBUG storage.ui: device sdc2 new partedPartition None >03:49:00,107 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:49:00,109 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:49:00,109 DEBUG storage.ui: total growth: 3139584 sectors >03:49:00,110 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:00,110 DEBUG storage.ui: new free: 788480-24575999 / 11615MB >03:49:00,110 DEBUG storage.ui: new free allows for 3139584 sectors of growth >03:49:00,111 DEBUG storage.ui: created partition sdc2 of 1MB and added it to /dev/sdc >03:49:00,113 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:49:00,114 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17890> PedPartition: <_ped.Partition object at 0x7fae04f8e0b0> >03:49:00,116 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:49:00,118 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:49:00,121 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:00,122 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17950> PedPartition: <_ped.Partition object at 0x7fae04f8e170> >03:49:00,125 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:00,127 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:00,127 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:00,129 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:00,131 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:00,131 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:00,132 DEBUG storage.ui: allocating partition: req9 ; id: 34 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 512 >03:49:00,132 DEBUG storage.ui: checking freespace on sdd >03:49:00,133 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=1MB boot=False best=None grow=True >03:49:00,134 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:00,134 DEBUG storage.ui: current free range is 788480-24575999 (11615MB) >03:49:00,135 DEBUG storage.ui: evaluating growth potential for new layout >03:49:00,135 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:00,137 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:49:00,138 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f174d0> PedPartition: <_ped.Partition object at 0x7fae04fb3dd0> >03:49:00,140 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:49:00,142 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:49:00,143 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:00,143 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:00,144 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,144 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,145 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,145 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,145 DEBUG storage.ui: adding 23787457 (11614MB) to 34 (sdd2) >03:49:00,146 DEBUG storage.ui: taking back 22740929 (11103MB) from 34 (sdd2) >03:49:00,146 DEBUG storage.ui: new grow amount for request 34 (sdd2) is 1046528 units, or 511MB >03:49:00,147 DEBUG storage.ui: request 28 (sdd1) growth: 0 (0MB) size: 384MB >03:49:00,147 DEBUG storage.ui: request 34 (sdd2) growth: 1046528 (511MB) size: 512MB >03:49:00,147 DEBUG storage.ui: disk /dev/sdd growth: 1046528 (511MB) >03:49:00,148 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:00,148 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:00,149 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:00,149 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,150 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,150 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,150 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,151 DEBUG storage.ui: adding 23787457 (11614MB) to 32 (sdb2) >03:49:00,151 DEBUG storage.ui: taking back 22740929 (11103MB) from 32 (sdb2) >03:49:00,151 DEBUG storage.ui: new grow amount for request 32 (sdb2) is 1046528 units, or 511MB >03:49:00,152 DEBUG storage.ui: request 26 (sdb1) growth: 0 (0MB) size: 384MB >03:49:00,152 DEBUG storage.ui: request 32 (sdb2) growth: 1046528 (511MB) size: 512MB >03:49:00,152 DEBUG storage.ui: disk /dev/sdb growth: 1046528 (511MB) >03:49:00,153 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:00,153 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:00,154 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:00,154 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,155 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,155 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,156 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,156 DEBUG storage.ui: adding 23787457 (11614MB) to 33 (sdc2) >03:49:00,156 DEBUG storage.ui: taking back 22740929 (11103MB) from 33 (sdc2) >03:49:00,157 DEBUG storage.ui: new grow amount for request 33 (sdc2) is 1046528 units, or 511MB >03:49:00,157 DEBUG storage.ui: request 27 (sdc1) growth: 0 (0MB) size: 384MB >03:49:00,157 DEBUG storage.ui: request 33 (sdc2) growth: 1046528 (511MB) size: 512MB >03:49:00,158 DEBUG storage.ui: disk /dev/sdc growth: 1046528 (511MB) >03:49:00,158 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:00,158 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:00,159 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:00,159 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,160 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,160 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,160 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,161 DEBUG storage.ui: adding 23787457 (11614MB) to 31 (sda2) >03:49:00,161 DEBUG storage.ui: taking back 22740929 (11103MB) from 31 (sda2) >03:49:00,162 DEBUG storage.ui: new grow amount for request 31 (sda2) is 1046528 units, or 511MB >03:49:00,162 DEBUG storage.ui: request 25 (sda1) growth: 0 (0MB) size: 384MB >03:49:00,162 DEBUG storage.ui: request 31 (sda2) growth: 1046528 (511MB) size: 512MB >03:49:00,163 DEBUG storage.ui: disk /dev/sda growth: 1046528 (511MB) >03:49:00,165 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:00,165 DEBUG storage.ui: device sdd2 new partedPartition None >03:49:00,167 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:49:00,169 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:49:00,170 DEBUG storage.ui: total growth: 4186112 sectors >03:49:00,170 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:00,170 DEBUG storage.ui: new free: 788480-24575999 / 11615MB >03:49:00,171 DEBUG storage.ui: new free allows for 4186112 sectors of growth >03:49:00,171 DEBUG storage.ui: created partition sdd2 of 1MB and added it to /dev/sdd >03:49:00,173 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:49:00,174 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f179d0> PedPartition: <_ped.Partition object at 0x7fae04fb3950> >03:49:00,176 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:49:00,178 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:49:00,181 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:00,181 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17510> PedPartition: <_ped.Partition object at 0x7fae04f8e230> >03:49:00,182 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda2(id 31)', 'sdb2(id 32)', 'sdc2(id 33)', 'sdd2(id 34)', 'sda1(id 25)', 'sdb1(id 26)', 'sdc1(id 27)', 'sdd1(id 28)'] >03:49:00,182 DEBUG storage.ui: growable partitions are ['sda2', 'sdb2', 'sdc2', 'sdd2'] >03:49:00,183 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:00,183 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:00,183 DEBUG storage.ui: disk sda has 1 chunks >03:49:00,184 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:00,184 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:00,185 DEBUG storage.ui: disk sdb has 1 chunks >03:49:00,185 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:00,185 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:00,186 DEBUG storage.ui: disk sdc has 1 chunks >03:49:00,186 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:00,187 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:00,187 DEBUG storage.ui: disk sdd has 1 chunks >03:49:00,187 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,188 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,188 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,189 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,189 DEBUG storage.ui: adding 23787457 (11614MB) to 31 (sda2) >03:49:00,189 DEBUG storage.ui: taking back 22740929 (11103MB) from 31 (sda2) >03:49:00,190 DEBUG storage.ui: new grow amount for request 31 (sda2) is 1046528 units, or 511MB >03:49:00,190 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,190 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,191 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,191 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,192 DEBUG storage.ui: adding 23787457 (11614MB) to 32 (sdb2) >03:49:00,192 DEBUG storage.ui: taking back 22740929 (11103MB) from 32 (sdb2) >03:49:00,192 DEBUG storage.ui: new grow amount for request 32 (sdb2) is 1046528 units, or 511MB >03:49:00,193 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,193 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,194 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,194 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,194 DEBUG storage.ui: adding 23787457 (11614MB) to 33 (sdc2) >03:49:00,195 DEBUG storage.ui: taking back 22740929 (11103MB) from 33 (sdc2) >03:49:00,195 DEBUG storage.ui: new grow amount for request 33 (sdc2) is 1046528 units, or 511MB >03:49:00,195 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:00,196 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd1 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:00,196 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = True >base = 2048 growth = 0 max_grow = 1046528 >done = False >03:49:00,197 DEBUG storage.ui: 1 requests and 23787457 (11614MB) left in chunk >03:49:00,197 DEBUG storage.ui: adding 23787457 (11614MB) to 34 (sdd2) >03:49:00,197 DEBUG storage.ui: taking back 22740929 (11103MB) from 34 (sdd2) >03:49:00,198 DEBUG storage.ui: new grow amount for request 34 (sdd2) is 1046528 units, or 511MB >03:49:00,198 DEBUG storage.ui: set: ['sda2', 'sdb2', 'sdc2', 'sdd2'] 512 >03:49:00,198 DEBUG storage.ui: min growth is 1046528 >03:49:00,199 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 31 name = sda2 growable = True >base = 2048 growth = 1046528 max_grow = 1046528 >done = True is 1046528 >03:49:00,199 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 32 name = sdb2 growable = True >base = 2048 growth = 1046528 max_grow = 1046528 >done = True is 1046528 >03:49:00,200 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 33 name = sdc2 growable = True >base = 2048 growth = 1046528 max_grow = 1046528 >done = True is 1046528 >03:49:00,200 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 34 name = sdd2 growable = True >base = 2048 growth = 1046528 max_grow = 1046528 >done = True is 1046528 >03:49:00,201 DEBUG storage.ui: set: ['sda2', 'sdb2', 'sdc2', 'sdd2'] 512 >03:49:00,201 DEBUG storage.ui: min growth is 1046528 >03:49:00,201 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 31 name = sda2 growable = True >base = 2048 growth = 1046528 max_grow = 1046528 >done = True is 1046528 >03:49:00,202 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 32 name = sdb2 growable = True >base = 2048 growth = 1046528 max_grow = 1046528 >done = True is 1046528 >03:49:00,202 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 33 name = sdc2 growable = True >base = 2048 growth = 1046528 max_grow = 1046528 >done = True is 1046528 >03:49:00,203 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 34 name = sdd2 growable = True >base = 2048 growth = 1046528 max_grow = 1046528 >done = True is 1046528 >03:49:00,203 DEBUG storage.ui: growing partitions on sda >03:49:00,204 DEBUG storage.ui: partition sda1 (25): 0 >03:49:00,204 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f17b10> >03:49:00,205 DEBUG storage.ui: partition sda2 (31): 0 >03:49:00,205 DEBUG storage.ui: new geometry for sda2: parted.Geometry instance -- > start: 788480 end: 1837055 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f17c90> >03:49:00,205 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 25)', 'sda2(id 31)'] from disk(s) ['sda'] >03:49:00,208 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:00,208 DEBUG storage.ui: device sda1 new partedPartition None >03:49:00,210 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:49:00,212 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:49:00,214 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:00,214 DEBUG storage.ui: device sda2 new partedPartition None >03:49:00,216 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:49:00,218 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:49:00,219 DEBUG storage.ui: back from removeNewPartitions >03:49:00,219 DEBUG storage.ui: extended: None >03:49:00,219 DEBUG storage.ui: setting req1 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f17b10> >03:49:00,221 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:49:00,222 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f040d0> PedPartition: <_ped.Partition object at 0x7fae04f8e050> >03:49:00,224 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:49:00,226 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:49:00,229 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:00,230 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04250> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:49:00,230 DEBUG storage.ui: setting req6 new geometry: parted.Geometry instance -- > start: 788480 end: 1837055 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f17c90> >03:49:00,233 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:49:00,233 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17e90> PedPartition: <_ped.Partition object at 0x7fae04f8e290> >03:49:00,235 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:49:00,238 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:49:00,246 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:00,247 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04490> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:00,247 DEBUG storage.ui: growing partitions on sdb >03:49:00,248 DEBUG storage.ui: partition sdb1 (26): 0 >03:49:00,248 DEBUG storage.ui: new geometry for sdb1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f17c10> >03:49:00,249 DEBUG storage.ui: partition sdb2 (32): 0 >03:49:00,249 DEBUG storage.ui: new geometry for sdb2: parted.Geometry instance -- > start: 788480 end: 1837055 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f17ad0> >03:49:00,250 DEBUG storage.ui: removing all non-preexisting partitions ['sdb1(id 26)', 'sdb2(id 32)'] from disk(s) ['sdb'] >03:49:00,253 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:00,253 DEBUG storage.ui: device sdb1 new partedPartition None >03:49:00,255 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:49:00,258 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:49:00,260 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:00,260 DEBUG storage.ui: device sdb2 new partedPartition None >03:49:00,262 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:49:00,264 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:49:00,265 DEBUG storage.ui: back from removeNewPartitions >03:49:00,265 DEBUG storage.ui: extended: None >03:49:00,266 DEBUG storage.ui: setting req2 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f17c10> >03:49:00,268 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:49:00,269 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f42550> PedPartition: <_ped.Partition object at 0x7fae04fb3950> >03:49:00,271 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:49:00,273 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:49:00,275 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:00,276 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f42910> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:00,277 DEBUG storage.ui: setting req7 new geometry: parted.Geometry instance -- > start: 788480 end: 1837055 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f17ad0> >03:49:00,279 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:49:00,280 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f3e950> PedPartition: <_ped.Partition object at 0x7fae04fb3dd0> >03:49:00,282 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:49:00,285 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:49:00,287 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:00,288 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f42150> PedPartition: <_ped.Partition object at 0x7fae04fb3e90> >03:49:00,289 DEBUG storage.ui: growing partitions on sdc >03:49:00,289 DEBUG storage.ui: partition sdc1 (27): 0 >03:49:00,290 DEBUG storage.ui: new geometry for sdc1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f172d0> >03:49:00,290 DEBUG storage.ui: partition sdc2 (33): 0 >03:49:00,291 DEBUG storage.ui: new geometry for sdc2: parted.Geometry instance -- > start: 788480 end: 1837055 length: 1048576 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f420d0> >03:49:00,291 DEBUG storage.ui: removing all non-preexisting partitions ['sdc1(id 27)', 'sdc2(id 33)'] from disk(s) ['sdc'] >03:49:00,293 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:00,294 DEBUG storage.ui: device sdc1 new partedPartition None >03:49:00,296 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:49:00,298 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:49:00,300 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:00,300 DEBUG storage.ui: device sdc2 new partedPartition None >03:49:00,303 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:49:00,305 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:49:00,305 DEBUG storage.ui: back from removeNewPartitions >03:49:00,306 DEBUG storage.ui: extended: None >03:49:00,306 DEBUG storage.ui: setting req3 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f172d0> >03:49:00,308 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:49:00,309 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f42b50> PedPartition: <_ped.Partition object at 0x7fae04fb38f0> >03:49:00,312 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:49:00,314 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:49:00,316 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:00,317 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f42cd0> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:49:00,318 DEBUG storage.ui: setting req8 new geometry: parted.Geometry instance -- > start: 788480 end: 1837055 length: 1048576 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f420d0> >03:49:00,320 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:49:00,321 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89b90> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:49:00,323 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:49:00,326 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:49:00,328 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:00,329 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efcc90> PedPartition: <_ped.Partition object at 0x7fae04f8e110> >03:49:00,330 DEBUG storage.ui: growing partitions on sdd >03:49:00,331 DEBUG storage.ui: partition sdd1 (28): 0 >03:49:00,331 DEBUG storage.ui: new geometry for sdd1: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04f17910> >03:49:00,332 DEBUG storage.ui: partition sdd2 (34): 0 >03:49:00,333 DEBUG storage.ui: new geometry for sdd2: parted.Geometry instance -- > start: 788480 end: 1837055 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04efc210> >03:49:00,333 DEBUG storage.ui: removing all non-preexisting partitions ['sdd1(id 28)', 'sdd2(id 34)'] from disk(s) ['sdd'] >03:49:00,336 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:00,336 DEBUG storage.ui: device sdd1 new partedPartition None >03:49:00,338 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:49:00,341 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:49:00,343 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:00,343 DEBUG storage.ui: device sdd2 new partedPartition None >03:49:00,345 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:49:00,347 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:49:00,348 DEBUG storage.ui: back from removeNewPartitions >03:49:00,348 DEBUG storage.ui: extended: None >03:49:00,348 DEBUG storage.ui: setting req4 new geometry: parted.Geometry instance -- > start: 2048 end: 788479 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04f17910> >03:49:00,351 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:49:00,352 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f89b50> PedPartition: <_ped.Partition object at 0x7fae04f8e170> >03:49:00,355 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:49:00,357 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:49:00,359 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:00,360 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17710> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:49:00,361 DEBUG storage.ui: setting req9 new geometry: parted.Geometry instance -- > start: 788480 end: 1837055 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04efc210> >03:49:00,363 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:49:00,364 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f3e950> PedPartition: <_ped.Partition object at 0x7fae04f8e290> >03:49:00,366 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:49:00,368 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:49:00,371 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:00,372 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17c90> PedPartition: <_ped.Partition object at 0x7fae04fb38f0> >03:49:00,373 DEBUG storage.ui: fixing size of non-existent 384MB partition sda1 (25) with non-existent mdmember at 384.00 >03:49:00,373 DEBUG storage.ui: fixing size of non-existent 512MB partition sda2 (31) with non-existent mdmember at 512.00 >03:49:00,374 DEBUG storage.ui: fixing size of non-existent 384MB partition sdb1 (26) with non-existent mdmember at 384.00 >03:49:00,375 DEBUG storage.ui: fixing size of non-existent 512MB partition sdb2 (32) with non-existent mdmember at 512.00 >03:49:00,375 DEBUG storage.ui: fixing size of non-existent 384MB partition sdc1 (27) with non-existent mdmember at 384.00 >03:49:00,376 DEBUG storage.ui: fixing size of non-existent 512MB partition sdc2 (33) with non-existent mdmember at 512.00 >03:49:00,377 DEBUG storage.ui: fixing size of non-existent 384MB partition sdd1 (28) with non-existent mdmember at 384.00 >03:49:00,378 DEBUG storage.ui: fixing size of non-existent 512MB partition sdd2 (34) with non-existent mdmember at 512.00 >03:49:00,381 DEBUG storage.ui: Ext4FS.supported: supported: True ; >03:49:00,382 DEBUG storage.ui: getFormat('ext4') returning Ext4FS instance >03:49:00,384 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sda2 ; >03:49:00,386 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdb2 ; >03:49:00,388 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdc2 ; >03:49:00,390 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdd2 ; >03:49:00,392 DEBUG storage.ui: MDRaidArrayDevice._setFormat: boot ; current: None ; type: ext4 ; >03:49:00,393 INFO storage.ui: added mdarray boot (id 35) to device tree >03:49:00,394 INFO storage.ui: registered action: [56] Create Device mdarray boot (id 35) >03:49:00,394 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:49:00,395 INFO storage.ui: registered action: [57] Create Format ext4 filesystem mounted at /boot on mdarray boot (id 35) >03:49:00,396 DEBUG storage.ui: raw RAID 1 size == 512.0 >03:49:00,397 INFO storage.ui: Using 0MB superBlockSize >03:49:00,397 DEBUG storage.ui: non-existent RAID 1 size == 512.0 >03:49:00,399 DEBUG storage.ui: raw RAID 1 size == 512.0 >03:49:00,399 INFO storage.ui: Using 0MB superBlockSize >03:49:00,400 DEBUG storage.ui: non-existent RAID 1 size == 512.0 >03:49:00,403 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:00,403 INFO blivet: Using 0MB superBlockSize >03:49:00,404 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:00,406 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:00,407 INFO blivet: Using 0MB superBlockSize >03:49:00,408 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:00,422 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:00,422 INFO blivet: Using 0MB superBlockSize >03:49:00,423 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:00,426 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:00,427 INFO blivet: Using 0MB superBlockSize >03:49:00,427 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:01,890 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:01,891 INFO blivet: Using 0MB superBlockSize >03:49:01,891 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:01,900 DEBUG blivet: Ext4FS.supported: supported: True ; >03:49:01,901 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:49:01,908 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid1'} >03:49:12,545 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000.0, [], {} >03:49:12,551 DEBUG storage.ui: Blivet.factoryDevice: 2 ; 6000.0 ; mountpoint: / ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 2 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 2 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 2 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 2 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; fstype: ext4 ; encrypted: False ; >03:49:12,553 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.PartitionFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000.0, ['sda', 'sdb', 'sdc', 'sdd'], {'mountpoint': '/', 'fstype': 'ext4', 'encrypted': False} >03:49:12,555 DEBUG storage.ui: PartitionFactory.configure: parent_factory: None ; >03:49:12,555 DEBUG storage.ui: starting Blivet copy >03:49:12,600 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:12,602 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f78d10> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efd1d0> PedPartition: <_ped.Partition object at 0x7fae04fb3dd0> >03:49:12,604 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:12,605 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f78d10> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efd350> PedPartition: <_ped.Partition object at 0x7fae04f8e050> >03:49:12,608 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:12,609 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f15790> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efd450> PedPartition: <_ped.Partition object at 0x7fae04f8e1d0> >03:49:12,611 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:12,612 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f15790> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efd5d0> PedPartition: <_ped.Partition object at 0x7fae04f8e350> >03:49:12,615 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:12,616 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f17050> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efd6d0> PedPartition: <_ped.Partition object at 0x7fae04f8e3b0> >03:49:12,618 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:12,619 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f17050> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efd850> PedPartition: <_ped.Partition object at 0x7fae04f8e410> >03:49:12,622 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:12,623 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f983d0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efd950> PedPartition: <_ped.Partition object at 0x7fae04f8e0b0> >03:49:12,625 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:12,626 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f983d0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efdad0> PedPartition: <_ped.Partition object at 0x7fae04f8e230> >03:49:12,627 DEBUG storage.ui: finished Blivet copy >03:49:12,630 DEBUG storage.ui: Ext4FS.supported: supported: True ; >03:49:12,630 DEBUG storage.ui: getFormat('ext4') returning Ext4FS instance >03:49:12,633 DEBUG storage.ui: Ext4FS.supported: supported: True ; >03:49:12,633 DEBUG storage.ui: getFormat('ext4') returning Ext4FS instance >03:49:12,636 DEBUG storage.ui: Ext4FS.supported: supported: True ; >03:49:12,637 DEBUG storage.ui: getFormat('ext4') returning Ext4FS instance >03:49:12,639 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:12,641 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:12,643 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:12,646 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:12,648 DEBUG storage.ui: PartitionDevice._setFormat: req10 ; >03:49:12,650 DEBUG storage.ui: PartitionDevice._setFormat: req10 ; current: None ; type: ext4 ; >03:49:12,652 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:12,654 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:49:12,657 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:49:12,659 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:49:12,660 INFO storage.ui: added partition req10 (id 36) to device tree >03:49:12,661 INFO storage.ui: registered action: [58] Create Device partition req10 (id 36) >03:49:12,661 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:49:12,662 INFO storage.ui: registered action: [59] Create Format ext4 filesystem mounted at / on partition req10 (id 36) >03:49:12,665 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:49:12,667 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:49:12,669 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:49:12,671 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:49:12,672 DEBUG storage.ui: removing all non-preexisting partitions ['req10(id 36)', 'sda1(id 25)', 'sda2(id 31)', 'sdb1(id 26)', 'sdb2(id 32)', 'sdc1(id 27)', 'sdc2(id 33)', 'sdd1(id 28)', 'sdd2(id 34)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:49:12,674 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:12,675 DEBUG storage.ui: device sda1 new partedPartition None >03:49:12,677 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:49:12,680 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:49:12,682 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:12,682 DEBUG storage.ui: device sda2 new partedPartition None >03:49:12,685 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:49:12,687 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:49:12,689 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:12,690 DEBUG storage.ui: device sdb1 new partedPartition None >03:49:12,692 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:49:12,694 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:49:12,696 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:12,697 DEBUG storage.ui: device sdb2 new partedPartition None >03:49:12,699 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:49:12,701 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:49:12,703 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:12,704 DEBUG storage.ui: device sdc1 new partedPartition None >03:49:12,706 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:49:12,708 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:49:12,710 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:12,711 DEBUG storage.ui: device sdc2 new partedPartition None >03:49:12,713 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:49:12,715 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:49:12,717 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:12,718 DEBUG storage.ui: device sdd1 new partedPartition None >03:49:12,720 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:49:12,722 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:49:12,725 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:12,725 DEBUG storage.ui: device sdd2 new partedPartition None >03:49:12,728 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:49:12,730 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:49:12,731 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req10(id 36)', 'req1(id 25)', 'req6(id 31)', 'req2(id 26)', 'req7(id 32)', 'req3(id 27)', 'req8(id 33)', 'req4(id 28)', 'req9(id 34)'] >03:49:12,732 DEBUG storage.ui: removing all non-preexisting partitions ['req6(id 31)', 'req7(id 32)', 'req8(id 33)', 'req9(id 34)', 'req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)', 'req10(id 36)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:49:12,734 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,737 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,737 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,739 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,742 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,742 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,743 DEBUG storage.ui: allocating partition: req6 ; id: 31 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:12,743 DEBUG storage.ui: checking freespace on sda >03:49:12,744 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=512MB boot=False best=None grow=False >03:49:12,745 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:12,746 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:12,746 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:12,747 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:12,747 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:12,748 DEBUG storage.ui: created partition sda1 of 512MB and added it to /dev/sda >03:49:12,750 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:49:12,752 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efddd0> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:49:12,754 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:49:12,757 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:49:12,759 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:12,760 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f42a10> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:12,763 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,765 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,766 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,768 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,770 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,771 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,771 DEBUG storage.ui: allocating partition: req7 ; id: 32 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:12,772 DEBUG storage.ui: checking freespace on sdb >03:49:12,773 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=512MB boot=False best=None grow=False >03:49:12,774 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:12,775 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:12,775 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:12,776 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:12,777 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:12,778 DEBUG storage.ui: created partition sdb1 of 512MB and added it to /dev/sdb >03:49:12,780 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:49:12,781 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f041d0> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:49:12,783 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:49:12,786 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:49:12,788 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:12,789 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17ed0> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:49:12,792 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,794 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,795 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,797 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,799 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,800 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,800 DEBUG storage.ui: allocating partition: req8 ; id: 33 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:12,801 DEBUG storage.ui: checking freespace on sdc >03:49:12,802 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=512MB boot=False best=None grow=False >03:49:12,803 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:12,804 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:12,805 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:12,805 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:12,806 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:12,807 DEBUG storage.ui: created partition sdc1 of 512MB and added it to /dev/sdc >03:49:12,809 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:49:12,810 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04450> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:49:12,812 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:49:12,815 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:49:12,817 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:12,818 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f42150> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:49:12,821 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,823 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,824 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,826 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,828 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,829 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,829 DEBUG storage.ui: allocating partition: req9 ; id: 34 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:12,830 DEBUG storage.ui: checking freespace on sdd >03:49:12,831 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=512MB boot=False best=None grow=False >03:49:12,832 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:12,832 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:12,833 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:12,833 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:12,834 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:12,835 DEBUG storage.ui: created partition sdd1 of 512MB and added it to /dev/sdd >03:49:12,837 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:49:12,838 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04090> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:49:12,840 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:49:12,842 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:49:12,845 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:12,846 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04410> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:49:12,848 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,851 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,851 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,853 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,855 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,856 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,856 DEBUG storage.ui: allocating partition: req1 ; id: 25 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:49:12,857 DEBUG storage.ui: checking freespace on sda >03:49:12,858 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=384MB boot=False best=None grow=False >03:49:12,859 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:12,859 DEBUG storage.ui: current free range is 1050624-24575999 (11487MB) >03:49:12,860 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:12,861 DEBUG storage.ui: new free: 1050624-24575999 / 11487MB >03:49:12,861 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:12,862 DEBUG storage.ui: created partition sda2 of 384MB and added it to /dev/sda >03:49:12,865 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:49:12,866 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efde90> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:12,868 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:49:12,870 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:49:12,873 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:12,874 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04490> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:49:12,876 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,878 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,879 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,881 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,883 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,884 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,884 DEBUG storage.ui: allocating partition: req2 ; id: 26 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:49:12,885 DEBUG storage.ui: checking freespace on sdb >03:49:12,886 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=384MB boot=False best=None grow=False >03:49:12,887 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:12,887 DEBUG storage.ui: current free range is 1050624-24575999 (11487MB) >03:49:12,888 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:12,889 DEBUG storage.ui: new free: 1050624-24575999 / 11487MB >03:49:12,889 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:12,890 DEBUG storage.ui: created partition sdb2 of 384MB and added it to /dev/sdb >03:49:12,892 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:49:12,893 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efdb50> PedPartition: <_ped.Partition object at 0x7fae04fb3e90> >03:49:12,896 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:49:12,898 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:49:12,901 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:12,902 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efde10> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:49:12,904 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,907 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,907 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,910 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,912 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,913 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,913 DEBUG storage.ui: allocating partition: req3 ; id: 27 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:49:12,914 DEBUG storage.ui: checking freespace on sdc >03:49:12,915 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=384MB boot=False best=None grow=False >03:49:12,916 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:12,916 DEBUG storage.ui: current free range is 1050624-24575999 (11487MB) >03:49:12,917 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:12,918 DEBUG storage.ui: new free: 1050624-24575999 / 11487MB >03:49:12,918 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:12,920 DEBUG storage.ui: created partition sdc2 of 384MB and added it to /dev/sdc >03:49:12,922 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:49:12,924 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff150> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:49:12,926 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:49:12,929 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:49:12,932 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:12,933 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff1d0> PedPartition: <_ped.Partition object at 0x7fae04f8e530> >03:49:12,935 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,937 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,938 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,940 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,942 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,943 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,943 DEBUG storage.ui: allocating partition: req4 ; id: 28 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:49:12,944 DEBUG storage.ui: checking freespace on sdd >03:49:12,945 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=384MB boot=False best=None grow=False >03:49:12,946 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:12,947 DEBUG storage.ui: current free range is 1050624-24575999 (11487MB) >03:49:12,947 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:12,948 DEBUG storage.ui: new free: 1050624-24575999 / 11487MB >03:49:12,948 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:12,950 DEBUG storage.ui: created partition sdd2 of 384MB and added it to /dev/sdd >03:49:12,952 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:49:12,953 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff610> PedPartition: <_ped.Partition object at 0x7fae04f8e470> >03:49:12,955 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:49:12,958 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:49:12,960 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:12,961 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff690> PedPartition: <_ped.Partition object at 0x7fae04f8e5f0> >03:49:12,964 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,966 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,966 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,969 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,971 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,972 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,974 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,976 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,977 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,980 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,982 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,982 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,985 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,987 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,987 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,990 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,992 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,993 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:12,995 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:12,997 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:12,998 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:13,000 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:13,002 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:13,003 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:13,003 DEBUG storage.ui: allocating partition: req10 ; id: 36 ; disks: ['sda', 'sdb', 'sdc', 'sdd'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 6000.0 >03:49:13,004 DEBUG storage.ui: checking freespace on sda >03:49:13,005 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=False best=None grow=True >03:49:13,006 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:13,007 DEBUG storage.ui: current free range is 1837056-24575999 (11103MB) >03:49:13,008 DEBUG storage.ui: evaluating growth potential for new layout >03:49:13,008 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:13,009 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,010 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,010 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,011 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,012 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,012 DEBUG storage.ui: request 34 (sdd1) growth: 0 (0MB) size: 512MB >03:49:13,013 DEBUG storage.ui: request 28 (sdd2) growth: 0 (0MB) size: 384MB >03:49:13,014 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:13,014 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:13,015 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,016 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,016 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,017 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,018 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,018 DEBUG storage.ui: request 32 (sdb1) growth: 0 (0MB) size: 512MB >03:49:13,019 DEBUG storage.ui: request 26 (sdb2) growth: 0 (0MB) size: 384MB >03:49:13,019 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:49:13,020 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:13,021 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,021 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,022 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,023 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,023 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,024 DEBUG storage.ui: request 33 (sdc1) growth: 0 (0MB) size: 512MB >03:49:13,024 DEBUG storage.ui: request 27 (sdc2) growth: 0 (0MB) size: 384MB >03:49:13,025 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:49:13,025 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:13,028 DEBUG storage.ui: PartitionDevice._setPartedPartition: req10 ; >03:49:13,028 DEBUG storage.ui: device req10 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff850> PedPartition: <_ped.Partition object at 0x7fae04fb3e90> >03:49:13,030 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:13,032 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:13,033 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,033 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,034 DEBUG storage.ui: adding request 36 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,034 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,035 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,035 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,036 DEBUG storage.ui: req: PartitionRequest instance -- >id = 36 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 12285952 >done = False >03:49:13,036 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:13,037 DEBUG storage.ui: adding 22738881 (11102MB) to 36 (sda3) >03:49:13,037 DEBUG storage.ui: taking back 10452929 (5103MB) from 36 (sda3) >03:49:13,037 DEBUG storage.ui: new grow amount for request 36 (sda3) is 12285952 units, or 5999MB >03:49:13,038 DEBUG storage.ui: request 31 (sda1) growth: 0 (0MB) size: 512MB >03:49:13,038 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:49:13,038 DEBUG storage.ui: request 36 (sda3) growth: 12285952 (5999MB) size: 6000MB >03:49:13,039 DEBUG storage.ui: disk /dev/sda growth: 12285952 (5999MB) >03:49:13,041 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:13,041 DEBUG storage.ui: device sda3 new partedPartition None >03:49:13,043 DEBUG storage.ui: PartitionDevice._setDisk: req10 ; new: None ; old: sda ; >03:49:13,045 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:13,045 DEBUG storage.ui: total growth: 12285952 sectors >03:49:13,046 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:13,046 DEBUG storage.ui: new free: 1837056-24575999 / 11103MB >03:49:13,047 DEBUG storage.ui: new free allows for 12285952 sectors of growth >03:49:13,047 DEBUG storage.ui: checking freespace on sdb >03:49:13,048 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=1MB boot=False best=None grow=True >03:49:13,048 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:13,049 DEBUG storage.ui: current free range is 1837056-24575999 (11103MB) >03:49:13,049 DEBUG storage.ui: evaluating growth potential for new layout >03:49:13,050 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:13,050 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,050 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,051 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,051 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,052 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,052 DEBUG storage.ui: request 34 (sdd1) growth: 0 (0MB) size: 512MB >03:49:13,053 DEBUG storage.ui: request 28 (sdd2) growth: 0 (0MB) size: 384MB >03:49:13,053 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:13,053 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:13,056 DEBUG storage.ui: PartitionDevice._setPartedPartition: req10 ; >03:49:13,056 DEBUG storage.ui: device req10 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04effb90> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:49:13,058 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:49:13,060 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:13,061 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,061 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,062 DEBUG storage.ui: adding request 36 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,062 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,063 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,063 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,064 DEBUG storage.ui: req: PartitionRequest instance -- >id = 36 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 12285952 >done = False >03:49:13,064 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:13,064 DEBUG storage.ui: adding 22738881 (11102MB) to 36 (sdb3) >03:49:13,065 DEBUG storage.ui: taking back 10452929 (5103MB) from 36 (sdb3) >03:49:13,065 DEBUG storage.ui: new grow amount for request 36 (sdb3) is 12285952 units, or 5999MB >03:49:13,066 DEBUG storage.ui: request 32 (sdb1) growth: 0 (0MB) size: 512MB >03:49:13,066 DEBUG storage.ui: request 26 (sdb2) growth: 0 (0MB) size: 384MB >03:49:13,066 DEBUG storage.ui: request 36 (sdb3) growth: 12285952 (5999MB) size: 6000MB >03:49:13,067 DEBUG storage.ui: disk /dev/sdb growth: 12285952 (5999MB) >03:49:13,067 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:13,067 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,068 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,068 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,069 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,069 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,070 DEBUG storage.ui: request 33 (sdc1) growth: 0 (0MB) size: 512MB >03:49:13,070 DEBUG storage.ui: request 27 (sdc2) growth: 0 (0MB) size: 384MB >03:49:13,070 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:49:13,071 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:13,071 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,072 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,072 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,073 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,073 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,073 DEBUG storage.ui: request 31 (sda1) growth: 0 (0MB) size: 512MB >03:49:13,074 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:49:13,074 DEBUG storage.ui: disk /dev/sda growth: 0 (0MB) >03:49:13,076 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:13,076 DEBUG storage.ui: device sdb3 new partedPartition None >03:49:13,079 DEBUG storage.ui: PartitionDevice._setDisk: req10 ; new: None ; old: sdb ; >03:49:13,081 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:49:13,081 DEBUG storage.ui: total growth: 12285952 sectors >03:49:13,081 DEBUG storage.ui: keeping old free: 12285952 <= 12285952 >03:49:13,082 DEBUG storage.ui: checking freespace on sdc >03:49:13,082 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=1MB boot=False best=None grow=True >03:49:13,083 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:13,084 DEBUG storage.ui: current free range is 1837056-24575999 (11103MB) >03:49:13,084 DEBUG storage.ui: evaluating growth potential for new layout >03:49:13,084 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:13,085 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,085 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,086 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,086 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,087 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,087 DEBUG storage.ui: request 34 (sdd1) growth: 0 (0MB) size: 512MB >03:49:13,087 DEBUG storage.ui: request 28 (sdd2) growth: 0 (0MB) size: 384MB >03:49:13,088 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:13,088 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:13,089 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,089 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,090 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,090 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,090 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,091 DEBUG storage.ui: request 32 (sdb1) growth: 0 (0MB) size: 512MB >03:49:13,091 DEBUG storage.ui: request 26 (sdb2) growth: 0 (0MB) size: 384MB >03:49:13,092 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:49:13,092 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:13,094 DEBUG storage.ui: PartitionDevice._setPartedPartition: req10 ; >03:49:13,095 DEBUG storage.ui: device req10 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04effd90> PedPartition: <_ped.Partition object at 0x7fae04f8e6b0> >03:49:13,097 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:49:13,099 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:13,100 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,100 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,101 DEBUG storage.ui: adding request 36 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,101 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,101 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,102 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,102 DEBUG storage.ui: req: PartitionRequest instance -- >id = 36 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 12285952 >done = False >03:49:13,103 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:13,103 DEBUG storage.ui: adding 22738881 (11102MB) to 36 (sdc3) >03:49:13,103 DEBUG storage.ui: taking back 10452929 (5103MB) from 36 (sdc3) >03:49:13,104 DEBUG storage.ui: new grow amount for request 36 (sdc3) is 12285952 units, or 5999MB >03:49:13,104 DEBUG storage.ui: request 33 (sdc1) growth: 0 (0MB) size: 512MB >03:49:13,105 DEBUG storage.ui: request 27 (sdc2) growth: 0 (0MB) size: 384MB >03:49:13,105 DEBUG storage.ui: request 36 (sdc3) growth: 12285952 (5999MB) size: 6000MB >03:49:13,105 DEBUG storage.ui: disk /dev/sdc growth: 12285952 (5999MB) >03:49:13,106 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:13,106 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,107 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,107 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,107 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,108 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,108 DEBUG storage.ui: request 31 (sda1) growth: 0 (0MB) size: 512MB >03:49:13,109 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:49:13,109 DEBUG storage.ui: disk /dev/sda growth: 0 (0MB) >03:49:13,111 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:13,111 DEBUG storage.ui: device sdc3 new partedPartition None >03:49:13,113 DEBUG storage.ui: PartitionDevice._setDisk: req10 ; new: None ; old: sdc ; >03:49:13,115 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:49:13,116 DEBUG storage.ui: total growth: 12285952 sectors >03:49:13,116 DEBUG storage.ui: keeping old free: 12285952 <= 12285952 >03:49:13,116 DEBUG storage.ui: checking freespace on sdd >03:49:13,117 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=1MB boot=False best=None grow=True >03:49:13,118 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:13,118 DEBUG storage.ui: current free range is 1837056-24575999 (11103MB) >03:49:13,119 DEBUG storage.ui: evaluating growth potential for new layout >03:49:13,119 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:13,122 DEBUG storage.ui: PartitionDevice._setPartedPartition: req10 ; >03:49:13,122 DEBUG storage.ui: device req10 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efff90> PedPartition: <_ped.Partition object at 0x7fae04f8e710> >03:49:13,125 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:49:13,127 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:13,127 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,128 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,128 DEBUG storage.ui: adding request 36 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,128 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,129 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,129 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,130 DEBUG storage.ui: req: PartitionRequest instance -- >id = 36 name = sdd3 growable = True >base = 2048 growth = 0 max_grow = 12285952 >done = False >03:49:13,130 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:13,130 DEBUG storage.ui: adding 22738881 (11102MB) to 36 (sdd3) >03:49:13,131 DEBUG storage.ui: taking back 10452929 (5103MB) from 36 (sdd3) >03:49:13,131 DEBUG storage.ui: new grow amount for request 36 (sdd3) is 12285952 units, or 5999MB >03:49:13,132 DEBUG storage.ui: request 34 (sdd1) growth: 0 (0MB) size: 512MB >03:49:13,132 DEBUG storage.ui: request 28 (sdd2) growth: 0 (0MB) size: 384MB >03:49:13,132 DEBUG storage.ui: request 36 (sdd3) growth: 12285952 (5999MB) size: 6000MB >03:49:13,133 DEBUG storage.ui: disk /dev/sdd growth: 12285952 (5999MB) >03:49:13,133 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:13,134 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,134 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,134 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,135 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,135 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,136 DEBUG storage.ui: request 32 (sdb1) growth: 0 (0MB) size: 512MB >03:49:13,136 DEBUG storage.ui: request 26 (sdb2) growth: 0 (0MB) size: 384MB >03:49:13,136 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:49:13,137 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:13,137 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,138 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,138 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,139 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,139 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,140 DEBUG storage.ui: request 33 (sdc1) growth: 0 (0MB) size: 512MB >03:49:13,140 DEBUG storage.ui: request 27 (sdc2) growth: 0 (0MB) size: 384MB >03:49:13,140 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:49:13,141 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:13,141 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,142 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,142 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,143 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,143 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,143 DEBUG storage.ui: request 31 (sda1) growth: 0 (0MB) size: 512MB >03:49:13,144 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:49:13,144 DEBUG storage.ui: disk /dev/sda growth: 0 (0MB) >03:49:13,146 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:13,146 DEBUG storage.ui: device sdd3 new partedPartition None >03:49:13,149 DEBUG storage.ui: PartitionDevice._setDisk: req10 ; new: None ; old: sdd ; >03:49:13,151 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:49:13,151 DEBUG storage.ui: total growth: 12285952 sectors >03:49:13,152 DEBUG storage.ui: keeping old free: 12285952 <= 12285952 >03:49:13,153 DEBUG storage.ui: created partition sda3 of 1MB and added it to /dev/sda >03:49:13,155 DEBUG storage.ui: PartitionDevice._setPartedPartition: req10 ; >03:49:13,155 DEBUG storage.ui: device req10 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff950> PedPartition: <_ped.Partition object at 0x7fae04fb3e90> >03:49:13,157 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:13,159 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:13,161 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:13,162 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff810> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:49:13,163 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda3(id 36)', 'sda2(id 25)', 'sda1(id 31)', 'sdb2(id 26)', 'sdb1(id 32)', 'sdc2(id 27)', 'sdc1(id 33)', 'sdd2(id 28)', 'sdd1(id 34)'] >03:49:13,163 DEBUG storage.ui: growable partitions are ['sda3'] >03:49:13,164 DEBUG storage.ui: adding request 36 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,164 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,165 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:13,165 DEBUG storage.ui: disk sda has 1 chunks >03:49:13,166 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,166 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:13,166 DEBUG storage.ui: disk sdb has 1 chunks >03:49:13,167 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,167 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:13,168 DEBUG storage.ui: disk sdc has 1 chunks >03:49:13,168 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,169 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:13,169 DEBUG storage.ui: disk sdd has 1 chunks >03:49:13,170 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:13,170 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:13,171 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:13,171 DEBUG storage.ui: req: PartitionRequest instance -- >id = 36 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 12285952 >done = False >03:49:13,171 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:13,172 DEBUG storage.ui: adding 22738881 (11102MB) to 36 (sda3) >03:49:13,172 DEBUG storage.ui: taking back 10452929 (5103MB) from 36 (sda3) >03:49:13,173 DEBUG storage.ui: new grow amount for request 36 (sda3) is 12285952 units, or 5999MB >03:49:13,173 DEBUG storage.ui: growing partitions on sda >03:49:13,174 DEBUG storage.ui: partition sda1 (31): 0 >03:49:13,174 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04effa10> >03:49:13,175 DEBUG storage.ui: partition sda2 (25): 0 >03:49:13,175 DEBUG storage.ui: new geometry for sda2: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04effb50> >03:49:13,175 DEBUG storage.ui: partition sda3 (36): 0 >03:49:13,176 DEBUG storage.ui: new geometry for sda3: parted.Geometry instance -- > start: 1837056 end: 14125055 length: 12288000 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04ef7190> >03:49:13,176 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 31)', 'sda2(id 25)', 'sda3(id 36)'] from disk(s) ['sda'] >03:49:13,178 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:13,179 DEBUG storage.ui: device sda1 new partedPartition None >03:49:13,180 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:49:13,182 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:13,184 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:13,185 DEBUG storage.ui: device sda2 new partedPartition None >03:49:13,187 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:49:13,189 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:49:13,191 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:13,191 DEBUG storage.ui: device sda3 new partedPartition None >03:49:13,193 DEBUG storage.ui: PartitionDevice._setDisk: req10 ; new: None ; old: sda ; >03:49:13,195 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:49:13,195 DEBUG storage.ui: back from removeNewPartitions >03:49:13,196 DEBUG storage.ui: extended: None >03:49:13,196 DEBUG storage.ui: setting req6 new geometry: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04effa10> >03:49:13,198 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:49:13,199 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef72d0> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:49:13,201 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:49:13,203 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:49:13,205 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:13,206 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7410> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:13,207 DEBUG storage.ui: setting req1 new geometry: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04effb50> >03:49:13,209 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:49:13,210 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f17d90> PedPartition: <_ped.Partition object at 0x7fae04f8e7d0> >03:49:13,212 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:49:13,214 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:49:13,216 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:13,217 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04effd90> PedPartition: <_ped.Partition object at 0x7fae04f8e6b0> >03:49:13,218 DEBUG storage.ui: setting req10 new geometry: parted.Geometry instance -- > start: 1837056 end: 14125055 length: 12288000 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04ef7190> >03:49:13,220 DEBUG storage.ui: PartitionDevice._setPartedPartition: req10 ; >03:49:13,221 DEBUG storage.ui: device req10 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff610> PedPartition: <_ped.Partition object at 0x7fae04f8e770> >03:49:13,223 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:13,225 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:13,227 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:13,228 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7990> PedPartition: <_ped.Partition object at 0x7fae04f8e890> >03:49:13,229 DEBUG storage.ui: growing partitions on sdb >03:49:13,229 DEBUG storage.ui: growing partitions on sdc >03:49:13,229 DEBUG storage.ui: growing partitions on sdd >03:49:13,230 DEBUG storage.ui: fixing size of non-existent 512MB partition sda1 (31) with non-existent mdmember at 512.00 >03:49:13,231 DEBUG storage.ui: fixing size of non-existent 384MB partition sda2 (25) with non-existent mdmember at 384.00 >03:49:13,232 DEBUG storage.ui: fixing size of non-existent 6000MB partition sda3 (36) with non-existent ext4 filesystem mounted at / at 6000.00 >03:49:13,232 DEBUG storage.ui: fixing size of non-existent 512MB partition sdb1 (32) with non-existent mdmember at 512.00 >03:49:13,233 DEBUG storage.ui: fixing size of non-existent 384MB partition sdb2 (26) with non-existent mdmember at 384.00 >03:49:13,234 DEBUG storage.ui: fixing size of non-existent 512MB partition sdc1 (33) with non-existent mdmember at 512.00 >03:49:13,234 DEBUG storage.ui: fixing size of non-existent 384MB partition sdc2 (27) with non-existent mdmember at 384.00 >03:49:13,235 DEBUG storage.ui: fixing size of non-existent 512MB partition sdd1 (34) with non-existent mdmember at 512.00 >03:49:13,236 DEBUG storage.ui: fixing size of non-existent 384MB partition sdd2 (28) with non-existent mdmember at 384.00 >03:49:13,246 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:13,246 INFO blivet: Using 0MB superBlockSize >03:49:13,247 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:13,253 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:13,253 INFO blivet: Using 0MB superBlockSize >03:49:13,254 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:13,260 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:13,260 INFO blivet: Using 0MB superBlockSize >03:49:13,261 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:13,265 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:13,265 INFO blivet: Using 0MB superBlockSize >03:49:13,266 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:13,277 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:13,277 INFO blivet: Using 0MB superBlockSize >03:49:13,278 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:13,281 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:13,282 INFO blivet: Using 0MB superBlockSize >03:49:13,282 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:13,290 DEBUG blivet: Ext4FS.supported: supported: True ; >03:49:13,290 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:49:13,295 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid1'} >03:49:19,759 DEBUG blivet: Ext4FS.supported: supported: True ; >03:49:19,760 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:49:19,765 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000, ['sda'], {'encrypted': False, 'raid_level': 'raid10'} >03:49:19,769 INFO storage.ui: removed partition sda3 (id 36) from device tree >03:49:19,772 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:19,772 INFO storage.ui: registered action: [60] Destroy Device partition sda3 (id 36) >03:49:19,779 DEBUG storage.ui: Blivet.factoryDevice: 1 ; 6000 ; container_raid_level: None ; name: root ; encrypted: False ; container_encrypted: False ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 2 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 2 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 2 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 2 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; raid_level: raid10 ; label: ; container_name: None ; device: None ; mountpoint: / ; fstype: ext4 ; container_size: 0 ; >03:49:19,781 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000, ['sda', 'sdb', 'sdc', 'sdd'], {'container_raid_level': None, 'name': 'root', 'encrypted': False, 'container_encrypted': False, 'raid_level': 'raid10', 'label': '', 'container_name': None, 'device': None, 'mountpoint': '/', 'fstype': 'ext4', 'container_size': 0} >03:49:19,783 DEBUG storage.ui: MDFactory.configure: parent_factory: None ; >03:49:19,783 DEBUG storage.ui: starting Blivet copy >03:49:19,840 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:19,841 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04eff3d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f157d0> PedPartition: <_ped.Partition object at 0x7fae04fb3e90> >03:49:19,844 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:19,845 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04eff3d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f152d0> PedPartition: <_ped.Partition object at 0x7fae04fb3dd0> >03:49:19,847 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:19,848 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f0cf10> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f15390> PedPartition: <_ped.Partition object at 0x7fae04fb3950> >03:49:19,850 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:19,851 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f0cf10> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f151d0> PedPartition: <_ped.Partition object at 0x7fae04fb38f0> >03:49:19,854 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:19,855 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f0c590> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f15b50> PedPartition: <_ped.Partition object at 0x7fae04fb3b90> >03:49:19,857 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:19,858 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f0c590> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f15550> PedPartition: <_ped.Partition object at 0x7fae04fb3a10> >03:49:19,861 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:19,862 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04ef7bd0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f15910> PedPartition: <_ped.Partition object at 0x7fae04fb3b30> >03:49:19,864 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:19,865 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04ef7bd0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f12350> PedPartition: <_ped.Partition object at 0x7fae04f8e710> >03:49:19,865 DEBUG storage.ui: finished Blivet copy >03:49:19,866 INFO storage.ui: Using 4MB superBlockSize >03:49:19,866 DEBUG storage.ui: child factory class: <class 'blivet.devicefactory.PartitionSetFactory'> >03:49:19,871 DEBUG storage.ui: child factory args: [<blivet.Blivet object at 0x7fae05326d50>, 12016.0, [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 2 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 2 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 2 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 2 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>]] >03:49:19,873 DEBUG storage.ui: child factory kwargs: {'fstype': 'mdmember'} >03:49:19,875 DEBUG storage.ui: PartitionSetFactory.configure: parent_factory: <blivet.devicefactory.MDFactory object at 0x7fae04f42850> ; >03:49:19,875 DEBUG storage.ui: parent factory container: None >03:49:19,876 DEBUG storage.ui: members: [] >03:49:19,877 DEBUG storage.ui: add_disks: ['sda', 'sdb', 'sdc', 'sdd'] >03:49:19,878 DEBUG storage.ui: remove_disks: [] >03:49:19,880 DEBUG storage.ui: MDRaidMember.__init__: >03:49:19,880 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:49:19,882 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:49:19,882 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:49:19,884 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:19,886 DEBUG storage.ui: PartitionDevice._setFormat: req11 ; >03:49:19,888 DEBUG storage.ui: PartitionDevice._setFormat: req11 ; current: None ; type: mdmember ; >03:49:19,890 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:19,891 INFO storage.ui: added partition req11 (id 37) to device tree >03:49:19,891 INFO storage.ui: registered action: [61] Create Device partition req11 (id 37) >03:49:19,892 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:49:19,892 INFO storage.ui: registered action: [62] Create Format mdmember on partition req11 (id 37) >03:49:19,895 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:49:19,895 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:49:19,897 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:19,899 DEBUG storage.ui: PartitionDevice._setFormat: req12 ; >03:49:19,901 DEBUG storage.ui: PartitionDevice._setFormat: req12 ; current: None ; type: mdmember ; >03:49:19,903 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:49:19,903 INFO storage.ui: added partition req12 (id 38) to device tree >03:49:19,904 INFO storage.ui: registered action: [63] Create Device partition req12 (id 38) >03:49:19,904 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:49:19,905 INFO storage.ui: registered action: [64] Create Format mdmember on partition req12 (id 38) >03:49:19,907 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:49:19,908 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:49:19,910 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:19,912 DEBUG storage.ui: PartitionDevice._setFormat: req13 ; >03:49:19,914 DEBUG storage.ui: PartitionDevice._setFormat: req13 ; current: None ; type: mdmember ; >03:49:19,916 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:49:19,916 INFO storage.ui: added partition req13 (id 39) to device tree >03:49:19,917 INFO storage.ui: registered action: [65] Create Device partition req13 (id 39) >03:49:19,917 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:49:19,918 INFO storage.ui: registered action: [66] Create Format mdmember on partition req13 (id 39) >03:49:19,920 DEBUG storage.ui: MDRaidMember.__init__: mountpoint: None ; >03:49:19,920 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:49:19,923 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:19,926 DEBUG storage.ui: PartitionDevice._setFormat: req14 ; >03:49:19,928 DEBUG storage.ui: PartitionDevice._setFormat: req14 ; current: None ; type: mdmember ; >03:49:19,930 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:49:19,930 INFO storage.ui: added partition req14 (id 40) to device tree >03:49:19,931 INFO storage.ui: registered action: [67] Create Device partition req14 (id 40) >03:49:19,932 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:49:19,932 INFO storage.ui: registered action: [68] Create Format mdmember on partition req14 (id 40) >03:49:19,933 INFO storage.ui: Using 4MB superBlockSize >03:49:19,934 DEBUG storage.ui: adding a SameSizeSet with size 12016 >03:49:19,937 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:49:19,939 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:49:19,941 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:49:19,943 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:49:19,944 DEBUG storage.ui: removing all non-preexisting partitions ['req11(id 37)', 'req12(id 38)', 'req13(id 39)', 'req14(id 40)', 'sda1(id 31)', 'sda2(id 25)', 'sdb1(id 32)', 'sdb2(id 26)', 'sdc1(id 33)', 'sdc2(id 27)', 'sdd1(id 34)', 'sdd2(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:49:19,947 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:19,947 DEBUG storage.ui: device sda1 new partedPartition None >03:49:19,949 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:49:19,951 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:49:19,953 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:19,954 DEBUG storage.ui: device sda2 new partedPartition None >03:49:19,956 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:49:19,958 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:49:19,960 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:19,960 DEBUG storage.ui: device sdb1 new partedPartition None >03:49:19,962 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:49:19,964 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:49:19,967 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:19,967 DEBUG storage.ui: device sdb2 new partedPartition None >03:49:19,969 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:49:19,971 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:49:19,974 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:19,974 DEBUG storage.ui: device sdc1 new partedPartition None >03:49:19,976 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:49:19,979 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:49:19,981 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:19,981 DEBUG storage.ui: device sdc2 new partedPartition None >03:49:19,983 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:49:19,985 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:49:19,987 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:19,988 DEBUG storage.ui: device sdd1 new partedPartition None >03:49:19,990 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:49:19,992 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:49:19,994 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:19,994 DEBUG storage.ui: device sdd2 new partedPartition None >03:49:19,996 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:49:19,998 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:49:19,999 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req11(id 37)', 'req12(id 38)', 'req13(id 39)', 'req14(id 40)', 'req6(id 31)', 'req1(id 25)', 'req7(id 32)', 'req2(id 26)', 'req8(id 33)', 'req3(id 27)', 'req9(id 34)', 'req4(id 28)'] >03:49:20,000 DEBUG storage.ui: removing all non-preexisting partitions ['req6(id 31)', 'req7(id 32)', 'req8(id 33)', 'req9(id 34)', 'req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)', 'req11(id 37)', 'req12(id 38)', 'req13(id 39)', 'req14(id 40)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:49:20,002 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,005 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,006 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,008 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,010 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,010 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,010 DEBUG storage.ui: allocating partition: req6 ; id: 31 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:20,011 DEBUG storage.ui: checking freespace on sda >03:49:20,012 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=512MB boot=False best=None grow=False >03:49:20,012 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:20,013 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:20,013 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:20,014 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:20,014 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:20,015 DEBUG storage.ui: created partition sda1 of 512MB and added it to /dev/sda >03:49:20,017 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:49:20,018 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04effdd0> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:49:20,020 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:49:20,022 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:49:20,024 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:20,025 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efdc50> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:49:20,027 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,030 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,030 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,032 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,034 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,035 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,035 DEBUG storage.ui: allocating partition: req7 ; id: 32 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:20,035 DEBUG storage.ui: checking freespace on sdb >03:49:20,036 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=512MB boot=False best=None grow=False >03:49:20,037 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:20,037 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:20,038 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:20,038 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:20,038 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:20,039 DEBUG storage.ui: created partition sdb1 of 512MB and added it to /dev/sdb >03:49:20,041 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:49:20,042 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efa950> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:20,044 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:49:20,046 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:49:20,049 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:20,050 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efac10> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:49:20,052 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,054 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,054 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,056 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,059 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,059 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,059 DEBUG storage.ui: allocating partition: req8 ; id: 33 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:20,060 DEBUG storage.ui: checking freespace on sdc >03:49:20,061 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=512MB boot=False best=None grow=False >03:49:20,061 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:20,062 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:20,062 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:20,062 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:20,063 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:20,064 DEBUG storage.ui: created partition sdc1 of 512MB and added it to /dev/sdc >03:49:20,066 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:49:20,067 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5250> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:49:20,069 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:49:20,071 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:49:20,073 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:20,074 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef77d0> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:20,076 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,079 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,079 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,081 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,083 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,083 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,084 DEBUG storage.ui: allocating partition: req9 ; id: 34 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:20,084 DEBUG storage.ui: checking freespace on sdd >03:49:20,085 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=512MB boot=False best=None grow=False >03:49:20,086 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:20,086 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:20,087 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:20,087 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:20,088 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:20,089 DEBUG storage.ui: created partition sdd1 of 512MB and added it to /dev/sdd >03:49:20,091 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:49:20,091 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5ed0> PedPartition: <_ped.Partition object at 0x7fae04f8e590> >03:49:20,094 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:49:20,096 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:49:20,099 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:20,099 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff1d0> PedPartition: <_ped.Partition object at 0x7fae04f8e5f0> >03:49:20,102 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,105 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,105 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,107 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,109 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,110 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,110 DEBUG storage.ui: allocating partition: req1 ; id: 25 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:49:20,110 DEBUG storage.ui: checking freespace on sda >03:49:20,111 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=384MB boot=False best=None grow=False >03:49:20,112 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:20,112 DEBUG storage.ui: current free range is 1050624-24575999 (11487MB) >03:49:20,113 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:20,113 DEBUG storage.ui: new free: 1050624-24575999 / 11487MB >03:49:20,114 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:20,115 DEBUG storage.ui: created partition sda2 of 384MB and added it to /dev/sda >03:49:20,116 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:49:20,117 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7710> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:20,119 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:49:20,122 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:49:20,124 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:20,125 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7850> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:49:20,127 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,129 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,130 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,132 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,134 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,134 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,135 DEBUG storage.ui: allocating partition: req2 ; id: 26 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:49:20,135 DEBUG storage.ui: checking freespace on sdb >03:49:20,135 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=384MB boot=False best=None grow=False >03:49:20,136 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:20,137 DEBUG storage.ui: current free range is 1050624-24575999 (11487MB) >03:49:20,137 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:20,137 DEBUG storage.ui: new free: 1050624-24575999 / 11487MB >03:49:20,138 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:20,139 DEBUG storage.ui: created partition sdb2 of 384MB and added it to /dev/sdb >03:49:20,141 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:49:20,141 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5990> PedPartition: <_ped.Partition object at 0x7fae04f8e530> >03:49:20,144 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:49:20,146 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:49:20,148 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:20,149 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fa5250> PedPartition: <_ped.Partition object at 0x7fae04f8e110> >03:49:20,151 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,154 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,154 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,156 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,158 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,159 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,159 DEBUG storage.ui: allocating partition: req3 ; id: 27 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:49:20,159 DEBUG storage.ui: checking freespace on sdc >03:49:20,160 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=384MB boot=False best=None grow=False >03:49:20,161 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:20,161 DEBUG storage.ui: current free range is 1050624-24575999 (11487MB) >03:49:20,162 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:20,162 DEBUG storage.ui: new free: 1050624-24575999 / 11487MB >03:49:20,162 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:20,163 DEBUG storage.ui: created partition sdc2 of 384MB and added it to /dev/sdc >03:49:20,165 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:49:20,166 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2d50> PedPartition: <_ped.Partition object at 0x7fae04f8e830> >03:49:20,168 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:49:20,170 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:49:20,173 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:20,174 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2bd0> PedPartition: <_ped.Partition object at 0x7fae04f8e6b0> >03:49:20,176 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,178 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,179 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,181 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,183 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,183 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,184 DEBUG storage.ui: allocating partition: req4 ; id: 28 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 384MB ; grow: False ; max_size: 384 >03:49:20,184 DEBUG storage.ui: checking freespace on sdd >03:49:20,185 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=384MB boot=False best=None grow=False >03:49:20,185 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:20,186 DEBUG storage.ui: current free range is 1050624-24575999 (11487MB) >03:49:20,186 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:20,187 DEBUG storage.ui: new free: 1050624-24575999 / 11487MB >03:49:20,187 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:20,188 DEBUG storage.ui: created partition sdd2 of 384MB and added it to /dev/sdd >03:49:20,190 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:49:20,191 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2d90> PedPartition: <_ped.Partition object at 0x7fae04f8e3b0> >03:49:20,193 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:49:20,195 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:49:20,198 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:20,199 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff450> PedPartition: <_ped.Partition object at 0x7fae04f8e830> >03:49:20,201 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,203 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,204 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,206 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,208 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,209 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,209 DEBUG storage.ui: allocating partition: req11 ; id: 37 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 3004 >03:49:20,209 DEBUG storage.ui: checking freespace on sda >03:49:20,210 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=False best=None grow=True >03:49:20,211 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:20,211 DEBUG storage.ui: current free range is 1837056-24575999 (11103MB) >03:49:20,212 DEBUG storage.ui: evaluating growth potential for new layout >03:49:20,212 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:20,213 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,213 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,214 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,214 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,214 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,215 DEBUG storage.ui: request 34 (sdd1) growth: 0 (0MB) size: 512MB >03:49:20,215 DEBUG storage.ui: request 28 (sdd2) growth: 0 (0MB) size: 384MB >03:49:20,216 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:20,216 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:20,216 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,217 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,217 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,218 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,218 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,218 DEBUG storage.ui: request 32 (sdb1) growth: 0 (0MB) size: 512MB >03:49:20,219 DEBUG storage.ui: request 26 (sdb2) growth: 0 (0MB) size: 384MB >03:49:20,219 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:49:20,219 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:20,220 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,220 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,221 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,221 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,222 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,222 DEBUG storage.ui: request 33 (sdc1) growth: 0 (0MB) size: 512MB >03:49:20,222 DEBUG storage.ui: request 27 (sdc2) growth: 0 (0MB) size: 384MB >03:49:20,223 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:49:20,223 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:20,225 DEBUG storage.ui: PartitionDevice._setPartedPartition: req11 ; >03:49:20,226 DEBUG storage.ui: device req11 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2610> PedPartition: <_ped.Partition object at 0x7fae04f8e530> >03:49:20,228 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:20,230 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:20,231 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,231 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,232 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,232 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,233 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,233 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,233 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,234 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,234 DEBUG storage.ui: adding 22738881 (11102MB) to 37 (sda3) >03:49:20,234 DEBUG storage.ui: taking back 16588737 (8099MB) from 37 (sda3) >03:49:20,235 DEBUG storage.ui: new grow amount for request 37 (sda3) is 6150144 units, or 3003MB >03:49:20,235 DEBUG storage.ui: request 31 (sda1) growth: 0 (0MB) size: 512MB >03:49:20,235 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:49:20,236 DEBUG storage.ui: request 37 (sda3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,236 DEBUG storage.ui: disk /dev/sda growth: 6150144 (3003MB) >03:49:20,238 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:20,238 DEBUG storage.ui: device sda3 new partedPartition None >03:49:20,240 DEBUG storage.ui: PartitionDevice._setDisk: req11 ; new: None ; old: sda ; >03:49:20,243 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:20,243 DEBUG storage.ui: total growth: 6150144 sectors >03:49:20,243 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:20,244 DEBUG storage.ui: new free: 1837056-24575999 / 11103MB >03:49:20,244 DEBUG storage.ui: new free allows for 6150144 sectors of growth >03:49:20,245 DEBUG storage.ui: created partition sda3 of 1MB and added it to /dev/sda >03:49:20,247 DEBUG storage.ui: PartitionDevice._setPartedPartition: req11 ; >03:49:20,247 DEBUG storage.ui: device req11 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2e10> PedPartition: <_ped.Partition object at 0x7fae04f8e1d0> >03:49:20,249 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:20,252 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:20,254 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:20,255 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2c10> PedPartition: <_ped.Partition object at 0x7fae04f8e410> >03:49:20,257 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,259 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,260 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,262 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,264 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,264 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,264 DEBUG storage.ui: allocating partition: req12 ; id: 38 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 3004 >03:49:20,265 DEBUG storage.ui: checking freespace on sdb >03:49:20,266 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=1MB boot=False best=None grow=True >03:49:20,266 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:20,267 DEBUG storage.ui: current free range is 1837056-24575999 (11103MB) >03:49:20,267 DEBUG storage.ui: evaluating growth potential for new layout >03:49:20,268 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:20,268 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,269 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,269 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,269 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,270 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,270 DEBUG storage.ui: request 34 (sdd1) growth: 0 (0MB) size: 512MB >03:49:20,271 DEBUG storage.ui: request 28 (sdd2) growth: 0 (0MB) size: 384MB >03:49:20,271 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:20,271 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:20,274 DEBUG storage.ui: PartitionDevice._setPartedPartition: req12 ; >03:49:20,274 DEBUG storage.ui: device req12 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04090> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:49:20,277 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:49:20,279 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:20,279 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,280 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,280 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,280 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,281 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,281 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,282 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,282 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,282 DEBUG storage.ui: adding 22738881 (11102MB) to 38 (sdb3) >03:49:20,283 DEBUG storage.ui: taking back 16588737 (8099MB) from 38 (sdb3) >03:49:20,283 DEBUG storage.ui: new grow amount for request 38 (sdb3) is 6150144 units, or 3003MB >03:49:20,284 DEBUG storage.ui: request 32 (sdb1) growth: 0 (0MB) size: 512MB >03:49:20,284 DEBUG storage.ui: request 26 (sdb2) growth: 0 (0MB) size: 384MB >03:49:20,284 DEBUG storage.ui: request 38 (sdb3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,285 DEBUG storage.ui: disk /dev/sdb growth: 6150144 (3003MB) >03:49:20,285 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:20,285 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,286 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,286 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,287 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,287 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,287 DEBUG storage.ui: request 33 (sdc1) growth: 0 (0MB) size: 512MB >03:49:20,288 DEBUG storage.ui: request 27 (sdc2) growth: 0 (0MB) size: 384MB >03:49:20,288 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:49:20,288 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:20,289 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,289 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,290 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,290 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,291 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,291 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,292 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,292 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,292 DEBUG storage.ui: adding 22738881 (11102MB) to 37 (sda3) >03:49:20,293 DEBUG storage.ui: taking back 16588737 (8099MB) from 37 (sda3) >03:49:20,293 DEBUG storage.ui: new grow amount for request 37 (sda3) is 6150144 units, or 3003MB >03:49:20,293 DEBUG storage.ui: request 31 (sda1) growth: 0 (0MB) size: 512MB >03:49:20,294 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:49:20,294 DEBUG storage.ui: request 37 (sda3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,294 DEBUG storage.ui: disk /dev/sda growth: 6150144 (3003MB) >03:49:20,296 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:20,297 DEBUG storage.ui: device sdb3 new partedPartition None >03:49:20,299 DEBUG storage.ui: PartitionDevice._setDisk: req12 ; new: None ; old: sdb ; >03:49:20,301 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:49:20,301 DEBUG storage.ui: total growth: 12300288 sectors >03:49:20,302 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:20,302 DEBUG storage.ui: new free: 1837056-24575999 / 11103MB >03:49:20,302 DEBUG storage.ui: new free allows for 12300288 sectors of growth >03:49:20,303 DEBUG storage.ui: created partition sdb3 of 1MB and added it to /dev/sdb >03:49:20,305 DEBUG storage.ui: PartitionDevice._setPartedPartition: req12 ; >03:49:20,306 DEBUG storage.ui: device req12 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04e10> PedPartition: <_ped.Partition object at 0x7fae04f8e0b0> >03:49:20,308 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:49:20,311 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:20,313 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:20,314 DEBUG storage.ui: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2a50> PedPartition: <_ped.Partition object at 0x7fae04f8e230> >03:49:20,316 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,318 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,319 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,321 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,323 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,323 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,324 DEBUG storage.ui: allocating partition: req13 ; id: 39 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 3004 >03:49:20,324 DEBUG storage.ui: checking freespace on sdc >03:49:20,325 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=1MB boot=False best=None grow=True >03:49:20,325 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:20,326 DEBUG storage.ui: current free range is 1837056-24575999 (11103MB) >03:49:20,326 DEBUG storage.ui: evaluating growth potential for new layout >03:49:20,326 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:20,327 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,327 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,328 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,328 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,329 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,329 DEBUG storage.ui: request 34 (sdd1) growth: 0 (0MB) size: 512MB >03:49:20,329 DEBUG storage.ui: request 28 (sdd2) growth: 0 (0MB) size: 384MB >03:49:20,330 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:20,330 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:20,331 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,331 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,332 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,332 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,332 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,333 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,333 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,333 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,334 DEBUG storage.ui: adding 22738881 (11102MB) to 38 (sdb3) >03:49:20,334 DEBUG storage.ui: taking back 16588737 (8099MB) from 38 (sdb3) >03:49:20,335 DEBUG storage.ui: new grow amount for request 38 (sdb3) is 6150144 units, or 3003MB >03:49:20,335 DEBUG storage.ui: request 32 (sdb1) growth: 0 (0MB) size: 512MB >03:49:20,335 DEBUG storage.ui: request 26 (sdb2) growth: 0 (0MB) size: 384MB >03:49:20,336 DEBUG storage.ui: request 38 (sdb3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,336 DEBUG storage.ui: disk /dev/sdb growth: 6150144 (3003MB) >03:49:20,336 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:20,339 DEBUG storage.ui: PartitionDevice._setPartedPartition: req13 ; >03:49:20,339 DEBUG storage.ui: device req13 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f046d0> PedPartition: <_ped.Partition object at 0x7fae04f8e590> >03:49:20,341 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:49:20,344 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:20,344 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,344 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,345 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,345 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,346 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,346 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,346 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,347 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,347 DEBUG storage.ui: adding 22738881 (11102MB) to 39 (sdc3) >03:49:20,348 DEBUG storage.ui: taking back 16588737 (8099MB) from 39 (sdc3) >03:49:20,348 DEBUG storage.ui: new grow amount for request 39 (sdc3) is 6150144 units, or 3003MB >03:49:20,348 DEBUG storage.ui: request 33 (sdc1) growth: 0 (0MB) size: 512MB >03:49:20,349 DEBUG storage.ui: request 27 (sdc2) growth: 0 (0MB) size: 384MB >03:49:20,349 DEBUG storage.ui: request 39 (sdc3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,349 DEBUG storage.ui: disk /dev/sdc growth: 6150144 (3003MB) >03:49:20,350 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:20,350 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,350 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,351 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,351 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,352 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,352 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,352 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,353 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,353 DEBUG storage.ui: adding 22738881 (11102MB) to 37 (sda3) >03:49:20,354 DEBUG storage.ui: taking back 16588737 (8099MB) from 37 (sda3) >03:49:20,354 DEBUG storage.ui: new grow amount for request 37 (sda3) is 6150144 units, or 3003MB >03:49:20,354 DEBUG storage.ui: request 31 (sda1) growth: 0 (0MB) size: 512MB >03:49:20,355 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:49:20,355 DEBUG storage.ui: request 37 (sda3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,355 DEBUG storage.ui: disk /dev/sda growth: 6150144 (3003MB) >03:49:20,357 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:20,358 DEBUG storage.ui: device sdc3 new partedPartition None >03:49:20,360 DEBUG storage.ui: PartitionDevice._setDisk: req13 ; new: None ; old: sdc ; >03:49:20,361 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:49:20,362 DEBUG storage.ui: total growth: 18450432 sectors >03:49:20,362 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:20,363 DEBUG storage.ui: new free: 1837056-24575999 / 11103MB >03:49:20,363 DEBUG storage.ui: new free allows for 18450432 sectors of growth >03:49:20,364 DEBUG storage.ui: created partition sdc3 of 1MB and added it to /dev/sdc >03:49:20,366 DEBUG storage.ui: PartitionDevice._setPartedPartition: req13 ; >03:49:20,366 DEBUG storage.ui: device req13 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04890> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:49:20,368 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:49:20,371 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:20,373 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:20,374 DEBUG storage.ui: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04cd0> PedPartition: <_ped.Partition object at 0x7fae04f8e050> >03:49:20,376 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,378 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,379 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,381 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:20,383 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:20,383 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:20,383 DEBUG storage.ui: allocating partition: req14 ; id: 40 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 3004 >03:49:20,384 DEBUG storage.ui: checking freespace on sdd >03:49:20,384 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=1MB boot=False best=None grow=True >03:49:20,385 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:20,385 DEBUG storage.ui: current free range is 1837056-24575999 (11103MB) >03:49:20,386 DEBUG storage.ui: evaluating growth potential for new layout >03:49:20,386 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:20,389 DEBUG storage.ui: PartitionDevice._setPartedPartition: req14 ; >03:49:20,389 DEBUG storage.ui: device req14 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04710> PedPartition: <_ped.Partition object at 0x7fae04f8e470> >03:49:20,391 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:49:20,393 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:20,394 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,394 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,395 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,395 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,396 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,396 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,396 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,397 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,397 DEBUG storage.ui: adding 22738881 (11102MB) to 40 (sdd3) >03:49:20,398 DEBUG storage.ui: taking back 16588737 (8099MB) from 40 (sdd3) >03:49:20,398 DEBUG storage.ui: new grow amount for request 40 (sdd3) is 6150144 units, or 3003MB >03:49:20,398 DEBUG storage.ui: request 34 (sdd1) growth: 0 (0MB) size: 512MB >03:49:20,399 DEBUG storage.ui: request 28 (sdd2) growth: 0 (0MB) size: 384MB >03:49:20,399 DEBUG storage.ui: request 40 (sdd3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,399 DEBUG storage.ui: disk /dev/sdd growth: 6150144 (3003MB) >03:49:20,400 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:20,400 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,401 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,401 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,401 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,402 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,402 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,403 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,403 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,403 DEBUG storage.ui: adding 22738881 (11102MB) to 38 (sdb3) >03:49:20,404 DEBUG storage.ui: taking back 16588737 (8099MB) from 38 (sdb3) >03:49:20,404 DEBUG storage.ui: new grow amount for request 38 (sdb3) is 6150144 units, or 3003MB >03:49:20,405 DEBUG storage.ui: request 32 (sdb1) growth: 0 (0MB) size: 512MB >03:49:20,405 DEBUG storage.ui: request 26 (sdb2) growth: 0 (0MB) size: 384MB >03:49:20,405 DEBUG storage.ui: request 38 (sdb3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,406 DEBUG storage.ui: disk /dev/sdb growth: 6150144 (3003MB) >03:49:20,406 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:20,406 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,407 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,407 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,408 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,408 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,409 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,409 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,409 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,410 DEBUG storage.ui: adding 22738881 (11102MB) to 39 (sdc3) >03:49:20,410 DEBUG storage.ui: taking back 16588737 (8099MB) from 39 (sdc3) >03:49:20,411 DEBUG storage.ui: new grow amount for request 39 (sdc3) is 6150144 units, or 3003MB >03:49:20,411 DEBUG storage.ui: request 33 (sdc1) growth: 0 (0MB) size: 512MB >03:49:20,411 DEBUG storage.ui: request 27 (sdc2) growth: 0 (0MB) size: 384MB >03:49:20,412 DEBUG storage.ui: request 39 (sdc3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,412 DEBUG storage.ui: disk /dev/sdc growth: 6150144 (3003MB) >03:49:20,412 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:20,413 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,413 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,414 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,414 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,414 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,415 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,415 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,416 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,416 DEBUG storage.ui: adding 22738881 (11102MB) to 37 (sda3) >03:49:20,416 DEBUG storage.ui: taking back 16588737 (8099MB) from 37 (sda3) >03:49:20,417 DEBUG storage.ui: new grow amount for request 37 (sda3) is 6150144 units, or 3003MB >03:49:20,417 DEBUG storage.ui: request 31 (sda1) growth: 0 (0MB) size: 512MB >03:49:20,417 DEBUG storage.ui: request 25 (sda2) growth: 0 (0MB) size: 384MB >03:49:20,418 DEBUG storage.ui: request 37 (sda3) growth: 6150144 (3003MB) size: 3004MB >03:49:20,418 DEBUG storage.ui: disk /dev/sda growth: 6150144 (3003MB) >03:49:20,420 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:20,420 DEBUG storage.ui: device sdd3 new partedPartition None >03:49:20,422 DEBUG storage.ui: PartitionDevice._setDisk: req14 ; new: None ; old: sdd ; >03:49:20,424 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:49:20,424 DEBUG storage.ui: total growth: 24600576 sectors >03:49:20,425 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:20,425 DEBUG storage.ui: new free: 1837056-24575999 / 11103MB >03:49:20,425 DEBUG storage.ui: new free allows for 24600576 sectors of growth >03:49:20,426 DEBUG storage.ui: created partition sdd3 of 1MB and added it to /dev/sdd >03:49:20,428 DEBUG storage.ui: PartitionDevice._setPartedPartition: req14 ; >03:49:20,429 DEBUG storage.ui: device req14 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04c50> PedPartition: <_ped.Partition object at 0x7fae04f8e170> >03:49:20,431 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:49:20,433 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:20,436 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:20,437 DEBUG storage.ui: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04790> PedPartition: <_ped.Partition object at 0x7fae04f8e7d0> >03:49:20,437 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda3(id 37)', 'sdb3(id 38)', 'sdc3(id 39)', 'sdd3(id 40)', 'sda1(id 31)', 'sda2(id 25)', 'sdb1(id 32)', 'sdb2(id 26)', 'sdc1(id 33)', 'sdc2(id 27)', 'sdd1(id 34)', 'sdd2(id 28)'] >03:49:20,438 DEBUG storage.ui: growable partitions are ['sda3', 'sdb3', 'sdc3', 'sdd3'] >03:49:20,438 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,439 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,439 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:20,440 DEBUG storage.ui: disk sda has 1 chunks >03:49:20,440 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,441 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,441 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:20,441 DEBUG storage.ui: disk sdb has 1 chunks >03:49:20,442 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,442 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,443 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:20,443 DEBUG storage.ui: disk sdc has 1 chunks >03:49:20,444 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,444 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,445 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:20,445 DEBUG storage.ui: disk sdd has 1 chunks >03:49:20,445 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,446 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,446 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,447 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,447 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,447 DEBUG storage.ui: adding 22738881 (11102MB) to 37 (sda3) >03:49:20,448 DEBUG storage.ui: taking back 16588737 (8099MB) from 37 (sda3) >03:49:20,448 DEBUG storage.ui: new grow amount for request 37 (sda3) is 6150144 units, or 3003MB >03:49:20,448 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,449 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,449 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,450 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,450 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,450 DEBUG storage.ui: adding 22738881 (11102MB) to 38 (sdb3) >03:49:20,451 DEBUG storage.ui: taking back 16588737 (8099MB) from 38 (sdb3) >03:49:20,451 DEBUG storage.ui: new grow amount for request 38 (sdb3) is 6150144 units, or 3003MB >03:49:20,451 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,452 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,452 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,452 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,453 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,453 DEBUG storage.ui: adding 22738881 (11102MB) to 39 (sdc3) >03:49:20,454 DEBUG storage.ui: taking back 16588737 (8099MB) from 39 (sdc3) >03:49:20,454 DEBUG storage.ui: new grow amount for request 39 (sdc3) is 6150144 units, or 3003MB >03:49:20,454 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:20,455 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd1 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:20,455 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd2 growable = False >base = 786432 growth = 0 max_grow = 0 >done = True >03:49:20,455 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd3 growable = True >base = 2048 growth = 0 max_grow = 6150144 >done = False >03:49:20,456 DEBUG storage.ui: 1 requests and 22738881 (11102MB) left in chunk >03:49:20,456 DEBUG storage.ui: adding 22738881 (11102MB) to 40 (sdd3) >03:49:20,457 DEBUG storage.ui: taking back 16588737 (8099MB) from 40 (sdd3) >03:49:20,457 DEBUG storage.ui: new grow amount for request 40 (sdd3) is 6150144 units, or 3003MB >03:49:20,457 DEBUG storage.ui: set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] 3004 >03:49:20,457 DEBUG storage.ui: min growth is 6150144 >03:49:20,458 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 37 name = sda3 growable = True >base = 2048 growth = 6150144 max_grow = 6150144 >done = True is 6150144 >03:49:20,458 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 38 name = sdb3 growable = True >base = 2048 growth = 6150144 max_grow = 6150144 >done = True is 6150144 >03:49:20,459 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 39 name = sdc3 growable = True >base = 2048 growth = 6150144 max_grow = 6150144 >done = True is 6150144 >03:49:20,459 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 40 name = sdd3 growable = True >base = 2048 growth = 6150144 max_grow = 6150144 >done = True is 6150144 >03:49:20,459 DEBUG storage.ui: set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] 3004 >03:49:20,460 DEBUG storage.ui: min growth is 6150144 >03:49:20,460 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 37 name = sda3 growable = True >base = 2048 growth = 6150144 max_grow = 6150144 >done = True is 6150144 >03:49:20,460 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 38 name = sdb3 growable = True >base = 2048 growth = 6150144 max_grow = 6150144 >done = True is 6150144 >03:49:20,461 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 39 name = sdc3 growable = True >base = 2048 growth = 6150144 max_grow = 6150144 >done = True is 6150144 >03:49:20,461 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 40 name = sdd3 growable = True >base = 2048 growth = 6150144 max_grow = 6150144 >done = True is 6150144 >03:49:20,462 DEBUG storage.ui: growing partitions on sda >03:49:20,462 DEBUG storage.ui: partition sda1 (31): 0 >03:49:20,462 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fb2c90> >03:49:20,463 DEBUG storage.ui: partition sda2 (25): 0 >03:49:20,463 DEBUG storage.ui: new geometry for sda2: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fb2310> >03:49:20,464 DEBUG storage.ui: partition sda3 (37): 0 >03:49:20,464 DEBUG storage.ui: new geometry for sda3: parted.Geometry instance -- > start: 1837056 end: 7989247 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f04110> >03:49:20,465 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 31)', 'sda2(id 25)', 'sda3(id 37)'] from disk(s) ['sda'] >03:49:20,467 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:20,467 DEBUG storage.ui: device sda1 new partedPartition None >03:49:20,469 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:49:20,471 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:20,473 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:20,474 DEBUG storage.ui: device sda2 new partedPartition None >03:49:20,476 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:49:20,478 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:49:20,480 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:20,480 DEBUG storage.ui: device sda3 new partedPartition None >03:49:20,482 DEBUG storage.ui: PartitionDevice._setDisk: req11 ; new: None ; old: sda ; >03:49:20,485 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:49:20,485 DEBUG storage.ui: back from removeNewPartitions >03:49:20,485 DEBUG storage.ui: extended: None >03:49:20,486 DEBUG storage.ui: setting req6 new geometry: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fb2c90> >03:49:20,488 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:49:20,489 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04c90> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:20,491 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:49:20,493 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:49:20,496 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:20,497 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00550> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:49:20,497 DEBUG storage.ui: setting req1 new geometry: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fb2310> >03:49:20,500 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:49:20,500 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff410> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:49:20,503 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:49:20,505 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:49:20,507 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:20,508 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2850> PedPartition: <_ped.Partition object at 0x7fae04f8e350> >03:49:20,508 DEBUG storage.ui: setting req11 new geometry: parted.Geometry instance -- > start: 1837056 end: 7989247 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f04110> >03:49:20,510 DEBUG storage.ui: PartitionDevice._setPartedPartition: req11 ; >03:49:20,511 DEBUG storage.ui: device req11 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efa510> PedPartition: <_ped.Partition object at 0x7fae04f8e590> >03:49:20,513 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:20,515 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:20,517 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:20,518 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00990> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:49:20,518 DEBUG storage.ui: growing partitions on sdb >03:49:20,519 DEBUG storage.ui: partition sdb1 (32): 0 >03:49:20,519 DEBUG storage.ui: new geometry for sdb1: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f04a10> >03:49:20,520 DEBUG storage.ui: partition sdb2 (26): 0 >03:49:20,520 DEBUG storage.ui: new geometry for sdb2: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fb2c10> >03:49:20,521 DEBUG storage.ui: partition sdb3 (38): 0 >03:49:20,522 DEBUG storage.ui: new geometry for sdb3: parted.Geometry instance -- > start: 1837056 end: 7989247 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f00850> >03:49:20,522 DEBUG storage.ui: removing all non-preexisting partitions ['sdb1(id 32)', 'sdb2(id 26)', 'sdb3(id 38)'] from disk(s) ['sdb'] >03:49:20,524 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:20,525 DEBUG storage.ui: device sdb1 new partedPartition None >03:49:20,527 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:49:20,529 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:49:20,531 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:20,531 DEBUG storage.ui: device sdb2 new partedPartition None >03:49:20,533 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:49:20,536 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:49:20,538 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:20,538 DEBUG storage.ui: device sdb3 new partedPartition None >03:49:20,541 DEBUG storage.ui: PartitionDevice._setDisk: req12 ; new: None ; old: sdb ; >03:49:20,542 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:49:20,543 DEBUG storage.ui: back from removeNewPartitions >03:49:20,543 DEBUG storage.ui: extended: None >03:49:20,544 DEBUG storage.ui: setting req7 new geometry: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f04a10> >03:49:20,546 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:49:20,547 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2890> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:49:20,549 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:49:20,551 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:49:20,553 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:20,554 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2510> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:20,555 DEBUG storage.ui: setting req2 new geometry: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fb2c10> >03:49:20,557 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:49:20,558 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff410> PedPartition: <_ped.Partition object at 0x7fae04f8e950> >03:49:20,560 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:49:20,562 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:49:20,564 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:20,565 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2950> PedPartition: <_ped.Partition object at 0x7fae04f8e1d0> >03:49:20,566 DEBUG storage.ui: setting req12 new geometry: parted.Geometry instance -- > start: 1837056 end: 7989247 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04f00850> >03:49:20,568 DEBUG storage.ui: PartitionDevice._setPartedPartition: req12 ; >03:49:20,569 DEBUG storage.ui: device req12 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7810> PedPartition: <_ped.Partition object at 0x7fae04f8e8f0> >03:49:20,571 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:49:20,573 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:20,576 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:20,576 DEBUG storage.ui: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efac10> PedPartition: <_ped.Partition object at 0x7fae04f8e410> >03:49:20,577 DEBUG storage.ui: growing partitions on sdc >03:49:20,577 DEBUG storage.ui: partition sdc1 (33): 0 >03:49:20,578 DEBUG storage.ui: new geometry for sdc1: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04ef7350> >03:49:20,578 DEBUG storage.ui: partition sdc2 (27): 0 >03:49:20,579 DEBUG storage.ui: new geometry for sdc2: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04fb2e50> >03:49:20,579 DEBUG storage.ui: partition sdc3 (39): 0 >03:49:20,580 DEBUG storage.ui: new geometry for sdc3: parted.Geometry instance -- > start: 1837056 end: 7989247 length: 6152192 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f00290> >03:49:20,580 DEBUG storage.ui: removing all non-preexisting partitions ['sdc1(id 33)', 'sdc2(id 27)', 'sdc3(id 39)'] from disk(s) ['sdc'] >03:49:20,582 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:20,583 DEBUG storage.ui: device sdc1 new partedPartition None >03:49:20,585 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:49:20,587 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:49:20,589 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:20,589 DEBUG storage.ui: device sdc2 new partedPartition None >03:49:20,591 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:49:20,593 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:49:20,596 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:20,596 DEBUG storage.ui: device sdc3 new partedPartition None >03:49:20,598 DEBUG storage.ui: PartitionDevice._setDisk: req13 ; new: None ; old: sdc ; >03:49:20,600 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:49:20,600 DEBUG storage.ui: back from removeNewPartitions >03:49:20,600 DEBUG storage.ui: extended: None >03:49:20,601 DEBUG storage.ui: setting req8 new geometry: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04ef7350> >03:49:20,603 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:49:20,604 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04890> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:49:20,606 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:49:20,608 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:49:20,611 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:20,611 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04290> PedPartition: <_ped.Partition object at 0x7fae04f8e530> >03:49:20,612 DEBUG storage.ui: setting req3 new geometry: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04fb2e50> >03:49:20,614 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:49:20,615 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef77d0> PedPartition: <_ped.Partition object at 0x7fae04f8e590> >03:49:20,618 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:49:20,620 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:49:20,622 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:20,623 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7750> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:20,624 DEBUG storage.ui: setting req13 new geometry: parted.Geometry instance -- > start: 1837056 end: 7989247 length: 6152192 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f00290> >03:49:20,626 DEBUG storage.ui: PartitionDevice._setPartedPartition: req13 ; >03:49:20,627 DEBUG storage.ui: device req13 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efab90> PedPartition: <_ped.Partition object at 0x7fae04f8e110> >03:49:20,629 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:49:20,631 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:20,634 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:20,634 DEBUG storage.ui: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04b10> PedPartition: <_ped.Partition object at 0x7fae04f8e290> >03:49:20,635 DEBUG storage.ui: growing partitions on sdd >03:49:20,635 DEBUG storage.ui: partition sdd1 (34): 0 >03:49:20,636 DEBUG storage.ui: new geometry for sdd1: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04fb2a50> >03:49:20,636 DEBUG storage.ui: partition sdd2 (28): 0 >03:49:20,636 DEBUG storage.ui: new geometry for sdd2: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04ef77d0> >03:49:20,637 DEBUG storage.ui: partition sdd3 (40): 0 >03:49:20,637 DEBUG storage.ui: new geometry for sdd3: parted.Geometry instance -- > start: 1837056 end: 7989247 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04effb50> >03:49:20,638 DEBUG storage.ui: removing all non-preexisting partitions ['sdd1(id 34)', 'sdd2(id 28)', 'sdd3(id 40)'] from disk(s) ['sdd'] >03:49:20,640 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:20,640 DEBUG storage.ui: device sdd1 new partedPartition None >03:49:20,642 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:49:20,644 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:49:20,646 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:20,646 DEBUG storage.ui: device sdd2 new partedPartition None >03:49:20,648 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:49:20,650 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:49:20,653 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:20,653 DEBUG storage.ui: device sdd3 new partedPartition None >03:49:20,655 DEBUG storage.ui: PartitionDevice._setDisk: req14 ; new: None ; old: sdd ; >03:49:20,657 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:49:20,657 DEBUG storage.ui: back from removeNewPartitions >03:49:20,658 DEBUG storage.ui: extended: None >03:49:20,658 DEBUG storage.ui: setting req9 new geometry: parted.Geometry instance -- > start: 2048 end: 1050623 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04fb2a50> >03:49:20,660 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:49:20,661 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb23d0> PedPartition: <_ped.Partition object at 0x7fae04fb3d70> >03:49:20,663 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:49:20,665 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:49:20,668 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:20,669 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2bd0> PedPartition: <_ped.Partition object at 0x7fae04f8e8f0> >03:49:20,669 DEBUG storage.ui: setting req4 new geometry: parted.Geometry instance -- > start: 1050624 end: 1837055 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04ef77d0> >03:49:20,671 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:49:20,672 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04eff1d0> PedPartition: <_ped.Partition object at 0x7fae04fb3a70> >03:49:20,675 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:49:20,677 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:49:20,679 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:20,680 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f040d0> PedPartition: <_ped.Partition object at 0x7fae04f8e770> >03:49:20,681 DEBUG storage.ui: setting req14 new geometry: parted.Geometry instance -- > start: 1837056 end: 7989247 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04effb50> >03:49:20,683 DEBUG storage.ui: PartitionDevice._setPartedPartition: req14 ; >03:49:20,683 DEBUG storage.ui: device req14 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7410> PedPartition: <_ped.Partition object at 0x7fae04f8e590> >03:49:20,686 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:49:20,688 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:20,690 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:20,691 DEBUG storage.ui: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00c90> PedPartition: <_ped.Partition object at 0x7fae04f8e050> >03:49:20,692 DEBUG storage.ui: fixing size of non-existent 512MB partition sda1 (31) with non-existent mdmember at 512.00 >03:49:20,692 DEBUG storage.ui: fixing size of non-existent 384MB partition sda2 (25) with non-existent mdmember at 384.00 >03:49:20,693 DEBUG storage.ui: fixing size of non-existent 3004MB partition sda3 (37) with non-existent mdmember at 3004.00 >03:49:20,694 DEBUG storage.ui: fixing size of non-existent 512MB partition sdb1 (32) with non-existent mdmember at 512.00 >03:49:20,694 DEBUG storage.ui: fixing size of non-existent 384MB partition sdb2 (26) with non-existent mdmember at 384.00 >03:49:20,695 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdb3 (38) with non-existent mdmember at 3004.00 >03:49:20,696 DEBUG storage.ui: fixing size of non-existent 512MB partition sdc1 (33) with non-existent mdmember at 512.00 >03:49:20,696 DEBUG storage.ui: fixing size of non-existent 384MB partition sdc2 (27) with non-existent mdmember at 384.00 >03:49:20,697 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdc3 (39) with non-existent mdmember at 3004.00 >03:49:20,698 DEBUG storage.ui: fixing size of non-existent 512MB partition sdd1 (34) with non-existent mdmember at 512.00 >03:49:20,698 DEBUG storage.ui: fixing size of non-existent 384MB partition sdd2 (28) with non-existent mdmember at 384.00 >03:49:20,699 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdd3 (40) with non-existent mdmember at 3004.00 >03:49:20,703 DEBUG storage.ui: Ext4FS.supported: supported: True ; >03:49:20,703 DEBUG storage.ui: getFormat('ext4') returning Ext4FS instance >03:49:20,708 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sda3 ; >03:49:20,710 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdb3 ; >03:49:20,712 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdc3 ; >03:49:20,714 DEBUG storage.ui: PartitionDevice.addChild: kids: 0 ; name: sdd3 ; >03:49:20,716 DEBUG storage.ui: MDRaidArrayDevice._setFormat: root ; current: None ; type: ext4 ; >03:49:20,716 INFO storage.ui: added mdarray root (id 41) to device tree >03:49:20,717 INFO storage.ui: registered action: [69] Create Device mdarray root (id 41) >03:49:20,718 DEBUG storage.ui: getFormat('None') returning DeviceFormat instance >03:49:20,718 INFO storage.ui: registered action: [70] Create Format ext4 filesystem mounted at / on mdarray root (id 41) >03:49:20,720 DEBUG storage.ui: raw RAID 10 size == 6008.0 >03:49:20,720 INFO storage.ui: Using 4MB superBlockSize >03:49:20,721 DEBUG storage.ui: non-existent RAID 10 size == 6000.0 >03:49:20,722 DEBUG storage.ui: raw RAID 10 size == 6008.0 >03:49:20,722 INFO storage.ui: Using 4MB superBlockSize >03:49:20,723 DEBUG storage.ui: non-existent RAID 10 size == 6000.0 >03:49:20,726 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:20,726 INFO blivet: Using 0MB superBlockSize >03:49:20,727 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:20,730 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:20,730 INFO blivet: Using 4MB superBlockSize >03:49:20,731 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:20,733 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:20,734 INFO blivet: Using 0MB superBlockSize >03:49:20,734 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:20,748 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:20,749 INFO blivet: Using 4MB superBlockSize >03:49:20,749 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:20,753 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:20,753 INFO blivet: Using 4MB superBlockSize >03:49:20,754 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:37,905 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:37,906 INFO blivet: Using 4MB superBlockSize >03:49:37,907 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:37,916 DEBUG blivet: Ext4FS.supported: supported: True ; >03:49:37,916 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:49:37,923 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid10'} >03:49:42,577 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:42,579 INFO blivet: Using 4MB superBlockSize >03:49:42,581 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:42,591 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:42,591 INFO blivet: Using 4MB superBlockSize >03:49:42,592 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:42,602 DEBUG blivet: Ext4FS.supported: supported: True ; >03:49:42,602 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:49:42,610 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid10'} >03:49:42,618 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:42,619 INFO blivet: Using 4MB superBlockSize >03:49:42,619 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:42,621 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:42,622 INFO blivet: Using 4MB superBlockSize >03:49:42,622 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:42,628 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:42,628 INFO blivet: Using 4MB superBlockSize >03:49:42,629 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:43,843 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:43,845 INFO blivet: Using 4MB superBlockSize >03:49:43,846 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:43,857 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:43,858 INFO blivet: Using 4MB superBlockSize >03:49:43,858 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:43,868 DEBUG blivet: Ext4FS.supported: supported: True ; >03:49:43,868 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:49:43,875 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid10'} >03:49:43,883 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:43,883 INFO blivet: Using 0MB superBlockSize >03:49:43,884 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:43,886 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:43,886 INFO blivet: Using 0MB superBlockSize >03:49:43,887 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:43,893 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:43,893 INFO blivet: Using 0MB superBlockSize >03:49:43,894 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:46,633 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:46,635 INFO blivet: Using 0MB superBlockSize >03:49:46,636 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:46,643 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:46,644 INFO blivet: Using 0MB superBlockSize >03:49:46,644 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:46,653 DEBUG blivet: Ext4FS.supported: supported: True ; >03:49:46,653 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:49:46,660 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid1'} >03:49:46,668 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:46,668 INFO blivet: Using 0MB superBlockSize >03:49:46,669 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:46,671 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:46,671 INFO blivet: Using 0MB superBlockSize >03:49:46,672 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:46,677 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:46,678 INFO blivet: Using 0MB superBlockSize >03:49:46,679 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:46,683 DEBUG blivet: SwapSpace.__init__: >03:49:46,684 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:49:50,498 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:50,500 INFO blivet: Using 0MB superBlockSize >03:49:50,502 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:50,510 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:50,510 INFO blivet: Using 0MB superBlockSize >03:49:50,511 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:50,520 DEBUG blivet: SwapSpace.__init__: >03:49:50,520 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:49:50,527 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid10'} >03:49:50,539 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:49:50,539 INFO storage.ui: Using 0MB superBlockSize >03:49:50,540 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:49:50,541 DEBUG storage.ui: Blivet.factoryDevice: 1 ; 768 ; container_raid_level: None ; name: swap ; encrypted: False ; container_encrypted: False ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 3 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 3 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 3 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 3 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; raid_level: raid10 ; label: ; container_name: None ; device: non-existent 768MB mdarray swap (29) with non-existent swap ; mountpoint: None ; fstype: swap ; container_size: 0 ; >03:49:50,544 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:49:50,544 INFO storage.ui: Using 0MB superBlockSize >03:49:50,545 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:49:50,546 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda', 'sdb', 'sdc', 'sdd'], {'container_raid_level': None, 'name': 'swap', 'encrypted': False, 'container_encrypted': False, 'raid_level': 'raid10', 'label': '', 'container_name': None, 'device': MDRaidArrayDevice instance (0x7fae04f96bd0) -- > name = swap status = False kids = 0 id = 29 > parents = ['non-existent 384MB partition sda2 (25) with non-existent mdmember', > 'non-existent 384MB partition sdb2 (26) with non-existent mdmember', > 'non-existent 384MB partition sdc2 (27) with non-existent mdmember', > 'non-existent 384MB partition sdd2 (28) with non-existent mdmember'] > uuid = None size = 768.0 > format = non-existent swap > major = 0 minor = 0 exists = False protected = False > sysfs path = partedDevice = None > target size = 768 path = /dev/md/swap > format args = None originalFormat = swap level = 10 spares = 0 > members = 4 > total devices = 4 metadata version = default, 'mountpoint': None, 'fstype': 'swap', 'container_size': 0} >03:49:50,548 DEBUG storage.ui: MDFactory.configure: parent_factory: None ; >03:49:50,548 DEBUG storage.ui: starting Blivet copy >03:49:50,593 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:50,595 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f984d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f25750> PedPartition: <_ped.Partition object at 0x7fae04f8ea70> >03:49:50,597 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:50,598 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f984d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f258d0> PedPartition: <_ped.Partition object at 0x7fae04f8ead0> >03:49:50,600 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:50,601 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f984d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f25a50> PedPartition: <_ped.Partition object at 0x7fae04f8ea10> >03:49:50,604 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:50,605 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f3e8d0> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f25b90> PedPartition: <_ped.Partition object at 0x7fae04f8e7d0> >03:49:50,608 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:50,609 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f3e8d0> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f25c90> PedPartition: <_ped.Partition object at 0x7fae04f8e110> >03:49:50,611 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:50,612 DEBUG storage.ui: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f3e8d0> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f25e10> PedPartition: <_ped.Partition object at 0x7fae04f8e170> >03:49:50,615 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:50,616 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f001d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f25f50> PedPartition: <_ped.Partition object at 0x7fae04f8eb30> >03:49:50,619 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:50,620 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f001d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1d090> PedPartition: <_ped.Partition object at 0x7fae04f8e590> >03:49:50,622 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:50,623 DEBUG storage.ui: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f001d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1d210> PedPartition: <_ped.Partition object at 0x7fae04f8e470> >03:49:50,626 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:50,627 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f00610> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1d350> PedPartition: <_ped.Partition object at 0x7fae04f8ebf0> >03:49:50,629 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:50,630 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f00610> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1d450> PedPartition: <_ped.Partition object at 0x7fae04f8ec50> >03:49:50,633 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:50,633 DEBUG storage.ui: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f00610> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1d5d0> PedPartition: <_ped.Partition object at 0x7fae04f8eb90> >03:49:50,634 DEBUG storage.ui: finished Blivet copy >03:49:50,635 INFO storage.ui: Using 0MB superBlockSize >03:49:50,635 DEBUG storage.ui: child factory class: <class 'blivet.devicefactory.PartitionSetFactory'> >03:49:50,640 DEBUG storage.ui: child factory args: [<blivet.Blivet object at 0x7fae05326d50>, 1536.0, [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 3 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 3 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 3 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 3 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>]] >03:49:50,642 DEBUG storage.ui: child factory kwargs: {'fstype': 'mdmember'} >03:49:50,644 DEBUG storage.ui: PartitionSetFactory.configure: parent_factory: <blivet.devicefactory.MDFactory object at 0x7fae04fa3b10> ; >03:49:50,645 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:49:50,646 INFO storage.ui: Using 0MB superBlockSize >03:49:50,646 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:49:50,647 DEBUG storage.ui: parent factory container: non-existent 768MB mdarray swap (29) with non-existent swap >03:49:50,647 DEBUG storage.ui: members: ['sda2', 'sdb2', 'sdc2', 'sdd2'] >03:49:50,648 DEBUG storage.ui: add_disks: [] >03:49:50,648 DEBUG storage.ui: remove_disks: [] >03:49:50,651 DEBUG storage.ui: MDRaidMember.__init__: >03:49:50,651 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:49:50,652 INFO storage.ui: Using 0MB superBlockSize >03:49:50,652 DEBUG storage.ui: adding a SameSizeSet with size 1536 >03:49:50,655 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:49:50,657 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:49:50,659 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:49:50,662 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:49:50,663 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 31)', 'sda2(id 25)', 'sda3(id 37)', 'sdb1(id 32)', 'sdb2(id 26)', 'sdb3(id 38)', 'sdc1(id 33)', 'sdc2(id 27)', 'sdc3(id 39)', 'sdd1(id 34)', 'sdd2(id 28)', 'sdd3(id 40)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:49:50,666 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:50,666 DEBUG storage.ui: device sda1 new partedPartition None >03:49:50,669 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:49:50,671 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:50,674 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:50,674 DEBUG storage.ui: device sda2 new partedPartition None >03:49:50,677 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:49:50,679 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:49:50,682 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:50,682 DEBUG storage.ui: device sda3 new partedPartition None >03:49:50,685 DEBUG storage.ui: PartitionDevice._setDisk: req11 ; new: None ; old: sda ; >03:49:50,688 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:49:50,690 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:50,691 DEBUG storage.ui: device sdb1 new partedPartition None >03:49:50,693 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:49:50,696 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:49:50,698 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:50,699 DEBUG storage.ui: device sdb2 new partedPartition None >03:49:50,701 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:49:50,704 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:49:50,706 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:50,707 DEBUG storage.ui: device sdb3 new partedPartition None >03:49:50,709 DEBUG storage.ui: PartitionDevice._setDisk: req12 ; new: None ; old: sdb ; >03:49:50,711 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:49:50,714 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:50,715 DEBUG storage.ui: device sdc1 new partedPartition None >03:49:50,717 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:49:50,720 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:49:50,722 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:50,723 DEBUG storage.ui: device sdc2 new partedPartition None >03:49:50,725 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:49:50,728 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:49:50,730 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:50,731 DEBUG storage.ui: device sdc3 new partedPartition None >03:49:50,733 DEBUG storage.ui: PartitionDevice._setDisk: req13 ; new: None ; old: sdc ; >03:49:50,735 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:49:50,738 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:50,738 DEBUG storage.ui: device sdd1 new partedPartition None >03:49:50,741 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:49:50,743 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:49:50,745 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:50,746 DEBUG storage.ui: device sdd2 new partedPartition None >03:49:50,748 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:49:50,751 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:49:50,753 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:50,754 DEBUG storage.ui: device sdd3 new partedPartition None >03:49:50,756 DEBUG storage.ui: PartitionDevice._setDisk: req14 ; new: None ; old: sdd ; >03:49:50,759 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:49:50,760 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req6(id 31)', 'req1(id 25)', 'req11(id 37)', 'req7(id 32)', 'req2(id 26)', 'req12(id 38)', 'req8(id 33)', 'req3(id 27)', 'req13(id 39)', 'req9(id 34)', 'req4(id 28)', 'req14(id 40)'] >03:49:50,761 DEBUG storage.ui: removing all non-preexisting partitions ['req11(id 37)', 'req12(id 38)', 'req13(id 39)', 'req14(id 40)', 'req6(id 31)', 'req7(id 32)', 'req8(id 33)', 'req9(id 34)', 'req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:49:50,764 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,766 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,767 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,770 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,772 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,773 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,773 DEBUG storage.ui: allocating partition: req11 ; id: 37 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 3004MB ; grow: False ; max_size: 3004 >03:49:50,774 DEBUG storage.ui: checking freespace on sda >03:49:50,775 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=3004MB boot=False best=None grow=False >03:49:50,775 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:50,776 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:50,777 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:50,777 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:50,778 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:50,779 DEBUG storage.ui: created partition sda1 of 3004MB and added it to /dev/sda >03:49:50,781 DEBUG storage.ui: PartitionDevice._setPartedPartition: req11 ; >03:49:50,782 DEBUG storage.ui: device req11 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7790> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:50,785 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:49:50,787 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:49:50,790 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:50,791 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00990> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:50,793 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,796 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,796 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,799 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,801 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,802 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,802 DEBUG storage.ui: allocating partition: req12 ; id: 38 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 3004MB ; grow: False ; max_size: 3004 >03:49:50,803 DEBUG storage.ui: checking freespace on sdb >03:49:50,804 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=3004MB boot=False best=None grow=False >03:49:50,805 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:50,806 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:50,806 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:50,807 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:50,807 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:50,809 DEBUG storage.ui: created partition sdb1 of 3004MB and added it to /dev/sdb >03:49:50,811 DEBUG storage.ui: PartitionDevice._setPartedPartition: req12 ; >03:49:50,812 DEBUG storage.ui: device req12 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00350> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:49:50,815 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:49:50,817 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:49:50,820 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:50,821 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efa510> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:49:50,824 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,826 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,827 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,829 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,832 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,832 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,833 DEBUG storage.ui: allocating partition: req13 ; id: 39 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 3004MB ; grow: False ; max_size: 3004 >03:49:50,834 DEBUG storage.ui: checking freespace on sdc >03:49:50,835 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=3004MB boot=False best=None grow=False >03:49:50,836 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:50,836 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:50,837 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:50,838 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:50,838 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:50,839 DEBUG storage.ui: created partition sdc1 of 3004MB and added it to /dev/sdc >03:49:50,841 DEBUG storage.ui: PartitionDevice._setPartedPartition: req13 ; >03:49:50,842 DEBUG storage.ui: device req13 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2a10> PedPartition: <_ped.Partition object at 0x7fae04f8e0b0> >03:49:50,845 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:49:50,847 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:49:50,850 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:50,851 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2450> PedPartition: <_ped.Partition object at 0x7fae04f8e950> >03:49:50,854 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,856 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,857 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,859 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,862 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,862 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,863 DEBUG storage.ui: allocating partition: req14 ; id: 40 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 3004MB ; grow: False ; max_size: 3004 >03:49:50,864 DEBUG storage.ui: checking freespace on sdd >03:49:50,865 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=3004MB boot=False best=None grow=False >03:49:50,866 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:49:50,867 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:50,867 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:49:50,868 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:50,868 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:49:50,869 DEBUG storage.ui: created partition sdd1 of 3004MB and added it to /dev/sdd >03:49:50,872 DEBUG storage.ui: PartitionDevice._setPartedPartition: req14 ; >03:49:50,873 DEBUG storage.ui: device req14 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efdd10> PedPartition: <_ped.Partition object at 0x7fae04f8e4d0> >03:49:50,876 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:49:50,878 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:49:50,882 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:50,883 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00dd0> PedPartition: <_ped.Partition object at 0x7fae04f8e5f0> >03:49:50,885 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,888 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,889 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,891 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,893 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,894 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,895 DEBUG storage.ui: allocating partition: req6 ; id: 31 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:50,895 DEBUG storage.ui: checking freespace on sda >03:49:50,896 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=512MB boot=False best=None grow=False >03:49:50,897 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:50,898 DEBUG storage.ui: current free range is 6154240-24575999 (8995MB) >03:49:50,898 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:50,899 DEBUG storage.ui: new free: 6154240-24575999 / 8995MB >03:49:50,900 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:50,901 DEBUG storage.ui: created partition sda2 of 512MB and added it to /dev/sda >03:49:50,903 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:49:50,904 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04690> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:50,907 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:49:50,909 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:49:50,912 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:50,913 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7350> PedPartition: <_ped.Partition object at 0x7fae04f8e4d0> >03:49:50,916 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,918 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,919 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,921 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,923 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,924 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,925 DEBUG storage.ui: allocating partition: req7 ; id: 32 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:50,925 DEBUG storage.ui: checking freespace on sdb >03:49:50,926 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=512MB boot=False best=None grow=False >03:49:50,927 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:50,928 DEBUG storage.ui: current free range is 6154240-24575999 (8995MB) >03:49:50,929 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:50,929 DEBUG storage.ui: new free: 6154240-24575999 / 8995MB >03:49:50,930 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:50,931 DEBUG storage.ui: created partition sdb2 of 512MB and added it to /dev/sdb >03:49:50,933 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:49:50,935 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00d10> PedPartition: <_ped.Partition object at 0x7fae04f8e050> >03:49:50,937 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:49:50,939 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:49:50,943 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:50,944 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2350> PedPartition: <_ped.Partition object at 0x7fae04f8e410> >03:49:50,946 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,949 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,950 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,952 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,955 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,955 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,956 DEBUG storage.ui: allocating partition: req8 ; id: 33 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:50,956 DEBUG storage.ui: checking freespace on sdc >03:49:50,957 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=512MB boot=False best=None grow=False >03:49:50,958 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:50,959 DEBUG storage.ui: current free range is 6154240-24575999 (8995MB) >03:49:50,960 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:50,961 DEBUG storage.ui: new free: 6154240-24575999 / 8995MB >03:49:50,961 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:50,962 DEBUG storage.ui: created partition sdc2 of 512MB and added it to /dev/sdc >03:49:50,964 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:49:50,966 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb29d0> PedPartition: <_ped.Partition object at 0x7fae04f8e230> >03:49:50,968 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:49:50,970 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:49:50,973 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:50,974 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2c50> PedPartition: <_ped.Partition object at 0x7fae04f8e3b0> >03:49:50,977 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,980 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,980 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,983 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:50,986 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:50,986 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:50,987 DEBUG storage.ui: allocating partition: req9 ; id: 34 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:49:50,988 DEBUG storage.ui: checking freespace on sdd >03:49:50,989 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=512MB boot=False best=None grow=False >03:49:50,989 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:50,990 DEBUG storage.ui: current free range is 6154240-24575999 (8995MB) >03:49:50,991 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:50,992 DEBUG storage.ui: new free: 6154240-24575999 / 8995MB >03:49:50,992 DEBUG storage.ui: new free allows for 0 sectors of growth >03:49:50,993 DEBUG storage.ui: created partition sdd2 of 512MB and added it to /dev/sdd >03:49:50,996 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:49:50,997 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00410> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:50,999 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:49:51,002 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:49:51,004 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:51,005 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00ad0> PedPartition: <_ped.Partition object at 0x7fae04f8e050> >03:49:51,008 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:51,010 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:51,011 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:51,013 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:51,015 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:51,015 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:51,016 DEBUG storage.ui: allocating partition: req1 ; id: 25 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:49:51,016 DEBUG storage.ui: checking freespace on sda >03:49:51,017 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=False best=None grow=True >03:49:51,017 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:51,018 DEBUG storage.ui: current free range is 7202816-24575999 (8483MB) >03:49:51,018 DEBUG storage.ui: evaluating growth potential for new layout >03:49:51,019 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:51,019 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,020 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,020 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,020 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,021 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,021 DEBUG storage.ui: request 40 (sdd1) growth: 0 (0MB) size: 3004MB >03:49:51,021 DEBUG storage.ui: request 34 (sdd2) growth: 0 (0MB) size: 512MB >03:49:51,022 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:51,022 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:51,022 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,023 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,023 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,024 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,024 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,024 DEBUG storage.ui: request 38 (sdb1) growth: 0 (0MB) size: 3004MB >03:49:51,025 DEBUG storage.ui: request 32 (sdb2) growth: 0 (0MB) size: 512MB >03:49:51,025 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:49:51,025 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:51,026 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,026 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,026 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,027 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,027 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,027 DEBUG storage.ui: request 39 (sdc1) growth: 0 (0MB) size: 3004MB >03:49:51,028 DEBUG storage.ui: request 33 (sdc2) growth: 0 (0MB) size: 512MB >03:49:51,028 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:49:51,028 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:51,031 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:49:51,031 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1d6d0> PedPartition: <_ped.Partition object at 0x7fae04f8e650> >03:49:51,033 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:51,036 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:51,037 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,037 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,037 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,038 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,038 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,039 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,039 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,039 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,040 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:49:51,040 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:49:51,040 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:49:51,041 DEBUG storage.ui: request 37 (sda1) growth: 0 (0MB) size: 3004MB >03:49:51,041 DEBUG storage.ui: request 31 (sda2) growth: 0 (0MB) size: 512MB >03:49:51,041 DEBUG storage.ui: request 25 (sda3) growth: 784384 (383MB) size: 384MB >03:49:51,042 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:49:51,044 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:51,044 DEBUG storage.ui: device sda3 new partedPartition None >03:49:51,047 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:49:51,049 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:51,049 DEBUG storage.ui: total growth: 784384 sectors >03:49:51,050 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:49:51,050 DEBUG storage.ui: new free: 7202816-24575999 / 8483MB >03:49:51,050 DEBUG storage.ui: new free allows for 784384 sectors of growth >03:49:51,051 DEBUG storage.ui: created partition sda3 of 1MB and added it to /dev/sda >03:49:51,053 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:49:51,054 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1da10> PedPartition: <_ped.Partition object at 0x7fae04f8e770> >03:49:51,056 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:51,058 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:51,061 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:51,061 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1dcd0> PedPartition: <_ped.Partition object at 0x7fae04f8e530> >03:49:51,064 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:51,066 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:51,066 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:51,068 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:51,070 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:51,071 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:51,071 DEBUG storage.ui: allocating partition: req2 ; id: 26 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:49:51,071 DEBUG storage.ui: checking freespace on sdb >03:49:51,072 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=1MB boot=False best=None grow=True >03:49:51,073 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:51,073 DEBUG storage.ui: current free range is 7202816-24575999 (8483MB) >03:49:51,074 DEBUG storage.ui: evaluating growth potential for new layout >03:49:51,074 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:51,075 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,075 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,075 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,076 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,076 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,077 DEBUG storage.ui: request 40 (sdd1) growth: 0 (0MB) size: 3004MB >03:49:51,077 DEBUG storage.ui: request 34 (sdd2) growth: 0 (0MB) size: 512MB >03:49:51,077 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:51,077 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:51,080 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:49:51,081 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22150> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:49:51,083 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:49:51,085 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:51,086 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,086 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,087 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,088 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,088 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,088 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,089 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,089 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,090 DEBUG storage.ui: adding 17373121 (8482MB) to 26 (sdb3) >03:49:51,090 DEBUG storage.ui: taking back 16588737 (8099MB) from 26 (sdb3) >03:49:51,091 DEBUG storage.ui: new grow amount for request 26 (sdb3) is 784384 units, or 383MB >03:49:51,091 DEBUG storage.ui: request 38 (sdb1) growth: 0 (0MB) size: 3004MB >03:49:51,091 DEBUG storage.ui: request 32 (sdb2) growth: 0 (0MB) size: 512MB >03:49:51,092 DEBUG storage.ui: request 26 (sdb3) growth: 784384 (383MB) size: 384MB >03:49:51,092 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:49:51,092 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:51,093 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,093 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,093 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,094 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,094 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,095 DEBUG storage.ui: request 39 (sdc1) growth: 0 (0MB) size: 3004MB >03:49:51,095 DEBUG storage.ui: request 33 (sdc2) growth: 0 (0MB) size: 512MB >03:49:51,095 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:49:51,096 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:51,096 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,097 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,097 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,097 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,098 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,098 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,099 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,099 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,099 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:49:51,100 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:49:51,100 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:49:51,100 DEBUG storage.ui: request 37 (sda1) growth: 0 (0MB) size: 3004MB >03:49:51,101 DEBUG storage.ui: request 31 (sda2) growth: 0 (0MB) size: 512MB >03:49:51,101 DEBUG storage.ui: request 25 (sda3) growth: 784384 (383MB) size: 384MB >03:49:51,101 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:49:51,104 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:51,104 DEBUG storage.ui: device sdb3 new partedPartition None >03:49:51,106 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:49:51,108 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:49:51,109 DEBUG storage.ui: total growth: 1568768 sectors >03:49:51,109 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:49:51,110 DEBUG storage.ui: new free: 7202816-24575999 / 8483MB >03:49:51,110 DEBUG storage.ui: new free allows for 1568768 sectors of growth >03:49:51,111 DEBUG storage.ui: created partition sdb3 of 1MB and added it to /dev/sdb >03:49:51,113 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:49:51,113 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1dd50> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:51,115 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:49:51,118 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:51,120 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:51,121 DEBUG storage.ui: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1d6d0> PedPartition: <_ped.Partition object at 0x7fae04f8e350> >03:49:51,123 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:51,126 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:51,126 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:51,128 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:51,130 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:51,131 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:51,131 DEBUG storage.ui: allocating partition: req3 ; id: 27 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:49:51,131 DEBUG storage.ui: checking freespace on sdc >03:49:51,132 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=1MB boot=False best=None grow=True >03:49:51,133 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:51,133 DEBUG storage.ui: current free range is 7202816-24575999 (8483MB) >03:49:51,134 DEBUG storage.ui: evaluating growth potential for new layout >03:49:51,134 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:51,135 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,135 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,135 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,136 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,136 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,137 DEBUG storage.ui: request 40 (sdd1) growth: 0 (0MB) size: 3004MB >03:49:51,137 DEBUG storage.ui: request 34 (sdd2) growth: 0 (0MB) size: 512MB >03:49:51,137 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:49:51,138 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:51,138 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,139 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,139 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,139 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,140 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,140 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,141 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,141 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,141 DEBUG storage.ui: adding 17373121 (8482MB) to 26 (sdb3) >03:49:51,142 DEBUG storage.ui: taking back 16588737 (8099MB) from 26 (sdb3) >03:49:51,142 DEBUG storage.ui: new grow amount for request 26 (sdb3) is 784384 units, or 383MB >03:49:51,143 DEBUG storage.ui: request 38 (sdb1) growth: 0 (0MB) size: 3004MB >03:49:51,143 DEBUG storage.ui: request 32 (sdb2) growth: 0 (0MB) size: 512MB >03:49:51,143 DEBUG storage.ui: request 26 (sdb3) growth: 784384 (383MB) size: 384MB >03:49:51,144 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:49:51,144 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:51,147 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:49:51,147 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2bd0> PedPartition: <_ped.Partition object at 0x7fae04f8e0b0> >03:49:51,150 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:49:51,152 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:51,153 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,153 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,154 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,154 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,154 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,155 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,155 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,155 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,156 DEBUG storage.ui: adding 17373121 (8482MB) to 27 (sdc3) >03:49:51,156 DEBUG storage.ui: taking back 16588737 (8099MB) from 27 (sdc3) >03:49:51,157 DEBUG storage.ui: new grow amount for request 27 (sdc3) is 784384 units, or 383MB >03:49:51,157 DEBUG storage.ui: request 39 (sdc1) growth: 0 (0MB) size: 3004MB >03:49:51,157 DEBUG storage.ui: request 33 (sdc2) growth: 0 (0MB) size: 512MB >03:49:51,158 DEBUG storage.ui: request 27 (sdc3) growth: 784384 (383MB) size: 384MB >03:49:51,158 DEBUG storage.ui: disk /dev/sdc growth: 784384 (383MB) >03:49:51,158 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:51,159 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,159 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,160 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,160 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,161 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,161 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,161 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,162 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,162 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:49:51,163 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:49:51,163 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:49:51,163 DEBUG storage.ui: request 37 (sda1) growth: 0 (0MB) size: 3004MB >03:49:51,164 DEBUG storage.ui: request 31 (sda2) growth: 0 (0MB) size: 512MB >03:49:51,164 DEBUG storage.ui: request 25 (sda3) growth: 784384 (383MB) size: 384MB >03:49:51,164 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:49:51,167 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:51,167 DEBUG storage.ui: device sdc3 new partedPartition None >03:49:51,169 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:49:51,172 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:49:51,172 DEBUG storage.ui: total growth: 2353152 sectors >03:49:51,172 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:49:51,173 DEBUG storage.ui: new free: 7202816-24575999 / 8483MB >03:49:51,173 DEBUG storage.ui: new free allows for 2353152 sectors of growth >03:49:51,174 DEBUG storage.ui: created partition sdc3 of 1MB and added it to /dev/sdc >03:49:51,176 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:49:51,177 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22110> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:49:51,179 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:49:51,182 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:51,185 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:51,185 DEBUG storage.ui: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f223d0> PedPartition: <_ped.Partition object at 0x7fae04f8ed70> >03:49:51,188 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:51,190 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:51,190 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:51,193 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:49:51,195 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:49:51,196 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:49:51,196 DEBUG storage.ui: allocating partition: req4 ; id: 28 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:49:51,196 DEBUG storage.ui: checking freespace on sdd >03:49:51,197 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=1MB boot=False best=None grow=True >03:49:51,198 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:49:51,198 DEBUG storage.ui: current free range is 7202816-24575999 (8483MB) >03:49:51,199 DEBUG storage.ui: evaluating growth potential for new layout >03:49:51,199 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:49:51,202 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:49:51,202 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22850> PedPartition: <_ped.Partition object at 0x7fae04f8e290> >03:49:51,205 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:49:51,208 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:51,208 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,209 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,209 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,209 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,210 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,210 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,211 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,211 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,211 DEBUG storage.ui: adding 17373121 (8482MB) to 28 (sdd3) >03:49:51,212 DEBUG storage.ui: taking back 16588737 (8099MB) from 28 (sdd3) >03:49:51,212 DEBUG storage.ui: new grow amount for request 28 (sdd3) is 784384 units, or 383MB >03:49:51,212 DEBUG storage.ui: request 40 (sdd1) growth: 0 (0MB) size: 3004MB >03:49:51,213 DEBUG storage.ui: request 34 (sdd2) growth: 0 (0MB) size: 512MB >03:49:51,213 DEBUG storage.ui: request 28 (sdd3) growth: 784384 (383MB) size: 384MB >03:49:51,213 DEBUG storage.ui: disk /dev/sdd growth: 784384 (383MB) >03:49:51,214 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:49:51,215 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,216 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,216 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,216 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,217 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,217 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,218 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,218 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,218 DEBUG storage.ui: adding 17373121 (8482MB) to 26 (sdb3) >03:49:51,219 DEBUG storage.ui: taking back 16588737 (8099MB) from 26 (sdb3) >03:49:51,219 DEBUG storage.ui: new grow amount for request 26 (sdb3) is 784384 units, or 383MB >03:49:51,219 DEBUG storage.ui: request 38 (sdb1) growth: 0 (0MB) size: 3004MB >03:49:51,220 DEBUG storage.ui: request 32 (sdb2) growth: 0 (0MB) size: 512MB >03:49:51,220 DEBUG storage.ui: request 26 (sdb3) growth: 784384 (383MB) size: 384MB >03:49:51,221 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:49:51,221 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:49:51,221 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,222 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,222 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,223 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,223 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,223 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,224 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,224 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,225 DEBUG storage.ui: adding 17373121 (8482MB) to 27 (sdc3) >03:49:51,225 DEBUG storage.ui: taking back 16588737 (8099MB) from 27 (sdc3) >03:49:51,226 DEBUG storage.ui: new grow amount for request 27 (sdc3) is 784384 units, or 383MB >03:49:51,226 DEBUG storage.ui: request 39 (sdc1) growth: 0 (0MB) size: 3004MB >03:49:51,226 DEBUG storage.ui: request 33 (sdc2) growth: 0 (0MB) size: 512MB >03:49:51,227 DEBUG storage.ui: request 27 (sdc3) growth: 784384 (383MB) size: 384MB >03:49:51,227 DEBUG storage.ui: disk /dev/sdc growth: 784384 (383MB) >03:49:51,227 DEBUG storage.ui: calculating growth for disk /dev/sda >03:49:51,228 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,228 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,229 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,229 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,230 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,230 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,230 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,231 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,231 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:49:51,232 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:49:51,232 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:49:51,232 DEBUG storage.ui: request 37 (sda1) growth: 0 (0MB) size: 3004MB >03:49:51,233 DEBUG storage.ui: request 31 (sda2) growth: 0 (0MB) size: 512MB >03:49:51,233 DEBUG storage.ui: request 25 (sda3) growth: 784384 (383MB) size: 384MB >03:49:51,233 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:49:51,235 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:51,236 DEBUG storage.ui: device sdd3 new partedPartition None >03:49:51,238 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:49:51,240 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:49:51,241 DEBUG storage.ui: total growth: 3137536 sectors >03:49:51,241 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:49:51,242 DEBUG storage.ui: new free: 7202816-24575999 / 8483MB >03:49:51,242 DEBUG storage.ui: new free allows for 3137536 sectors of growth >03:49:51,243 DEBUG storage.ui: created partition sdd3 of 1MB and added it to /dev/sdd >03:49:51,245 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:49:51,246 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1df50> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:49:51,248 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:49:51,251 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:51,254 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:51,254 DEBUG storage.ui: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04250> PedPartition: <_ped.Partition object at 0x7fae04f8e0b0> >03:49:51,255 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda2(id 31)', 'sda3(id 25)', 'sda1(id 37)', 'sdb2(id 32)', 'sdb3(id 26)', 'sdb1(id 38)', 'sdc2(id 33)', 'sdc3(id 27)', 'sdc1(id 39)', 'sdd2(id 34)', 'sdd3(id 28)', 'sdd1(id 40)'] >03:49:51,255 DEBUG storage.ui: growable partitions are ['sda3', 'sdb3', 'sdc3', 'sdd3'] >03:49:51,256 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,257 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,257 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:49:51,257 DEBUG storage.ui: disk sda has 1 chunks >03:49:51,258 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,259 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,259 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:49:51,260 DEBUG storage.ui: disk sdb has 1 chunks >03:49:51,260 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,261 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,261 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:49:51,262 DEBUG storage.ui: disk sdc has 1 chunks >03:49:51,262 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,263 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,263 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:49:51,263 DEBUG storage.ui: disk sdd has 1 chunks >03:49:51,264 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,264 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,264 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,265 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,265 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,266 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:49:51,266 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:49:51,266 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:49:51,267 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,267 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,268 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,268 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,268 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,269 DEBUG storage.ui: adding 17373121 (8482MB) to 26 (sdb3) >03:49:51,269 DEBUG storage.ui: taking back 16588737 (8099MB) from 26 (sdb3) >03:49:51,270 DEBUG storage.ui: new grow amount for request 26 (sdb3) is 784384 units, or 383MB >03:49:51,270 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,270 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,271 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,271 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,272 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,272 DEBUG storage.ui: adding 17373121 (8482MB) to 27 (sdc3) >03:49:51,272 DEBUG storage.ui: taking back 16588737 (8099MB) from 27 (sdc3) >03:49:51,273 DEBUG storage.ui: new grow amount for request 27 (sdc3) is 784384 units, or 383MB >03:49:51,273 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:49:51,274 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:49:51,274 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:49:51,274 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:49:51,275 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:49:51,275 DEBUG storage.ui: adding 17373121 (8482MB) to 28 (sdd3) >03:49:51,276 DEBUG storage.ui: taking back 16588737 (8099MB) from 28 (sdd3) >03:49:51,276 DEBUG storage.ui: new grow amount for request 28 (sdd3) is 784384 units, or 383MB >03:49:51,276 DEBUG storage.ui: set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] 384 >03:49:51,277 DEBUG storage.ui: min growth is 784384 >03:49:51,277 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:49:51,277 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:49:51,278 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:49:51,278 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 28 name = sdd3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:49:51,279 DEBUG storage.ui: set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] 384 >03:49:51,279 DEBUG storage.ui: min growth is 784384 >03:49:51,280 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:49:51,280 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:49:51,280 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:49:51,281 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 28 name = sdd3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:49:51,281 DEBUG storage.ui: growing partitions on sda >03:49:51,282 DEBUG storage.ui: partition sda1 (37): 0 >03:49:51,282 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fa5ed0> >03:49:51,283 DEBUG storage.ui: partition sda2 (31): 0 >03:49:51,283 DEBUG storage.ui: new geometry for sda2: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f22810> >03:49:51,284 DEBUG storage.ui: partition sda3 (25): 0 >03:49:51,285 DEBUG storage.ui: new geometry for sda3: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f22590> >03:49:51,285 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 37)', 'sda2(id 31)', 'sda3(id 25)'] from disk(s) ['sda'] >03:49:51,288 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:51,288 DEBUG storage.ui: device sda1 new partedPartition None >03:49:51,290 DEBUG storage.ui: PartitionDevice._setDisk: req11 ; new: None ; old: sda ; >03:49:51,293 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:49:51,295 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:51,295 DEBUG storage.ui: device sda2 new partedPartition None >03:49:51,298 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:49:51,300 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:49:51,303 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:51,304 DEBUG storage.ui: device sda3 new partedPartition None >03:49:51,306 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:49:51,308 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:49:51,309 DEBUG storage.ui: back from removeNewPartitions >03:49:51,309 DEBUG storage.ui: extended: None >03:49:51,310 DEBUG storage.ui: setting req11 new geometry: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04fa5ed0> >03:49:51,312 DEBUG storage.ui: PartitionDevice._setPartedPartition: req11 ; >03:49:51,313 DEBUG storage.ui: device req11 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22c10> PedPartition: <_ped.Partition object at 0x7fae04f8e230> >03:49:51,315 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:49:51,318 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:49:51,320 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:49:51,321 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22e10> PedPartition: <_ped.Partition object at 0x7fae04f8e4d0> >03:49:51,322 DEBUG storage.ui: setting req6 new geometry: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f22810> >03:49:51,324 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:49:51,325 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00990> PedPartition: <_ped.Partition object at 0x7fae04f8edd0> >03:49:51,327 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:49:51,330 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:49:51,333 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:49:51,333 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22f10> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:49:51,334 DEBUG storage.ui: setting req1 new geometry: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae04f22590> >03:49:51,339 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:49:51,340 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1d990> PedPartition: <_ped.Partition object at 0x7fae04f8ee30> >03:49:51,342 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:49:51,345 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:49:51,348 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:49:51,348 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00410> PedPartition: <_ped.Partition object at 0x7fae04f8e8f0> >03:49:51,349 DEBUG storage.ui: growing partitions on sdb >03:49:51,349 DEBUG storage.ui: partition sdb1 (38): 0 >03:49:51,350 DEBUG storage.ui: new geometry for sdb1: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fb2910> >03:49:51,350 DEBUG storage.ui: partition sdb2 (32): 0 >03:49:51,351 DEBUG storage.ui: new geometry for sdb2: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fb28d0> >03:49:51,351 DEBUG storage.ui: partition sdb3 (26): 0 >03:49:51,352 DEBUG storage.ui: new geometry for sdb3: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fb2a10> >03:49:51,352 DEBUG storage.ui: removing all non-preexisting partitions ['sdb1(id 38)', 'sdb2(id 32)', 'sdb3(id 26)'] from disk(s) ['sdb'] >03:49:51,355 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:51,355 DEBUG storage.ui: device sdb1 new partedPartition None >03:49:51,357 DEBUG storage.ui: PartitionDevice._setDisk: req12 ; new: None ; old: sdb ; >03:49:51,360 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:49:51,362 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:51,362 DEBUG storage.ui: device sdb2 new partedPartition None >03:49:51,365 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:49:51,367 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:49:51,369 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:51,370 DEBUG storage.ui: device sdb3 new partedPartition None >03:49:51,372 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:49:51,374 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:49:51,375 DEBUG storage.ui: back from removeNewPartitions >03:49:51,375 DEBUG storage.ui: extended: None >03:49:51,375 DEBUG storage.ui: setting req12 new geometry: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fb2910> >03:49:51,378 DEBUG storage.ui: PartitionDevice._setPartedPartition: req12 ; >03:49:51,379 DEBUG storage.ui: device req12 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22f90> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:51,381 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:49:51,383 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:49:51,386 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:49:51,387 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22790> PedPartition: <_ped.Partition object at 0x7fae04f8e530> >03:49:51,388 DEBUG storage.ui: setting req7 new geometry: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fb28d0> >03:49:51,390 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:49:51,391 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efa510> PedPartition: <_ped.Partition object at 0x7fae04f8e230> >03:49:51,393 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:49:51,396 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:49:51,398 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:49:51,399 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1e1d0> PedPartition: <_ped.Partition object at 0x7fae04f8e650> >03:49:51,400 DEBUG storage.ui: setting req2 new geometry: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae04fb2a10> >03:49:51,402 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:49:51,403 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22c90> PedPartition: <_ped.Partition object at 0x7fae04f8edd0> >03:49:51,406 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:49:51,408 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:49:51,411 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:49:51,412 DEBUG storage.ui: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1e390> PedPartition: <_ped.Partition object at 0x7fae04f8ee90> >03:49:51,412 DEBUG storage.ui: growing partitions on sdc >03:49:51,413 DEBUG storage.ui: partition sdc1 (39): 0 >03:49:51,413 DEBUG storage.ui: new geometry for sdc1: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f22c50> >03:49:51,414 DEBUG storage.ui: partition sdc2 (33): 0 >03:49:51,414 DEBUG storage.ui: new geometry for sdc2: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f20b50> >03:49:51,415 DEBUG storage.ui: partition sdc3 (27): 0 >03:49:51,415 DEBUG storage.ui: new geometry for sdc3: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f200d0> >03:49:51,416 DEBUG storage.ui: removing all non-preexisting partitions ['sdc1(id 39)', 'sdc2(id 33)', 'sdc3(id 27)'] from disk(s) ['sdc'] >03:49:51,418 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:51,418 DEBUG storage.ui: device sdc1 new partedPartition None >03:49:51,421 DEBUG storage.ui: PartitionDevice._setDisk: req13 ; new: None ; old: sdc ; >03:49:51,423 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:49:51,426 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:51,426 DEBUG storage.ui: device sdc2 new partedPartition None >03:49:51,429 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:49:51,431 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:49:51,433 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:51,434 DEBUG storage.ui: device sdc3 new partedPartition None >03:49:51,436 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:49:51,438 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:49:51,439 DEBUG storage.ui: back from removeNewPartitions >03:49:51,439 DEBUG storage.ui: extended: None >03:49:51,439 DEBUG storage.ui: setting req13 new geometry: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f22c50> >03:49:51,442 DEBUG storage.ui: PartitionDevice._setPartedPartition: req13 ; >03:49:51,443 DEBUG storage.ui: device req13 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1ef90> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:49:51,445 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:49:51,447 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:49:51,450 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:49:51,451 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22c50> PedPartition: <_ped.Partition object at 0x7fae04f8e350> >03:49:51,451 DEBUG storage.ui: setting req8 new geometry: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f20b50> >03:49:51,454 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:49:51,455 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22550> PedPartition: <_ped.Partition object at 0x7fae04f8eef0> >03:49:51,457 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:49:51,459 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:49:51,462 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:49:51,463 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f004d0> PedPartition: <_ped.Partition object at 0x7fae04f8ecb0> >03:49:51,463 DEBUG storage.ui: setting req3 new geometry: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f200d0> >03:49:51,466 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:49:51,466 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22150> PedPartition: <_ped.Partition object at 0x7fae04f8ef50> >03:49:51,469 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:49:51,471 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:49:51,473 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:49:51,474 DEBUG storage.ui: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2950> PedPartition: <_ped.Partition object at 0x7fae04f8e3b0> >03:49:51,475 DEBUG storage.ui: growing partitions on sdd >03:49:51,475 DEBUG storage.ui: partition sdd1 (40): 0 >03:49:51,476 DEBUG storage.ui: new geometry for sdd1: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04f222d0> >03:49:51,476 DEBUG storage.ui: partition sdd2 (34): 0 >03:49:51,477 DEBUG storage.ui: new geometry for sdd2: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04f22c90> >03:49:51,477 DEBUG storage.ui: partition sdd3 (28): 0 >03:49:51,478 DEBUG storage.ui: new geometry for sdd3: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04fb2bd0> >03:49:51,478 DEBUG storage.ui: removing all non-preexisting partitions ['sdd1(id 40)', 'sdd2(id 34)', 'sdd3(id 28)'] from disk(s) ['sdd'] >03:49:51,481 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:51,481 DEBUG storage.ui: device sdd1 new partedPartition None >03:49:51,483 DEBUG storage.ui: PartitionDevice._setDisk: req14 ; new: None ; old: sdd ; >03:49:51,486 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:49:51,488 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:51,488 DEBUG storage.ui: device sdd2 new partedPartition None >03:49:51,491 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:49:51,493 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:49:51,495 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:51,496 DEBUG storage.ui: device sdd3 new partedPartition None >03:49:51,498 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:49:51,500 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:49:51,501 DEBUG storage.ui: back from removeNewPartitions >03:49:51,501 DEBUG storage.ui: extended: None >03:49:51,501 DEBUG storage.ui: setting req14 new geometry: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04f222d0> >03:49:51,504 DEBUG storage.ui: PartitionDevice._setPartedPartition: req14 ; >03:49:51,505 DEBUG storage.ui: device req14 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00250> PedPartition: <_ped.Partition object at 0x7fae04fb3cb0> >03:49:51,507 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:49:51,510 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:49:51,512 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:49:51,513 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00b90> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:49:51,514 DEBUG storage.ui: setting req9 new geometry: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04f22c90> >03:49:51,516 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:49:51,517 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00dd0> PedPartition: <_ped.Partition object at 0x7fae04f8e230> >03:49:51,520 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:49:51,522 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:49:51,525 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:49:51,525 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1ef90> PedPartition: <_ped.Partition object at 0x7fae04f8ed10> >03:49:51,526 DEBUG storage.ui: setting req4 new geometry: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae04fb2bd0> >03:49:51,529 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:49:51,529 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2ed0> PedPartition: <_ped.Partition object at 0x7fae04f8e950> >03:49:51,532 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:49:51,534 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:49:51,537 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:49:51,537 DEBUG storage.ui: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04efac10> PedPartition: <_ped.Partition object at 0x7fae04f8eef0> >03:49:51,538 DEBUG storage.ui: fixing size of non-existent 3004MB partition sda1 (37) with non-existent mdmember at 3004.00 >03:49:51,539 DEBUG storage.ui: fixing size of non-existent 512MB partition sda2 (31) with non-existent mdmember at 512.00 >03:49:51,539 DEBUG storage.ui: fixing size of non-existent 384MB partition sda3 (25) with non-existent mdmember at 384.00 >03:49:51,540 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdb1 (38) with non-existent mdmember at 3004.00 >03:49:51,541 DEBUG storage.ui: fixing size of non-existent 512MB partition sdb2 (32) with non-existent mdmember at 512.00 >03:49:51,541 DEBUG storage.ui: fixing size of non-existent 384MB partition sdb3 (26) with non-existent mdmember at 384.00 >03:49:51,542 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdc1 (39) with non-existent mdmember at 3004.00 >03:49:51,543 DEBUG storage.ui: fixing size of non-existent 512MB partition sdc2 (33) with non-existent mdmember at 512.00 >03:49:51,543 DEBUG storage.ui: fixing size of non-existent 384MB partition sdc3 (27) with non-existent mdmember at 384.00 >03:49:51,544 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdd1 (40) with non-existent mdmember at 3004.00 >03:49:51,545 DEBUG storage.ui: fixing size of non-existent 512MB partition sdd2 (34) with non-existent mdmember at 512.00 >03:49:51,545 DEBUG storage.ui: fixing size of non-existent 384MB partition sdd3 (28) with non-existent mdmember at 384.00 >03:49:51,547 DEBUG storage.ui: new member set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] >03:49:51,548 DEBUG storage.ui: old member set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] >03:49:51,549 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:49:51,549 INFO storage.ui: Using 0MB superBlockSize >03:49:51,550 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:49:51,551 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:49:51,552 INFO storage.ui: Using 0MB superBlockSize >03:49:51,552 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:49:51,555 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:51,556 INFO blivet: Using 0MB superBlockSize >03:49:51,556 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:51,559 DEBUG blivet: raw RAID 1 size == 512.0 >03:49:51,560 INFO blivet: Using 0MB superBlockSize >03:49:51,560 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:49:51,563 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:51,563 INFO blivet: Using 4MB superBlockSize >03:49:51,564 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:51,566 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:51,567 INFO blivet: Using 0MB superBlockSize >03:49:51,567 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:51,582 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:51,582 INFO blivet: Using 0MB superBlockSize >03:49:51,583 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:51,586 DEBUG blivet: raw RAID 10 size == 768.0 >03:49:51,587 INFO blivet: Using 0MB superBlockSize >03:49:51,588 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:49:51,597 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:51,597 INFO blivet: Using 4MB superBlockSize >03:49:51,598 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:51,599 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:51,600 INFO blivet: Using 4MB superBlockSize >03:49:51,600 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:51,604 DEBUG blivet: raw RAID 10 size == 6008.0 >03:49:51,605 INFO blivet: Using 4MB superBlockSize >03:49:51,606 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:49:51,610 DEBUG blivet: Ext4FS.supported: supported: True ; >03:49:51,610 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:50:38,535 DEBUG blivet: raw RAID 10 size == 6008.0 >03:50:38,538 INFO blivet: Using 4MB superBlockSize >03:50:38,540 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:50:38,545 DEBUG blivet: raw RAID 10 size == 6008.0 >03:50:38,546 INFO blivet: Using 4MB superBlockSize >03:50:38,547 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:50:38,556 DEBUG blivet: Ext4FS.supported: supported: True ; >03:50:38,557 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:50:38,564 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid10'} >03:50:38,571 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:38,572 INFO blivet: Using 0MB superBlockSize >03:50:38,573 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:38,575 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:38,575 INFO blivet: Using 0MB superBlockSize >03:50:38,576 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:38,581 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:38,582 INFO blivet: Using 0MB superBlockSize >03:50:38,582 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:44,045 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:44,045 INFO blivet: Using 0MB superBlockSize >03:50:44,046 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:44,054 DEBUG blivet: Ext4FS.supported: supported: True ; >03:50:44,055 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:50:44,062 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid1'} >03:50:54,846 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:54,848 INFO blivet: Using 0MB superBlockSize >03:50:54,850 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:54,858 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:54,859 INFO blivet: Using 0MB superBlockSize >03:50:54,859 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:54,868 DEBUG blivet: Ext4FS.supported: supported: True ; >03:50:54,869 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:50:54,876 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid1'} >03:50:54,884 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:54,884 INFO blivet: Using 0MB superBlockSize >03:50:54,885 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:54,887 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:54,888 INFO blivet: Using 0MB superBlockSize >03:50:54,888 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:54,894 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:54,894 INFO blivet: Using 0MB superBlockSize >03:50:54,895 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:54,900 DEBUG blivet: SwapSpace.__init__: >03:50:54,900 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:50:56,865 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:56,867 INFO blivet: Using 0MB superBlockSize >03:50:56,868 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:56,874 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:56,875 INFO blivet: Using 0MB superBlockSize >03:50:56,875 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:56,883 DEBUG blivet: SwapSpace.__init__: >03:50:56,884 DEBUG blivet: getFormat('swap') returning SwapSpace instance >03:50:56,890 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid10'} >03:50:56,903 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:50:56,904 INFO storage.ui: Using 0MB superBlockSize >03:50:56,904 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:50:56,905 DEBUG storage.ui: Blivet.factoryDevice: 1 ; 768 ; container_raid_level: None ; name: swap ; encrypted: False ; container_encrypted: False ; disks: [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 3 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 3 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 3 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 3 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>] ; raid_level: raid10 ; label: ; container_name: None ; device: non-existent 768MB mdarray swap (29) with non-existent swap ; mountpoint: None ; fstype: swap ; container_size: 0 ; >03:50:56,909 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:50:56,909 INFO storage.ui: Using 0MB superBlockSize >03:50:56,910 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:50:56,911 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 768, ['sda', 'sdb', 'sdc', 'sdd'], {'container_raid_level': None, 'name': 'swap', 'encrypted': False, 'container_encrypted': False, 'raid_level': 'raid10', 'label': '', 'container_name': None, 'device': MDRaidArrayDevice instance (0x7fae04f96bd0) -- > name = swap status = False kids = 0 id = 29 > parents = ['non-existent 384MB partition sda3 (25) with non-existent mdmember', > 'non-existent 384MB partition sdb3 (26) with non-existent mdmember', > 'non-existent 384MB partition sdc3 (27) with non-existent mdmember', > 'non-existent 384MB partition sdd3 (28) with non-existent mdmember'] > uuid = None size = 768.0 > format = non-existent swap > major = 0 minor = 0 exists = False protected = False > sysfs path = partedDevice = None > target size = 768 path = /dev/md/swap > format args = None originalFormat = swap level = 10 spares = 0 > members = 4 > total devices = 4 metadata version = default, 'mountpoint': None, 'fstype': 'swap', 'container_size': 0} >03:50:56,913 DEBUG storage.ui: MDFactory.configure: parent_factory: None ; >03:50:56,913 DEBUG storage.ui: starting Blivet copy >03:50:56,958 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:50:56,960 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f20e50> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49350> PedPartition: <_ped.Partition object at 0x7fae04f8ed70> >03:50:56,962 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:50:56,963 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f20e50> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a494d0> PedPartition: <_ped.Partition object at 0x7fae04f8e050> >03:50:56,966 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:50:56,966 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f20e50> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49650> PedPartition: <_ped.Partition object at 0x7fae04f8efb0> >03:50:56,969 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:50:56,970 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f1fd10> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49790> PedPartition: <_ped.Partition object at 0x7fae04f8e5f0> >03:50:56,972 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:50:56,973 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f1fd10> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49890> PedPartition: <_ped.Partition object at 0x7fae04f8e9b0> >03:50:56,976 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:50:56,977 DEBUG storage.ui: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f1fd10> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49a10> PedPartition: <_ped.Partition object at 0x7fae04f290b0> >03:50:56,980 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:50:56,981 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fb2ad0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49b50> PedPartition: <_ped.Partition object at 0x7fae04f29170> >03:50:56,983 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:50:56,985 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fb2ad0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49c50> PedPartition: <_ped.Partition object at 0x7fae04f291d0> >03:50:56,987 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:50:56,988 DEBUG storage.ui: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04fb2ad0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49dd0> PedPartition: <_ped.Partition object at 0x7fae04f29110> >03:50:56,991 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:50:56,992 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f1f090> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49f10> PedPartition: <_ped.Partition object at 0x7fae04f29290> >03:50:56,994 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:50:56,995 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f1f090> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51050> PedPartition: <_ped.Partition object at 0x7fae04f292f0> >03:50:56,998 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:50:56,999 DEBUG storage.ui: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae04f1f090> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a511d0> PedPartition: <_ped.Partition object at 0x7fae04f29050> >03:50:56,999 DEBUG storage.ui: finished Blivet copy >03:50:57,000 INFO storage.ui: Using 0MB superBlockSize >03:50:57,001 DEBUG storage.ui: child factory class: <class 'blivet.devicefactory.PartitionSetFactory'> >03:50:57,006 DEBUG storage.ui: child factory args: [<blivet.Blivet object at 0x7fae05326d50>, 1536.0, [DiskDevice instance (0x7fae05319950) -- > name = sda status = True kids = 3 id = 1 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 0 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sda type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 0 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae127eacb0> > target size = 0 path = /dev/sda > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae053199d0>, DiskDevice instance (0x7fae05b116d0) -- > name = sdb status = True kids = 3 id = 14 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 16 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdb type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 768 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674a70> > target size = 0 path = /dev/sdb > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05b11750>, DiskDevice instance (0x7fae05aeabd0) -- > name = sdc status = True kids = 3 id = 11 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 32 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdc type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 512 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae0f674170> > target size = 0 path = /dev/sdc > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aeac50>, DiskDevice instance (0x7fae05aea190) -- > name = sdd status = True kids = 3 id = 8 > parents = [] > uuid = None size = 12000.0 > format = non-existent msdos disklabel > major = 8 minor = 48 exists = True protected = False > sysfs path = /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd partedDevice = parted.Device instance -- > model: QEMU QEMU HARDDISK path: /dev/sdd type: 1 > sectorSize: 512 physicalSectorSize: 512 > length: 24576000 openCount: 0 readOnly: False > externalMode: False dirty: False bootDirty: False > host: 0 did: 256 busy: False > hardwareGeometry: (12000, 64, 32) biosGeometry: (1529, 255, 63) > PedDevice: <_ped.Device object at 0x7fae27c59680> > target size = 0 path = /dev/sdd > format args = [] originalFormat = disklabel removable = False partedDevice = <parted.device.Device object at 0x7fae05aea210>]] >03:50:57,007 DEBUG storage.ui: child factory kwargs: {'fstype': 'mdmember'} >03:50:57,009 DEBUG storage.ui: PartitionSetFactory.configure: parent_factory: <blivet.devicefactory.MDFactory object at 0x7fae04f04bd0> ; >03:50:57,010 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:50:57,011 INFO storage.ui: Using 0MB superBlockSize >03:50:57,011 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:50:57,012 DEBUG storage.ui: parent factory container: non-existent 768MB mdarray swap (29) with non-existent swap >03:50:57,013 DEBUG storage.ui: members: ['sda3', 'sdb3', 'sdc3', 'sdd3'] >03:50:57,013 DEBUG storage.ui: add_disks: [] >03:50:57,014 DEBUG storage.ui: remove_disks: [] >03:50:57,016 DEBUG storage.ui: MDRaidMember.__init__: >03:50:57,016 DEBUG storage.ui: getFormat('mdmember') returning MDRaidMember instance >03:50:57,017 INFO storage.ui: Using 0MB superBlockSize >03:50:57,017 DEBUG storage.ui: adding a SameSizeSet with size 1536 >03:50:57,020 DEBUG storage.ui: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:50:57,022 DEBUG storage.ui: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:50:57,024 DEBUG storage.ui: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:50:57,027 DEBUG storage.ui: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:50:57,028 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 37)', 'sda2(id 31)', 'sda3(id 25)', 'sdb1(id 38)', 'sdb2(id 32)', 'sdb3(id 26)', 'sdc1(id 39)', 'sdc2(id 33)', 'sdc3(id 27)', 'sdd1(id 40)', 'sdd2(id 34)', 'sdd3(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:50:57,030 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:50:57,031 DEBUG storage.ui: device sda1 new partedPartition None >03:50:57,033 DEBUG storage.ui: PartitionDevice._setDisk: req11 ; new: None ; old: sda ; >03:50:57,036 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:50:57,038 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:50:57,039 DEBUG storage.ui: device sda2 new partedPartition None >03:50:57,041 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:50:57,044 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:50:57,046 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:50:57,047 DEBUG storage.ui: device sda3 new partedPartition None >03:50:57,050 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:50:57,052 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:50:57,055 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:50:57,055 DEBUG storage.ui: device sdb1 new partedPartition None >03:50:57,058 DEBUG storage.ui: PartitionDevice._setDisk: req12 ; new: None ; old: sdb ; >03:50:57,060 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:50:57,063 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:50:57,063 DEBUG storage.ui: device sdb2 new partedPartition None >03:50:57,066 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:50:57,068 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:50:57,071 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:50:57,071 DEBUG storage.ui: device sdb3 new partedPartition None >03:50:57,073 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:50:57,076 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:50:57,078 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:50:57,079 DEBUG storage.ui: device sdc1 new partedPartition None >03:50:57,081 DEBUG storage.ui: PartitionDevice._setDisk: req13 ; new: None ; old: sdc ; >03:50:57,084 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:50:57,086 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:50:57,087 DEBUG storage.ui: device sdc2 new partedPartition None >03:50:57,089 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:50:57,092 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:50:57,094 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:50:57,095 DEBUG storage.ui: device sdc3 new partedPartition None >03:50:57,097 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:50:57,100 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:50:57,102 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:50:57,102 DEBUG storage.ui: device sdd1 new partedPartition None >03:50:57,105 DEBUG storage.ui: PartitionDevice._setDisk: req14 ; new: None ; old: sdd ; >03:50:57,107 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:50:57,109 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:50:57,110 DEBUG storage.ui: device sdd2 new partedPartition None >03:50:57,113 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:50:57,115 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:50:57,117 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:50:57,118 DEBUG storage.ui: device sdd3 new partedPartition None >03:50:57,120 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:50:57,123 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:50:57,124 DEBUG storage.ui: allocatePartitions: disks=['sda', 'sdb', 'sdc', 'sdd'] ; partitions=['req11(id 37)', 'req6(id 31)', 'req1(id 25)', 'req12(id 38)', 'req7(id 32)', 'req2(id 26)', 'req13(id 39)', 'req8(id 33)', 'req3(id 27)', 'req14(id 40)', 'req9(id 34)', 'req4(id 28)'] >03:50:57,125 DEBUG storage.ui: removing all non-preexisting partitions ['req11(id 37)', 'req12(id 38)', 'req13(id 39)', 'req14(id 40)', 'req6(id 31)', 'req7(id 32)', 'req8(id 33)', 'req9(id 34)', 'req1(id 25)', 'req2(id 26)', 'req3(id 27)', 'req4(id 28)'] from disk(s) ['sda', 'sdb', 'sdc', 'sdd'] >03:50:57,127 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,130 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,131 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,133 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,136 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,136 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,137 DEBUG storage.ui: allocating partition: req11 ; id: 37 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 3004MB ; grow: False ; max_size: 3004 >03:50:57,137 DEBUG storage.ui: checking freespace on sda >03:50:57,138 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=3004MB boot=False best=None grow=False >03:50:57,139 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:50:57,140 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:50:57,140 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:50:57,141 DEBUG storage.ui: new free allows for 0 sectors of growth >03:50:57,142 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:50:57,143 DEBUG storage.ui: created partition sda1 of 3004MB and added it to /dev/sda >03:50:57,145 DEBUG storage.ui: PartitionDevice._setPartedPartition: req11 ; >03:50:57,146 DEBUG storage.ui: device req11 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00890> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:50:57,148 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:50:57,151 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:50:57,154 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:50:57,155 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00c50> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:50:57,158 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,160 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,161 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,163 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,166 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,166 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,167 DEBUG storage.ui: allocating partition: req12 ; id: 38 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 3004MB ; grow: False ; max_size: 3004 >03:50:57,168 DEBUG storage.ui: checking freespace on sdb >03:50:57,169 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=3004MB boot=False best=None grow=False >03:50:57,170 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:50:57,170 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:50:57,171 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:50:57,171 DEBUG storage.ui: new free allows for 0 sectors of growth >03:50:57,172 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:50:57,173 DEBUG storage.ui: created partition sdb1 of 3004MB and added it to /dev/sdb >03:50:57,176 DEBUG storage.ui: PartitionDevice._setPartedPartition: req12 ; >03:50:57,177 DEBUG storage.ui: device req12 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1eb90> PedPartition: <_ped.Partition object at 0x7fae04f8e770> >03:50:57,179 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:50:57,182 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:50:57,185 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:50:57,186 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00bd0> PedPartition: <_ped.Partition object at 0x7fae04f8e410> >03:50:57,189 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,191 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,192 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,194 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,197 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,197 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,198 DEBUG storage.ui: allocating partition: req13 ; id: 39 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 3004MB ; grow: False ; max_size: 3004 >03:50:57,198 DEBUG storage.ui: checking freespace on sdc >03:50:57,200 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=3004MB boot=False best=None grow=False >03:50:57,201 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:50:57,201 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:50:57,202 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:50:57,202 DEBUG storage.ui: new free allows for 0 sectors of growth >03:50:57,203 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:50:57,204 DEBUG storage.ui: created partition sdc1 of 3004MB and added it to /dev/sdc >03:50:57,207 DEBUG storage.ui: PartitionDevice._setPartedPartition: req13 ; >03:50:57,208 DEBUG storage.ui: device req13 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7350> PedPartition: <_ped.Partition object at 0x7fae04f8e1d0> >03:50:57,210 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:50:57,213 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:50:57,216 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:50:57,217 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb27d0> PedPartition: <_ped.Partition object at 0x7fae04f8edd0> >03:50:57,219 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,222 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,223 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,225 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,228 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,228 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,229 DEBUG storage.ui: allocating partition: req14 ; id: 40 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 3004MB ; grow: False ; max_size: 3004 >03:50:57,229 DEBUG storage.ui: checking freespace on sdd >03:50:57,230 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=3004MB boot=False best=None grow=False >03:50:57,231 DEBUG storage.ui: current free range is 63-24575999 (11999MB) >03:50:57,232 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:50:57,233 DEBUG storage.ui: new free: 63-24575999 / 11999MB >03:50:57,233 DEBUG storage.ui: new free allows for 0 sectors of growth >03:50:57,234 DEBUG storage.ui: adjusted start sector from 63 to 2048 >03:50:57,235 DEBUG storage.ui: created partition sdd1 of 3004MB and added it to /dev/sdd >03:50:57,237 DEBUG storage.ui: PartitionDevice._setPartedPartition: req14 ; >03:50:57,238 DEBUG storage.ui: device req14 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1e790> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:50:57,241 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:50:57,244 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:50:57,247 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:50:57,248 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7850> PedPartition: <_ped.Partition object at 0x7fae04fb3ad0> >03:50:57,251 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,253 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,254 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,256 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,259 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,259 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,260 DEBUG storage.ui: allocating partition: req6 ; id: 31 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:50:57,261 DEBUG storage.ui: checking freespace on sda >03:50:57,261 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=512MB boot=False best=None grow=False >03:50:57,262 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:50:57,263 DEBUG storage.ui: current free range is 6154240-24575999 (8995MB) >03:50:57,264 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:50:57,264 DEBUG storage.ui: new free: 6154240-24575999 / 8995MB >03:50:57,265 DEBUG storage.ui: new free allows for 0 sectors of growth >03:50:57,266 DEBUG storage.ui: created partition sda2 of 512MB and added it to /dev/sda >03:50:57,268 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:50:57,269 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2910> PedPartition: <_ped.Partition object at 0x7fae04f8e770> >03:50:57,271 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:50:57,274 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:50:57,277 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:50:57,278 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2890> PedPartition: <_ped.Partition object at 0x7fae04f8e1d0> >03:50:57,280 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,283 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,283 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,285 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,288 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,288 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,289 DEBUG storage.ui: allocating partition: req7 ; id: 32 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:50:57,289 DEBUG storage.ui: checking freespace on sdb >03:50:57,290 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=512MB boot=False best=None grow=False >03:50:57,290 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:50:57,291 DEBUG storage.ui: current free range is 6154240-24575999 (8995MB) >03:50:57,291 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:50:57,291 DEBUG storage.ui: new free: 6154240-24575999 / 8995MB >03:50:57,292 DEBUG storage.ui: new free allows for 0 sectors of growth >03:50:57,293 DEBUG storage.ui: created partition sdb2 of 512MB and added it to /dev/sdb >03:50:57,295 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:50:57,295 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1e810> PedPartition: <_ped.Partition object at 0x7fae04f8e8f0> >03:50:57,298 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:50:57,300 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:50:57,303 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:50:57,304 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1eb90> PedPartition: <_ped.Partition object at 0x7fae04f8e3b0> >03:50:57,306 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,308 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,309 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,311 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,313 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,313 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,314 DEBUG storage.ui: allocating partition: req8 ; id: 33 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:50:57,314 DEBUG storage.ui: checking freespace on sdc >03:50:57,315 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=512MB boot=False best=None grow=False >03:50:57,315 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:50:57,316 DEBUG storage.ui: current free range is 6154240-24575999 (8995MB) >03:50:57,317 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:50:57,317 DEBUG storage.ui: new free: 6154240-24575999 / 8995MB >03:50:57,317 DEBUG storage.ui: new free allows for 0 sectors of growth >03:50:57,318 DEBUG storage.ui: created partition sdc2 of 512MB and added it to /dev/sdc >03:50:57,320 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:50:57,321 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49110> PedPartition: <_ped.Partition object at 0x7fae04f8e290> >03:50:57,324 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:50:57,326 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:50:57,329 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:50:57,330 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51350> PedPartition: <_ped.Partition object at 0x7fae04f8ed10> >03:50:57,333 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,336 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,337 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,339 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,341 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,341 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,342 DEBUG storage.ui: allocating partition: req9 ; id: 34 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 512MB ; grow: False ; max_size: 512 >03:50:57,342 DEBUG storage.ui: checking freespace on sdd >03:50:57,343 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=512MB boot=False best=None grow=False >03:50:57,344 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:50:57,344 DEBUG storage.ui: current free range is 6154240-24575999 (8995MB) >03:50:57,345 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:50:57,345 DEBUG storage.ui: new free: 6154240-24575999 / 8995MB >03:50:57,345 DEBUG storage.ui: new free allows for 0 sectors of growth >03:50:57,346 DEBUG storage.ui: created partition sdd2 of 512MB and added it to /dev/sdd >03:50:57,349 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:50:57,350 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22c50> PedPartition: <_ped.Partition object at 0x7fae04f8e770> >03:50:57,352 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:50:57,355 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:50:57,359 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:50:57,360 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1ef10> PedPartition: <_ped.Partition object at 0x7fae04f8e8f0> >03:50:57,362 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,365 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,365 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,367 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,370 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,370 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,370 DEBUG storage.ui: allocating partition: req1 ; id: 25 ; disks: ['sda'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:50:57,371 DEBUG storage.ui: checking freespace on sda >03:50:57,371 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sda part_type=0 req_size=1MB boot=False best=None grow=True >03:50:57,372 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:50:57,373 DEBUG storage.ui: current free range is 7202816-24575999 (8483MB) >03:50:57,373 DEBUG storage.ui: evaluating growth potential for new layout >03:50:57,373 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:50:57,374 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,374 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,375 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,375 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,375 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,376 DEBUG storage.ui: request 40 (sdd1) growth: 0 (0MB) size: 3004MB >03:50:57,376 DEBUG storage.ui: request 34 (sdd2) growth: 0 (0MB) size: 512MB >03:50:57,377 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:50:57,377 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:50:57,377 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,378 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,378 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,379 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,379 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,380 DEBUG storage.ui: request 38 (sdb1) growth: 0 (0MB) size: 3004MB >03:50:57,380 DEBUG storage.ui: request 32 (sdb2) growth: 0 (0MB) size: 512MB >03:50:57,380 DEBUG storage.ui: disk /dev/sdb growth: 0 (0MB) >03:50:57,381 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:50:57,381 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,382 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,382 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,383 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,383 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,383 DEBUG storage.ui: request 39 (sdc1) growth: 0 (0MB) size: 3004MB >03:50:57,384 DEBUG storage.ui: request 33 (sdc2) growth: 0 (0MB) size: 512MB >03:50:57,384 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:50:57,384 DEBUG storage.ui: calculating growth for disk /dev/sda >03:50:57,387 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:50:57,388 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51510> PedPartition: <_ped.Partition object at 0x7fae04f8ecb0> >03:50:57,390 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:50:57,392 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:50:57,393 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,394 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,394 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,394 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,395 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,395 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,396 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,396 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,396 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:50:57,397 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:50:57,397 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:50:57,397 DEBUG storage.ui: request 37 (sda1) growth: 0 (0MB) size: 3004MB >03:50:57,398 DEBUG storage.ui: request 31 (sda2) growth: 0 (0MB) size: 512MB >03:50:57,398 DEBUG storage.ui: request 25 (sda3) growth: 784384 (383MB) size: 384MB >03:50:57,399 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:50:57,401 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:50:57,401 DEBUG storage.ui: device sda3 new partedPartition None >03:50:57,404 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:50:57,406 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:50:57,406 DEBUG storage.ui: total growth: 784384 sectors >03:50:57,407 DEBUG storage.ui: updating use_disk to sda, type: 0 >03:50:57,407 DEBUG storage.ui: new free: 7202816-24575999 / 8483MB >03:50:57,407 DEBUG storage.ui: new free allows for 784384 sectors of growth >03:50:57,408 DEBUG storage.ui: created partition sda3 of 1MB and added it to /dev/sda >03:50:57,411 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:50:57,411 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51850> PedPartition: <_ped.Partition object at 0x7fae04f8e650> >03:50:57,414 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:50:57,416 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:50:57,419 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:50:57,420 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51b10> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:50:57,422 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,425 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,425 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,427 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,429 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,430 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,430 DEBUG storage.ui: allocating partition: req2 ; id: 26 ; disks: ['sdb'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:50:57,430 DEBUG storage.ui: checking freespace on sdb >03:50:57,431 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdb part_type=0 req_size=1MB boot=False best=None grow=True >03:50:57,432 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:50:57,432 DEBUG storage.ui: current free range is 7202816-24575999 (8483MB) >03:50:57,433 DEBUG storage.ui: evaluating growth potential for new layout >03:50:57,433 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:50:57,434 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,434 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,435 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,435 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,435 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,436 DEBUG storage.ui: request 40 (sdd1) growth: 0 (0MB) size: 3004MB >03:50:57,436 DEBUG storage.ui: request 34 (sdd2) growth: 0 (0MB) size: 512MB >03:50:57,436 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:50:57,437 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:50:57,440 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:50:57,440 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51f50> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:50:57,443 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:50:57,447 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:50:57,448 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,448 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,449 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,449 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,450 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,450 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,451 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,451 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,451 DEBUG storage.ui: adding 17373121 (8482MB) to 26 (sdb3) >03:50:57,452 DEBUG storage.ui: taking back 16588737 (8099MB) from 26 (sdb3) >03:50:57,452 DEBUG storage.ui: new grow amount for request 26 (sdb3) is 784384 units, or 383MB >03:50:57,452 DEBUG storage.ui: request 38 (sdb1) growth: 0 (0MB) size: 3004MB >03:50:57,453 DEBUG storage.ui: request 32 (sdb2) growth: 0 (0MB) size: 512MB >03:50:57,453 DEBUG storage.ui: request 26 (sdb3) growth: 784384 (383MB) size: 384MB >03:50:57,453 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:50:57,454 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:50:57,455 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,455 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,455 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,456 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,456 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,457 DEBUG storage.ui: request 39 (sdc1) growth: 0 (0MB) size: 3004MB >03:50:57,457 DEBUG storage.ui: request 33 (sdc2) growth: 0 (0MB) size: 512MB >03:50:57,457 DEBUG storage.ui: disk /dev/sdc growth: 0 (0MB) >03:50:57,458 DEBUG storage.ui: calculating growth for disk /dev/sda >03:50:57,458 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,459 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,459 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,460 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,460 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,460 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,461 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,461 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,461 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:50:57,462 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:50:57,462 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:50:57,463 DEBUG storage.ui: request 37 (sda1) growth: 0 (0MB) size: 3004MB >03:50:57,463 DEBUG storage.ui: request 31 (sda2) growth: 0 (0MB) size: 512MB >03:50:57,463 DEBUG storage.ui: request 25 (sda3) growth: 784384 (383MB) size: 384MB >03:50:57,464 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:50:57,466 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:50:57,467 DEBUG storage.ui: device sdb3 new partedPartition None >03:50:57,469 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:50:57,471 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:50:57,471 DEBUG storage.ui: total growth: 1568768 sectors >03:50:57,472 DEBUG storage.ui: updating use_disk to sdb, type: 0 >03:50:57,472 DEBUG storage.ui: new free: 7202816-24575999 / 8483MB >03:50:57,473 DEBUG storage.ui: new free allows for 1568768 sectors of growth >03:50:57,474 DEBUG storage.ui: created partition sdb3 of 1MB and added it to /dev/sdb >03:50:57,476 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:50:57,477 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f224d0> PedPartition: <_ped.Partition object at 0x7fae04f8ecb0> >03:50:57,479 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:50:57,481 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:50:57,484 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:50:57,485 DEBUG storage.ui: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a48750> PedPartition: <_ped.Partition object at 0x7fae04f8e4d0> >03:50:57,487 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,490 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,490 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,493 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,495 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,495 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,496 DEBUG storage.ui: allocating partition: req3 ; id: 27 ; disks: ['sdc'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:50:57,496 DEBUG storage.ui: checking freespace on sdc >03:50:57,497 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdc part_type=0 req_size=1MB boot=False best=None grow=True >03:50:57,498 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:50:57,498 DEBUG storage.ui: current free range is 7202816-24575999 (8483MB) >03:50:57,499 DEBUG storage.ui: evaluating growth potential for new layout >03:50:57,499 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:50:57,499 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,500 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,500 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,501 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,501 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,501 DEBUG storage.ui: request 40 (sdd1) growth: 0 (0MB) size: 3004MB >03:50:57,502 DEBUG storage.ui: request 34 (sdd2) growth: 0 (0MB) size: 512MB >03:50:57,502 DEBUG storage.ui: disk /dev/sdd growth: 0 (0MB) >03:50:57,502 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:50:57,503 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,504 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,504 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,504 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,505 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,505 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,506 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,506 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,507 DEBUG storage.ui: adding 17373121 (8482MB) to 26 (sdb3) >03:50:57,507 DEBUG storage.ui: taking back 16588737 (8099MB) from 26 (sdb3) >03:50:57,507 DEBUG storage.ui: new grow amount for request 26 (sdb3) is 784384 units, or 383MB >03:50:57,508 DEBUG storage.ui: request 38 (sdb1) growth: 0 (0MB) size: 3004MB >03:50:57,508 DEBUG storage.ui: request 32 (sdb2) growth: 0 (0MB) size: 512MB >03:50:57,508 DEBUG storage.ui: request 26 (sdb3) growth: 784384 (383MB) size: 384MB >03:50:57,509 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:50:57,509 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:50:57,512 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:50:57,513 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a45190> PedPartition: <_ped.Partition object at 0x7fae04f8ef50> >03:50:57,515 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:50:57,518 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:50:57,518 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,519 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,519 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,520 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,520 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,520 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,521 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,521 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,522 DEBUG storage.ui: adding 17373121 (8482MB) to 27 (sdc3) >03:50:57,522 DEBUG storage.ui: taking back 16588737 (8099MB) from 27 (sdc3) >03:50:57,522 DEBUG storage.ui: new grow amount for request 27 (sdc3) is 784384 units, or 383MB >03:50:57,523 DEBUG storage.ui: request 39 (sdc1) growth: 0 (0MB) size: 3004MB >03:50:57,523 DEBUG storage.ui: request 33 (sdc2) growth: 0 (0MB) size: 512MB >03:50:57,524 DEBUG storage.ui: request 27 (sdc3) growth: 784384 (383MB) size: 384MB >03:50:57,524 DEBUG storage.ui: disk /dev/sdc growth: 784384 (383MB) >03:50:57,524 DEBUG storage.ui: calculating growth for disk /dev/sda >03:50:57,525 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,525 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,526 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,526 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,527 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,527 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,528 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,528 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,529 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:50:57,529 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:50:57,529 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:50:57,530 DEBUG storage.ui: request 37 (sda1) growth: 0 (0MB) size: 3004MB >03:50:57,530 DEBUG storage.ui: request 31 (sda2) growth: 0 (0MB) size: 512MB >03:50:57,530 DEBUG storage.ui: request 25 (sda3) growth: 784384 (383MB) size: 384MB >03:50:57,531 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:50:57,533 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:50:57,533 DEBUG storage.ui: device sdc3 new partedPartition None >03:50:57,536 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:50:57,538 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:50:57,538 DEBUG storage.ui: total growth: 2353152 sectors >03:50:57,539 DEBUG storage.ui: updating use_disk to sdc, type: 0 >03:50:57,539 DEBUG storage.ui: new free: 7202816-24575999 / 8483MB >03:50:57,539 DEBUG storage.ui: new free allows for 2353152 sectors of growth >03:50:57,540 DEBUG storage.ui: created partition sdc3 of 1MB and added it to /dev/sdc >03:50:57,543 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:50:57,543 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a45050> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:50:57,546 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:50:57,548 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:50:57,551 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:50:57,551 DEBUG storage.ui: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a45290> PedPartition: <_ped.Partition object at 0x7fae04f293b0> >03:50:57,554 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,556 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,556 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,558 DEBUG storage.ui: DeviceTree.getDeviceByName: name: sda ; >03:50:57,560 DEBUG storage.ui: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:50:57,561 DEBUG storage.ui: resolved 'sda' to 'sda' (disk) >03:50:57,561 DEBUG storage.ui: allocating partition: req4 ; id: 28 ; disks: ['sdd'] ; >boot: False ; primary: False ; size: 1MB ; grow: True ; max_size: 384 >03:50:57,562 DEBUG storage.ui: checking freespace on sdd >03:50:57,562 DEBUG storage.ui: getBestFreeSpaceRegion: disk=/dev/sdd part_type=0 req_size=1MB boot=False best=None grow=True >03:50:57,563 DEBUG storage.ui: current free range is 63-2047 (0MB) >03:50:57,563 DEBUG storage.ui: current free range is 7202816-24575999 (8483MB) >03:50:57,564 DEBUG storage.ui: evaluating growth potential for new layout >03:50:57,564 DEBUG storage.ui: calculating growth for disk /dev/sdd >03:50:57,567 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:50:57,568 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1ef90> PedPartition: <_ped.Partition object at 0x7fae04f8ee90> >03:50:57,570 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:50:57,573 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:50:57,573 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,574 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,574 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,574 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,575 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,575 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,576 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,576 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,576 DEBUG storage.ui: adding 17373121 (8482MB) to 28 (sdd3) >03:50:57,577 DEBUG storage.ui: taking back 16588737 (8099MB) from 28 (sdd3) >03:50:57,577 DEBUG storage.ui: new grow amount for request 28 (sdd3) is 784384 units, or 383MB >03:50:57,577 DEBUG storage.ui: request 40 (sdd1) growth: 0 (0MB) size: 3004MB >03:50:57,578 DEBUG storage.ui: request 34 (sdd2) growth: 0 (0MB) size: 512MB >03:50:57,578 DEBUG storage.ui: request 28 (sdd3) growth: 784384 (383MB) size: 384MB >03:50:57,578 DEBUG storage.ui: disk /dev/sdd growth: 784384 (383MB) >03:50:57,579 DEBUG storage.ui: calculating growth for disk /dev/sdb >03:50:57,579 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,580 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,580 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,580 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,581 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,581 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,582 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,582 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,582 DEBUG storage.ui: adding 17373121 (8482MB) to 26 (sdb3) >03:50:57,583 DEBUG storage.ui: taking back 16588737 (8099MB) from 26 (sdb3) >03:50:57,583 DEBUG storage.ui: new grow amount for request 26 (sdb3) is 784384 units, or 383MB >03:50:57,583 DEBUG storage.ui: request 38 (sdb1) growth: 0 (0MB) size: 3004MB >03:50:57,584 DEBUG storage.ui: request 32 (sdb2) growth: 0 (0MB) size: 512MB >03:50:57,584 DEBUG storage.ui: request 26 (sdb3) growth: 784384 (383MB) size: 384MB >03:50:57,584 DEBUG storage.ui: disk /dev/sdb growth: 784384 (383MB) >03:50:57,585 DEBUG storage.ui: calculating growth for disk /dev/sdc >03:50:57,585 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,586 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,586 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,586 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,587 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,587 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,587 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,588 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,588 DEBUG storage.ui: adding 17373121 (8482MB) to 27 (sdc3) >03:50:57,588 DEBUG storage.ui: taking back 16588737 (8099MB) from 27 (sdc3) >03:50:57,589 DEBUG storage.ui: new grow amount for request 27 (sdc3) is 784384 units, or 383MB >03:50:57,589 DEBUG storage.ui: request 39 (sdc1) growth: 0 (0MB) size: 3004MB >03:50:57,589 DEBUG storage.ui: request 33 (sdc2) growth: 0 (0MB) size: 512MB >03:50:57,590 DEBUG storage.ui: request 27 (sdc3) growth: 784384 (383MB) size: 384MB >03:50:57,590 DEBUG storage.ui: disk /dev/sdc growth: 784384 (383MB) >03:50:57,590 DEBUG storage.ui: calculating growth for disk /dev/sda >03:50:57,591 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,591 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,592 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,592 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,592 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,593 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,593 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,594 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,594 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:50:57,595 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:50:57,595 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:50:57,595 DEBUG storage.ui: request 37 (sda1) growth: 0 (0MB) size: 3004MB >03:50:57,596 DEBUG storage.ui: request 31 (sda2) growth: 0 (0MB) size: 512MB >03:50:57,596 DEBUG storage.ui: request 25 (sda3) growth: 784384 (383MB) size: 384MB >03:50:57,596 DEBUG storage.ui: disk /dev/sda growth: 784384 (383MB) >03:50:57,598 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:50:57,599 DEBUG storage.ui: device sdd3 new partedPartition None >03:50:57,601 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:50:57,603 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:50:57,603 DEBUG storage.ui: total growth: 3137536 sectors >03:50:57,604 DEBUG storage.ui: updating use_disk to sdd, type: 0 >03:50:57,604 DEBUG storage.ui: new free: 7202816-24575999 / 8483MB >03:50:57,604 DEBUG storage.ui: new free allows for 3137536 sectors of growth >03:50:57,605 DEBUG storage.ui: created partition sdd3 of 1MB and added it to /dev/sdd >03:50:57,607 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:50:57,608 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00a90> PedPartition: <_ped.Partition object at 0x7fae04f8ef50> >03:50:57,610 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:50:57,613 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:50:57,615 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:50:57,616 DEBUG storage.ui: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef74d0> PedPartition: <_ped.Partition object at 0x7fae04f29470> >03:50:57,617 DEBUG storage.ui: growPartitions: disks=['sda', 'sdb', 'sdc', 'sdd'], partitions=['sda1(id 37)', 'sda2(id 31)', 'sda3(id 25)', 'sdb1(id 38)', 'sdb2(id 32)', 'sdb3(id 26)', 'sdc1(id 39)', 'sdc2(id 33)', 'sdc3(id 27)', 'sdd1(id 40)', 'sdd2(id 34)', 'sdd3(id 28)'] >03:50:57,617 DEBUG storage.ui: growable partitions are ['sda3', 'sdb3', 'sdc3', 'sdd3'] >03:50:57,618 DEBUG storage.ui: adding request 37 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,618 DEBUG storage.ui: adding request 31 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,619 DEBUG storage.ui: adding request 25 to chunk 24575937 (63-24575999) on /dev/sda >03:50:57,619 DEBUG storage.ui: disk sda has 1 chunks >03:50:57,620 DEBUG storage.ui: adding request 38 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,620 DEBUG storage.ui: adding request 32 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,620 DEBUG storage.ui: adding request 26 to chunk 24575937 (63-24575999) on /dev/sdb >03:50:57,622 DEBUG storage.ui: disk sdb has 1 chunks >03:50:57,624 DEBUG storage.ui: adding request 39 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,625 DEBUG storage.ui: adding request 33 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,626 DEBUG storage.ui: adding request 27 to chunk 24575937 (63-24575999) on /dev/sdc >03:50:57,627 DEBUG storage.ui: disk sdc has 1 chunks >03:50:57,629 DEBUG storage.ui: adding request 40 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,630 DEBUG storage.ui: adding request 34 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,630 DEBUG storage.ui: adding request 28 to chunk 24575937 (63-24575999) on /dev/sdd >03:50:57,631 DEBUG storage.ui: disk sdd has 1 chunks >03:50:57,631 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sda start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,631 DEBUG storage.ui: req: PartitionRequest instance -- >id = 37 name = sda1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,632 DEBUG storage.ui: req: PartitionRequest instance -- >id = 31 name = sda2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,632 DEBUG storage.ui: req: PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,632 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,633 DEBUG storage.ui: adding 17373121 (8482MB) to 25 (sda3) >03:50:57,633 DEBUG storage.ui: taking back 16588737 (8099MB) from 25 (sda3) >03:50:57,634 DEBUG storage.ui: new grow amount for request 25 (sda3) is 784384 units, or 383MB >03:50:57,634 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdb start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,634 DEBUG storage.ui: req: PartitionRequest instance -- >id = 38 name = sdb1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,635 DEBUG storage.ui: req: PartitionRequest instance -- >id = 32 name = sdb2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,635 DEBUG storage.ui: req: PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,635 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,636 DEBUG storage.ui: adding 17373121 (8482MB) to 26 (sdb3) >03:50:57,636 DEBUG storage.ui: taking back 16588737 (8099MB) from 26 (sdb3) >03:50:57,637 DEBUG storage.ui: new grow amount for request 26 (sdb3) is 784384 units, or 383MB >03:50:57,637 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdc start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,637 DEBUG storage.ui: req: PartitionRequest instance -- >id = 39 name = sdc1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,638 DEBUG storage.ui: req: PartitionRequest instance -- >id = 33 name = sdc2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,638 DEBUG storage.ui: req: PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,638 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,639 DEBUG storage.ui: adding 17373121 (8482MB) to 27 (sdc3) >03:50:57,639 DEBUG storage.ui: taking back 16588737 (8099MB) from 27 (sdc3) >03:50:57,639 DEBUG storage.ui: new grow amount for request 27 (sdc3) is 784384 units, or 383MB >03:50:57,640 DEBUG storage.ui: Chunk.growRequests: 24575937 on /dev/sdd start = 63 end = 24575999 >sectorSize = 512 > >03:50:57,640 DEBUG storage.ui: req: PartitionRequest instance -- >id = 40 name = sdd1 growable = False >base = 6152192 growth = 0 max_grow = 0 >done = True >03:50:57,641 DEBUG storage.ui: req: PartitionRequest instance -- >id = 34 name = sdd2 growable = False >base = 1048576 growth = 0 max_grow = 0 >done = True >03:50:57,641 DEBUG storage.ui: req: PartitionRequest instance -- >id = 28 name = sdd3 growable = True >base = 2048 growth = 0 max_grow = 784384 >done = False >03:50:57,641 DEBUG storage.ui: 1 requests and 17373121 (8482MB) left in chunk >03:50:57,642 DEBUG storage.ui: adding 17373121 (8482MB) to 28 (sdd3) >03:50:57,642 DEBUG storage.ui: taking back 16588737 (8099MB) from 28 (sdd3) >03:50:57,642 DEBUG storage.ui: new grow amount for request 28 (sdd3) is 784384 units, or 383MB >03:50:57,643 DEBUG storage.ui: set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] 384 >03:50:57,643 DEBUG storage.ui: min growth is 784384 >03:50:57,643 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:50:57,644 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:50:57,644 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:50:57,645 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 28 name = sdd3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:50:57,645 DEBUG storage.ui: set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] 384 >03:50:57,645 DEBUG storage.ui: min growth is 784384 >03:50:57,646 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 25 name = sda3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:50:57,646 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 26 name = sdb3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:50:57,647 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 27 name = sdc3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:50:57,647 DEBUG storage.ui: max growth for PartitionRequest instance -- >id = 28 name = sdd3 growable = True >base = 2048 growth = 784384 max_grow = 784384 >done = True is 784384 >03:50:57,647 DEBUG storage.ui: growing partitions on sda >03:50:57,648 DEBUG storage.ui: partition sda1 (37): 0 >03:50:57,648 DEBUG storage.ui: new geometry for sda1: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae05a48190> >03:50:57,649 DEBUG storage.ui: partition sda2 (31): 0 >03:50:57,649 DEBUG storage.ui: new geometry for sda2: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae05a48590> >03:50:57,650 DEBUG storage.ui: partition sda3 (25): 0 >03:50:57,650 DEBUG storage.ui: new geometry for sda3: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae05a51b90> >03:50:57,651 DEBUG storage.ui: removing all non-preexisting partitions ['sda1(id 37)', 'sda2(id 31)', 'sda3(id 25)'] from disk(s) ['sda'] >03:50:57,653 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:50:57,653 DEBUG storage.ui: device sda1 new partedPartition None >03:50:57,655 DEBUG storage.ui: PartitionDevice._setDisk: req11 ; new: None ; old: sda ; >03:50:57,658 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sda ; >03:50:57,660 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:50:57,660 DEBUG storage.ui: device sda2 new partedPartition None >03:50:57,662 DEBUG storage.ui: PartitionDevice._setDisk: req6 ; new: None ; old: sda ; >03:50:57,664 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sda ; >03:50:57,667 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:50:57,667 DEBUG storage.ui: device sda3 new partedPartition None >03:50:57,669 DEBUG storage.ui: PartitionDevice._setDisk: req1 ; new: None ; old: sda ; >03:50:57,671 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sda ; >03:50:57,672 DEBUG storage.ui: back from removeNewPartitions >03:50:57,672 DEBUG storage.ui: extended: None >03:50:57,672 DEBUG storage.ui: setting req11 new geometry: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae05a48190> >03:50:57,675 DEBUG storage.ui: PartitionDevice._setPartedPartition: req11 ; >03:50:57,676 DEBUG storage.ui: device req11 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51dd0> PedPartition: <_ped.Partition object at 0x7fae04f8e290> >03:50:57,678 DEBUG storage.ui: PartitionDevice._setDisk: sda1 ; new: sda ; old: None ; >03:50:57,680 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sda ; >03:50:57,683 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda1 ; >03:50:57,683 DEBUG storage.ui: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51550> PedPartition: <_ped.Partition object at 0x7fae04fb3f50> >03:50:57,684 DEBUG storage.ui: setting req6 new geometry: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae05a48590> >03:50:57,686 DEBUG storage.ui: PartitionDevice._setPartedPartition: req6 ; >03:50:57,687 DEBUG storage.ui: device req6 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00c50> PedPartition: <_ped.Partition object at 0x7fae04f294d0> >03:50:57,689 DEBUG storage.ui: PartitionDevice._setDisk: sda2 ; new: sda ; old: None ; >03:50:57,691 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sda ; >03:50:57,694 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda2 ; >03:50:57,695 DEBUG storage.ui: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51250> PedPartition: <_ped.Partition object at 0x7fae04f8e350> >03:50:57,695 DEBUG storage.ui: setting req1 new geometry: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09390> PedGeometry: <_ped.Geometry object at 0x7fae05a51b90> >03:50:57,698 DEBUG storage.ui: PartitionDevice._setPartedPartition: req1 ; >03:50:57,698 DEBUG storage.ui: device req1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a517d0> PedPartition: <_ped.Partition object at 0x7fae04f29530> >03:50:57,701 DEBUG storage.ui: PartitionDevice._setDisk: sda3 ; new: sda ; old: None ; >03:50:57,704 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sda ; >03:50:57,707 DEBUG storage.ui: PartitionDevice._setPartedPartition: sda3 ; >03:50:57,707 DEBUG storage.ui: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b095d0> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22650> PedPartition: <_ped.Partition object at 0x7fae04f295f0> >03:50:57,708 DEBUG storage.ui: growing partitions on sdb >03:50:57,708 DEBUG storage.ui: partition sdb1 (38): 0 >03:50:57,709 DEBUG storage.ui: new geometry for sdb1: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae05a48e90> >03:50:57,709 DEBUG storage.ui: partition sdb2 (32): 0 >03:50:57,710 DEBUG storage.ui: new geometry for sdb2: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae05a48f10> >03:50:57,710 DEBUG storage.ui: partition sdb3 (26): 0 >03:50:57,711 DEBUG storage.ui: new geometry for sdb3: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae05a49090> >03:50:57,711 DEBUG storage.ui: removing all non-preexisting partitions ['sdb1(id 38)', 'sdb2(id 32)', 'sdb3(id 26)'] from disk(s) ['sdb'] >03:50:57,713 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:50:57,714 DEBUG storage.ui: device sdb1 new partedPartition None >03:50:57,716 DEBUG storage.ui: PartitionDevice._setDisk: req12 ; new: None ; old: sdb ; >03:50:57,718 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdb ; >03:50:57,721 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:50:57,721 DEBUG storage.ui: device sdb2 new partedPartition None >03:50:57,723 DEBUG storage.ui: PartitionDevice._setDisk: req7 ; new: None ; old: sdb ; >03:50:57,726 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdb ; >03:50:57,728 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:50:57,728 DEBUG storage.ui: device sdb3 new partedPartition None >03:50:57,730 DEBUG storage.ui: PartitionDevice._setDisk: req2 ; new: None ; old: sdb ; >03:50:57,733 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdb ; >03:50:57,733 DEBUG storage.ui: back from removeNewPartitions >03:50:57,733 DEBUG storage.ui: extended: None >03:50:57,734 DEBUG storage.ui: setting req12 new geometry: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae05a48e90> >03:50:57,736 DEBUG storage.ui: PartitionDevice._setPartedPartition: req12 ; >03:50:57,737 DEBUG storage.ui: device req12 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51fd0> PedPartition: <_ped.Partition object at 0x7fae04f8e290> >03:50:57,739 DEBUG storage.ui: PartitionDevice._setDisk: sdb1 ; new: sdb ; old: None ; >03:50:57,742 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdb ; >03:50:57,744 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb1 ; >03:50:57,745 DEBUG storage.ui: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51850> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:50:57,746 DEBUG storage.ui: setting req7 new geometry: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae05a48f10> >03:50:57,748 DEBUG storage.ui: PartitionDevice._setPartedPartition: req7 ; >03:50:57,749 DEBUG storage.ui: device req7 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f00bd0> PedPartition: <_ped.Partition object at 0x7fae04f8e1d0> >03:50:57,751 DEBUG storage.ui: PartitionDevice._setDisk: sdb2 ; new: sdb ; old: None ; >03:50:57,753 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdb ; >03:50:57,756 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb2 ; >03:50:57,757 DEBUG storage.ui: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51650> PedPartition: <_ped.Partition object at 0x7fae04f8e770> >03:50:57,757 DEBUG storage.ui: setting req2 new geometry: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09ad0> PedGeometry: <_ped.Geometry object at 0x7fae05a49090> >03:50:57,760 DEBUG storage.ui: PartitionDevice._setPartedPartition: req2 ; >03:50:57,760 DEBUG storage.ui: device req2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49190> PedPartition: <_ped.Partition object at 0x7fae04f294d0> >03:50:57,763 DEBUG storage.ui: PartitionDevice._setDisk: sdb3 ; new: sdb ; old: None ; >03:50:57,765 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdb ; >03:50:57,767 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdb3 ; >03:50:57,768 DEBUG storage.ui: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee350> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a4d350> PedPartition: <_ped.Partition object at 0x7fae04f29530> >03:50:57,769 DEBUG storage.ui: growing partitions on sdc >03:50:57,769 DEBUG storage.ui: partition sdc1 (39): 0 >03:50:57,770 DEBUG storage.ui: new geometry for sdc1: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f12d10> >03:50:57,770 DEBUG storage.ui: partition sdc2 (33): 0 >03:50:57,770 DEBUG storage.ui: new geometry for sdc2: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae05a51dd0> >03:50:57,771 DEBUG storage.ui: partition sdc3 (27): 0 >03:50:57,771 DEBUG storage.ui: new geometry for sdc3: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae05a4d3d0> >03:50:57,772 DEBUG storage.ui: removing all non-preexisting partitions ['sdc1(id 39)', 'sdc2(id 33)', 'sdc3(id 27)'] from disk(s) ['sdc'] >03:50:57,774 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:50:57,774 DEBUG storage.ui: device sdc1 new partedPartition None >03:50:57,777 DEBUG storage.ui: PartitionDevice._setDisk: req13 ; new: None ; old: sdc ; >03:50:57,779 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdc ; >03:50:57,781 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:50:57,782 DEBUG storage.ui: device sdc2 new partedPartition None >03:50:57,784 DEBUG storage.ui: PartitionDevice._setDisk: req8 ; new: None ; old: sdc ; >03:50:57,786 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdc ; >03:50:57,789 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:50:57,789 DEBUG storage.ui: device sdc3 new partedPartition None >03:50:57,791 DEBUG storage.ui: PartitionDevice._setDisk: req3 ; new: None ; old: sdc ; >03:50:57,794 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdc ; >03:50:57,794 DEBUG storage.ui: back from removeNewPartitions >03:50:57,794 DEBUG storage.ui: extended: None >03:50:57,795 DEBUG storage.ui: setting req13 new geometry: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae04f12d10> >03:50:57,797 DEBUG storage.ui: PartitionDevice._setPartedPartition: req13 ; >03:50:57,798 DEBUG storage.ui: device req13 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f04310> PedPartition: <_ped.Partition object at 0x7fae04f8e3b0> >03:50:57,800 DEBUG storage.ui: PartitionDevice._setDisk: sdc1 ; new: sdc ; old: None ; >03:50:57,803 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdc ; >03:50:57,805 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc1 ; >03:50:57,806 DEBUG storage.ui: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51790> PedPartition: <_ped.Partition object at 0x7fae04f8e4d0> >03:50:57,807 DEBUG storage.ui: setting req8 new geometry: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae05a51dd0> >03:50:57,809 DEBUG storage.ui: PartitionDevice._setPartedPartition: req8 ; >03:50:57,810 DEBUG storage.ui: device req8 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f040d0> PedPartition: <_ped.Partition object at 0x7fae04f29650> >03:50:57,812 DEBUG storage.ui: PartitionDevice._setDisk: sdc2 ; new: sdc ; old: None ; >03:50:57,814 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdc ; >03:50:57,817 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc2 ; >03:50:57,818 DEBUG storage.ui: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51410> PedPartition: <_ped.Partition object at 0x7fae04f8e290> >03:50:57,818 DEBUG storage.ui: setting req3 new geometry: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05aee390> PedGeometry: <_ped.Geometry object at 0x7fae05a4d3d0> >03:50:57,821 DEBUG storage.ui: PartitionDevice._setPartedPartition: req3 ; >03:50:57,822 DEBUG storage.ui: device req3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51710> PedPartition: <_ped.Partition object at 0x7fae04f296b0> >03:50:57,824 DEBUG storage.ui: PartitionDevice._setDisk: sdc3 ; new: sdc ; old: None ; >03:50:57,826 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdc ; >03:50:57,829 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdc3 ; >03:50:57,830 DEBUG storage.ui: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee6d0> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a48690> PedPartition: <_ped.Partition object at 0x7fae04f29410> >03:50:57,830 DEBUG storage.ui: growing partitions on sdd >03:50:57,831 DEBUG storage.ui: partition sdd1 (40): 0 >03:50:57,831 DEBUG storage.ui: new geometry for sdd1: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae05a45350> >03:50:57,832 DEBUG storage.ui: partition sdd2 (34): 0 >03:50:57,832 DEBUG storage.ui: new geometry for sdd2: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae05a51dd0> >03:50:57,833 DEBUG storage.ui: partition sdd3 (28): 0 >03:50:57,833 DEBUG storage.ui: new geometry for sdd3: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae05a48710> >03:50:57,834 DEBUG storage.ui: removing all non-preexisting partitions ['sdd1(id 40)', 'sdd2(id 34)', 'sdd3(id 28)'] from disk(s) ['sdd'] >03:50:57,836 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:50:57,837 DEBUG storage.ui: device sdd1 new partedPartition None >03:50:57,839 DEBUG storage.ui: PartitionDevice._setDisk: req14 ; new: None ; old: sdd ; >03:50:57,841 DEBUG storage.ui: DiskDevice.removeChild: kids: 3 ; name: sdd ; >03:50:57,843 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:50:57,844 DEBUG storage.ui: device sdd2 new partedPartition None >03:50:57,846 DEBUG storage.ui: PartitionDevice._setDisk: req9 ; new: None ; old: sdd ; >03:50:57,848 DEBUG storage.ui: DiskDevice.removeChild: kids: 2 ; name: sdd ; >03:50:57,851 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:50:57,851 DEBUG storage.ui: device sdd3 new partedPartition None >03:50:57,853 DEBUG storage.ui: PartitionDevice._setDisk: req4 ; new: None ; old: sdd ; >03:50:57,856 DEBUG storage.ui: DiskDevice.removeChild: kids: 1 ; name: sdd ; >03:50:57,856 DEBUG storage.ui: back from removeNewPartitions >03:50:57,856 DEBUG storage.ui: extended: None >03:50:57,857 DEBUG storage.ui: setting req14 new geometry: parted.Geometry instance -- > start: 2048 end: 6154239 length: 6152192 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae05a45350> >03:50:57,859 DEBUG storage.ui: PartitionDevice._setPartedPartition: req14 ; >03:50:57,860 DEBUG storage.ui: device req14 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a45650> PedPartition: <_ped.Partition object at 0x7fae04f8ed10> >03:50:57,863 DEBUG storage.ui: PartitionDevice._setDisk: sdd1 ; new: sdd ; old: None ; >03:50:57,865 DEBUG storage.ui: DiskDevice.addChild: kids: 0 ; name: sdd ; >03:50:57,868 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd1 ; >03:50:57,868 DEBUG storage.ui: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a49090> PedPartition: <_ped.Partition object at 0x7fae04f8e3b0> >03:50:57,869 DEBUG storage.ui: setting req9 new geometry: parted.Geometry instance -- > start: 6154240 end: 7202815 length: 1048576 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae05a51dd0> >03:50:57,871 DEBUG storage.ui: PartitionDevice._setPartedPartition: req9 ; >03:50:57,872 DEBUG storage.ui: device req9 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7850> PedPartition: <_ped.Partition object at 0x7fae04f293b0> >03:50:57,875 DEBUG storage.ui: PartitionDevice._setDisk: sdd2 ; new: sdd ; old: None ; >03:50:57,877 DEBUG storage.ui: DiskDevice.addChild: kids: 1 ; name: sdd ; >03:50:57,880 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd2 ; >03:50:57,880 DEBUG storage.ui: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22c10> PedPartition: <_ped.Partition object at 0x7fae04f8edd0> >03:50:57,881 DEBUG storage.ui: setting req4 new geometry: parted.Geometry instance -- > start: 7202816 end: 7989247 length: 786432 > device: <parted.device.Device object at 0x7fae05b09b90> PedGeometry: <_ped.Geometry object at 0x7fae05a48710> >03:50:57,883 DEBUG storage.ui: PartitionDevice._setPartedPartition: req4 ; >03:50:57,884 DEBUG storage.ui: device req4 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04ef7390> PedPartition: <_ped.Partition object at 0x7fae04f294d0> >03:50:57,886 DEBUG storage.ui: PartitionDevice._setDisk: sdd3 ; new: sdd ; old: None ; >03:50:57,889 DEBUG storage.ui: DiskDevice.addChild: kids: 2 ; name: sdd ; >03:50:57,891 DEBUG storage.ui: PartitionDevice._setPartedPartition: sdd3 ; >03:50:57,892 DEBUG storage.ui: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09ed0> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a45150> PedPartition: <_ped.Partition object at 0x7fae04f29470> >03:50:57,893 DEBUG storage.ui: fixing size of non-existent 3004MB partition sda1 (37) with non-existent mdmember at 3004.00 >03:50:57,893 DEBUG storage.ui: fixing size of non-existent 512MB partition sda2 (31) with non-existent mdmember at 512.00 >03:50:57,894 DEBUG storage.ui: fixing size of non-existent 384MB partition sda3 (25) with non-existent mdmember at 384.00 >03:50:57,894 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdb1 (38) with non-existent mdmember at 3004.00 >03:50:57,895 DEBUG storage.ui: fixing size of non-existent 512MB partition sdb2 (32) with non-existent mdmember at 512.00 >03:50:57,896 DEBUG storage.ui: fixing size of non-existent 384MB partition sdb3 (26) with non-existent mdmember at 384.00 >03:50:57,896 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdc1 (39) with non-existent mdmember at 3004.00 >03:50:57,897 DEBUG storage.ui: fixing size of non-existent 512MB partition sdc2 (33) with non-existent mdmember at 512.00 >03:50:57,898 DEBUG storage.ui: fixing size of non-existent 384MB partition sdc3 (27) with non-existent mdmember at 384.00 >03:50:57,898 DEBUG storage.ui: fixing size of non-existent 3004MB partition sdd1 (40) with non-existent mdmember at 3004.00 >03:50:57,899 DEBUG storage.ui: fixing size of non-existent 512MB partition sdd2 (34) with non-existent mdmember at 512.00 >03:50:57,900 DEBUG storage.ui: fixing size of non-existent 384MB partition sdd3 (28) with non-existent mdmember at 384.00 >03:50:57,901 DEBUG storage.ui: new member set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] >03:50:57,902 DEBUG storage.ui: old member set: ['sda3', 'sdb3', 'sdc3', 'sdd3'] >03:50:57,903 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:50:57,903 INFO storage.ui: Using 0MB superBlockSize >03:50:57,904 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:50:57,905 DEBUG storage.ui: raw RAID 10 size == 768.0 >03:50:57,905 INFO storage.ui: Using 0MB superBlockSize >03:50:57,906 DEBUG storage.ui: non-existent RAID 10 size == 768.0 >03:50:57,909 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:57,909 INFO blivet: Using 0MB superBlockSize >03:50:57,910 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:57,912 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:57,913 INFO blivet: Using 0MB superBlockSize >03:50:57,913 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:57,916 DEBUG blivet: raw RAID 10 size == 6008.0 >03:50:57,916 INFO blivet: Using 4MB superBlockSize >03:50:57,917 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:50:57,919 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:57,920 INFO blivet: Using 0MB superBlockSize >03:50:57,920 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:57,934 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:57,935 INFO blivet: Using 0MB superBlockSize >03:50:57,936 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:57,939 DEBUG blivet: raw RAID 10 size == 768.0 >03:50:57,940 INFO blivet: Using 0MB superBlockSize >03:50:57,940 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:50:57,949 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:57,949 INFO blivet: Using 0MB superBlockSize >03:50:57,950 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:57,951 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:57,952 INFO blivet: Using 0MB superBlockSize >03:50:57,952 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:57,956 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:57,957 INFO blivet: Using 0MB superBlockSize >03:50:57,957 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:57,961 DEBUG blivet: Ext4FS.supported: supported: True ; >03:50:57,961 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:50:57,970 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:57,971 INFO blivet: Using 0MB superBlockSize >03:50:57,971 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:57,975 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:57,975 INFO blivet: Using 0MB superBlockSize >03:50:57,976 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:57,983 DEBUG blivet: Ext4FS.supported: supported: True ; >03:50:57,983 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:50:57,988 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid1'} >03:50:57,993 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:57,993 INFO blivet: Using 0MB superBlockSize >03:50:57,993 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:57,995 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:57,995 INFO blivet: Using 0MB superBlockSize >03:50:57,996 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:58,000 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:58,000 INFO blivet: Using 0MB superBlockSize >03:50:58,001 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:58,919 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:58,920 INFO blivet: Using 0MB superBlockSize >03:50:58,922 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:58,930 DEBUG blivet: raw RAID 1 size == 512.0 >03:50:58,930 INFO blivet: Using 0MB superBlockSize >03:50:58,931 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:50:58,938 DEBUG blivet: Ext4FS.supported: supported: True ; >03:50:58,939 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:50:58,944 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 512, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid1'} >03:50:58,949 DEBUG blivet: raw RAID 10 size == 6008.0 >03:50:58,949 INFO blivet: Using 4MB superBlockSize >03:50:58,950 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:50:58,951 DEBUG blivet: raw RAID 10 size == 6008.0 >03:50:58,952 INFO blivet: Using 4MB superBlockSize >03:50:58,952 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:50:58,956 DEBUG blivet: raw RAID 10 size == 6008.0 >03:50:58,957 INFO blivet: Using 4MB superBlockSize >03:50:58,957 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:00,880 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:00,881 INFO blivet: Using 4MB superBlockSize >03:51:00,881 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:00,888 DEBUG blivet: Ext4FS.supported: supported: True ; >03:51:00,889 DEBUG blivet: getFormat('ext4') returning Ext4FS instance >03:51:00,894 DEBUG storage.ui: instantiating <class 'blivet.devicefactory.MDFactory'>: <blivet.Blivet object at 0x7fae05326d50>, 6000, ['sda', 'sdb', 'sdc', 'sdd'], {'encrypted': False, 'raid_level': 'raid10'} >03:51:00,898 INFO blivet: removing obsolete action 58 (60) >03:51:00,899 INFO blivet: removing obsolete action 59 (60) >03:51:00,900 INFO blivet: removing obsolete action 60 (60) >03:51:00,900 DEBUG blivet: action 59 already pruned >03:51:00,900 DEBUG blivet: action 58 already pruned >03:51:00,901 INFO blivet: removing obsolete action 45 (47) >03:51:00,902 INFO blivet: removing obsolete action 46 (47) >03:51:00,902 INFO blivet: removing obsolete action 47 (47) >03:51:00,903 DEBUG blivet: action 46 already pruned >03:51:00,903 DEBUG blivet: action 45 already pruned >03:51:00,904 INFO blivet: removing obsolete action 32 (34) >03:51:00,905 INFO blivet: removing obsolete action 33 (34) >03:51:00,905 INFO blivet: removing obsolete action 34 (34) >03:51:00,906 DEBUG blivet: action 33 already pruned >03:51:00,906 DEBUG blivet: action 32 already pruned >03:51:02,483 DEBUG blivet: DeviceTree.getDeviceByName: name: sda ; >03:51:02,485 DEBUG blivet: DeviceTree.getDeviceByName returned existing 12000MB disk sda (1) with non-existent msdos disklabel >03:51:02,486 DEBUG blivet: resolved 'sda' to 'sda' (disk) >03:51:02,503 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:02,505 INFO blivet: Using 4MB superBlockSize >03:51:02,506 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:02,508 DEBUG blivet: raw RAID 1 size == 512.0 >03:51:02,510 INFO blivet: Using 0MB superBlockSize >03:51:02,511 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:51:02,512 DEBUG blivet: raw RAID 1 size == 512.0 >03:51:02,514 INFO blivet: Using 0MB superBlockSize >03:51:02,515 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:51:02,516 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:02,518 INFO blivet: Using 4MB superBlockSize >03:51:02,519 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:02,525 DEBUG blivet: raw RAID 1 size == 512.0 >03:51:02,527 INFO blivet: Using 0MB superBlockSize >03:51:02,528 DEBUG blivet: non-existent RAID 1 size == 512.0 >03:51:02,585 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:02,585 INFO blivet: Using 4MB superBlockSize >03:51:02,586 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:03,012 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:03,013 INFO blivet: Using 4MB superBlockSize >03:51:03,014 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:03,021 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:03,021 INFO blivet: Using 4MB superBlockSize >03:51:03,022 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:03,030 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:03,030 INFO blivet: Using 4MB superBlockSize >03:51:03,031 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:04,519 DEBUG blivet: OpticalDevice.teardown: sr0 ; status: True ; controllable: True ; >03:51:04,556 DEBUG blivet: LoopDevice.teardown: loop0 ; status: False ; controllable: False ; >03:51:04,557 DEBUG blivet: LoopDevice.teardown: loop1 ; status: False ; controllable: False ; >03:51:04,559 DEBUG blivet: MDRaidArrayDevice.teardown: swap ; status: False ; controllable: True ; >03:51:04,560 DEBUG blivet: PartitionDevice.teardown: sda3 ; status: False ; controllable: True ; >03:51:04,562 DEBUG blivet: PartitionDevice.teardown: sdb3 ; status: False ; controllable: True ; >03:51:04,563 DEBUG blivet: PartitionDevice.teardown: sdc3 ; status: False ; controllable: True ; >03:51:04,564 DEBUG blivet: PartitionDevice.teardown: sdd3 ; status: False ; controllable: True ; >03:51:04,566 DEBUG blivet: MDRaidArrayDevice.teardown: boot ; status: False ; controllable: True ; >03:51:04,569 DEBUG blivet: PartitionDevice.teardown: sda2 ; status: False ; controllable: True ; >03:51:04,571 DEBUG blivet: PartitionDevice.teardown: sdb2 ; status: False ; controllable: True ; >03:51:04,572 DEBUG blivet: PartitionDevice.teardown: sdc2 ; status: False ; controllable: True ; >03:51:04,573 DEBUG blivet: PartitionDevice.teardown: sdd2 ; status: False ; controllable: True ; >03:51:04,575 DEBUG blivet: MDRaidArrayDevice.teardown: root ; status: False ; controllable: True ; >03:51:04,576 DEBUG blivet: PartitionDevice.teardown: sda1 ; status: False ; controllable: True ; >03:51:04,578 DEBUG blivet: PartitionDevice.teardown: sdb1 ; status: False ; controllable: True ; >03:51:04,579 DEBUG blivet: PartitionDevice.teardown: sdc1 ; status: False ; controllable: True ; >03:51:04,580 DEBUG blivet: PartitionDevice.teardown: sdd1 ; status: False ; controllable: True ; >03:51:04,581 INFO blivet: resetting parted disks... >03:51:04,582 DEBUG blivet: DiskLabel.resetPartedDisk: device: /dev/sda ; >03:51:04,584 DEBUG blivet: DiskLabel.resetPartedDisk: device: /dev/sda ; >03:51:04,586 DEBUG blivet: DiskLabel.resetPartedDisk: device: /dev/sdd ; >03:51:04,588 DEBUG blivet: DiskLabel.resetPartedDisk: device: /dev/sdd ; >03:51:04,590 DEBUG blivet: DiskLabel.resetPartedDisk: device: /dev/sdc ; >03:51:04,591 DEBUG blivet: DiskLabel.resetPartedDisk: device: /dev/sdc ; >03:51:04,592 DEBUG blivet: DiskLabel.resetPartedDisk: device: /dev/sdb ; >03:51:04,594 DEBUG blivet: DiskLabel.resetPartedDisk: device: /dev/sdb ; >03:51:04,596 DEBUG blivet: PartitionDevice.preCommitFixup: sda3 ; >03:51:04,597 DEBUG blivet: PartitionDevice.preCommitFixup: sdb3 ; >03:51:04,598 DEBUG blivet: PartitionDevice.preCommitFixup: sdc3 ; >03:51:04,599 DEBUG blivet: PartitionDevice.preCommitFixup: sdd3 ; >03:51:04,601 DEBUG blivet: MDRaidArrayDevice.preCommitFixup: swap ; [None, '', '', '', '', '', '', '', None, '', '', None, '', '', '', '', '', '', '', '', '', '/boot', '', '', '', '', '/'] ; >03:51:04,604 DEBUG blivet: raw RAID 10 size == 768.0 >03:51:04,605 INFO blivet: Using 0MB superBlockSize >03:51:04,606 DEBUG blivet: non-existent RAID 10 size == 768.0 >03:51:04,607 DEBUG blivet: PartitionDevice.preCommitFixup: sda2 ; >03:51:04,608 DEBUG blivet: PartitionDevice.preCommitFixup: sdb2 ; >03:51:04,610 DEBUG blivet: PartitionDevice.preCommitFixup: sdc2 ; >03:51:04,611 DEBUG blivet: PartitionDevice.preCommitFixup: sdd2 ; >03:51:04,612 DEBUG blivet: MDRaidArrayDevice.preCommitFixup: boot ; [None, '', '', '', '', '', '', '', None, '', '', None, '', '', '', '', '', '', '', '', '', '/boot', '', '', '', '', '/'] ; >03:51:04,613 DEBUG blivet: raw RAID 1 size == 512.0 >03:51:04,614 INFO blivet: Using 2.0MB superBlockSize >03:51:04,614 DEBUG blivet: non-existent RAID 1 size == 510.0 >03:51:04,616 DEBUG blivet: PartitionDevice.preCommitFixup: sda1 ; >03:51:04,617 DEBUG blivet: PartitionDevice.preCommitFixup: sdb1 ; >03:51:04,618 DEBUG blivet: PartitionDevice.preCommitFixup: sdc1 ; >03:51:04,621 DEBUG blivet: PartitionDevice.preCommitFixup: sdd1 ; >03:51:04,623 DEBUG blivet: MDRaidArrayDevice.preCommitFixup: root ; [None, '', '', '', '', '', '', '', None, '', '', None, '', '', '', '', '', '', '', '', '', '/boot', '', '', '', '', '/'] ; >03:51:04,624 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:04,624 INFO blivet: Using 4MB superBlockSize >03:51:04,625 DEBUG blivet: non-existent RAID 10 size == 6000.0 >03:51:04,626 DEBUG blivet: MDRaidArrayDevice.preCommitFixup: dhcppc0:swap ; [None, '', '', '', '', '', '', '', None, '', '', None, '', '', '', '', '', '', '', '', '', '/boot', '', '', '', '', '/'] ; >03:51:04,628 DEBUG blivet: raw RAID 1 size == 2041.0 >03:51:04,628 INFO blivet: Using 1MB superBlockSize >03:51:04,629 DEBUG blivet: existing RAID 1 size == 2039.9375 >03:51:04,630 DEBUG blivet: PartitionDevice.preCommitFixup: sdb2 ; >03:51:04,631 DEBUG blivet: sector-based lookup found partition sdb2 >03:51:04,632 DEBUG blivet: PartitionDevice._setPartedPartition: sdb2 ; >03:51:04,633 DEBUG blivet: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b11fd0> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05e3b750> PedPartition: <_ped.Partition object at 0x7fae04f29e90> >03:51:04,634 DEBUG blivet: PartitionDevice.preCommitFixup: sdb1 ; >03:51:04,635 DEBUG blivet: sector-based lookup found partition sdb1 >03:51:04,638 DEBUG blivet: PartitionDevice._setPartedPartition: sdb1 ; >03:51:04,639 DEBUG blivet: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b11fd0> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05cd6e50> PedPartition: <_ped.Partition object at 0x7fae04f29e30> >03:51:04,641 DEBUG blivet: PartitionDevice.preCommitFixup: sdc2 ; >03:51:04,641 DEBUG blivet: sector-based lookup found partition sdc2 >03:51:04,643 DEBUG blivet: PartitionDevice._setPartedPartition: sdc2 ; >03:51:04,644 DEBUG blivet: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b11510> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0dcdcfd0> PedPartition: <_ped.Partition object at 0x7fae04f29ef0> >03:51:04,646 DEBUG blivet: PartitionDevice.preCommitFixup: sdc1 ; >03:51:04,646 DEBUG blivet: sector-based lookup found partition sdc1 >03:51:04,647 DEBUG blivet: PartitionDevice._setPartedPartition: sdc1 ; >03:51:04,648 DEBUG blivet: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b11510> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae053bdf50> PedPartition: <_ped.Partition object at 0x7fae04f29f50> >03:51:04,650 DEBUG blivet: PartitionDevice.preCommitFixup: sdd2 ; >03:51:04,650 DEBUG blivet: sector-based lookup found partition sdd2 >03:51:04,652 DEBUG blivet: PartitionDevice._setPartedPartition: sdd2 ; >03:51:04,653 DEBUG blivet: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aeaa10> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0dd391d0> PedPartition: <_ped.Partition object at 0x7fae04f29fb0> >03:51:04,656 DEBUG blivet: PartitionDevice.preCommitFixup: sdd1 ; >03:51:04,657 DEBUG blivet: sector-based lookup found partition sdd1 >03:51:04,658 DEBUG blivet: PartitionDevice._setPartedPartition: sdd1 ; >03:51:04,659 DEBUG blivet: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aeaa10> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae0dd39190> PedPartition: <_ped.Partition object at 0x7fae05a3f050> >03:51:04,660 DEBUG blivet: PartitionDevice.preCommitFixup: sda2 ; >03:51:04,661 DEBUG blivet: sector-based lookup found partition sda2 >03:51:04,662 DEBUG blivet: PartitionDevice._setPartedPartition: sda2 ; >03:51:04,663 DEBUG blivet: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05319fd0> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b22850> PedPartition: <_ped.Partition object at 0x7fae05a3f0b0> >03:51:04,665 DEBUG blivet: PartitionDevice.preCommitFixup: sda1 ; >03:51:04,665 DEBUG blivet: sector-based lookup found partition sda1 >03:51:04,666 DEBUG blivet: PartitionDevice._setPartedPartition: sda1 ; >03:51:04,667 DEBUG blivet: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05319fd0> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05e060d0> PedPartition: <_ped.Partition object at 0x7fae05a3f110> >03:51:04,668 DEBUG blivet: action: [14] Destroy Format swap on mdarray dhcppc0:swap (id 3) >03:51:04,668 DEBUG blivet: action: [15] Destroy Device mdarray dhcppc0:swap (id 3) >03:51:04,669 DEBUG blivet: action: [28] Destroy Format mdmember on partition sdb1 (id 15) >03:51:04,669 DEBUG blivet: action: [24] Destroy Format mdmember on partition sdc1 (id 12) >03:51:04,669 DEBUG blivet: action: [20] Destroy Format mdmember on partition sdd1 (id 9) >03:51:04,670 DEBUG blivet: action: [16] Destroy Format mdmember on partition sda1 (id 2) >03:51:04,670 DEBUG blivet: action: [2] Destroy Format btrfs filesystem on btrfs subvolume root (id 7) >03:51:04,671 DEBUG blivet: action: [3] Destroy Device btrfs subvolume root (id 7) >03:51:04,671 DEBUG blivet: action: [0] Destroy Format btrfs filesystem on btrfs subvolume boot (id 6) >03:51:04,672 DEBUG blivet: action: [1] Destroy Device btrfs subvolume boot (id 6) >03:51:04,673 DEBUG blivet: action: [4] Destroy Format btrfs filesystem on btrfs volume btrfs.5 (id 5) >03:51:04,674 DEBUG blivet: action: [5] Destroy Device btrfs volume btrfs.5 (id 5) >03:51:04,674 DEBUG blivet: action: [12] Destroy Format btrfs filesystem on partition sdb2 (id 16) >03:51:04,675 DEBUG blivet: action: [13] Destroy Device partition sdb2 (id 16) >03:51:04,675 DEBUG blivet: action: [29] Destroy Device partition sdb1 (id 15) >03:51:04,675 DEBUG blivet: action: [30] Destroy Format msdos disklabel on disk sdb (id 14) >03:51:04,676 DEBUG blivet: action: [10] Destroy Format btrfs filesystem on partition sdc2 (id 13) >03:51:04,676 DEBUG blivet: action: [11] Destroy Device partition sdc2 (id 13) >03:51:04,676 DEBUG blivet: action: [25] Destroy Device partition sdc1 (id 12) >03:51:04,677 DEBUG blivet: action: [26] Destroy Format msdos disklabel on disk sdc (id 11) >03:51:04,677 DEBUG blivet: action: [8] Destroy Format btrfs filesystem on partition sdd2 (id 10) >03:51:04,677 DEBUG blivet: action: [9] Destroy Device partition sdd2 (id 10) >03:51:04,678 DEBUG blivet: action: [21] Destroy Device partition sdd1 (id 9) >03:51:04,678 DEBUG blivet: action: [22] Destroy Format msdos disklabel on disk sdd (id 8) >03:51:04,678 DEBUG blivet: action: [6] Destroy Format btrfs filesystem on partition sda2 (id 4) >03:51:04,679 DEBUG blivet: action: [7] Destroy Device partition sda2 (id 4) >03:51:04,679 DEBUG blivet: action: [17] Destroy Device partition sda1 (id 2) >03:51:04,680 DEBUG blivet: action: [18] Destroy Format msdos disklabel on disk sda (id 1) >03:51:04,680 DEBUG blivet: action: [31] Create Format msdos disklabel on disk sdb (id 14) >03:51:04,680 DEBUG blivet: action: [63] Create Device partition sdb1 (id 38) >03:51:04,681 DEBUG blivet: action: [64] Create Format mdmember on partition sdb1 (id 38) >03:51:04,681 DEBUG blivet: action: [50] Create Device partition sdb2 (id 32) >03:51:04,682 DEBUG blivet: action: [51] Create Format mdmember on partition sdb2 (id 32) >03:51:04,682 DEBUG blivet: action: [37] Create Device partition sdb3 (id 26) >03:51:04,682 DEBUG blivet: action: [38] Create Format mdmember on partition sdb3 (id 26) >03:51:04,683 DEBUG blivet: action: [27] Create Format msdos disklabel on disk sdc (id 11) >03:51:04,683 DEBUG blivet: action: [65] Create Device partition sdc1 (id 39) >03:51:04,683 DEBUG blivet: action: [66] Create Format mdmember on partition sdc1 (id 39) >03:51:04,684 DEBUG blivet: action: [52] Create Device partition sdc2 (id 33) >03:51:04,684 DEBUG blivet: action: [53] Create Format mdmember on partition sdc2 (id 33) >03:51:04,684 DEBUG blivet: action: [39] Create Device partition sdc3 (id 27) >03:51:04,685 DEBUG blivet: action: [40] Create Format mdmember on partition sdc3 (id 27) >03:51:04,685 DEBUG blivet: action: [23] Create Format msdos disklabel on disk sdd (id 8) >03:51:04,685 DEBUG blivet: action: [67] Create Device partition sdd1 (id 40) >03:51:04,686 DEBUG blivet: action: [68] Create Format mdmember on partition sdd1 (id 40) >03:51:04,686 DEBUG blivet: action: [54] Create Device partition sdd2 (id 34) >03:51:04,686 DEBUG blivet: action: [55] Create Format mdmember on partition sdd2 (id 34) >03:51:04,687 DEBUG blivet: action: [41] Create Device partition sdd3 (id 28) >03:51:04,687 DEBUG blivet: action: [42] Create Format mdmember on partition sdd3 (id 28) >03:51:04,688 DEBUG blivet: action: [19] Create Format msdos disklabel on disk sda (id 1) >03:51:04,688 DEBUG blivet: action: [61] Create Device partition sda1 (id 37) >03:51:04,688 DEBUG blivet: action: [62] Create Format mdmember on partition sda1 (id 37) >03:51:04,689 DEBUG blivet: action: [69] Create Device mdarray root (id 41) >03:51:04,691 DEBUG blivet: action: [70] Create Format ext4 filesystem mounted at / on mdarray root (id 41) >03:51:04,692 DEBUG blivet: action: [48] Create Device partition sda2 (id 31) >03:51:04,692 DEBUG blivet: action: [49] Create Format mdmember on partition sda2 (id 31) >03:51:04,693 DEBUG blivet: action: [56] Create Device mdarray boot (id 35) >03:51:04,693 DEBUG blivet: action: [57] Create Format ext4 filesystem mounted at /boot on mdarray boot (id 35) >03:51:04,694 DEBUG blivet: action: [35] Create Device partition sda3 (id 25) >03:51:04,694 DEBUG blivet: action: [36] Create Format mdmember on partition sda3 (id 25) >03:51:04,695 DEBUG blivet: action: [43] Create Device mdarray swap (id 29) >03:51:04,695 DEBUG blivet: action: [44] Create Format swap on mdarray swap (id 29) >03:51:04,695 INFO blivet: pruning action queue... >03:51:04,699 INFO blivet: sorting actions... >03:51:04,776 DEBUG blivet: action: [0] Destroy Format btrfs filesystem on btrfs subvolume boot (id 6) >03:51:04,777 DEBUG blivet: action: [1] Destroy Device btrfs subvolume boot (id 6) >03:51:04,777 DEBUG blivet: action: [2] Destroy Format btrfs filesystem on btrfs subvolume root (id 7) >03:51:04,778 DEBUG blivet: action: [3] Destroy Device btrfs subvolume root (id 7) >03:51:04,778 DEBUG blivet: action: [4] Destroy Format btrfs filesystem on btrfs volume btrfs.5 (id 5) >03:51:04,778 DEBUG blivet: action: [5] Destroy Device btrfs volume btrfs.5 (id 5) >03:51:04,779 DEBUG blivet: action: [6] Destroy Format btrfs filesystem on partition sda2 (id 4) >03:51:04,779 DEBUG blivet: action: [7] Destroy Device partition sda2 (id 4) >03:51:04,780 DEBUG blivet: action: [8] Destroy Format btrfs filesystem on partition sdd2 (id 10) >03:51:04,780 DEBUG blivet: action: [9] Destroy Device partition sdd2 (id 10) >03:51:04,780 DEBUG blivet: action: [10] Destroy Format btrfs filesystem on partition sdc2 (id 13) >03:51:04,781 DEBUG blivet: action: [11] Destroy Device partition sdc2 (id 13) >03:51:04,781 DEBUG blivet: action: [12] Destroy Format btrfs filesystem on partition sdb2 (id 16) >03:51:04,781 DEBUG blivet: action: [13] Destroy Device partition sdb2 (id 16) >03:51:04,782 DEBUG blivet: action: [14] Destroy Format swap on mdarray dhcppc0:swap (id 3) >03:51:04,782 DEBUG blivet: action: [15] Destroy Device mdarray dhcppc0:swap (id 3) >03:51:04,782 DEBUG blivet: action: [16] Destroy Format mdmember on partition sda1 (id 2) >03:51:04,783 DEBUG blivet: action: [17] Destroy Device partition sda1 (id 2) >03:51:04,783 DEBUG blivet: action: [18] Destroy Format msdos disklabel on disk sda (id 1) >03:51:04,784 DEBUG blivet: action: [20] Destroy Format mdmember on partition sdd1 (id 9) >03:51:04,784 DEBUG blivet: action: [21] Destroy Device partition sdd1 (id 9) >03:51:04,784 DEBUG blivet: action: [22] Destroy Format msdos disklabel on disk sdd (id 8) >03:51:04,785 DEBUG blivet: action: [24] Destroy Format mdmember on partition sdc1 (id 12) >03:51:04,785 DEBUG blivet: action: [25] Destroy Device partition sdc1 (id 12) >03:51:04,785 DEBUG blivet: action: [26] Destroy Format msdos disklabel on disk sdc (id 11) >03:51:04,786 DEBUG blivet: action: [28] Destroy Format mdmember on partition sdb1 (id 15) >03:51:04,786 DEBUG blivet: action: [29] Destroy Device partition sdb1 (id 15) >03:51:04,787 DEBUG blivet: action: [30] Destroy Format msdos disklabel on disk sdb (id 14) >03:51:04,787 DEBUG blivet: action: [19] Create Format msdos disklabel on disk sda (id 1) >03:51:04,787 DEBUG blivet: action: [61] Create Device partition sda1 (id 37) >03:51:04,788 DEBUG blivet: action: [48] Create Device partition sda2 (id 31) >03:51:04,788 DEBUG blivet: action: [35] Create Device partition sda3 (id 25) >03:51:04,788 DEBUG blivet: action: [36] Create Format mdmember on partition sda3 (id 25) >03:51:04,789 DEBUG blivet: action: [49] Create Format mdmember on partition sda2 (id 31) >03:51:04,789 DEBUG blivet: action: [62] Create Format mdmember on partition sda1 (id 37) >03:51:04,789 DEBUG blivet: action: [23] Create Format msdos disklabel on disk sdd (id 8) >03:51:04,790 DEBUG blivet: action: [67] Create Device partition sdd1 (id 40) >03:51:04,790 DEBUG blivet: action: [54] Create Device partition sdd2 (id 34) >03:51:04,790 DEBUG blivet: action: [41] Create Device partition sdd3 (id 28) >03:51:04,791 DEBUG blivet: action: [42] Create Format mdmember on partition sdd3 (id 28) >03:51:04,791 DEBUG blivet: action: [55] Create Format mdmember on partition sdd2 (id 34) >03:51:04,793 DEBUG blivet: action: [68] Create Format mdmember on partition sdd1 (id 40) >03:51:04,794 DEBUG blivet: action: [27] Create Format msdos disklabel on disk sdc (id 11) >03:51:04,794 DEBUG blivet: action: [65] Create Device partition sdc1 (id 39) >03:51:04,794 DEBUG blivet: action: [52] Create Device partition sdc2 (id 33) >03:51:04,795 DEBUG blivet: action: [39] Create Device partition sdc3 (id 27) >03:51:04,795 DEBUG blivet: action: [40] Create Format mdmember on partition sdc3 (id 27) >03:51:04,796 DEBUG blivet: action: [53] Create Format mdmember on partition sdc2 (id 33) >03:51:04,796 DEBUG blivet: action: [66] Create Format mdmember on partition sdc1 (id 39) >03:51:04,796 DEBUG blivet: action: [31] Create Format msdos disklabel on disk sdb (id 14) >03:51:04,797 DEBUG blivet: action: [63] Create Device partition sdb1 (id 38) >03:51:04,797 DEBUG blivet: action: [50] Create Device partition sdb2 (id 32) >03:51:04,797 DEBUG blivet: action: [37] Create Device partition sdb3 (id 26) >03:51:04,798 DEBUG blivet: action: [38] Create Format mdmember on partition sdb3 (id 26) >03:51:04,798 DEBUG blivet: action: [43] Create Device mdarray swap (id 29) >03:51:04,798 DEBUG blivet: action: [44] Create Format swap on mdarray swap (id 29) >03:51:04,799 DEBUG blivet: action: [51] Create Format mdmember on partition sdb2 (id 32) >03:51:04,799 DEBUG blivet: action: [56] Create Device mdarray boot (id 35) >03:51:04,799 DEBUG blivet: action: [57] Create Format ext4 filesystem mounted at /boot on mdarray boot (id 35) >03:51:04,800 DEBUG blivet: action: [64] Create Format mdmember on partition sdb1 (id 38) >03:51:04,801 DEBUG blivet: action: [69] Create Device mdarray root (id 41) >03:51:04,801 DEBUG blivet: action: [70] Create Format ext4 filesystem mounted at / on mdarray root (id 41) >03:51:04,801 INFO blivet: executing action: [0] Destroy Format btrfs filesystem on btrfs subvolume boot (id 6) >03:51:04,803 DEBUG blivet: BTRFSSubVolumeDevice.setup: boot ; status: True ; controllable: True ; orig: True ; >03:51:04,841 DEBUG blivet: BTRFSSubVolumeDevice.teardown: boot ; status: True ; controllable: True ; >03:51:04,843 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:51:04,904 INFO blivet: executing action: [1] Destroy Device btrfs subvolume boot (id 6) >03:51:04,906 DEBUG blivet: BTRFSSubVolumeDevice.destroy: boot ; status: True ; >03:51:04,910 DEBUG blivet: BTRFSSubVolumeDevice.teardown: boot ; status: True ; controllable: True ; >03:51:04,914 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:51:04,949 DEBUG blivet: BTRFSSubVolumeDevice.setupParents: kids: 0 ; name: boot ; orig: True ; >03:51:04,951 DEBUG blivet: BTRFSVolumeDevice.setup: btrfs.5 ; status: True ; controllable: True ; orig: True ; >03:51:04,953 DEBUG blivet: BTRFSSubVolumeDevice._destroy: boot ; status: True ; >03:51:04,970 INFO blivet: failed to get default SELinux context for /tmp/btrfs-tmp.5yIXAwt: [Errno 2] No such file or directory >03:51:04,973 INFO blivet: set SELinux context for mountpoint /tmp/btrfs-tmp.5yIXAwt to None >03:51:05,009 INFO blivet: failed to get default SELinux context for /tmp/btrfs-tmp.5yIXAwt: [Errno 2] No such file or directory >03:51:05,010 INFO blivet: set SELinux context for newly mounted filesystem root at /tmp/btrfs-tmp.5yIXAwt to None >03:51:08,260 INFO blivet: executing action: [2] Destroy Format btrfs filesystem on btrfs subvolume root (id 7) >03:51:08,262 DEBUG blivet: BTRFSSubVolumeDevice.setup: root ; status: True ; controllable: True ; orig: True ; >03:51:08,294 DEBUG blivet: BTRFSSubVolumeDevice.teardown: root ; status: True ; controllable: True ; >03:51:08,297 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:51:08,360 INFO blivet: executing action: [3] Destroy Device btrfs subvolume root (id 7) >03:51:08,362 DEBUG blivet: BTRFSSubVolumeDevice.destroy: root ; status: True ; >03:51:08,364 DEBUG blivet: BTRFSSubVolumeDevice.teardown: root ; status: True ; controllable: True ; >03:51:08,367 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:51:08,399 DEBUG blivet: BTRFSSubVolumeDevice.setupParents: kids: 0 ; name: root ; orig: True ; >03:51:08,401 DEBUG blivet: BTRFSVolumeDevice.setup: btrfs.5 ; status: True ; controllable: True ; orig: True ; >03:51:08,403 DEBUG blivet: BTRFSSubVolumeDevice._destroy: root ; status: True ; >03:51:08,404 INFO blivet: failed to get default SELinux context for /tmp/btrfs-tmp.5174F_f: [Errno 2] No such file or directory >03:51:08,404 INFO blivet: set SELinux context for mountpoint /tmp/btrfs-tmp.5174F_f to None >03:51:08,455 INFO blivet: failed to get default SELinux context for /tmp/btrfs-tmp.5174F_f: [Errno 2] No such file or directory >03:51:08,456 INFO blivet: set SELinux context for newly mounted filesystem root at /tmp/btrfs-tmp.5174F_f to None >03:51:21,417 INFO blivet: executing action: [4] Destroy Format btrfs filesystem on btrfs volume btrfs.5 (id 5) >03:51:21,419 DEBUG blivet: BTRFSVolumeDevice.setup: btrfs.5 ; status: True ; controllable: True ; orig: True ; >03:51:21,451 DEBUG blivet: BTRFSVolumeDevice.teardown: btrfs.5 ; status: True ; controllable: True ; >03:51:21,457 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:51:21,515 INFO blivet: executing action: [5] Destroy Device btrfs volume btrfs.5 (id 5) >03:51:21,517 DEBUG blivet: BTRFSVolumeDevice.destroy: btrfs.5 ; status: True ; >03:51:21,519 DEBUG blivet: BTRFSVolumeDevice.teardown: btrfs.5 ; status: True ; controllable: True ; >03:51:21,522 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:51:21,555 DEBUG blivet: BTRFSVolumeDevice.setupParents: kids: 0 ; name: btrfs.5 ; orig: True ; >03:51:21,557 DEBUG blivet: PartitionDevice.setup: sda2 ; status: True ; controllable: True ; orig: True ; >03:51:21,559 DEBUG blivet: BTRFS.setup: device: /dev/sda2 ; mountpoint: None ; type: btrfs ; >03:51:21,561 DEBUG blivet: PartitionDevice.setup: sdd2 ; status: True ; controllable: True ; orig: True ; >03:51:21,562 DEBUG blivet: BTRFS.setup: device: /dev/sdd2 ; mountpoint: None ; type: btrfs ; >03:51:21,566 DEBUG blivet: PartitionDevice.setup: sdc2 ; status: True ; controllable: True ; orig: True ; >03:51:21,568 DEBUG blivet: BTRFS.setup: device: /dev/sdc2 ; mountpoint: None ; type: btrfs ; >03:51:21,570 DEBUG blivet: PartitionDevice.setup: sdb2 ; status: True ; controllable: True ; orig: True ; >03:51:21,571 DEBUG blivet: BTRFS.setup: device: /dev/sdb2 ; mountpoint: None ; type: btrfs ; >03:51:21,573 DEBUG blivet: BTRFSVolumeDevice._destroy: btrfs.5 ; status: True ; >03:51:21,575 DEBUG blivet: PartitionDevice.setup: sda2 ; status: True ; controllable: True ; orig: True ; >03:51:21,576 DEBUG blivet: DeviceFormat.destroy: device: /dev/sda2 ; status: False ; type: None ; >03:51:21,685 DEBUG blivet: PartitionDevice.setup: sdd2 ; status: True ; controllable: True ; orig: True ; >03:51:21,690 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdd2 ; status: False ; type: None ; >03:51:21,751 DEBUG blivet: PartitionDevice.setup: sdc2 ; status: True ; controllable: True ; orig: True ; >03:51:21,753 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdc2 ; status: False ; type: None ; >03:51:21,844 DEBUG blivet: PartitionDevice.setup: sdb2 ; status: True ; controllable: True ; orig: True ; >03:51:21,846 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdb2 ; status: False ; type: None ; >03:51:21,966 INFO blivet: executing action: [6] Destroy Format btrfs filesystem on partition sda2 (id 4) >03:51:21,968 DEBUG blivet: PartitionDevice.setup: sda2 ; status: True ; controllable: True ; orig: True ; >03:51:22,003 DEBUG blivet: PartitionDevice.teardown: sda2 ; status: True ; controllable: True ; >03:51:22,006 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:51:22,067 INFO blivet: executing action: [7] Destroy Device partition sda2 (id 4) >03:51:22,071 DEBUG blivet: PartitionDevice.destroy: sda2 ; status: True ; >03:51:22,073 DEBUG blivet: PartitionDevice.teardown: sda2 ; status: True ; controllable: True ; >03:51:22,076 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda2 ; status: False ; type: None ; >03:51:22,111 DEBUG blivet: PartitionDevice.setupParents: kids: 0 ; name: sda2 ; orig: True ; >03:51:22,113 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: True ; >03:51:22,115 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:22,118 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:22,120 DEBUG blivet: PartitionDevice._destroy: sda2 ; status: True ; >03:51:22,124 DEBUG blivet: DiskLabel.commit: device: /dev/sda ; numparts: 1 ; >03:51:22,249 INFO blivet: executing action: [8] Destroy Format btrfs filesystem on partition sdd2 (id 10) >03:51:22,251 DEBUG blivet: PartitionDevice.setup: sdd2 ; status: True ; controllable: True ; orig: True ; >03:51:22,285 DEBUG blivet: PartitionDevice.teardown: sdd2 ; status: True ; controllable: True ; >03:51:22,290 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdd2 ; status: False ; type: None ; >03:51:22,353 INFO blivet: executing action: [9] Destroy Device partition sdd2 (id 10) >03:51:22,355 DEBUG blivet: PartitionDevice.destroy: sdd2 ; status: True ; >03:51:22,359 DEBUG blivet: PartitionDevice.teardown: sdd2 ; status: True ; controllable: True ; >03:51:22,362 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdd2 ; status: False ; type: None ; >03:51:22,397 DEBUG blivet: PartitionDevice.setupParents: kids: 0 ; name: sdd2 ; orig: True ; >03:51:22,399 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: True ; >03:51:22,401 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:22,402 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:22,404 DEBUG blivet: PartitionDevice._destroy: sdd2 ; status: True ; >03:51:22,407 DEBUG blivet: DiskLabel.commit: device: /dev/sdd ; numparts: 1 ; >03:51:22,540 INFO blivet: executing action: [10] Destroy Format btrfs filesystem on partition sdc2 (id 13) >03:51:22,542 DEBUG blivet: PartitionDevice.setup: sdc2 ; status: True ; controllable: True ; orig: True ; >03:51:22,575 DEBUG blivet: PartitionDevice.teardown: sdc2 ; status: True ; controllable: True ; >03:51:22,578 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdc2 ; status: False ; type: None ; >03:51:22,641 INFO blivet: executing action: [11] Destroy Device partition sdc2 (id 13) >03:51:22,645 DEBUG blivet: PartitionDevice.destroy: sdc2 ; status: True ; >03:51:22,647 DEBUG blivet: PartitionDevice.teardown: sdc2 ; status: True ; controllable: True ; >03:51:22,650 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdc2 ; status: False ; type: None ; >03:51:22,683 DEBUG blivet: PartitionDevice.setupParents: kids: 0 ; name: sdc2 ; orig: True ; >03:51:22,685 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: True ; >03:51:22,687 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:22,689 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:22,690 DEBUG blivet: PartitionDevice._destroy: sdc2 ; status: True ; >03:51:22,693 DEBUG blivet: DiskLabel.commit: device: /dev/sdc ; numparts: 1 ; >03:51:22,833 INFO blivet: executing action: [12] Destroy Format btrfs filesystem on partition sdb2 (id 16) >03:51:22,835 DEBUG blivet: PartitionDevice.setup: sdb2 ; status: True ; controllable: True ; orig: True ; >03:51:22,867 DEBUG blivet: PartitionDevice.teardown: sdb2 ; status: True ; controllable: True ; >03:51:22,869 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdb2 ; status: False ; type: None ; >03:51:22,933 INFO blivet: executing action: [13] Destroy Device partition sdb2 (id 16) >03:51:22,935 DEBUG blivet: PartitionDevice.destroy: sdb2 ; status: True ; >03:51:22,937 DEBUG blivet: PartitionDevice.teardown: sdb2 ; status: True ; controllable: True ; >03:51:22,940 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdb2 ; status: False ; type: None ; >03:51:22,974 DEBUG blivet: PartitionDevice.setupParents: kids: 0 ; name: sdb2 ; orig: True ; >03:51:22,976 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: True ; >03:51:22,978 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:22,980 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:22,981 DEBUG blivet: PartitionDevice._destroy: sdb2 ; status: True ; >03:51:22,984 DEBUG blivet: DiskLabel.commit: device: /dev/sdb ; numparts: 1 ; >03:51:23,131 INFO blivet: executing action: [14] Destroy Format swap on mdarray dhcppc0:swap (id 3) >03:51:23,133 DEBUG blivet: MDRaidArrayDevice.setup: dhcppc0:swap ; status: False ; controllable: True ; orig: True ; >03:51:23,136 DEBUG blivet: MDRaidArrayDevice.setupParents: kids: 0 ; name: dhcppc0:swap ; orig: True ; >03:51:23,138 DEBUG blivet: PartitionDevice.setup: sda1 ; status: True ; controllable: True ; orig: True ; >03:51:23,140 DEBUG blivet: MDRaidMember.setup: device: /dev/sda1 ; status: False ; type: mdmember ; >03:51:23,143 DEBUG blivet: PartitionDevice.setup: sdd1 ; status: True ; controllable: True ; orig: True ; >03:51:23,145 DEBUG blivet: MDRaidMember.setup: device: /dev/sdd1 ; status: False ; type: mdmember ; >03:51:23,147 DEBUG blivet: PartitionDevice.setup: sdc1 ; status: True ; controllable: True ; orig: True ; >03:51:23,148 DEBUG blivet: MDRaidMember.setup: device: /dev/sdc1 ; status: False ; type: mdmember ; >03:51:23,150 DEBUG blivet: PartitionDevice.setup: sdb1 ; status: True ; controllable: True ; orig: True ; >03:51:23,152 DEBUG blivet: MDRaidMember.setup: device: /dev/sdb1 ; status: False ; type: mdmember ; >03:51:23,153 DEBUG blivet: MDRaidArrayDevice._setup: dhcppc0:swap ; status: False ; controllable: True ; orig: True ; >03:51:23,155 DEBUG blivet: PartitionDevice.setup: sda1 ; status: True ; controllable: True ; orig: True ; >03:51:23,157 DEBUG blivet: PartitionDevice.setup: sdd1 ; status: True ; controllable: True ; orig: True ; >03:51:23,160 DEBUG blivet: PartitionDevice.setup: sdc1 ; status: True ; controllable: True ; orig: True ; >03:51:23,162 DEBUG blivet: PartitionDevice.setup: sdb1 ; status: True ; controllable: True ; orig: True ; >03:51:23,486 DEBUG blivet: MDRaidArrayDevice.updateSysfsPath: dhcppc0:swap ; status: True ; >03:51:23,488 DEBUG blivet: SwapSpace.destroy: device: /dev/md/dhcppc0:swap ; status: False ; type: swap ; >03:51:23,773 DEBUG blivet: MDRaidArrayDevice.teardown: dhcppc0:swap ; status: True ; controllable: True ; >03:51:23,775 DEBUG blivet: SwapSpace.teardown: device: /dev/md/dhcppc0:swap ; status: False ; type: swap ; >03:51:23,779 DEBUG blivet: DeviceFormat.teardown: device: /dev/md/dhcppc0:swap ; status: False ; type: None ; >03:51:24,075 INFO blivet: executing action: [15] Destroy Device mdarray dhcppc0:swap (id 3) >03:51:24,078 DEBUG blivet: MDRaidArrayDevice.destroy: dhcppc0:swap ; status: False ; >03:51:24,079 DEBUG blivet: MDRaidArrayDevice.teardown: dhcppc0:swap ; status: False ; controllable: True ; >03:51:24,112 INFO blivet: executing action: [16] Destroy Format mdmember on partition sda1 (id 2) >03:51:24,114 DEBUG blivet: PartitionDevice.setup: sda1 ; status: True ; controllable: True ; orig: True ; >03:51:24,233 DEBUG blivet: PartitionDevice.teardown: sda1 ; status: True ; controllable: True ; >03:51:24,235 DEBUG blivet: MDRaidMember.teardown: device: /dev/sda1 ; status: False ; type: mdmember ; >03:51:24,238 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda1 ; status: False ; type: None ; >03:51:24,308 INFO blivet: executing action: [17] Destroy Device partition sda1 (id 2) >03:51:24,310 DEBUG blivet: PartitionDevice.destroy: sda1 ; status: True ; >03:51:24,312 DEBUG blivet: PartitionDevice.teardown: sda1 ; status: True ; controllable: True ; >03:51:24,314 DEBUG blivet: MDRaidMember.teardown: device: /dev/sda1 ; status: False ; type: mdmember ; >03:51:24,317 DEBUG blivet: DeviceFormat.teardown: device: /dev/sda1 ; status: False ; type: None ; >03:51:24,351 DEBUG blivet: PartitionDevice.setupParents: kids: 0 ; name: sda1 ; orig: True ; >03:51:24,355 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: True ; >03:51:24,357 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:24,359 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:24,360 DEBUG blivet: PartitionDevice._destroy: sda1 ; status: True ; >03:51:24,363 DEBUG blivet: DiskLabel.commit: device: /dev/sda ; numparts: 0 ; >03:51:24,467 INFO blivet: executing action: [18] Destroy Format msdos disklabel on disk sda (id 1) >03:51:24,469 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: True ; >03:51:24,471 DEBUG blivet: DiskLabel.destroy: device: /dev/sda ; status: False ; type: disklabel ; >03:51:24,561 DEBUG blivet: DiskDevice.teardown: sda ; status: True ; controllable: True ; >03:51:24,563 DEBUG blivet: DiskLabel.teardown: device: /dev/sda ; status: False ; type: disklabel ; >03:51:24,629 INFO blivet: executing action: [20] Destroy Format mdmember on partition sdd1 (id 9) >03:51:24,631 DEBUG blivet: PartitionDevice.setup: sdd1 ; status: True ; controllable: True ; orig: True ; >03:51:24,729 DEBUG blivet: PartitionDevice.teardown: sdd1 ; status: True ; controllable: True ; >03:51:24,731 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdd1 ; status: False ; type: mdmember ; >03:51:24,734 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdd1 ; status: False ; type: None ; >03:51:24,796 INFO blivet: executing action: [21] Destroy Device partition sdd1 (id 9) >03:51:24,800 DEBUG blivet: PartitionDevice.destroy: sdd1 ; status: True ; >03:51:24,802 DEBUG blivet: PartitionDevice.teardown: sdd1 ; status: True ; controllable: True ; >03:51:24,804 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdd1 ; status: False ; type: mdmember ; >03:51:24,807 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdd1 ; status: False ; type: None ; >03:51:24,839 DEBUG blivet: PartitionDevice.setupParents: kids: 0 ; name: sdd1 ; orig: True ; >03:51:24,842 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: True ; >03:51:24,844 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:24,846 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:24,848 DEBUG blivet: PartitionDevice._destroy: sdd1 ; status: True ; >03:51:24,850 DEBUG blivet: DiskLabel.commit: device: /dev/sdd ; numparts: 0 ; >03:51:24,933 INFO blivet: executing action: [22] Destroy Format msdos disklabel on disk sdd (id 8) >03:51:24,937 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: True ; >03:51:24,940 DEBUG blivet: DiskLabel.destroy: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:25,088 DEBUG blivet: DiskDevice.teardown: sdd ; status: True ; controllable: True ; >03:51:25,091 DEBUG blivet: DiskLabel.teardown: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:25,152 INFO blivet: executing action: [24] Destroy Format mdmember on partition sdc1 (id 12) >03:51:25,154 DEBUG blivet: PartitionDevice.setup: sdc1 ; status: True ; controllable: True ; orig: True ; >03:51:25,306 DEBUG blivet: PartitionDevice.teardown: sdc1 ; status: True ; controllable: True ; >03:51:25,308 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdc1 ; status: False ; type: mdmember ; >03:51:25,310 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdc1 ; status: False ; type: None ; >03:51:25,374 INFO blivet: executing action: [25] Destroy Device partition sdc1 (id 12) >03:51:25,376 DEBUG blivet: PartitionDevice.destroy: sdc1 ; status: True ; >03:51:25,378 DEBUG blivet: PartitionDevice.teardown: sdc1 ; status: True ; controllable: True ; >03:51:25,380 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdc1 ; status: False ; type: mdmember ; >03:51:25,382 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdc1 ; status: False ; type: None ; >03:51:25,416 DEBUG blivet: PartitionDevice.setupParents: kids: 0 ; name: sdc1 ; orig: True ; >03:51:25,419 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: True ; >03:51:25,421 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:25,423 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:25,425 DEBUG blivet: PartitionDevice._destroy: sdc1 ; status: True ; >03:51:25,427 DEBUG blivet: DiskLabel.commit: device: /dev/sdc ; numparts: 0 ; >03:51:25,578 INFO blivet: executing action: [26] Destroy Format msdos disklabel on disk sdc (id 11) >03:51:25,580 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: True ; >03:51:25,582 DEBUG blivet: DiskLabel.destroy: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:25,830 DEBUG blivet: DiskDevice.teardown: sdc ; status: True ; controllable: True ; >03:51:25,833 DEBUG blivet: DiskLabel.teardown: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:25,897 INFO blivet: executing action: [28] Destroy Format mdmember on partition sdb1 (id 15) >03:51:25,900 DEBUG blivet: PartitionDevice.setup: sdb1 ; status: True ; controllable: True ; orig: True ; >03:51:26,058 DEBUG blivet: PartitionDevice.teardown: sdb1 ; status: True ; controllable: True ; >03:51:26,060 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdb1 ; status: False ; type: mdmember ; >03:51:26,063 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdb1 ; status: False ; type: None ; >03:51:26,122 INFO blivet: executing action: [29] Destroy Device partition sdb1 (id 15) >03:51:26,126 DEBUG blivet: PartitionDevice.destroy: sdb1 ; status: True ; >03:51:26,128 DEBUG blivet: PartitionDevice.teardown: sdb1 ; status: True ; controllable: True ; >03:51:26,130 DEBUG blivet: MDRaidMember.teardown: device: /dev/sdb1 ; status: False ; type: mdmember ; >03:51:26,133 DEBUG blivet: DeviceFormat.teardown: device: /dev/sdb1 ; status: False ; type: None ; >03:51:26,165 DEBUG blivet: PartitionDevice.setupParents: kids: 0 ; name: sdb1 ; orig: True ; >03:51:26,167 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: True ; >03:51:26,169 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:26,173 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:26,174 DEBUG blivet: PartitionDevice._destroy: sdb1 ; status: True ; >03:51:26,176 DEBUG blivet: DiskLabel.commit: device: /dev/sdb ; numparts: 0 ; >03:51:26,273 INFO blivet: executing action: [30] Destroy Format msdos disklabel on disk sdb (id 14) >03:51:26,275 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: True ; >03:51:26,277 DEBUG blivet: DiskLabel.destroy: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:26,396 DEBUG blivet: DiskDevice.teardown: sdb ; status: True ; controllable: True ; >03:51:26,398 DEBUG blivet: DiskLabel.teardown: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:26,461 INFO blivet: executing action: [19] Create Format msdos disklabel on disk sda (id 1) >03:51:26,466 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:51:26,468 DEBUG blivet: DiskLabel.create: device: /dev/sda ; status: False ; type: disklabel ; >03:51:26,470 DEBUG blivet: DiskLabel.create: device: /dev/sda ; status: False ; type: disklabel ; >03:51:26,472 DEBUG blivet: DiskLabel.commit: device: /dev/sda ; numparts: 0 ; >03:51:26,608 DEBUG blivet: DiskDevice.updateSysfsPath: sda ; status: True ; >03:51:26,609 DEBUG blivet: sda sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda >03:51:26,650 INFO blivet: executing action: [61] Create Device partition sda1 (id 37) >03:51:26,652 DEBUG blivet: PartitionDevice.create: sda1 ; status: False ; >03:51:26,655 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sda1 ; orig: False ; >03:51:26,659 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:51:26,665 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:26,669 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:26,671 DEBUG blivet: PartitionDevice._create: sda1 ; status: False ; >03:51:26,676 DEBUG blivet: DiskLabel.commit: device: /dev/sda ; numparts: 1 ; >03:51:26,751 DEBUG blivet: post-commit partition path is /dev/sda1 >03:51:26,753 DEBUG blivet: PartitionDevice._setPartedPartition: sda1 ; >03:51:26,754 DEBUG blivet: device sda1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09990> fileSystem: None > number: 1 path: /dev/sda1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a45450> PedPartition: <_ped.Partition object at 0x7fae04f294d0> >03:51:26,758 DEBUG blivet: DeviceFormat.destroy: device: /dev/sda1 ; status: False ; type: None ; >03:51:26,798 DEBUG blivet: PartitionDevice.setup: sda1 ; status: True ; controllable: True ; orig: False ; >03:51:26,801 DEBUG blivet: PartitionDevice.updateSysfsPath: sda1 ; status: True ; >03:51:26,803 DEBUG blivet: PartitionDevice.updateSysfsPath: sda1 ; status: True ; >03:51:26,805 DEBUG blivet: sda1 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda1 >03:51:26,870 INFO blivet: executing action: [48] Create Device partition sda2 (id 31) >03:51:26,872 DEBUG blivet: PartitionDevice.create: sda2 ; status: False ; >03:51:26,874 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sda2 ; orig: False ; >03:51:26,876 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:51:26,878 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:26,882 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:26,884 DEBUG blivet: PartitionDevice._create: sda2 ; status: False ; >03:51:26,887 DEBUG blivet: DiskLabel.commit: device: /dev/sda ; numparts: 2 ; >03:51:26,969 DEBUG blivet: post-commit partition path is /dev/sda2 >03:51:26,971 DEBUG blivet: PartitionDevice._setPartedPartition: sda2 ; >03:51:26,972 DEBUG blivet: device sda2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09990> fileSystem: None > number: 2 path: /dev/sda2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05af54d0> PedPartition: <_ped.Partition object at 0x7fae04fb3ef0> >03:51:26,976 DEBUG blivet: DeviceFormat.destroy: device: /dev/sda2 ; status: False ; type: None ; >03:51:27,033 DEBUG blivet: PartitionDevice.setup: sda2 ; status: True ; controllable: True ; orig: False ; >03:51:27,034 DEBUG blivet: PartitionDevice.updateSysfsPath: sda2 ; status: True ; >03:51:27,036 DEBUG blivet: PartitionDevice.updateSysfsPath: sda2 ; status: True ; >03:51:27,037 DEBUG blivet: sda2 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda2 >03:51:27,097 INFO blivet: executing action: [35] Create Device partition sda3 (id 25) >03:51:27,099 DEBUG blivet: PartitionDevice.create: sda3 ; status: False ; >03:51:27,102 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sda3 ; orig: False ; >03:51:27,104 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:51:27,106 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:27,108 DEBUG blivet: DiskLabel.setup: device: /dev/sda ; status: False ; type: disklabel ; >03:51:27,111 DEBUG blivet: PartitionDevice._create: sda3 ; status: False ; >03:51:27,114 DEBUG blivet: DiskLabel.commit: device: /dev/sda ; numparts: 3 ; >03:51:27,207 DEBUG blivet: post-commit partition path is /dev/sda3 >03:51:27,209 DEBUG blivet: PartitionDevice._setPartedPartition: sda3 ; >03:51:27,210 DEBUG blivet: device sda3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05b09990> fileSystem: None > number: 3 path: /dev/sda3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05e06ad0> PedPartition: <_ped.Partition object at 0x7fae04f293b0> >03:51:27,214 DEBUG blivet: DeviceFormat.destroy: device: /dev/sda3 ; status: False ; type: None ; >03:51:27,254 DEBUG blivet: PartitionDevice.setup: sda3 ; status: True ; controllable: True ; orig: False ; >03:51:27,257 DEBUG blivet: PartitionDevice.updateSysfsPath: sda3 ; status: True ; >03:51:27,259 DEBUG blivet: PartitionDevice.updateSysfsPath: sda3 ; status: True ; >03:51:27,259 DEBUG blivet: sda3 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda3 >03:51:27,324 INFO blivet: executing action: [36] Create Format mdmember on partition sda3 (id 25) >03:51:27,327 DEBUG blivet: PartitionDevice.setup: sda3 ; status: True ; controllable: True ; orig: False ; >03:51:27,328 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 1 ; >03:51:27,329 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 2 ; >03:51:27,331 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 3 ; >03:51:27,332 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 4 ; >03:51:27,333 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 6 ; >03:51:27,335 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 8 ; >03:51:27,338 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 9 ; >03:51:27,339 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 10 ; >03:51:27,341 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 11 ; >03:51:27,342 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 12 ; >03:51:27,343 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 13 ; >03:51:27,344 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 14 ; >03:51:27,346 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda3 ; flag: 15 ; >03:51:27,347 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sda3 ; flag: 5 ; >03:51:27,349 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sda ; numparts: 3 ; >03:51:27,617 DEBUG blivet: MDRaidMember.create: device: /dev/sda3 ; status: False ; type: mdmember ; >03:51:27,661 DEBUG blivet: PartitionDevice.updateSysfsPath: sda3 ; status: True ; >03:51:27,663 DEBUG blivet: PartitionDevice.updateSysfsPath: sda3 ; status: True ; >03:51:27,663 DEBUG blivet: sda3 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda3 >03:51:27,697 INFO blivet: executing action: [49] Create Format mdmember on partition sda2 (id 31) >03:51:27,700 DEBUG blivet: PartitionDevice.setup: sda2 ; status: True ; controllable: True ; orig: False ; >03:51:27,701 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 1 ; >03:51:27,703 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 2 ; >03:51:27,704 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 3 ; >03:51:27,705 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 4 ; >03:51:27,707 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 6 ; >03:51:27,710 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 8 ; >03:51:27,712 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 9 ; >03:51:27,713 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 10 ; >03:51:27,714 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 11 ; >03:51:27,716 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 12 ; >03:51:27,717 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 13 ; >03:51:27,718 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 14 ; >03:51:27,720 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda2 ; flag: 15 ; >03:51:27,721 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sda2 ; flag: 5 ; >03:51:27,723 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sda ; numparts: 3 ; >03:51:27,752 DEBUG blivet: MDRaidMember.create: device: /dev/sda2 ; status: False ; type: mdmember ; >03:51:27,794 DEBUG blivet: PartitionDevice.updateSysfsPath: sda2 ; status: True ; >03:51:27,796 DEBUG blivet: PartitionDevice.updateSysfsPath: sda2 ; status: True ; >03:51:27,796 DEBUG blivet: sda2 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda2 >03:51:27,831 INFO blivet: executing action: [62] Create Format mdmember on partition sda1 (id 37) >03:51:27,834 DEBUG blivet: PartitionDevice.setup: sda1 ; status: True ; controllable: True ; orig: False ; >03:51:27,837 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 1 ; >03:51:27,838 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 2 ; >03:51:27,839 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 3 ; >03:51:27,841 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 4 ; >03:51:27,842 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 6 ; >03:51:27,846 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 8 ; >03:51:27,847 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 9 ; >03:51:27,848 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 10 ; >03:51:27,850 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 11 ; >03:51:27,851 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 12 ; >03:51:27,852 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 13 ; >03:51:27,854 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 14 ; >03:51:27,855 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sda1 ; flag: 15 ; >03:51:27,856 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sda1 ; flag: 5 ; >03:51:27,859 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sda ; numparts: 3 ; >03:51:27,882 DEBUG blivet: MDRaidMember.create: device: /dev/sda1 ; status: False ; type: mdmember ; >03:51:27,937 DEBUG blivet: PartitionDevice.updateSysfsPath: sda1 ; status: True ; >03:51:27,941 DEBUG blivet: PartitionDevice.updateSysfsPath: sda1 ; status: True ; >03:51:27,943 DEBUG blivet: sda1 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:0/block/sda/sda1 >03:51:27,978 INFO blivet: executing action: [23] Create Format msdos disklabel on disk sdd (id 8) >03:51:27,981 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:51:27,982 DEBUG blivet: DiskLabel.create: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:27,983 DEBUG blivet: DiskLabel.create: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:27,985 DEBUG blivet: DiskLabel.commit: device: /dev/sdd ; numparts: 0 ; >03:51:28,115 DEBUG blivet: DiskDevice.updateSysfsPath: sdd ; status: True ; >03:51:28,116 DEBUG blivet: sdd sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd >03:51:28,150 INFO blivet: executing action: [67] Create Device partition sdd1 (id 40) >03:51:28,152 DEBUG blivet: PartitionDevice.create: sdd1 ; status: False ; >03:51:28,154 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdd1 ; orig: False ; >03:51:28,155 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:51:28,157 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:28,158 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:28,160 DEBUG blivet: PartitionDevice._create: sdd1 ; status: False ; >03:51:28,162 DEBUG blivet: DiskLabel.commit: device: /dev/sdd ; numparts: 1 ; >03:51:28,243 DEBUG blivet: post-commit partition path is /dev/sdd1 >03:51:28,245 DEBUG blivet: PartitionDevice._setPartedPartition: sdd1 ; >03:51:28,246 DEBUG blivet: device sdd1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee150> fileSystem: None > number: 1 path: /dev/sdd1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f22550> PedPartition: <_ped.Partition object at 0x7fae05a3f290> >03:51:28,248 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdd1 ; status: False ; type: None ; >03:51:28,309 DEBUG blivet: PartitionDevice.setup: sdd1 ; status: True ; controllable: True ; orig: False ; >03:51:28,311 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd1 ; status: True ; >03:51:28,314 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd1 ; status: True ; >03:51:28,315 DEBUG blivet: sdd1 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd1 >03:51:28,375 INFO blivet: executing action: [54] Create Device partition sdd2 (id 34) >03:51:28,377 DEBUG blivet: PartitionDevice.create: sdd2 ; status: False ; >03:51:28,379 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdd2 ; orig: False ; >03:51:28,380 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:51:28,382 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:28,384 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:28,387 DEBUG blivet: PartitionDevice._create: sdd2 ; status: False ; >03:51:28,389 DEBUG blivet: DiskLabel.commit: device: /dev/sdd ; numparts: 2 ; >03:51:28,531 DEBUG blivet: post-commit partition path is /dev/sdd2 >03:51:28,532 DEBUG blivet: PartitionDevice._setPartedPartition: sdd2 ; >03:51:28,533 DEBUG blivet: device sdd2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee150> fileSystem: None > number: 2 path: /dev/sdd2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2b90> PedPartition: <_ped.Partition object at 0x7fae05a3f2f0> >03:51:28,535 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdd2 ; status: False ; type: None ; >03:51:28,601 DEBUG blivet: PartitionDevice.setup: sdd2 ; status: True ; controllable: True ; orig: False ; >03:51:28,604 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd2 ; status: True ; >03:51:28,605 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd2 ; status: True ; >03:51:28,606 DEBUG blivet: sdd2 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd2 >03:51:28,666 INFO blivet: executing action: [41] Create Device partition sdd3 (id 28) >03:51:28,670 DEBUG blivet: PartitionDevice.create: sdd3 ; status: False ; >03:51:28,672 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdd3 ; orig: False ; >03:51:28,674 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:51:28,675 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:28,677 DEBUG blivet: DiskLabel.setup: device: /dev/sdd ; status: False ; type: disklabel ; >03:51:28,678 DEBUG blivet: PartitionDevice._create: sdd3 ; status: False ; >03:51:28,681 DEBUG blivet: DiskLabel.commit: device: /dev/sdd ; numparts: 3 ; >03:51:28,840 DEBUG blivet: post-commit partition path is /dev/sdd3 >03:51:28,842 DEBUG blivet: PartitionDevice._setPartedPartition: sdd3 ; >03:51:28,843 DEBUG blivet: device sdd3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee150> fileSystem: None > number: 3 path: /dev/sdd3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a48110> PedPartition: <_ped.Partition object at 0x7fae04f8e3b0> >03:51:28,844 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdd3 ; status: False ; type: None ; >03:51:28,895 DEBUG blivet: PartitionDevice.setup: sdd3 ; status: True ; controllable: True ; orig: False ; >03:51:28,897 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd3 ; status: True ; >03:51:28,898 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd3 ; status: True ; >03:51:28,899 DEBUG blivet: sdd3 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd3 >03:51:28,960 INFO blivet: executing action: [42] Create Format mdmember on partition sdd3 (id 28) >03:51:28,963 DEBUG blivet: PartitionDevice.setup: sdd3 ; status: True ; controllable: True ; orig: False ; >03:51:28,965 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 1 ; >03:51:28,968 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 2 ; >03:51:28,969 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 3 ; >03:51:28,971 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 4 ; >03:51:28,972 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 6 ; >03:51:28,974 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 8 ; >03:51:28,975 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 9 ; >03:51:28,976 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 10 ; >03:51:28,978 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 11 ; >03:51:28,979 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 12 ; >03:51:28,980 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 13 ; >03:51:28,983 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 14 ; >03:51:28,985 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd3 ; flag: 15 ; >03:51:28,986 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdd3 ; flag: 5 ; >03:51:28,988 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdd ; numparts: 3 ; >03:51:29,057 DEBUG blivet: MDRaidMember.create: device: /dev/sdd3 ; status: False ; type: mdmember ; >03:51:29,094 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd3 ; status: True ; >03:51:29,095 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd3 ; status: True ; >03:51:29,096 DEBUG blivet: sdd3 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd3 >03:51:29,129 INFO blivet: executing action: [55] Create Format mdmember on partition sdd2 (id 34) >03:51:29,131 DEBUG blivet: PartitionDevice.setup: sdd2 ; status: True ; controllable: True ; orig: False ; >03:51:29,133 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 1 ; >03:51:29,134 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 2 ; >03:51:29,136 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 3 ; >03:51:29,137 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 4 ; >03:51:29,139 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 6 ; >03:51:29,140 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 8 ; >03:51:29,141 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 9 ; >03:51:29,145 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 10 ; >03:51:29,146 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 11 ; >03:51:29,147 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 12 ; >03:51:29,150 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 13 ; >03:51:29,152 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 14 ; >03:51:29,153 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd2 ; flag: 15 ; >03:51:29,154 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdd2 ; flag: 5 ; >03:51:29,156 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdd ; numparts: 3 ; >03:51:29,176 DEBUG blivet: MDRaidMember.create: device: /dev/sdd2 ; status: False ; type: mdmember ; >03:51:29,224 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd2 ; status: True ; >03:51:29,225 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd2 ; status: True ; >03:51:29,226 DEBUG blivet: sdd2 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd2 >03:51:29,258 INFO blivet: executing action: [68] Create Format mdmember on partition sdd1 (id 40) >03:51:29,261 DEBUG blivet: PartitionDevice.setup: sdd1 ; status: True ; controllable: True ; orig: False ; >03:51:29,263 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 1 ; >03:51:29,264 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 2 ; >03:51:29,265 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 3 ; >03:51:29,267 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 4 ; >03:51:29,271 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 6 ; >03:51:29,273 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 8 ; >03:51:29,274 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 9 ; >03:51:29,276 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 10 ; >03:51:29,277 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 11 ; >03:51:29,278 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 12 ; >03:51:29,280 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 13 ; >03:51:29,281 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 14 ; >03:51:29,283 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdd1 ; flag: 15 ; >03:51:29,284 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdd1 ; flag: 5 ; >03:51:29,288 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdd ; numparts: 3 ; >03:51:29,308 DEBUG blivet: MDRaidMember.create: device: /dev/sdd1 ; status: False ; type: mdmember ; >03:51:29,347 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd1 ; status: True ; >03:51:29,349 DEBUG blivet: PartitionDevice.updateSysfsPath: sdd1 ; status: True ; >03:51:29,350 DEBUG blivet: sdd1 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:1/block/sdd/sdd1 >03:51:29,384 INFO blivet: executing action: [27] Create Format msdos disklabel on disk sdc (id 11) >03:51:29,387 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:51:29,388 DEBUG blivet: DiskLabel.create: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:29,389 DEBUG blivet: DiskLabel.create: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:29,391 DEBUG blivet: DiskLabel.commit: device: /dev/sdc ; numparts: 0 ; >03:51:29,546 DEBUG blivet: DiskDevice.updateSysfsPath: sdc ; status: True ; >03:51:29,547 DEBUG blivet: sdc sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc >03:51:29,579 INFO blivet: executing action: [65] Create Device partition sdc1 (id 39) >03:51:29,582 DEBUG blivet: PartitionDevice.create: sdc1 ; status: False ; >03:51:29,583 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdc1 ; orig: False ; >03:51:29,585 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:51:29,586 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:29,588 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:29,590 DEBUG blivet: PartitionDevice._create: sdc1 ; status: False ; >03:51:29,592 DEBUG blivet: DiskLabel.commit: device: /dev/sdc ; numparts: 1 ; >03:51:29,657 DEBUG blivet: post-commit partition path is /dev/sdc1 >03:51:29,659 DEBUG blivet: PartitionDevice._setPartedPartition: sdc1 ; >03:51:29,659 DEBUG blivet: device sdc1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee950> fileSystem: None > number: 1 path: /dev/sdc1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51c50> PedPartition: <_ped.Partition object at 0x7fae05a3f170> >03:51:29,661 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdc1 ; status: False ; type: None ; >03:51:29,716 DEBUG blivet: PartitionDevice.setup: sdc1 ; status: True ; controllable: True ; orig: False ; >03:51:29,718 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc1 ; status: True ; >03:51:29,720 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc1 ; status: True ; >03:51:29,720 DEBUG blivet: sdc1 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc1 >03:51:29,781 INFO blivet: executing action: [52] Create Device partition sdc2 (id 33) >03:51:29,784 DEBUG blivet: PartitionDevice.create: sdc2 ; status: False ; >03:51:29,788 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdc2 ; orig: False ; >03:51:29,789 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:51:29,791 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:29,792 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:29,793 DEBUG blivet: PartitionDevice._create: sdc2 ; status: False ; >03:51:29,796 DEBUG blivet: DiskLabel.commit: device: /dev/sdc ; numparts: 2 ; >03:51:29,879 DEBUG blivet: post-commit partition path is /dev/sdc2 >03:51:29,881 DEBUG blivet: PartitionDevice._setPartedPartition: sdc2 ; >03:51:29,882 DEBUG blivet: device sdc2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee950> fileSystem: None > number: 2 path: /dev/sdc2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04f1ed10> PedPartition: <_ped.Partition object at 0x7fae05a3f350> >03:51:29,884 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdc2 ; status: False ; type: None ; >03:51:29,948 DEBUG blivet: PartitionDevice.setup: sdc2 ; status: True ; controllable: True ; orig: False ; >03:51:29,950 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc2 ; status: True ; >03:51:29,953 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc2 ; status: True ; >03:51:29,953 DEBUG blivet: sdc2 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc2 >03:51:30,019 INFO blivet: executing action: [39] Create Device partition sdc3 (id 27) >03:51:30,021 DEBUG blivet: PartitionDevice.create: sdc3 ; status: False ; >03:51:30,022 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdc3 ; orig: False ; >03:51:30,024 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:51:30,026 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:30,027 DEBUG blivet: DiskLabel.setup: device: /dev/sdc ; status: False ; type: disklabel ; >03:51:30,029 DEBUG blivet: PartitionDevice._create: sdc3 ; status: False ; >03:51:30,033 DEBUG blivet: DiskLabel.commit: device: /dev/sdc ; numparts: 3 ; >03:51:30,183 DEBUG blivet: post-commit partition path is /dev/sdc3 >03:51:30,192 DEBUG blivet: PartitionDevice._setPartedPartition: sdc3 ; >03:51:30,194 DEBUG blivet: device sdc3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aee950> fileSystem: None > number: 3 path: /dev/sdc3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05b7e690> PedPartition: <_ped.Partition object at 0x7fae04f8e4d0> >03:51:30,195 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdc3 ; status: False ; type: None ; >03:51:30,257 DEBUG blivet: PartitionDevice.setup: sdc3 ; status: True ; controllable: True ; orig: False ; >03:51:30,260 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc3 ; status: True ; >03:51:30,261 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc3 ; status: True ; >03:51:30,263 DEBUG blivet: sdc3 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc3 >03:51:30,324 INFO blivet: executing action: [40] Create Format mdmember on partition sdc3 (id 27) >03:51:30,326 DEBUG blivet: PartitionDevice.setup: sdc3 ; status: True ; controllable: True ; orig: False ; >03:51:30,328 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 1 ; >03:51:30,329 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 2 ; >03:51:30,331 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 3 ; >03:51:30,332 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 4 ; >03:51:30,335 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 6 ; >03:51:30,337 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 8 ; >03:51:30,338 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 9 ; >03:51:30,339 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 10 ; >03:51:30,341 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 11 ; >03:51:30,342 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 12 ; >03:51:30,343 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 13 ; >03:51:30,345 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 14 ; >03:51:30,346 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc3 ; flag: 15 ; >03:51:30,348 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdc3 ; flag: 5 ; >03:51:30,352 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdc ; numparts: 3 ; >03:51:30,414 DEBUG blivet: MDRaidMember.create: device: /dev/sdc3 ; status: False ; type: mdmember ; >03:51:30,456 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc3 ; status: True ; >03:51:30,457 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc3 ; status: True ; >03:51:30,458 DEBUG blivet: sdc3 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc3 >03:51:30,491 INFO blivet: executing action: [53] Create Format mdmember on partition sdc2 (id 33) >03:51:30,494 DEBUG blivet: PartitionDevice.setup: sdc2 ; status: True ; controllable: True ; orig: False ; >03:51:30,497 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 1 ; >03:51:30,498 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 2 ; >03:51:30,499 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 3 ; >03:51:30,501 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 4 ; >03:51:30,502 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 6 ; >03:51:30,503 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 8 ; >03:51:30,505 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 9 ; >03:51:30,508 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 10 ; >03:51:30,509 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 11 ; >03:51:30,511 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 12 ; >03:51:30,512 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 13 ; >03:51:30,513 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 14 ; >03:51:30,515 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc2 ; flag: 15 ; >03:51:30,516 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdc2 ; flag: 5 ; >03:51:30,518 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdc ; numparts: 3 ; >03:51:30,539 DEBUG blivet: MDRaidMember.create: device: /dev/sdc2 ; status: False ; type: mdmember ; >03:51:30,580 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc2 ; status: True ; >03:51:30,581 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc2 ; status: True ; >03:51:30,582 DEBUG blivet: sdc2 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc2 >03:51:30,615 INFO blivet: executing action: [66] Create Format mdmember on partition sdc1 (id 39) >03:51:30,618 DEBUG blivet: PartitionDevice.setup: sdc1 ; status: True ; controllable: True ; orig: False ; >03:51:30,619 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 1 ; >03:51:30,621 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 2 ; >03:51:30,622 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 3 ; >03:51:30,624 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 4 ; >03:51:30,625 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 6 ; >03:51:30,626 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 8 ; >03:51:30,630 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 9 ; >03:51:30,631 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 10 ; >03:51:30,633 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 11 ; >03:51:30,634 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 12 ; >03:51:30,635 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 13 ; >03:51:30,637 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 14 ; >03:51:30,638 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdc1 ; flag: 15 ; >03:51:30,639 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdc1 ; flag: 5 ; >03:51:30,641 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdc ; numparts: 3 ; >03:51:30,731 DEBUG blivet: MDRaidMember.create: device: /dev/sdc1 ; status: False ; type: mdmember ; >03:51:30,787 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc1 ; status: True ; >03:51:30,789 DEBUG blivet: PartitionDevice.updateSysfsPath: sdc1 ; status: True ; >03:51:30,791 DEBUG blivet: sdc1 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:2/block/sdc/sdc1 >03:51:30,827 INFO blivet: executing action: [31] Create Format msdos disklabel on disk sdb (id 14) >03:51:30,830 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:51:30,831 DEBUG blivet: DiskLabel.create: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:30,832 DEBUG blivet: DiskLabel.create: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:30,834 DEBUG blivet: DiskLabel.commit: device: /dev/sdb ; numparts: 0 ; >03:51:30,992 DEBUG blivet: DiskDevice.updateSysfsPath: sdb ; status: True ; >03:51:30,993 DEBUG blivet: sdb sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb >03:51:31,032 INFO blivet: executing action: [63] Create Device partition sdb1 (id 38) >03:51:31,034 DEBUG blivet: PartitionDevice.create: sdb1 ; status: False ; >03:51:31,036 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdb1 ; orig: False ; >03:51:31,037 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:51:31,039 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:31,041 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:31,044 DEBUG blivet: PartitionDevice._create: sdb1 ; status: False ; >03:51:31,048 DEBUG blivet: DiskLabel.commit: device: /dev/sdb ; numparts: 1 ; >03:51:31,121 DEBUG blivet: post-commit partition path is /dev/sdb1 >03:51:31,124 DEBUG blivet: PartitionDevice._setPartedPartition: sdb1 ; >03:51:31,125 DEBUG blivet: device sdb1 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aeea90> fileSystem: None > number: 1 path: /dev/sdb1 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a51ed0> PedPartition: <_ped.Partition object at 0x7fae05a3f530> >03:51:31,129 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdb1 ; status: False ; type: None ; >03:51:31,174 DEBUG blivet: PartitionDevice.setup: sdb1 ; status: True ; controllable: True ; orig: False ; >03:51:31,176 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb1 ; status: True ; >03:51:31,179 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb1 ; status: True ; >03:51:31,180 DEBUG blivet: sdb1 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb1 >03:51:31,261 INFO blivet: executing action: [50] Create Device partition sdb2 (id 32) >03:51:31,263 DEBUG blivet: PartitionDevice.create: sdb2 ; status: False ; >03:51:31,264 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdb2 ; orig: False ; >03:51:31,266 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:51:31,267 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:31,269 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:31,270 DEBUG blivet: PartitionDevice._create: sdb2 ; status: False ; >03:51:31,273 DEBUG blivet: DiskLabel.commit: device: /dev/sdb ; numparts: 2 ; >03:51:31,390 DEBUG blivet: post-commit partition path is /dev/sdb2 >03:51:31,392 DEBUG blivet: PartitionDevice._setPartedPartition: sdb2 ; >03:51:31,394 DEBUG blivet: device sdb2 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aeea90> fileSystem: None > number: 2 path: /dev/sdb2 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae05a4da90> PedPartition: <_ped.Partition object at 0x7fae05a3f590> >03:51:31,395 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdb2 ; status: False ; type: None ; >03:51:31,438 DEBUG blivet: PartitionDevice.setup: sdb2 ; status: True ; controllable: True ; orig: False ; >03:51:31,442 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb2 ; status: True ; >03:51:31,444 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb2 ; status: True ; >03:51:31,445 DEBUG blivet: sdb2 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb2 >03:51:31,505 INFO blivet: executing action: [37] Create Device partition sdb3 (id 26) >03:51:31,507 DEBUG blivet: PartitionDevice.create: sdb3 ; status: False ; >03:51:31,512 DEBUG blivet: PartitionDevice.setupParents: kids: 1 ; name: sdb3 ; orig: False ; >03:51:31,514 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:51:31,516 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:31,518 DEBUG blivet: DiskLabel.setup: device: /dev/sdb ; status: False ; type: disklabel ; >03:51:31,519 DEBUG blivet: PartitionDevice._create: sdb3 ; status: False ; >03:51:31,522 DEBUG blivet: DiskLabel.commit: device: /dev/sdb ; numparts: 3 ; >03:51:31,629 DEBUG blivet: post-commit partition path is /dev/sdb3 >03:51:31,631 DEBUG blivet: PartitionDevice._setPartedPartition: sdb3 ; >03:51:31,633 DEBUG blivet: device sdb3 new partedPartition parted.Partition instance -- > disk: <parted.disk.Disk object at 0x7fae05aeea90> fileSystem: None > number: 3 path: /dev/sdb3 type: 0 > name: None active: True busy: False > geometry: <parted.geometry.Geometry object at 0x7fae04fb2050> PedPartition: <_ped.Partition object at 0x7fae04f8e2f0> >03:51:31,637 DEBUG blivet: DeviceFormat.destroy: device: /dev/sdb3 ; status: False ; type: None ; >03:51:31,680 DEBUG blivet: PartitionDevice.setup: sdb3 ; status: True ; controllable: True ; orig: False ; >03:51:31,683 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb3 ; status: True ; >03:51:31,689 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb3 ; status: True ; >03:51:31,689 DEBUG blivet: sdb3 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb3 >03:51:31,751 INFO blivet: executing action: [38] Create Format mdmember on partition sdb3 (id 26) >03:51:31,757 DEBUG blivet: PartitionDevice.setup: sdb3 ; status: True ; controllable: True ; orig: False ; >03:51:31,759 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 1 ; >03:51:31,761 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 2 ; >03:51:31,762 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 3 ; >03:51:31,764 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 4 ; >03:51:31,765 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 6 ; >03:51:31,767 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 8 ; >03:51:31,769 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 9 ; >03:51:31,770 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 10 ; >03:51:31,774 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 11 ; >03:51:31,777 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 12 ; >03:51:31,779 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 13 ; >03:51:31,780 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 14 ; >03:51:31,782 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb3 ; flag: 15 ; >03:51:31,784 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdb3 ; flag: 5 ; >03:51:31,786 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdb ; numparts: 3 ; >03:51:31,814 DEBUG blivet: MDRaidMember.create: device: /dev/sdb3 ; status: False ; type: mdmember ; >03:51:31,854 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb3 ; status: True ; >03:51:31,856 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb3 ; status: True ; >03:51:31,857 DEBUG blivet: sdb3 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb3 >03:51:31,891 INFO blivet: executing action: [43] Create Device mdarray swap (id 29) >03:51:31,893 DEBUG blivet: MDRaidArrayDevice.create: swap ; status: False ; >03:51:31,894 DEBUG blivet: MDRaidArrayDevice.setupParents: kids: 0 ; name: swap ; orig: False ; >03:51:31,896 DEBUG blivet: PartitionDevice.setup: sda3 ; status: True ; controllable: True ; orig: False ; >03:51:31,898 DEBUG blivet: PartitionDevice.setup: sdb3 ; status: True ; controllable: True ; orig: False ; >03:51:31,902 DEBUG blivet: PartitionDevice.setup: sdc3 ; status: True ; controllable: True ; orig: False ; >03:51:31,904 DEBUG blivet: PartitionDevice.setup: sdd3 ; status: True ; controllable: True ; orig: False ; >03:51:31,906 DEBUG blivet: MDRaidArrayDevice._create: swap ; status: False ; >03:51:32,569 DEBUG blivet: MDRaidArrayDevice.setup: swap ; status: True ; controllable: True ; orig: False ; >03:51:32,571 DEBUG blivet: MDRaidArrayDevice.updateSysfsPath: swap ; status: True ; >03:51:32,638 INFO blivet: executing action: [44] Create Format swap on mdarray swap (id 29) >03:51:32,641 DEBUG blivet: MDRaidArrayDevice.setup: swap ; status: True ; controllable: True ; orig: False ; >03:51:32,643 DEBUG blivet: SwapSpace.create: device: /dev/md/swap ; status: None ; type: swap ; >03:51:32,645 DEBUG blivet: SwapSpace.create: device: /dev/md/swap ; status: None ; type: swap ; >03:51:32,853 DEBUG blivet: SwapSpace.notifyKernel: device: /dev/md/swap ; type: swap ; >03:51:32,856 DEBUG blivet: notifying kernel of 'change' event on device /sys/class/block/md127 >03:51:32,914 DEBUG blivet: MDRaidArrayDevice.updateSysfsPath: swap ; status: True ; >03:51:32,946 INFO blivet: executing action: [51] Create Format mdmember on partition sdb2 (id 32) >03:51:32,949 DEBUG blivet: PartitionDevice.setup: sdb2 ; status: True ; controllable: True ; orig: False ; >03:51:32,953 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 1 ; >03:51:32,954 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 2 ; >03:51:32,956 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 3 ; >03:51:32,958 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 4 ; >03:51:32,959 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 6 ; >03:51:32,960 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 8 ; >03:51:32,962 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 9 ; >03:51:32,964 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 10 ; >03:51:32,965 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 11 ; >03:51:32,970 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 12 ; >03:51:32,977 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 13 ; >03:51:32,981 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 14 ; >03:51:32,986 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb2 ; flag: 15 ; >03:51:32,989 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdb2 ; flag: 5 ; >03:51:32,994 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdb ; numparts: 3 ; >03:51:33,024 DEBUG blivet: MDRaidMember.create: device: /dev/sdb2 ; status: False ; type: mdmember ; >03:51:33,065 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb2 ; status: True ; >03:51:33,067 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb2 ; status: True ; >03:51:33,068 DEBUG blivet: sdb2 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb2 >03:51:33,103 INFO blivet: executing action: [56] Create Device mdarray boot (id 35) >03:51:33,105 DEBUG blivet: MDRaidArrayDevice.create: boot ; status: False ; >03:51:33,107 DEBUG blivet: MDRaidArrayDevice.setupParents: kids: 0 ; name: boot ; orig: False ; >03:51:33,109 DEBUG blivet: PartitionDevice.setup: sda2 ; status: True ; controllable: True ; orig: False ; >03:51:33,110 DEBUG blivet: PartitionDevice.setup: sdb2 ; status: True ; controllable: True ; orig: False ; >03:51:33,116 DEBUG blivet: PartitionDevice.setup: sdc2 ; status: True ; controllable: True ; orig: False ; >03:51:33,118 DEBUG blivet: PartitionDevice.setup: sdd2 ; status: True ; controllable: True ; orig: False ; >03:51:33,120 DEBUG blivet: MDRaidArrayDevice._create: boot ; status: False ; >03:51:33,702 DEBUG blivet: MDRaidArrayDevice.setup: boot ; status: True ; controllable: True ; orig: False ; >03:51:33,704 DEBUG blivet: MDRaidArrayDevice.updateSysfsPath: boot ; status: True ; >03:51:33,773 INFO blivet: executing action: [57] Create Format ext4 filesystem mounted at /boot on mdarray boot (id 35) >03:51:33,776 DEBUG blivet: MDRaidArrayDevice.setup: boot ; status: True ; controllable: True ; orig: False ; >03:51:33,778 DEBUG blivet: Ext4FS.create: device: /dev/md/boot ; status: False ; type: ext4 ; >03:51:33,780 DEBUG blivet: Ext4FS.doFormat: device: /dev/md/boot ; mountpoint: /boot ; type: ext4 ; >03:51:37,328 DEBUG blivet: Ext4FS.notifyKernel: device: /dev/md/boot ; type: ext4 ; >03:51:37,329 DEBUG blivet: notifying kernel of 'change' event on device /sys/class/block/md126 >03:51:37,379 DEBUG blivet: MDRaidArrayDevice.updateSysfsPath: boot ; status: True ; >03:51:37,412 INFO blivet: executing action: [64] Create Format mdmember on partition sdb1 (id 38) >03:51:37,415 DEBUG blivet: PartitionDevice.setup: sdb1 ; status: True ; controllable: True ; orig: False ; >03:51:37,418 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 1 ; >03:51:37,420 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 2 ; >03:51:37,422 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 3 ; >03:51:37,423 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 4 ; >03:51:37,425 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 6 ; >03:51:37,427 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 8 ; >03:51:37,429 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 9 ; >03:51:37,430 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 10 ; >03:51:37,432 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 11 ; >03:51:37,435 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 12 ; >03:51:37,437 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 13 ; >03:51:37,439 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 14 ; >03:51:37,441 DEBUG blivet: PartitionDevice.unsetFlag: path: /dev/sdb1 ; flag: 15 ; >03:51:37,443 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdb1 ; flag: 5 ; >03:51:37,446 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdb ; numparts: 3 ; >03:51:37,487 DEBUG blivet: MDRaidMember.create: device: /dev/sdb1 ; status: False ; type: mdmember ; >03:51:37,546 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb1 ; status: True ; >03:51:37,548 DEBUG blivet: PartitionDevice.updateSysfsPath: sdb1 ; status: True ; >03:51:37,549 DEBUG blivet: sdb1 sysfsPath set to /devices/pci0000:00/0000:00:06.0/virtio2/host2/target2:0:0/2:0:0:3/block/sdb/sdb1 >03:51:37,591 INFO blivet: executing action: [69] Create Device mdarray root (id 41) >03:51:37,595 DEBUG blivet: MDRaidArrayDevice.create: root ; status: False ; >03:51:37,597 DEBUG blivet: MDRaidArrayDevice.setupParents: kids: 0 ; name: root ; orig: False ; >03:51:37,600 DEBUG blivet: PartitionDevice.setup: sda1 ; status: True ; controllable: True ; orig: False ; >03:51:37,603 DEBUG blivet: PartitionDevice.setup: sdb1 ; status: True ; controllable: True ; orig: False ; >03:51:37,606 DEBUG blivet: PartitionDevice.setup: sdc1 ; status: True ; controllable: True ; orig: False ; >03:51:37,609 DEBUG blivet: PartitionDevice.setup: sdd1 ; status: True ; controllable: True ; orig: False ; >03:51:37,616 DEBUG blivet: MDRaidArrayDevice._create: root ; status: False ; >03:51:38,526 DEBUG blivet: MDRaidArrayDevice.setup: root ; status: True ; controllable: True ; orig: False ; >03:51:38,529 DEBUG blivet: MDRaidArrayDevice.updateSysfsPath: root ; status: True ; >03:51:38,603 INFO blivet: executing action: [70] Create Format ext4 filesystem mounted at / on mdarray root (id 41) >03:51:38,606 DEBUG blivet: MDRaidArrayDevice.setup: root ; status: True ; controllable: True ; orig: False ; >03:51:38,610 DEBUG blivet: Ext4FS.create: device: /dev/md/root ; status: False ; type: ext4 ; >03:51:38,612 DEBUG blivet: Ext4FS.doFormat: device: /dev/md/root ; mountpoint: / ; type: ext4 ; >03:51:53,700 DEBUG blivet: Ext4FS.notifyKernel: device: /dev/md/root ; type: ext4 ; >03:51:53,700 DEBUG blivet: notifying kernel of 'change' event on device /sys/class/block/md125 >03:51:53,757 DEBUG blivet: MDRaidArrayDevice.updateSysfsPath: root ; status: True ; >03:51:53,798 INFO blivet: setting boot flag on sda2 >03:51:53,803 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sda2 ; flag: 1 ; >03:51:53,806 DEBUG blivet: DiskDevice.setup: sda ; status: True ; controllable: True ; orig: False ; >03:51:53,811 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sda ; numparts: 3 ; >03:51:53,845 INFO blivet: setting boot flag on sdb2 >03:51:53,851 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdb2 ; flag: 1 ; >03:51:53,855 DEBUG blivet: DiskDevice.setup: sdb ; status: True ; controllable: True ; orig: False ; >03:51:53,858 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdb ; numparts: 3 ; >03:51:53,885 INFO blivet: setting boot flag on sdc2 >03:51:53,895 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdc2 ; flag: 1 ; >03:51:53,900 DEBUG blivet: DiskDevice.setup: sdc ; status: True ; controllable: True ; orig: False ; >03:51:53,902 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdc ; numparts: 3 ; >03:51:53,952 INFO blivet: setting boot flag on sdd2 >03:51:53,960 DEBUG blivet: PartitionDevice.setFlag: path: /dev/sdd2 ; flag: 1 ; >03:51:53,968 DEBUG blivet: DiskDevice.setup: sdd ; status: True ; controllable: True ; orig: False ; >03:51:53,978 DEBUG blivet: DiskLabel.commitToDisk: device: /dev/sdd ; numparts: 3 ; >03:51:54,035 DEBUG blivet: raw RAID 1 size == 512.0 >03:51:54,036 INFO blivet: Using 2.0MB superBlockSize >03:51:54,037 DEBUG blivet: looking up parted Device: /dev/md/boot >03:51:54,043 DEBUG blivet: existing RAID 1 size == 511.9375 >03:51:54,050 DEBUG blivet: Ext4FS.supported: supported: True ; >03:51:54,055 DEBUG blivet: Ext4FS.supported: supported: True ; >03:51:54,060 DEBUG blivet: Ext4FS.supported: supported: True ; >03:51:54,067 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:54,069 INFO blivet: Using 4MB superBlockSize >03:51:54,071 DEBUG blivet: looking up parted Device: /dev/md/root >03:51:54,077 DEBUG blivet: existing RAID 10 size == 6003.0 >03:51:54,084 DEBUG blivet: Ext4FS.supported: supported: True ; >03:51:54,109 DEBUG blivet: OpticalDevice.mediaPresent: sr0 ; status: True ; >03:51:54,113 DEBUG blivet: Iso9660FS.supported: supported: True ; >03:51:54,118 DEBUG blivet: raw RAID 10 size == 768.0 >03:51:54,118 INFO blivet: Using 0MB superBlockSize >03:51:54,119 DEBUG blivet: looking up parted Device: /dev/md/swap >03:51:54,123 DEBUG blivet: existing RAID 10 size == 767.0 >03:51:54,130 DEBUG blivet: MDRaidArrayDevice.setup: swap ; status: True ; controllable: True ; orig: False ; >03:51:54,137 DEBUG blivet: SwapSpace.setup: device: /dev/md/swap ; status: False ; type: swap ; >03:51:54,141 DEBUG blivet: SwapSpace.setup: device: /dev/md/swap ; status: False ; type: swap ; >03:51:54,179 DEBUG blivet: BindFS.supported: supported: False ; >03:51:54,180 DEBUG blivet: getFormat('bind') returning BindFS instance >03:51:54,182 DEBUG blivet: DirectoryDevice._setFormat: /dev ; current: None ; type: bind ; >03:51:54,184 DEBUG blivet: TmpFS.supported: supported: False ; >03:51:54,185 DEBUG blivet: getFormat('tmpfs') returning TmpFS instance >03:51:54,186 DEBUG blivet: NoDevice._setFormat: tmpfs ; current: None ; type: tmpfs ; >03:51:54,188 DEBUG blivet: DevPtsFS.supported: supported: False ; >03:51:54,189 DEBUG blivet: getFormat('devpts') returning DevPtsFS instance >03:51:54,191 DEBUG blivet: NoDevice._setFormat: devpts ; current: None ; type: devpts ; >03:51:54,194 DEBUG blivet: SysFS.supported: supported: False ; >03:51:54,195 DEBUG blivet: getFormat('sysfs') returning SysFS instance >03:51:54,197 DEBUG blivet: NoDevice._setFormat: sysfs ; current: None ; type: sysfs ; >03:51:54,198 DEBUG blivet: ProcFS.supported: supported: False ; >03:51:54,199 DEBUG blivet: getFormat('proc') returning ProcFS instance >03:51:54,201 DEBUG blivet: NoDevice._setFormat: proc ; current: None ; type: proc ; >03:51:54,203 DEBUG blivet: SELinuxFS.supported: supported: False ; >03:51:54,203 DEBUG blivet: getFormat('selinuxfs') returning SELinuxFS instance >03:51:54,205 DEBUG blivet: NoDevice._setFormat: selinuxfs ; current: None ; type: selinuxfs ; >03:51:54,207 DEBUG blivet: USBFS.supported: supported: False ; >03:51:54,208 DEBUG blivet: getFormat('usbfs') returning USBFS instance >03:51:54,212 DEBUG blivet: NoDevice._setFormat: usbfs ; current: None ; type: usbfs ; >03:51:54,214 DEBUG blivet: BindFS.supported: supported: False ; >03:51:54,215 DEBUG blivet: getFormat('bind') returning BindFS instance >03:51:54,216 DEBUG blivet: DirectoryDevice._setFormat: /run ; current: None ; type: bind ; >03:51:54,218 DEBUG blivet: MDRaidArrayDevice.setup: root ; status: True ; controllable: True ; orig: False ; >03:51:54,220 INFO blivet: set SELinux context for mountpoint / to system_u:object_r:root_t:s0 >03:51:54,340 INFO blivet: set SELinux context for newly mounted filesystem root at / to system_u:object_r:root_t:s0 >03:51:54,344 DEBUG blivet: MDRaidArrayDevice.setup: boot ; status: True ; controllable: True ; orig: False ; >03:51:54,347 INFO blivet: set SELinux context for mountpoint /boot to system_u:object_r:boot_t:s0 >03:51:54,505 INFO blivet: set SELinux context for newly mounted filesystem root at /boot to system_u:object_r:boot_t:s0 >03:51:54,507 DEBUG blivet: DirectoryDevice.setup: /dev ; status: True ; controllable: True ; orig: False ; >03:51:54,508 INFO blivet: set SELinux context for mountpoint /dev to system_u:object_r:device_t:s0 >03:51:54,536 INFO blivet: set SELinux context for newly mounted filesystem root at /dev to system_u:object_r:device_t:s0 >03:51:54,538 DEBUG blivet: NoDevice.setup: devpts ; status: False ; controllable: True ; orig: False ; >03:51:54,539 INFO blivet: set SELinux context for mountpoint /dev/pts to system_u:object_r:devpts_t:s0 >03:51:54,563 INFO blivet: set SELinux context for newly mounted filesystem root at /dev/pts to system_u:object_r:devpts_t:s0 >03:51:54,566 DEBUG blivet: NoDevice.setup: tmpfs ; status: False ; controllable: True ; orig: False ; >03:51:54,567 INFO blivet: set SELinux context for mountpoint /dev/shm to system_u:object_r:tmpfs_t:s0 >03:51:54,593 INFO blivet: set SELinux context for newly mounted filesystem root at /dev/shm to system_u:object_r:tmpfs_t:s0 >03:51:54,595 DEBUG blivet: NoDevice.setup: proc ; status: False ; controllable: True ; orig: False ; >03:51:54,596 INFO blivet: failed to get default SELinux context for /proc: [Errno 2] No such file or directory >03:51:54,597 INFO blivet: set SELinux context for mountpoint /proc to None >03:51:54,621 INFO blivet: failed to get default SELinux context for /proc: [Errno 2] No such file or directory >03:51:54,621 INFO blivet: set SELinux context for newly mounted filesystem root at /proc to None >03:51:54,685 DEBUG blivet: DirectoryDevice.setup: /run ; status: True ; controllable: True ; orig: False ; >03:51:54,687 INFO blivet: set SELinux context for mountpoint /run to system_u:object_r:var_run_t:s0 >03:51:54,711 INFO blivet: set SELinux context for newly mounted filesystem root at /run to system_u:object_r:var_run_t:s0 >03:51:54,715 DEBUG blivet: NoDevice.setup: sysfs ; status: False ; controllable: True ; orig: False ; >03:51:54,716 INFO blivet: set SELinux context for mountpoint /sys to system_u:object_r:sysfs_t:s0 >03:51:54,741 INFO blivet: set SELinux context for newly mounted filesystem root at /sys to system_u:object_r:sysfs_t:s0 >03:51:54,743 DEBUG blivet: NoDevice.setup: selinuxfs ; status: False ; controllable: True ; orig: False ; >03:51:54,743 INFO blivet: set SELinux context for mountpoint /sys/fs/selinux to system_u:object_r:sysfs_t:s0 >03:51:54,767 INFO blivet: failed to set SELinux context for /mnt/sysimage/sys/fs/selinux: [Errno 95] Operation not supported >03:51:54,767 INFO blivet: set SELinux context for newly mounted filesystem root at /sys/fs/selinux to None >03:51:54,774 INFO blivet: not writing out mpath configuration >03:51:54,782 DEBUG blivet: /dev/sr0 is mounted on /run/install/repo >03:51:59,722 DEBUG blivet: raw RAID 1 size == 2041.0 >03:51:59,722 INFO blivet: Using 1MB superBlockSize >03:51:59,723 DEBUG blivet: non-existent RAID 1 size == 2040.0 >03:51:59,743 DEBUG blivet: raw RAID 10 size == 768.0 >03:51:59,743 INFO blivet: Using 0MB superBlockSize >03:51:59,744 DEBUG blivet: existing RAID 10 size == 767.0 >03:51:59,745 DEBUG blivet: raw RAID 1 size == 512.0 >03:51:59,746 INFO blivet: Using 2.0MB superBlockSize >03:51:59,747 DEBUG blivet: existing RAID 1 size == 511.9375 >03:51:59,748 DEBUG blivet: raw RAID 10 size == 6008.0 >03:51:59,748 INFO blivet: Using 4MB superBlockSize >03:51:59,749 DEBUG blivet: existing RAID 10 size == 6003.0 > > >/tmp/ifcfg.log: >03:47:12,460 DEBUG ifcfg: content of files (network initialization): >03:47:12,778 DEBUG ifcfg: content of files (ifcfgs created): >03:47:12,779 DEBUG ifcfg: /etc/sysconfig/network-scripts/ifcfg-eth0: >TYPE=Ethernet >BOOTPROTO=dhcp >DEFROUTE=yes >IPV4_FAILURE_FATAL=no >IPV6INIT=yes >IPV6_AUTOCONF=yes >IPV6_DEFROUTE=yes >IPV6_PEERDNS=yes >IPV6_PEERROUTES=yes >IPV6_FAILURE_FATAL=no >NAME=eth0 >UUID=9d7b618a-d3c7-45c2-b9c4-d4a7773db55b >ONBOOT=no >HWADDR=52:54:00:A0:38:A0 >PEERDNS=yes >PEERROUTES=yes > >03:47:35,438 DEBUG ifcfg: loadIfcfFile /etc/sysconfig/network-scripts/ifcfg-eth0 > > >/proc/cmdline: >initrd=initrd.img inst.stage2=hd:LABEL=Fedora\x2019-Beta-TC4\x20x86_64 quiet BOOT_IMAGE=vmlinuz > > >/tmp/syslog: >03:47:01,978 INFO rsyslogd: [origin software="rsyslogd" swVersion="7.2.6" x-pid="650" x-info="http://www.rsyslog.com"] start >03:47:01,979 INFO kernel:[ 0.000000] Initializing cgroup subsys cpuset >03:47:01,979 INFO kernel:[ 0.000000] Initializing cgroup subsys cpu >03:47:01,979 NOTICE kernel:[ 0.000000] Linux version 3.9.0-301.fc19.x86_64 (mockbuild@bkernel02) (gcc version 4.8.0 20130412 (Red Hat 4.8.0-2) (GCC) ) #1 SMP Mon Apr 29 13:44:05 UTC 2013 >03:47:01,979 INFO kernel:[ 0.000000] Command line: initrd=initrd.img inst.stage2=hd:LABEL=Fedora\x2019-Beta-TC4\x20x86_64 quiet BOOT_IMAGE=vmlinuz >03:47:01,979 INFO kernel:[ 0.000000] e820: BIOS-provided physical RAM map: >03:47:01,979 INFO kernel:[ 0.000000] BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable >03:47:01,979 INFO kernel:[ 0.000000] BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved >03:47:01,979 INFO kernel:[ 0.000000] BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved >03:47:01,979 INFO kernel:[ 0.000000] BIOS-e820: [mem 0x0000000000100000-0x000000007fffdfff] usable >03:47:01,979 INFO kernel:[ 0.000000] BIOS-e820: [mem 0x000000007fffe000-0x000000007fffffff] reserved >03:47:01,979 INFO kernel:[ 0.000000] BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved >03:47:01,979 INFO kernel:[ 0.000000] BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved >03:47:01,979 INFO kernel:[ 0.000000] NX (Execute Disable) protection: active >03:47:01,979 INFO kernel:[ 0.000000] SMBIOS 2.4 present. >03:47:01,979 DEBUG kernel:[ 0.000000] DMI: Bochs Bochs, BIOS Bochs 01/01/2011 >03:47:01,979 INFO kernel:[ 0.000000] Hypervisor detected: KVM >03:47:01,979 DEBUG kernel:[ 0.000000] e820: update [mem 0x00000000-0x00000fff] usable ==> reserved >03:47:01,979 DEBUG kernel:[ 0.000000] e820: remove [mem 0x000a0000-0x000fffff] usable >03:47:01,979 INFO kernel:[ 0.000000] No AGP bridge found >03:47:01,979 INFO kernel:[ 0.000000] e820: last_pfn = 0x7fffe max_arch_pfn = 0x400000000 >03:47:01,979 DEBUG kernel:[ 0.000000] MTRR default type: write-back >03:47:01,979 DEBUG kernel:[ 0.000000] MTRR fixed ranges enabled: >03:47:01,979 DEBUG kernel:[ 0.000000] 00000-9FFFF write-back >03:47:01,979 DEBUG kernel:[ 0.000000] A0000-BFFFF uncachable >03:47:01,979 DEBUG kernel:[ 0.000000] C0000-FFFFF write-protect >03:47:01,979 DEBUG kernel:[ 0.000000] MTRR variable ranges enabled: >03:47:01,979 DEBUG kernel:[ 0.000000] 0 base 00E0000000 mask FFE0000000 uncachable >03:47:01,979 DEBUG kernel:[ 0.000000] 1 disabled >03:47:01,979 DEBUG kernel:[ 0.000000] 2 disabled >03:47:01,979 DEBUG kernel:[ 0.000000] 3 disabled >03:47:01,979 DEBUG kernel:[ 0.000000] 4 disabled >03:47:01,979 DEBUG kernel:[ 0.000000] 5 disabled >03:47:01,979 DEBUG kernel:[ 0.000000] 6 disabled >03:47:01,979 DEBUG kernel:[ 0.000000] 7 disabled >03:47:01,979 INFO kernel:[ 0.000000] x86 PAT enabled: cpu 0, old 0x70406, new 0x7010600070106 >03:47:01,979 INFO kernel:[ 0.000000] found SMP MP-table at [mem 0x000fdad0-0x000fdadf] mapped at [ffff8800000fdad0] >03:47:01,979 DEBUG kernel:[ 0.000000] Base memory trampoline at [ffff880000099000] 99000 size 24576 >03:47:01,979 INFO kernel:[ 0.000000] init_memory_mapping: [mem 0x00000000-0x000fffff] >03:47:01,979 DEBUG kernel:[ 0.000000] [mem 0x00000000-0x000fffff] page 4k >03:47:01,979 DEBUG kernel:[ 0.000000] BRK [0x01fc4000, 0x01fc4fff] PGTABLE >03:47:01,979 DEBUG kernel:[ 0.000000] BRK [0x01fc5000, 0x01fc5fff] PGTABLE >03:47:01,979 DEBUG kernel:[ 0.000000] BRK [0x01fc6000, 0x01fc6fff] PGTABLE >03:47:01,979 INFO kernel:[ 0.000000] init_memory_mapping: [mem 0x7de00000-0x7dffffff] >03:47:01,979 DEBUG kernel:[ 0.000000] [mem 0x7de00000-0x7dffffff] page 2M >03:47:01,979 DEBUG kernel:[ 0.000000] BRK [0x01fc7000, 0x01fc7fff] PGTABLE >03:47:01,979 INFO kernel:[ 0.000000] init_memory_mapping: [mem 0x7c000000-0x7ddfffff] >03:47:01,979 DEBUG kernel:[ 0.000000] [mem 0x7c000000-0x7ddfffff] page 2M >03:47:01,979 INFO kernel:[ 0.000000] init_memory_mapping: [mem 0x00100000-0x7bffffff] >03:47:01,979 DEBUG kernel:[ 0.000000] [mem 0x00100000-0x001fffff] page 4k >03:47:01,979 DEBUG kernel:[ 0.000000] [mem 0x00200000-0x7bffffff] page 2M >03:47:01,979 INFO kernel:[ 0.000000] init_memory_mapping: [mem 0x7e000000-0x7fffdfff] >03:47:01,979 DEBUG kernel:[ 0.000000] [mem 0x7e000000-0x7fdfffff] page 2M >03:47:01,979 DEBUG kernel:[ 0.000000] [mem 0x7fe00000-0x7fffdfff] page 4k >03:47:01,979 DEBUG kernel:[ 0.000000] BRK [0x01fc8000, 0x01fc8fff] PGTABLE >03:47:01,979 INFO kernel:[ 0.000000] RAMDISK: [mem 0x7e0d6000-0x7ffdcfff] >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: RSDP 00000000000fd970 00014 (v00 BOCHS ) >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: RSDT 000000007fffe440 00038 (v01 BOCHS BXPCRSDT 00000001 BXPC 00000001) >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: FACP 000000007fffff80 00074 (v01 BOCHS BXPCFACP 00000001 BXPC 00000001) >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: DSDT 000000007fffe480 0124A (v01 BXPC BXDSDT 00000001 INTL 20100528) >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: FACS 000000007fffff40 00040 >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: SSDT 000000007ffffe90 000AF (v01 BOCHS BXPCSSDT 00000001 BXPC 00000001) >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: APIC 000000007ffffd70 00078 (v01 BOCHS BXPCAPIC 00000001 BXPC 00000001) >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: HPET 000000007ffffd30 00038 (v01 BOCHS BXPCHPET 00000001 BXPC 00000001) >03:47:01,979 WARNING kernel:[ 0.000000] ACPI: SSDT 000000007ffff6d0 00654 (v01 BXPC BXSSDTPC 00000001 INTL 20100528) >03:47:01,979 DEBUG kernel:[ 0.000000] ACPI: Local APIC address 0xfee00000 >03:47:01,979 INFO kernel:[ 0.000000] No NUMA configuration found >03:47:01,979 INFO kernel:[ 0.000000] Faking a node at [mem 0x0000000000000000-0x000000007fffdfff] >03:47:01,979 INFO kernel:[ 0.000000] Initmem setup node 0 [mem 0x00000000-0x7fffdfff] >03:47:01,979 INFO kernel:[ 0.000000] NODE_DATA [mem 0x7ffea000-0x7fffdfff] >03:47:01,979 INFO kernel:[ 0.000000] kvm-clock: Using msrs 4b564d01 and 4b564d00 >03:47:01,979 INFO kernel:[ 0.000000] kvm-clock: cpu 0, msr 0:7ffe8001, boot clock >03:47:01,979 DEBUG kernel:[ 0.000000] [ffffea0000000000-ffffea0001ffffff] PMD -> [ffff88007b800000-ffff88007d7fffff] on node 0 >03:47:01,979 WARNING kernel:[ 0.000000] Zone ranges: >03:47:01,979 WARNING kernel:[ 0.000000] DMA [mem 0x00001000-0x00ffffff] >03:47:01,979 WARNING kernel:[ 0.000000] DMA32 [mem 0x01000000-0xffffffff] >03:47:01,979 WARNING kernel:[ 0.000000] Normal empty >03:47:01,979 WARNING kernel:[ 0.000000] Movable zone start for each node >03:47:01,979 WARNING kernel:[ 0.000000] Early memory node ranges >03:47:01,979 WARNING kernel:[ 0.000000] node 0: [mem 0x00001000-0x0009efff] >03:47:01,979 WARNING kernel:[ 0.000000] node 0: [mem 0x00100000-0x7fffdfff] >03:47:01,979 DEBUG kernel:[ 0.000000] On node 0 totalpages: 524188 >03:47:01,979 DEBUG kernel:[ 0.000000] DMA zone: 64 pages used for memmap >03:47:01,979 DEBUG kernel:[ 0.000000] DMA zone: 21 pages reserved >03:47:01,979 DEBUG kernel:[ 0.000000] DMA zone: 3998 pages, LIFO batch:0 >03:47:01,979 DEBUG kernel:[ 0.000000] DMA32 zone: 8128 pages used for memmap >03:47:01,979 DEBUG kernel:[ 0.000000] DMA32 zone: 520190 pages, LIFO batch:31 >03:47:01,979 INFO kernel:[ 0.000000] ACPI: PM-Timer IO Port: 0xb008 >03:47:01,979 DEBUG kernel:[ 0.000000] ACPI: Local APIC address 0xfee00000 >03:47:01,979 INFO kernel:[ 0.000000] ACPI: LAPIC (acpi_id[0x00] lapic_id[0x00] enabled) >03:47:01,979 INFO kernel:[ 0.000000] ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1]) >03:47:01,979 INFO kernel:[ 0.000000] ACPI: IOAPIC (id[0x00] address[0xfec00000] gsi_base[0]) >03:47:01,979 INFO kernel:[ 0.000000] IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23 >03:47:01,979 INFO kernel:[ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl) >03:47:01,979 INFO kernel:[ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level) >03:47:01,979 INFO kernel:[ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level) >03:47:01,979 INFO kernel:[ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level) >03:47:01,979 INFO kernel:[ 0.000000] ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level) >03:47:01,979 DEBUG kernel:[ 0.000000] ACPI: IRQ0 used by override. >03:47:01,979 DEBUG kernel:[ 0.000000] ACPI: IRQ2 used by override. >03:47:01,979 DEBUG kernel:[ 0.000000] ACPI: IRQ5 used by override. >03:47:01,979 DEBUG kernel:[ 0.000000] ACPI: IRQ9 used by override. >03:47:01,979 DEBUG kernel:[ 0.000000] ACPI: IRQ10 used by override. >03:47:01,979 DEBUG kernel:[ 0.000000] ACPI: IRQ11 used by override. >03:47:01,979 INFO kernel:[ 0.000000] Using ACPI (MADT) for SMP configuration information >03:47:01,979 INFO kernel:[ 0.000000] ACPI: HPET id: 0x8086a201 base: 0xfed00000 >03:47:01,979 INFO kernel:[ 0.000000] smpboot: Allowing 1 CPUs, 0 hotplug CPUs >03:47:01,979 DEBUG kernel:[ 0.000000] nr_irqs_gsi: 40 >03:47:01,979 INFO kernel:[ 0.000000] PM: Registered nosave memory: 000000000009f000 - 00000000000a0000 >03:47:01,979 INFO kernel:[ 0.000000] PM: Registered nosave memory: 00000000000a0000 - 00000000000f0000 >03:47:01,979 INFO kernel:[ 0.000000] PM: Registered nosave memory: 00000000000f0000 - 0000000000100000 >03:47:01,979 INFO kernel:[ 0.000000] e820: [mem 0x80000000-0xfeffbfff] available for PCI devices >03:47:01,979 INFO kernel:[ 0.000000] Booting paravirtualized kernel on KVM >03:47:01,979 INFO kernel:[ 0.000000] setup_percpu: NR_CPUS:128 nr_cpumask_bits:128 nr_cpu_ids:1 nr_node_ids:1 >03:47:01,979 INFO kernel:[ 0.000000] PERCPU: Embedded 28 pages/cpu @ffff88007de00000 s85120 r8192 d21376 u2097152 >03:47:01,979 DEBUG kernel:[ 0.000000] pcpu-alloc: s85120 r8192 d21376 u2097152 alloc=1*2097152 >03:47:01,979 DEBUG kernel:[ 0.000000] pcpu-alloc: [0] 0 >03:47:01,979 INFO kernel:[ 0.000000] kvm-clock: cpu 0, msr 0:7ffe8001, primary cpu clock >03:47:01,979 INFO kernel:[ 0.000000] KVM setup async PF for cpu 0 >03:47:01,979 INFO kernel:[ 0.000000] kvm-stealtime: cpu 0, msr 7de0de80 >03:47:01,979 WARNING kernel:[ 0.000000] Built 1 zonelists in Node order, mobility grouping on. Total pages: 515975 >03:47:01,979 WARNING kernel:[ 0.000000] Policy zone: DMA32 >03:47:01,979 NOTICE kernel:[ 0.000000] Kernel command line: initrd=initrd.img inst.stage2=hd:LABEL=Fedora\x2019-Beta-TC4\x20x86_64 quiet BOOT_IMAGE=vmlinuz >03:47:01,979 INFO kernel:[ 0.000000] PID hash table entries: 4096 (order: 3, 32768 bytes) >03:47:01,979 NOTICE kernel:[ 0.000000] __ex_table already sorted, skipping sort >03:47:01,979 INFO kernel:[ 0.000000] Checking aperture... >03:47:01,979 INFO kernel:[ 0.000000] No AGP bridge found >03:47:01,979 INFO kernel:[ 0.000000] Memory: 2015664k/2097144k available (6469k kernel code, 392k absent, 81088k reserved, 6769k data, 1352k init) >03:47:01,979 INFO kernel:[ 0.000000] SLUB: Genslabs=15, HWalign=64, Order=0-3, MinObjects=0, CPUs=1, Nodes=1 >03:47:01,979 INFO kernel:[ 0.000000] Hierarchical RCU implementation. >03:47:01,979 INFO kernel:[ 0.000000] RCU restricting CPUs from NR_CPUS=128 to nr_cpu_ids=1. >03:47:01,979 INFO kernel:[ 0.000000] NR_IRQS:8448 nr_irqs:256 16 >03:47:01,979 INFO kernel:[ 0.000000] Console: colour VGA+ 80x25 >03:47:01,979 INFO kernel:[ 0.000000] console [tty0] enabled >03:47:01,979 INFO kernel:[ 0.000000] allocated 8388608 bytes of page_cgroup >03:47:01,979 INFO kernel:[ 0.000000] please try 'cgroup_disable=memory' option if you don't want memory cgroups >03:47:01,979 DEBUG kernel:[ 0.000000] hpet clockevent registered >03:47:01,979 INFO kernel:[ 0.000000] tsc: Detected 3311.132 MHz processor >03:47:01,979 INFO kernel:[ 0.002000] Calibrating delay loop (skipped) preset value.. 6622.26 BogoMIPS (lpj=3311132) >03:47:01,979 INFO kernel:[ 0.002000] pid_max: default: 32768 minimum: 301 >03:47:01,979 INFO kernel:[ 0.002000] Security Framework initialized >03:47:01,979 INFO kernel:[ 0.002000] SELinux: Initializing. >03:47:01,979 DEBUG kernel:[ 0.002000] SELinux: Starting in permissive mode >03:47:01,979 INFO kernel:[ 0.002000] Dentry cache hash table entries: 262144 (order: 9, 2097152 bytes) >03:47:01,979 INFO kernel:[ 0.003716] Inode-cache hash table entries: 131072 (order: 8, 1048576 bytes) >03:47:01,979 INFO kernel:[ 0.004222] Mount-cache hash table entries: 256 >03:47:01,979 INFO kernel:[ 0.005287] Initializing cgroup subsys cpuacct >03:47:01,979 INFO kernel:[ 0.005289] Initializing cgroup subsys memory >03:47:01,979 INFO kernel:[ 0.005295] Initializing cgroup subsys devices >03:47:01,979 INFO kernel:[ 0.005297] Initializing cgroup subsys freezer >03:47:01,979 INFO kernel:[ 0.005298] Initializing cgroup subsys net_cls >03:47:01,979 INFO kernel:[ 0.005300] Initializing cgroup subsys blkio >03:47:01,979 INFO kernel:[ 0.005302] Initializing cgroup subsys perf_event >03:47:01,979 INFO kernel:[ 0.005412] mce: CPU supports 10 MCE banks >03:47:01,979 INFO kernel:[ 0.005463] Last level iTLB entries: 4KB 0, 2MB 0, 4MB 0 >03:47:01,979 INFO kernel:[ 0.005463] Last level dTLB entries: 4KB 0, 2MB 0, 4MB 0 >03:47:01,979 INFO kernel:[ 0.005463] tlb_flushall_shift: -1 >03:47:01,979 INFO kernel:[ 0.024476] Freeing SMP alternatives: 24k freed >03:47:01,979 INFO kernel:[ 0.028382] ACPI: Core revision 20130117 >03:47:01,979 WARNING kernel:[ 0.029113] ACPI: All ACPI Tables successfully acquired >03:47:01,979 INFO kernel:[ 0.029148] ftrace: allocating 24401 entries in 96 pages >03:47:01,979 INFO kernel:[ 0.038336] ..TIMER: vector=0x30 apic1=0 pin1=2 apic2=-1 pin2=-1 >03:47:01,979 INFO kernel:[ 0.038339] smpboot: CPU0: AMD QEMU Virtual CPU version 1.0.1 (fam: 06, model: 02, stepping: 03) >03:47:01,979 INFO kernel:[ 0.039000] Performance Events: Broken PMU hardware detected, using software events only. >03:47:01,979 ERR kernel:[ 0.039000] Failed to access perfctr msr (MSR c0010001 is ffffffffffffffff) >03:47:01,979 INFO kernel:[ 0.039911] Brought up 1 CPUs >03:47:01,979 INFO kernel:[ 0.039914] smpboot: Total of 1 processors activated (6622.26 BogoMIPS) >03:47:01,979 WARNING kernel:[ 0.040196] NMI watchdog: disabled (cpu0): hardware events not enabled >03:47:01,979 INFO kernel:[ 0.040541] devtmpfs: initialized >03:47:01,979 INFO kernel:[ 0.041470] atomic64 test passed for x86-64 platform with CX8 and with SSE >03:47:01,979 INFO kernel:[ 0.041555] RTC time: 3:46:49, date: 05/11/13 >03:47:01,979 INFO kernel:[ 0.041611] NET: Registered protocol family 16 >03:47:01,979 INFO kernel:[ 0.041844] ACPI: bus type PCI registered >03:47:01,979 INFO kernel:[ 0.041994] PCI: Using configuration type 1 for base access >03:47:01,979 INFO kernel:[ 0.043034] bio: create slab <bio-0> at 0 >03:47:01,979 INFO kernel:[ 0.043200] ACPI: Added _OSI(Module Device) >03:47:01,979 INFO kernel:[ 0.043202] ACPI: Added _OSI(Processor Device) >03:47:01,979 INFO kernel:[ 0.043203] ACPI: Added _OSI(3.0 _SCP Extensions) >03:47:01,979 INFO kernel:[ 0.043205] ACPI: Added _OSI(Processor Aggregator Device) >03:47:01,979 DEBUG kernel:[ 0.043780] ACPI: EC: Look up EC in DSDT >03:47:01,979 INFO kernel:[ 0.045457] ACPI: Interpreter enabled >03:47:01,979 WARNING kernel:[ 0.045461] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S1_] (20130117/hwxface-568) >03:47:01,979 WARNING kernel:[ 0.045465] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S2_] (20130117/hwxface-568) >03:47:01,979 WARNING kernel:[ 0.045468] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S3_] (20130117/hwxface-568) >03:47:01,979 WARNING kernel:[ 0.045471] ACPI Exception: AE_NOT_FOUND, While evaluating Sleep State [\_S4_] (20130117/hwxface-568) >03:47:01,979 INFO kernel:[ 0.045475] ACPI: (supports S0 S5) >03:47:01,979 INFO kernel:[ 0.045477] ACPI: Using IOAPIC for interrupt routing >03:47:01,979 INFO kernel:[ 0.045488] PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug >03:47:01,979 INFO kernel:[ 0.048712] ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) >03:47:01,979 WARNING kernel:[ 0.048833] acpi PNP0A03:00: fail to add MMCONFIG information, can't access extended PCI configuration space under this bridge. >03:47:01,979 INFO kernel:[ 0.048868] PCI host bridge to bus 0000:00 >03:47:01,979 INFO kernel:[ 0.048870] pci_bus 0000:00: root bus resource [bus 00-ff] >03:47:01,979 INFO kernel:[ 0.048872] pci_bus 0000:00: root bus resource [io 0x0000-0x0cf7] >03:47:01,979 INFO kernel:[ 0.048874] pci_bus 0000:00: root bus resource [io 0x0d00-0xffff] >03:47:01,979 INFO kernel:[ 0.048876] pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff] >03:47:01,979 INFO kernel:[ 0.048878] pci_bus 0000:00: root bus resource [mem 0x80000000-0xfebfffff] >03:47:01,979 DEBUG kernel:[ 0.048923] pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 >03:47:01,979 DEBUG kernel:[ 0.049334] pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 >03:47:01,979 DEBUG kernel:[ 0.049890] pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 >03:47:01,979 DEBUG kernel:[ 0.050414] pci 0000:00:01.1: reg 20: [io 0xc0e0-0xc0ef] >03:47:01,979 DEBUG kernel:[ 0.050811] pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 >03:47:01,979 DEBUG kernel:[ 0.051324] pci 0000:00:01.2: reg 20: [io 0xc040-0xc05f] >03:47:01,980 DEBUG kernel:[ 0.051692] pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 >03:47:01,980 INFO kernel:[ 0.052170] pci 0000:00:01.3: quirk: [io 0xb000-0xb03f] claimed by PIIX4 ACPI >03:47:01,980 INFO kernel:[ 0.052182] pci 0000:00:01.3: quirk: [io 0xb100-0xb10f] claimed by PIIX4 SMB >03:47:01,980 DEBUG kernel:[ 0.052392] pci 0000:00:02.0: [1b36:0100] type 00 class 0x030000 >03:47:01,980 DEBUG kernel:[ 0.054764] pci 0000:00:02.0: reg 10: [mem 0xf4000000-0xf7ffffff] >03:47:01,980 DEBUG kernel:[ 0.056055] pci 0000:00:02.0: reg 14: [mem 0xf8000000-0xfbffffff] >03:47:01,980 DEBUG kernel:[ 0.058061] pci 0000:00:02.0: reg 18: [mem 0xfc024000-0xfc025fff] >03:47:01,980 DEBUG kernel:[ 0.059077] pci 0000:00:02.0: reg 1c: [io 0xc060-0xc07f] >03:47:01,980 DEBUG kernel:[ 0.064057] pci 0000:00:02.0: reg 30: [mem 0xfc000000-0xfc00ffff pref] >03:47:01,980 DEBUG kernel:[ 0.064563] pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 >03:47:01,980 DEBUG kernel:[ 0.065003] pci 0000:00:03.0: reg 10: [io 0xc080-0xc09f] >03:47:01,980 DEBUG kernel:[ 0.065154] pci 0000:00:03.0: reg 14: [mem 0xfc026000-0xfc026fff] >03:47:01,980 DEBUG kernel:[ 0.065798] pci 0000:00:03.0: reg 30: [mem 0xfc010000-0xfc01ffff pref] >03:47:01,980 DEBUG kernel:[ 0.066255] pci 0000:00:04.0: [8086:2668] type 00 class 0x040300 >03:47:01,980 DEBUG kernel:[ 0.066413] pci 0000:00:04.0: reg 10: [mem 0xfc020000-0xfc023fff] >03:47:01,980 DEBUG kernel:[ 0.067391] pci 0000:00:05.0: [1af4:1003] type 00 class 0x078000 >03:47:01,980 DEBUG kernel:[ 0.067583] pci 0000:00:05.0: reg 10: [io 0xc0a0-0xc0bf] >03:47:01,980 DEBUG kernel:[ 0.067735] pci 0000:00:05.0: reg 14: [mem 0xfc027000-0xfc027fff] >03:47:01,980 DEBUG kernel:[ 0.068708] pci 0000:00:06.0: [1af4:1004] type 00 class 0x010000 >03:47:01,980 DEBUG kernel:[ 0.068896] pci 0000:00:06.0: reg 10: [io 0xc000-0xc03f] >03:47:01,980 DEBUG kernel:[ 0.069061] pci 0000:00:06.0: reg 14: [mem 0xfc028000-0xfc028fff] >03:47:01,980 DEBUG kernel:[ 0.070053] pci 0000:00:07.0: [1af4:1002] type 00 class 0x050000 >03:47:01,980 DEBUG kernel:[ 0.070184] pci 0000:00:07.0: reg 10: [io 0xc0c0-0xc0df] >03:47:01,980 INFO kernel:[ 0.071290] acpi PNP0A03:00: ACPI _OSC support notification failed, disabling PCIe ASPM >03:47:01,980 INFO kernel:[ 0.071295] acpi PNP0A03:00: Unable to request _OSC control (_OSC support mask: 0x08) >03:47:01,980 INFO kernel:[ 0.072065] ACPI: PCI Interrupt Link [LNKA] (IRQs 5 *10 11) >03:47:01,980 INFO kernel:[ 0.072183] ACPI: PCI Interrupt Link [LNKB] (IRQs 5 *10 11) >03:47:01,980 INFO kernel:[ 0.072287] ACPI: PCI Interrupt Link [LNKC] (IRQs 5 10 *11) >03:47:01,980 INFO kernel:[ 0.072390] ACPI: PCI Interrupt Link [LNKD] (IRQs 5 10 *11) >03:47:01,980 INFO kernel:[ 0.072488] ACPI: PCI Interrupt Link [LNKS] (IRQs 9) *0 >03:47:01,980 WARNING kernel:[ 0.072861] ACPI: Enabled 16 GPEs in block 00 to 0F >03:47:01,980 DEBUG kernel:[ 0.072871] acpi root: \_SB_.PCI0 notify handler is installed >03:47:01,980 DEBUG kernel:[ 0.072892] Found 1 acpi root devices >03:47:01,980 INFO kernel:[ 0.073340] ACPI: No dock devices found. >03:47:01,980 INFO kernel:[ 0.073563] vgaarb: device added: PCI:0000:00:02.0,decodes=io+mem,owns=io+mem,locks=none >03:47:01,980 INFO kernel:[ 0.073565] vgaarb: loaded >03:47:01,980 INFO kernel:[ 0.073566] vgaarb: bridge control possible 0000:00:02.0 >03:47:01,980 NOTICE kernel:[ 0.073753] SCSI subsystem initialized >03:47:01,980 INFO kernel:[ 0.073761] ACPI: bus type ATA registered >03:47:01,980 DEBUG kernel:[ 0.073916] libata version 3.00 loaded. >03:47:01,980 INFO kernel:[ 0.074010] ACPI: bus type USB registered >03:47:01,980 INFO kernel:[ 0.074044] usbcore: registered new interface driver usbfs >03:47:01,980 INFO kernel:[ 0.074057] usbcore: registered new interface driver hub >03:47:01,980 INFO kernel:[ 0.074101] usbcore: registered new device driver usb >03:47:01,980 INFO kernel:[ 0.074206] PCI: Using ACPI for IRQ routing >03:47:01,980 DEBUG kernel:[ 0.074213] PCI: pci_cache_line_size set to 64 bytes >03:47:01,982 INFO systemd: systemd 203 running in system mode. (+PAM +LIBWRAP +AUDIT +SELINUX +IMA +SYSVINIT +LIBCRYPTSETUP +GCRYPT +ACL +XZ) >03:47:01,982 INFO systemd: Detected virtualization 'kvm'. >03:47:01,982 INFO systemd: Initializing machine ID from KVM UUID. >03:47:01,982 ERR systemd: /usr/lib/systemd/system-generators/lvm2-activation-generator exited with exit status 127. >03:47:01,982 WARNING systemd: Cannot add dependency job for unit lvm2-monitor.service, ignoring: Unit dm-event.socket failed to load: No such file or directory. See system logs and 'systemctl status dm-event.socket' for details. >03:47:01,982 INFO systemd: Started Remount Root and Kernel File Systems. >03:47:01,982 INFO systemd: Starting Configure read-only root support... >03:47:01,982 INFO systemd: Started Import network configuration from initramfs. >03:47:01,982 INFO systemd: Started Apply Kernel Variables. >03:47:01,982 INFO systemd: Mounted POSIX Message Queue File System. >03:47:01,982 INFO systemd: Mounted Temporary Directory. >03:47:01,983 DEBUG kernel:[ 0.074445] e820: reserve RAM buffer [mem 0x0009fc00-0x0009ffff] >03:47:01,983 DEBUG kernel:[ 0.074452] e820: reserve RAM buffer [mem 0x7fffe000-0x7fffffff] >03:47:01,983 INFO kernel:[ 0.074671] NetLabel: Initializing >03:47:01,983 INFO kernel:[ 0.074676] NetLabel: domain hash size = 128 >03:47:01,983 INFO kernel:[ 0.074677] NetLabel: protocols = UNLABELED CIPSOv4 >03:47:01,983 INFO kernel:[ 0.074721] NetLabel: unlabeled traffic allowed by default >03:47:01,983 INFO kernel:[ 0.074818] HPET: 3 timers in total, 0 timers will be used for per-cpu timer >03:47:01,983 INFO kernel:[ 0.074839] hpet0: at MMIO 0xfed00000, IRQs 2, 8, 0 >03:47:01,983 INFO kernel:[ 0.074842] hpet0: 3 comparators, 64-bit 100.000000 MHz counter >03:47:01,983 INFO kernel:[ 0.077020] Switching to clocksource kvm-clock >03:47:01,983 INFO kernel:[ 0.089246] pnp: PnP ACPI init >03:47:01,983 INFO kernel:[ 0.089258] ACPI: bus type PNP registered >03:47:01,983 DEBUG kernel:[ 0.089341] pnp 00:00: Plug and Play ACPI device, IDs PNP0b00 (active) >03:47:01,983 DEBUG kernel:[ 0.089402] pnp 00:01: Plug and Play ACPI device, IDs PNP0303 (active) >03:47:01,983 DEBUG kernel:[ 0.089463] pnp 00:02: Plug and Play ACPI device, IDs PNP0f13 (active) >03:47:01,983 DEBUG kernel:[ 0.089496] pnp 00:03: [dma 2] >03:47:01,983 DEBUG kernel:[ 0.089516] pnp 00:03: Plug and Play ACPI device, IDs PNP0700 (active) >03:47:01,983 DEBUG kernel:[ 0.089649] pnp 00:04: Plug and Play ACPI device, IDs PNP0501 (active) >03:47:01,983 DEBUG kernel:[ 0.089791] pnp 00:05: Plug and Play ACPI device, IDs PNP0103 (active) >03:47:01,983 INFO kernel:[ 0.089925] pnp: PnP ACPI: found 6 devices >03:47:01,983 INFO kernel:[ 0.089926] ACPI: bus type PNP unregistered >03:47:01,983 DEBUG kernel:[ 0.096386] pci_bus 0000:00: resource 4 [io 0x0000-0x0cf7] >03:47:01,983 DEBUG kernel:[ 0.096389] pci_bus 0000:00: resource 5 [io 0x0d00-0xffff] >03:47:01,983 DEBUG kernel:[ 0.096391] pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff] >03:47:01,983 DEBUG kernel:[ 0.096392] pci_bus 0000:00: resource 7 [mem 0x80000000-0xfebfffff] >03:47:01,983 INFO kernel:[ 0.096451] NET: Registered protocol family 2 >03:47:01,983 INFO kernel:[ 0.096697] TCP established hash table entries: 16384 (order: 6, 262144 bytes) >03:47:01,983 INFO kernel:[ 0.097002] TCP bind hash table entries: 16384 (order: 6, 262144 bytes) >03:47:01,983 INFO kernel:[ 0.097323] TCP: Hash tables configured (established 16384 bind 16384) >03:47:01,983 INFO kernel:[ 0.097494] TCP: reno registered >03:47:01,983 INFO kernel:[ 0.097499] UDP hash table entries: 1024 (order: 3, 32768 bytes) >03:47:01,983 INFO kernel:[ 0.097539] UDP-Lite hash table entries: 1024 (order: 3, 32768 bytes) >03:47:01,983 INFO kernel:[ 0.097666] NET: Registered protocol family 1 >03:47:01,983 INFO kernel:[ 0.097690] pci 0000:00:00.0: Limiting direct PCI/PCI transfers >03:47:01,983 INFO kernel:[ 0.097711] pci 0000:00:01.0: PIIX3: Enabling Passive Release >03:47:01,983 INFO kernel:[ 0.097733] pci 0000:00:01.0: Activating ISA DMA hang workarounds >03:47:01,983 WARNING kernel:[ 0.098063] ACPI: PCI Interrupt Link [LNKD] enabled at IRQ 11 >03:47:01,983 DEBUG kernel:[ 0.098308] pci 0000:00:02.0: Boot video device >03:47:01,983 DEBUG kernel:[ 0.098370] PCI: CLS 0 bytes, default 64 >03:47:01,983 INFO kernel:[ 0.098435] Unpacking initramfs... >03:47:01,983 INFO kernel:[ 3.598794] Freeing initrd memory: 31772k freed >03:47:01,983 NOTICE kernel:[ 3.608537] Initialise system trusted keyring >03:47:01,983 INFO kernel:[ 3.608580] audit: initializing netlink socket (disabled) >03:47:01,983 NOTICE kernel:[ 3.608595] type=2000 audit(1368244013.608:1): initialized >03:47:01,983 INFO kernel:[ 3.629261] HugeTLB registered 2 MB page size, pre-allocated 0 pages >03:47:01,983 NOTICE kernel:[ 3.630636] VFS: Disk quotas dquot_6.5.2 >03:47:01,983 WARNING kernel:[ 3.630680] Dquot-cache hash table entries: 512 (order 0, 4096 bytes) >03:47:01,983 INFO kernel:[ 3.631058] msgmni has been set to 3998 >03:47:01,983 DEBUG kernel:[ 3.631112] SELinux: Registering netfilter hooks >03:47:01,983 INFO kernel:[ 3.631635] alg: No test for stdrng (krng) >03:47:01,983 INFO kernel:[ 3.631639] NET: Registered protocol family 38 >03:47:01,983 NOTICE kernel:[ 3.631642] Key type asymmetric registered >03:47:01,983 NOTICE kernel:[ 3.631646] Asymmetric key parser 'x509' registered >03:47:01,983 NOTICE kernel:[ 3.631647] Asymmetric key parser 'pefile' registered >03:47:01,983 INFO kernel:[ 3.631675] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 252) >03:47:01,983 INFO kernel:[ 3.631700] io scheduler noop registered >03:47:01,983 INFO kernel:[ 3.631704] io scheduler deadline registered >03:47:01,983 INFO kernel:[ 3.631710] io scheduler cfq registered (default) >03:47:01,983 INFO kernel:[ 3.631798] pci_hotplug: PCI Hot Plug PCI Core version: 0.5 >03:47:01,983 INFO kernel:[ 3.631813] pciehp: PCI Express Hot Plug Controller Driver version: 0.4 >03:47:01,983 INFO kernel:[ 3.631815] acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 >03:47:01,983 INFO kernel:[ 3.631859] acpiphp: Slot [3] registered >03:47:01,983 INFO kernel:[ 3.631871] acpiphp: Slot [4] registered >03:47:01,983 INFO kernel:[ 3.631883] acpiphp: Slot [5] registered >03:47:01,983 INFO kernel:[ 3.631896] acpiphp: Slot [6] registered >03:47:01,983 INFO kernel:[ 3.631906] acpiphp: Slot [7] registered >03:47:01,983 INFO kernel:[ 3.631916] acpiphp: Slot [8] registered >03:47:01,983 INFO kernel:[ 3.631926] acpiphp: Slot [9] registered >03:47:01,983 INFO kernel:[ 3.631937] acpiphp: Slot [10] registered >03:47:01,983 INFO kernel:[ 3.631947] acpiphp: Slot [11] registered >03:47:01,983 INFO kernel:[ 3.631959] acpiphp: Slot [12] registered >03:47:01,983 INFO kernel:[ 3.631969] acpiphp: Slot [13] registered >03:47:01,983 INFO kernel:[ 3.631980] acpiphp: Slot [14] registered >03:47:01,983 INFO kernel:[ 3.631991] acpiphp: Slot [15] registered >03:47:01,983 INFO kernel:[ 3.632001] acpiphp: Slot [16] registered >03:47:01,983 INFO kernel:[ 3.632027] acpiphp: Slot [17] registered >03:47:01,983 INFO kernel:[ 3.632038] acpiphp: Slot [18] registered >03:47:01,983 INFO kernel:[ 3.632048] acpiphp: Slot [19] registered >03:47:01,983 INFO kernel:[ 3.632060] acpiphp: Slot [20] registered >03:47:01,983 INFO kernel:[ 3.632071] acpiphp: Slot [21] registered >03:47:01,983 INFO kernel:[ 3.632081] acpiphp: Slot [22] registered >03:47:01,983 INFO kernel:[ 3.632091] acpiphp: Slot [23] registered >03:47:01,983 INFO kernel:[ 3.632101] acpiphp: Slot [24] registered >03:47:01,983 INFO kernel:[ 3.632111] acpiphp: Slot [25] registered >03:47:01,983 INFO kernel:[ 3.632122] acpiphp: Slot [26] registered >03:47:01,983 INFO kernel:[ 3.632133] acpiphp: Slot [27] registered >03:47:01,983 INFO kernel:[ 3.632144] acpiphp: Slot [28] registered >03:47:01,983 INFO kernel:[ 3.632154] acpiphp: Slot [29] registered >03:47:01,983 INFO kernel:[ 3.632164] acpiphp: Slot [30] registered >03:47:01,983 INFO kernel:[ 3.632174] acpiphp: Slot [31] registered >03:47:01,983 INFO kernel:[ 3.632357] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 >03:47:01,983 INFO kernel:[ 3.632361] ACPI: Power Button [PWRF] >03:47:01,983 INFO kernel:[ 3.633380] GHES: HEST is not enabled! >03:47:01,983 WARNING kernel:[ 3.633736] ACPI: PCI Interrupt Link [LNKC] enabled at IRQ 10 >03:47:01,983 DEBUG kernel:[ 3.633791] virtio-pci 0000:00:03.0: setting latency timer to 64 >03:47:01,983 WARNING kernel:[ 3.634165] ACPI: PCI Interrupt Link [LNKA] enabled at IRQ 10 >03:47:01,983 DEBUG kernel:[ 3.634194] virtio-pci 0000:00:05.0: setting latency timer to 64 >03:47:01,983 WARNING kernel:[ 3.634511] ACPI: PCI Interrupt Link [LNKB] enabled at IRQ 11 >03:47:01,983 DEBUG kernel:[ 3.634539] virtio-pci 0000:00:06.0: setting latency timer to 64 >03:47:01,983 DEBUG kernel:[ 3.634804] virtio-pci 0000:00:07.0: setting latency timer to 64 >03:47:01,983 INFO kernel:[ 3.634894] Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled >03:47:01,983 INFO kernel:[ 3.657328] 00:04: ttyS0 at I/O 0x3f8 (irq = 4) is a 16550A >03:47:01,983 DEBUG kernel:[ 3.657906] virtio-pci 0000:00:05.0: irq 40 for MSI/MSI-X >03:47:01,983 DEBUG kernel:[ 3.657929] virtio-pci 0000:00:05.0: irq 41 for MSI/MSI-X >03:47:01,983 INFO kernel:[ 3.687784] Non-volatile memory driver v1.3 >03:47:01,983 INFO kernel:[ 3.687792] Linux agpgart interface v0.103 >03:47:01,983 INFO kernel:[ 3.690427] loop: module loaded >03:47:01,983 DEBUG kernel:[ 3.690634] ata_piix 0000:00:01.1: version 2.13 >03:47:01,983 DEBUG kernel:[ 3.690963] ata_piix 0000:00:01.1: setting latency timer to 64 >03:47:01,983 INFO kernel:[ 3.692084] scsi0 : ata_piix >03:47:01,983 INFO kernel:[ 3.692164] scsi1 : ata_piix >03:47:01,983 INFO kernel:[ 3.692197] ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc0e0 irq 14 >03:47:01,983 INFO kernel:[ 3.692199] ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc0e8 irq 15 >03:47:01,983 INFO kernel:[ 3.692775] libphy: Fixed MDIO Bus: probed >03:47:01,983 INFO kernel:[ 3.692865] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver >03:47:01,983 INFO kernel:[ 3.692867] ehci-pci: EHCI PCI platform driver >03:47:01,983 INFO kernel:[ 3.692875] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver >03:47:01,983 INFO kernel:[ 3.692883] uhci_hcd: USB Universal Host Controller Interface driver >03:47:01,983 DEBUG kernel:[ 3.693076] uhci_hcd 0000:00:01.2: setting latency timer to 64 >03:47:01,983 INFO kernel:[ 3.693090] uhci_hcd 0000:00:01.2: UHCI Host Controller >03:47:01,983 INFO kernel:[ 3.693137] uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1 >03:47:01,983 INFO kernel:[ 3.693254] uhci_hcd 0000:00:01.2: irq 11, io base 0x0000c040 >03:47:01,983 INFO kernel:[ 3.693342] usb usb1: New USB device found, idVendor=1d6b, idProduct=0001 >03:47:01,983 INFO kernel:[ 3.693344] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 >03:47:01,983 INFO kernel:[ 3.693345] usb usb1: Product: UHCI Host Controller >03:47:01,983 INFO kernel:[ 3.693347] usb usb1: Manufacturer: Linux 3.9.0-301.fc19.x86_64 uhci_hcd >03:47:01,983 INFO kernel:[ 3.693348] usb usb1: SerialNumber: 0000:00:01.2 >03:47:01,983 INFO kernel:[ 3.693434] hub 1-0:1.0: USB hub found >03:47:01,983 INFO kernel:[ 3.693438] hub 1-0:1.0: 2 ports detected >03:47:01,983 INFO kernel:[ 3.693675] usbcore: registered new interface driver usbserial >03:47:01,983 INFO kernel:[ 3.693687] usbcore: registered new interface driver usbserial_generic >03:47:01,983 INFO kernel:[ 3.693696] usbserial: USB Serial support registered for generic >03:47:01,983 INFO kernel:[ 3.693725] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 >03:47:01,983 INFO kernel:[ 3.694578] serio: i8042 KBD port at 0x60,0x64 irq 1 >03:47:01,983 INFO kernel:[ 3.694583] serio: i8042 AUX port at 0x60,0x64 irq 12 >03:47:01,983 INFO kernel:[ 3.694669] mousedev: PS/2 mouse device common for all mice >03:47:01,983 INFO kernel:[ 3.695101] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 >03:47:01,983 INFO kernel:[ 3.697949] rtc_cmos 00:00: RTC can wake from S4 >03:47:01,983 INFO kernel:[ 3.698572] rtc_cmos 00:00: rtc core: registered rtc_cmos as rtc0 >03:47:01,983 INFO kernel:[ 3.698777] rtc_cmos 00:00: alarms up to one day, 114 bytes nvram, hpet irqs >03:47:01,983 INFO kernel:[ 3.698856] device-mapper: uevent: version 1.0.3 >03:47:01,983 INFO kernel:[ 3.698931] device-mapper: ioctl: 4.24.0-ioctl (2013-01-15) initialised: dm-devel@redhat.com >03:47:01,983 INFO kernel:[ 3.698973] cpuidle: using governor ladder >03:47:01,983 INFO kernel:[ 3.698975] cpuidle: using governor menu >03:47:01,983 INFO kernel:[ 3.699056] EFI Variables Facility v0.08 2004-May-17 >03:47:01,983 INFO kernel:[ 3.699101] hidraw: raw HID events driver (C) Jiri Kosina >03:47:01,983 INFO kernel:[ 3.699215] usbcore: registered new interface driver usbhid >03:47:01,983 INFO kernel:[ 3.699216] usbhid: USB HID core driver >03:47:01,983 INFO kernel:[ 3.699237] drop_monitor: Initializing network drop monitor service >03:47:01,983 INFO kernel:[ 3.699338] ip_tables: (C) 2000-2006 Netfilter Core Team >03:47:01,983 INFO kernel:[ 3.699370] TCP: cubic registered >03:47:01,983 INFO kernel:[ 3.699372] Initializing XFRM netlink socket >03:47:01,983 INFO kernel:[ 3.699490] NET: Registered protocol family 10 >03:47:01,983 INFO kernel:[ 3.699726] mip6: Mobile IPv6 >03:47:01,983 INFO kernel:[ 3.699728] NET: Registered protocol family 17 >03:47:01,983 DEBUG kernel:[ 3.699957] PM: Hibernation image not present or could not be loaded. >03:47:01,983 NOTICE kernel:[ 3.699958] Loading compiled-in X.509 certificates >03:47:01,983 NOTICE kernel:[ 3.700858] Loaded X.509 cert 'Fedora kernel signing key: 7e0d407f0cb0c86de526405fc0072f9ee90c6111' >03:47:01,983 INFO kernel:[ 3.700880] registered taskstats version 1 >03:47:01,983 INFO kernel:[ 3.701412] Magic number: 13:370:765 >03:47:01,983 DEBUG kernel:[ 3.845060] ata2.01: NODEV after polling detection >03:47:01,983 INFO kernel:[ 3.845826] ata2.00: ATAPI: QEMU DVD-ROM, 1.0.1, max UDMA/100 >03:47:01,983 INFO kernel:[ 3.847064] ata2.00: configured for MWDMA2 >03:47:01,983 NOTICE kernel:[ 3.848626] scsi 1:0:0:0: CD-ROM QEMU QEMU DVD-ROM 1.0. PQ: 0 ANSI: 5 >03:47:01,983 WARNING kernel:[ 3.850211] sr0: scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray >03:47:01,983 INFO kernel:[ 3.850233] cdrom: Uniform CD-ROM driver Revision: 3.20 >03:47:01,983 DEBUG kernel:[ 3.850460] sr 1:0:0:0: Attached scsi CD-ROM sr0 >03:47:01,983 NOTICE kernel:[ 3.850590] sr 1:0:0:0: Attached scsi generic sg0 type 5 >03:47:01,983 INFO kernel:[ 3.852857] Freeing unused kernel memory: 1352k freed >03:47:01,983 INFO kernel:[ 3.855061] Write protecting the kernel read-only data: 12288k >03:47:01,983 INFO kernel:[ 3.863494] Freeing unused kernel memory: 1712k freed >03:47:01,983 INFO kernel:[ 3.868385] Freeing unused kernel memory: 1332k freed >03:47:01,983 INFO kernel:[ 3.946942] BIOS EDD facility v0.16 2004-Jun-25, 1 devices found >03:47:01,983 INFO kernel:[ 3.995033] usb 1-1: new full-speed USB device number 2 using uhci_hcd >03:47:01,983 INFO kernel:[ 4.079235] squashfs: version 4.0 (2009/01/31) Phillip Lougher >03:47:01,983 INFO kernel:[ 4.082553] Loading iSCSI transport class v2.0-870. >03:47:01,983 NOTICE kernel:[ 4.087197] iscsi: registered transport (tcp) >03:47:01,983 INFO kernel:[ 4.089570] alua: device handler registered >03:47:01,983 INFO kernel:[ 4.091938] emc: device handler registered >03:47:01,983 INFO kernel:[ 4.094244] hp_sw: device handler registered >03:47:01,983 INFO kernel:[ 4.096544] rdac: device handler registered >03:47:01,983 INFO kernel:[ 4.113045] FDC 0 is a S82078B >03:47:01,983 INFO kernel:[ 4.123446] No iBFT detected. >03:47:01,983 INFO kernel:[ 4.125786] md: raid0 personality registered for level 0 >03:47:01,983 INFO kernel:[ 4.128732] md: raid1 personality registered for level 1 >03:47:01,983 INFO kernel:[ 4.131094] async_tx: api initialized (async) >03:47:01,983 INFO kernel:[ 4.132330] xor: measuring software checksum speed >03:47:01,983 INFO kernel:[ 4.142013] prefetch64-sse: 8808.000 MB/sec >03:47:01,983 INFO kernel:[ 4.152018] generic_sse: 9196.000 MB/sec >03:47:01,983 INFO kernel:[ 4.152020] xor: using function: generic_sse (9196.000 MB/sec) >03:47:01,983 WARNING kernel:[ 4.172022] raid6: sse2x1 4167 MB/s >03:47:01,983 WARNING kernel:[ 4.189021] raid6: sse2x2 8449 MB/s >03:47:01,983 WARNING kernel:[ 4.206017] raid6: sse2x4 10949 MB/s >03:47:01,983 WARNING kernel:[ 4.206019] raid6: using algorithm sse2x4 (10949 MB/s) >03:47:01,983 WARNING kernel:[ 4.206021] raid6: using intx1 recovery algorithm >03:47:01,983 INFO kernel:[ 4.212902] md: raid6 personality registered for level 6 >03:47:01,983 INFO kernel:[ 4.212904] md: raid5 personality registered for level 5 >03:47:01,983 INFO kernel:[ 4.212906] md: raid4 personality registered for level 4 >03:47:01,983 INFO kernel:[ 4.218920] md: raid10 personality registered for level 10 >03:47:01,983 INFO kernel:[ 4.221249] md: linear personality registered for level -1 >03:47:01,983 INFO kernel:[ 4.228787] device-mapper: multipath: version 1.5.1 loaded >03:47:01,983 INFO kernel:[ 4.231264] device-mapper: multipath round-robin: version 1.0.0 loaded >03:47:01,983 INFO kernel:[ 4.252528] RPC: Registered named UNIX socket transport module. >03:47:01,983 INFO kernel:[ 4.252531] RPC: Registered udp transport module. >03:47:01,983 INFO kernel:[ 4.252532] RPC: Registered tcp transport module. >03:47:01,983 INFO kernel:[ 4.252533] RPC: Registered tcp NFSv4.1 backchannel transport module. >03:47:01,983 INFO kernel:[ 4.282566] usb 1-1: New USB device found, idVendor=0627, idProduct=0001 >03:47:01,983 INFO kernel:[ 4.282569] usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=5 >03:47:01,983 INFO kernel:[ 4.282572] usb 1-1: Product: QEMU USB Tablet >03:47:01,983 INFO kernel:[ 4.282574] usb 1-1: Manufacturer: QEMU 1.0.1 >03:47:01,983 INFO kernel:[ 4.282575] usb 1-1: SerialNumber: 42 >03:47:01,983 INFO kernel:[ 4.317517] input: QEMU 1.0.1 QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/input/input2 >03:47:01,983 INFO kernel:[ 4.317629] hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Pointer [QEMU 1.0.1 QEMU USB Tablet] on usb-0000:00:01.2-1/input0 >03:47:01,983 DEBUG kernel:[ 4.430861] virtio-pci 0000:00:03.0: irq 42 for MSI/MSI-X >03:47:01,983 DEBUG kernel:[ 4.430886] virtio-pci 0000:00:03.0: irq 43 for MSI/MSI-X >03:47:01,983 DEBUG kernel:[ 4.430914] virtio-pci 0000:00:03.0: irq 44 for MSI/MSI-X >03:47:01,983 DEBUG kernel:[ 4.439277] virtio-pci 0000:00:06.0: irq 45 for MSI/MSI-X >03:47:01,983 DEBUG kernel:[ 4.439303] virtio-pci 0000:00:06.0: irq 46 for MSI/MSI-X >03:47:01,983 INFO kernel:[ 4.473007] [drm] Initialized drm 1.1.0 20060810 >03:47:01,983 INFO kernel:[ 4.519118] scsi2 : Virtio SCSI HBA >03:47:01,983 NOTICE kernel:[ 4.522218] scsi 2:0:0:0: Direct-Access QEMU QEMU HARDDISK 1.0. PQ: 0 ANSI: 5 >03:47:01,983 NOTICE kernel:[ 4.522291] scsi 2:0:0:3: Direct-Access QEMU QEMU HARDDISK 1.0. PQ: 0 ANSI: 5 >03:47:01,983 NOTICE kernel:[ 4.522342] scsi 2:0:0:2: Direct-Access QEMU QEMU HARDDISK 1.0. PQ: 0 ANSI: 5 >03:47:01,983 NOTICE kernel:[ 4.522415] scsi 2:0:0:1: Direct-Access QEMU QEMU HARDDISK 1.0. PQ: 0 ANSI: 5 >03:47:01,983 INFO kernel:[ 4.529578] input: ImExPS/2 Generic Explorer Mouse as /devices/platform/i8042/serio1/input/input3 >03:47:01,983 ERR kernel:[ 4.526783] [drm:qxl_pci_probe] *ERROR* qxl too old, doesn't support client_monitors_config, use xf86-video-qxl in user mode >03:47:01,983 WARNING kernel:[ 4.532471] qxl: probe of 0000:00:02.0 failed with error -22 >03:47:01,983 NOTICE kernel:[ 4.553934] sd 2:0:0:0: [sda] 24576000 512-byte logical blocks: (12.5 GB/11.7 GiB) >03:47:01,983 NOTICE kernel:[ 4.554032] sd 2:0:0:0: [sda] Write Protect is off >03:47:01,983 DEBUG kernel:[ 4.554035] sd 2:0:0:0: [sda] Mode Sense: 63 00 00 08 >03:47:01,983 NOTICE kernel:[ 4.554073] sd 2:0:0:0: [sda] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA >03:47:01,983 INFO kernel:[ 4.554562] sda: sda1 sda2 >03:47:01,983 NOTICE kernel:[ 4.554930] sd 2:0:0:0: [sda] Attached SCSI disk >03:47:01,983 NOTICE kernel:[ 4.554996] sd 2:0:0:0: Attached scsi generic sg1 type 0 >03:47:01,983 NOTICE kernel:[ 4.555926] sd 2:0:0:3: [sdb] 24576000 512-byte logical blocks: (12.5 GB/11.7 GiB) >03:47:01,983 NOTICE kernel:[ 4.556096] sd 2:0:0:3: [sdb] Write Protect is off >03:47:01,983 DEBUG kernel:[ 4.556099] sd 2:0:0:3: [sdb] Mode Sense: 63 00 00 08 >03:47:01,983 NOTICE kernel:[ 4.556135] sd 2:0:0:3: [sdb] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA >03:47:01,983 NOTICE kernel:[ 4.556522] sd 2:0:0:3: Attached scsi generic sg2 type 0 >03:47:01,983 NOTICE kernel:[ 4.556774] sd 2:0:0:2: [sdc] 24576000 512-byte logical blocks: (12.5 GB/11.7 GiB) >03:47:01,983 NOTICE kernel:[ 4.556989] sd 2:0:0:2: [sdc] Write Protect is off >03:47:01,983 DEBUG kernel:[ 4.556999] sd 2:0:0:2: [sdc] Mode Sense: 63 00 00 08 >03:47:01,983 NOTICE kernel:[ 4.557176] sd 2:0:0:2: [sdc] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA >03:47:01,983 INFO kernel:[ 4.557946] sdb: sdb1 sdb2 >03:47:01,983 INFO kernel:[ 4.558492] sdc: sdc1 sdc2 >03:47:01,983 NOTICE kernel:[ 4.559856] sd 2:0:0:3: [sdb] Attached SCSI disk >03:47:01,983 NOTICE kernel:[ 4.559872] sd 2:0:0:2: [sdc] Attached SCSI disk >03:47:01,983 NOTICE kernel:[ 4.560001] sd 2:0:0:2: Attached scsi generic sg3 type 0 >03:47:01,983 NOTICE kernel:[ 4.560459] sd 2:0:0:1: [sdd] 24576000 512-byte logical blocks: (12.5 GB/11.7 GiB) >03:47:01,983 NOTICE kernel:[ 4.560715] sd 2:0:0:1: [sdd] Write Protect is off >03:47:01,983 DEBUG kernel:[ 4.560721] sd 2:0:0:1: [sdd] Mode Sense: 63 00 00 08 >03:47:01,983 NOTICE kernel:[ 4.560818] sd 2:0:0:1: [sdd] Write cache: disabled, read cache: enabled, doesn't support DPO or FUA >03:47:01,983 INFO kernel:[ 4.561765] sdd: sdd1 sdd2 >03:47:01,983 NOTICE kernel:[ 4.562648] sd 2:0:0:1: Attached scsi generic sg4 type 0 >03:47:01,983 NOTICE kernel:[ 4.562691] sd 2:0:0:1: [sdd] Attached SCSI disk >03:47:01,983 INFO kernel:[ 4.610042] tsc: Refined TSC clocksource calibration: 3311.135 MHz >03:47:01,983 INFO kernel:[ 5.124076] Btrfs loaded >03:47:01,983 INFO kernel:[ 5.124433] device label fedora_dhcppc0 devid 1 transid 122 /dev/sda2 >03:47:01,983 INFO kernel:[ 5.271876] device label fedora_dhcppc0 devid 2 transid 122 /dev/sdb2 >03:47:01,983 INFO kernel:[ 5.275414] device label fedora_dhcppc0 devid 4 transid 122 /dev/sdd2 >03:47:01,983 INFO kernel:[ 5.366981] device label fedora_dhcppc0 devid 3 transid 122 /dev/sdc2 >03:47:01,983 DEBUG kernel:[ 5.619430] ISO 9660 Extensions: Microsoft Joliet Level 3 >03:47:01,983 DEBUG kernel:[ 5.648504] ISO 9660 Extensions: RRIP_1991A >03:47:01,983 INFO kernel:[ 5.762625] device label fedora_dhcppc0 devid 2 transid 122 /dev/sdb2 >03:47:01,983 INFO kernel:[ 5.784913] device label fedora_dhcppc0 devid 1 transid 122 /dev/sda2 >03:47:01,983 INFO kernel:[ 5.787832] device label fedora_dhcppc0 devid 4 transid 122 /dev/sdd2 >03:47:01,983 INFO kernel:[ 5.792672] device label fedora_dhcppc0 devid 3 transid 122 /dev/sdc2 >03:47:01,983 INFO kernel:[ 5.944157] bio: create slab <bio-1> at 1 >03:47:01,983 INFO kernel:[ 6.055787] EXT4-fs (dm-0): mounted filesystem with ordered data mode. Opts: (null) >03:47:01,983 DEBUG kernel:[ 7.368227] SELinux: 2048 avtab hash slots, 96983 rules. >03:47:01,983 DEBUG kernel:[ 7.387118] SELinux: 2048 avtab hash slots, 96983 rules. >03:47:01,983 DEBUG kernel:[ 7.650648] SELinux: 8 users, 82 roles, 4432 types, 250 bools, 1 sens, 1024 cats >03:47:01,983 DEBUG kernel:[ 7.650653] SELinux: 83 classes, 96983 rules >03:47:01,983 DEBUG kernel:[ 7.656393] SELinux: Completing initialization. >03:47:01,983 DEBUG kernel:[ 7.656396] SELinux: Setting up existing superblocks. >03:47:01,983 DEBUG kernel:[ 7.656402] SELinux: initialized (dev sysfs, type sysfs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.656407] SELinux: initialized (dev rootfs, type rootfs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.656418] SELinux: initialized (dev bdev, type bdev), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.656422] SELinux: initialized (dev proc, type proc), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.656430] SELinux: initialized (dev tmpfs, type tmpfs), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 7.656442] SELinux: initialized (dev devtmpfs, type devtmpfs), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 7.657390] SELinux: initialized (dev sockfs, type sockfs), uses task SIDs >03:47:01,983 DEBUG kernel:[ 7.657395] SELinux: initialized (dev debugfs, type debugfs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658028] SELinux: initialized (dev pipefs, type pipefs), uses task SIDs >03:47:01,983 DEBUG kernel:[ 7.658033] SELinux: initialized (dev anon_inodefs, type anon_inodefs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658037] SELinux: initialized (dev devpts, type devpts), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 7.658052] SELinux: initialized (dev hugetlbfs, type hugetlbfs), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 7.658059] SELinux: initialized (dev mqueue, type mqueue), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 7.658065] SELinux: initialized (dev selinuxfs, type selinuxfs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658075] SELinux: initialized (dev sysfs, type sysfs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658295] SELinux: initialized (dev securityfs, type securityfs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658297] SELinux: initialized (dev tmpfs, type tmpfs), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 7.658302] SELinux: initialized (dev tmpfs, type tmpfs), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 7.658437] SELinux: initialized (dev tmpfs, type tmpfs), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 7.658474] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658482] SELinux: initialized (dev pstore, type pstore), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658484] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658489] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658510] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658516] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658519] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658522] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658525] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658531] SELinux: initialized (dev cgroup, type cgroup), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658537] SELinux: initialized (dev rpc_pipefs, type rpc_pipefs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658542] SELinux: initialized (dev sr0, type iso9660), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 7.658561] SELinux: initialized (dev loop0, type squashfs), uses xattr >03:47:01,983 DEBUG kernel:[ 7.658582] SELinux: initialized (dev dm-0, type ext4), uses xattr >03:47:01,983 NOTICE kernel:[ 7.660460] type=1403 audit(1368244017.660:2): policy loaded auid=4294967295 ses=4294967295 >03:47:01,983 DEBUG kernel:[ 8.560703] SELinux: initialized (dev autofs, type autofs), uses genfs_contexts >03:47:01,983 DEBUG kernel:[ 8.956379] SELinux: initialized (dev tmpfs, type tmpfs), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 9.114336] SELinux: initialized (dev hugetlbfs, type hugetlbfs), uses transition SIDs >03:47:01,983 DEBUG kernel:[ 9.228709] SELinux: initialized (dev configfs, type configfs), uses genfs_contexts >03:47:01,983 INFO kernel:[ 9.679442] piix4_smbus 0000:00:01.3: SMBus Host Controller at 0xb100, revision 0 >03:47:01,983 WARNING kernel:[ 9.878625] microcode: AMD CPU family 0x6 not supported >03:47:01,983 INFO kernel:[ 10.156493] device label fedora_dhcppc0 devid 3 transid 122 /dev/sdc2 >03:47:01,983 INFO kernel:[ 10.159484] device label fedora_dhcppc0 devid 4 transid 122 /dev/sdd2 >03:47:01,983 INFO kernel:[ 10.212233] device label fedora_dhcppc0 devid 2 transid 122 /dev/sdb2 >03:47:01,983 INFO kernel:[ 10.215039] device label fedora_dhcppc0 devid 1 transid 122 /dev/sda2 >03:47:01,983 INFO kernel:[ 10.222762] md: bind<sda1> >03:47:01,983 NOTICE kernel:[ 10.226719] type=1400 audit(1368244020.225:3): avc: denied { read } for pid=623 comm="mdadm" name="md127" dev="devtmpfs" ino=14272 scontext=system_u:system_r:mdadm_t:s0-s0:c0.c1023 tcontext=system_u:object_r:device_t:s0 tclass=blk_file >03:47:01,983 NOTICE kernel:[ 10.226741] type=1400 audit(1368244020.225:4): avc: denied { open } for pid=623 comm="mdadm" path="/dev/md127" dev="devtmpfs" ino=14272 scontext=system_u:system_r:mdadm_t:s0-s0:c0.c1023 tcontext=system_u:object_r:device_t:s0 tclass=blk_file >03:47:01,983 NOTICE kernel:[ 10.226766] type=1400 audit(1368244020.225:5): avc: denied { ioctl } for pid=623 comm="mdadm" path="/dev/md127" dev="devtmpfs" ino=14272 scontext=system_u:system_r:mdadm_t:s0-s0:c0.c1023 tcontext=system_u:object_r:device_t:s0 tclass=blk_file >03:47:01,984 INFO kernel:[ 10.228351] md: bind<sdb1> >03:47:01,984 INFO kernel:[ 10.231708] md: bind<sdc1> >03:47:01,984 INFO kernel:[ 10.235346] md: bind<sdd1> >03:47:01,984 INFO kernel:[ 10.247538] md/raid1:md127: active with 4 out of 4 mirrors >03:47:01,984 INFO kernel:[ 10.247554] md127: detected capacity change from 0 to 2139029504 >03:47:01,984 INFO kernel:[ 10.261765] md127: unknown partition table >03:47:01,984 NOTICE kernel:[ 11.029445] type=1400 audit(1368244021.028:6): avc: denied { create } for pid=642 comm="systemd-tmpfile" name="tmp" scontext=system_u:system_r:systemd_tmpfiles_t:s0 tcontext=system_u:object_r:var_t:s0 tclass=dir >03:47:01,984 NOTICE kernel:[ 11.029513] type=1400 audit(1368244021.028:7): avc: denied { setattr } for pid=642 comm="systemd-tmpfile" name="tmp" dev="dm-0" ino=20874 scontext=system_u:system_r:systemd_tmpfiles_t:s0 tcontext=system_u:object_r:var_t:s0 tclass=dir >03:47:01,984 NOTICE kernel:[ 11.029551] type=1400 audit(1368244021.028:8): avc: denied { relabelfrom } for pid=642 comm="systemd-tmpfile" name="tmp" dev="dm-0" ino=20874 scontext=system_u:system_r:systemd_tmpfiles_t:s0 tcontext=system_u:object_r:var_t:s0 tclass=dir >03:47:02,000 NOTICE kernel:[ 11.998088] type=1400 audit(1368244021.997:9): avc: denied { execute } for pid=665 comm="bash" name="hostname" dev="dm-0" ino=51617 scontext=system_u:system_r:getty_t:s0 tcontext=unconfined_u:object_r:hostname_exec_t:s0 tclass=file >03:47:02,000 NOTICE kernel:[ 11.998096] type=1400 audit(1368244021.997:10): avc: denied { read open } for pid=665 comm="bash" path="/usr/bin/hostname" dev="dm-0" ino=51617 scontext=system_u:system_r:getty_t:s0 tcontext=unconfined_u:object_r:hostname_exec_t:s0 tclass=file >03:47:02,000 NOTICE kernel:[ 11.998113] type=1400 audit(1368244021.997:11): avc: denied { execute_no_trans } for pid=665 comm="bash" path="/usr/bin/hostname" dev="dm-0" ino=51617 scontext=system_u:system_r:getty_t:s0 tcontext=unconfined_u:object_r:hostname_exec_t:s0 tclass=file >03:47:02,044 NOTICE kernel:[ 12.044251] type=1400 audit(1368244022.043:12): avc: denied { read } for pid=656 comm="bash" name=".profile" dev="dm-0" ino=33906 scontext=system_u:system_r:getty_t:s0 tcontext=unconfined_u:object_r:admin_home_t:s0 tclass=file >03:47:03,906 WARNING systemd: Cannot add dependency job for unit lvm2-monitor.service, ignoring: Unit dm-event.socket failed to load: No such file or directory. See system logs and 'systemctl status dm-event.socket' for details. >03:47:03,906 INFO systemd: Starting D-Bus System Message Bus... >03:47:03,906 INFO systemd: Started D-Bus System Message Bus. >03:47:04,091 INFO systemd: Started firewalld - dynamic firewall daemon. >03:47:04,091 INFO systemd: Starting Network Manager... >03:47:04,445 ERR firewalld: 2013-05-11 03:47:04 ERROR: ebtables not usable, disabling ethernet bridge firewall. >03:47:04,450 CRIT firewalld: 2013-05-11 03:47:04 FATAL ERROR: No IPv4 and IPv6 firewall. >03:47:04,451 ERR firewalld: 2013-05-11 03:47:04 ERROR: Raising SystemExit in run_server >03:47:04,453 INFO NetworkManager: <info> NetworkManager (version 0.9.8.1-1.git20130327.fc19) is starting... >03:47:04,454 INFO NetworkManager: <info> Read config file /etc/NetworkManager/NetworkManager.conf >03:47:04,454 INFO NetworkManager: <info> WEXT support is enabled >03:47:04,486 INFO dbus-daemon: dbus[666]: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' >03:47:04,504 NOTICE dbus: [system] Activating via systemd: service name='org.freedesktop.PolicyKit1' unit='polkit.service' >03:47:04,504 WARNING systemd: Cannot add dependency job for unit lvm2-monitor.service, ignoring: Unit dm-event.socket failed to load: No such file or directory. See system logs and 'systemctl status dm-event.socket' for details. >03:47:04,504 INFO systemd: Starting Authorization Manager... >03:47:04,515 INFO polkitd: Started polkitd version 0.110 >03:47:04,534 INFO dbus-daemon: dbus[666]: [system] Successfully activated service 'org.freedesktop.PolicyKit1' >03:47:04,534 NOTICE dbus: [system] Successfully activated service 'org.freedesktop.PolicyKit1' >03:47:04,536 INFO systemd: Started Authorization Manager. >03:47:04,597 WARNING NetworkManager: ifcfg-rh: Could not get hostname: failed to read /etc/sysconfig/network >03:47:04,597 NOTICE NetworkManager: ifcfg-rh: Acquired D-Bus service com.redhat.ifcfgrh1 >03:47:04,598 INFO NetworkManager: <info> Loaded plugin ifcfg-rh: (c) 2007 - 2010 Red Hat, Inc. To report bugs please use the NetworkManager mailing list. >03:47:04,598 INFO NetworkManager: <info> Loaded plugin keyfile: (c) 2007 - 2010 Red Hat, Inc. To report bugs please use the NetworkManager mailing list. >03:47:04,600 NOTICE NetworkManager: ifcfg-rh: parsing /etc/sysconfig/network-scripts/ifcfg-lo ... >03:47:04,612 INFO dbus-daemon: dbus[666]: [system] Activating via systemd: service name='org.freedesktop.login1' unit='dbus-org.freedesktop.login1.service' >03:47:04,612 NOTICE dbus: [system] Activating via systemd: service name='org.freedesktop.login1' unit='dbus-org.freedesktop.login1.service' >03:47:04,614 WARNING systemd: Cannot add dependency job for unit lvm2-monitor.service, ignoring: Unit dm-event.socket failed to load: No such file or directory. See system logs and 'systemctl status dm-event.socket' for details. >03:47:04,614 INFO systemd: Starting Login Service... >03:47:04,620 INFO dbus-daemon: dbus[666]: [system] Successfully activated service 'org.freedesktop.login1' >03:47:04,620 NOTICE dbus: [system] Successfully activated service 'org.freedesktop.login1' >03:47:04,620 INFO systemd: Started Login Service. >03:47:04,622 INFO systemd-logind: Watching system buttons on /dev/input/event0 (Power Button) >03:47:04,626 INFO systemd-logind: New seat seat0. >03:47:04,629 INFO NetworkManager: <info> monitoring kernel firmware directory '/lib/firmware'. >03:47:04,636 INFO systemd: Started Network Manager. >03:47:04,636 INFO systemd: Starting Anaconda System Services. >03:47:04,639 INFO systemd: Reached target Anaconda System Services. >03:47:04,644 INFO systemd: Starting Anaconda... >03:47:04,644 INFO systemd: Starting Network Manager Wait Online... >03:47:04,645 INFO NetworkManager: <info> WiFi enabled by radio killswitch; enabled by state file >03:47:04,646 INFO NetworkManager: <info> WWAN enabled by radio killswitch; enabled by state file >03:47:04,647 INFO NetworkManager: <info> WiMAX enabled by radio killswitch; enabled by state file >03:47:04,648 INFO NetworkManager: <info> Networking is enabled by state file >03:47:04,650 WARNING NetworkManager: <warn> failed to allocate link cache: (-10) Operation not supported >03:47:04,651 INFO NetworkManager: <info> (eth0): carrier is OFF >03:47:04,653 INFO NetworkManager: <info> (eth0): new Ethernet device (driver: 'virtio_net' ifindex: 2) >03:47:04,653 INFO NetworkManager: <info> (eth0): exported as /org/freedesktop/NetworkManager/Devices/0 >03:47:04,653 INFO NetworkManager: <info> (eth0): device state change: unmanaged -> unavailable (reason 'managed') [10 20 2] >03:47:04,653 INFO NetworkManager: <info> (eth0): bringing up device. >03:47:04,654 INFO NetworkManager: <info> (eth0): carrier now ON (device state 20) >03:47:04,654 INFO NetworkManager: <info> (eth0): preparing device. >03:47:04,655 INFO NetworkManager: <info> (eth0): deactivating device (reason 'managed') [2] >03:47:04,655 INFO NetworkManager: <info> Added default wired connection 'Wired connection 1' for /sys/devices/pci0000:00/0000:00:03.0/virtio0/net/eth0 >03:47:04,686 WARNING NetworkManager: <warn> /sys/devices/virtual/net/lo: couldn't determine device driver; ignoring... >03:47:04,687 WARNING NetworkManager: <warn> /sys/devices/virtual/net/lo: couldn't determine device driver; ignoring... >03:47:04,695 INFO NetworkManager: <info> (eth0): device state change: unavailable -> disconnected (reason 'none') [20 30 0] >03:47:04,695 INFO NetworkManager: <info> Auto-activating connection 'Wired connection 1'. >03:47:04,699 INFO NetworkManager: <info> Activation (eth0) starting connection 'Wired connection 1' >03:47:04,700 INFO NetworkManager: <info> (eth0): device state change: disconnected -> prepare (reason 'none') [30 40 0] >03:47:04,700 INFO NetworkManager: <info> Activation (eth0) Stage 1 of 5 (Device Prepare) scheduled... >03:47:04,703 INFO NetworkManager: <info> Activation (eth0) Stage 1 of 5 (Device Prepare) started... >03:47:04,704 INFO NetworkManager: <info> Activation (eth0) Stage 2 of 5 (Device Configure) scheduled... >03:47:04,704 INFO NetworkManager: <info> Activation (eth0) Stage 1 of 5 (Device Prepare) complete. >03:47:04,705 INFO NetworkManager: <info> Activation (eth0) Stage 2 of 5 (Device Configure) starting... >03:47:04,708 INFO NetworkManager: <info> (eth0): device state change: prepare -> config (reason 'none') [40 50 0] >03:47:04,719 INFO NetworkManager: <info> Activation (eth0) Stage 2 of 5 (Device Configure) successful. >03:47:04,720 INFO NetworkManager: <info> Activation (eth0) Stage 3 of 5 (IP Configure Start) scheduled. >03:47:04,721 INFO NetworkManager: <info> Activation (eth0) Stage 2 of 5 (Device Configure) complete. >03:47:04,722 INFO NetworkManager: <info> Activation (eth0) Stage 3 of 5 (IP Configure Start) started... >03:47:04,725 INFO NetworkManager: <info> (eth0): device state change: config -> ip-config (reason 'none') [50 70 0] >03:47:04,725 INFO NetworkManager: <info> Activation (eth0) Beginning DHCPv4 transaction (timeout in 45 seconds) >03:47:04,743 INFO systemd: Started Anaconda. >03:47:04,754 INFO systemd: Starting Anaconda Text Console... >03:47:04,768 INFO NetworkManager: <info> dhclient started with pid 688 >03:47:04,770 INFO NetworkManager: <info> Activation (eth0) Beginning IP6 addrconf. >03:47:04,773 INFO NetworkManager: <info> Activation (eth0) Stage 3 of 5 (IP Configure Start) complete. >03:47:04,828 INFO systemd: Started Anaconda Text Console. >03:47:05,205 INFO dhclient: Internet Systems Consortium DHCP Client 4.2.5 >03:47:05,205 INFO dhclient: Copyright 2004-2013 Internet Systems Consortium. >03:47:05,205 INFO dhclient: All rights reserved. >03:47:05,206 INFO dhclient: For info, please visit https://www.isc.org/software/dhcp/ >03:47:05,206 INFO dhclient: >03:47:05,220 INFO NetworkManager: <info> (eth0): DHCPv4 state changed nbi -> preinit >03:47:05,222 INFO dhclient: Listening on LPF/eth0/52:54:00:a0:38:a0 >03:47:05,222 INFO dhclient: Sending on LPF/eth0/52:54:00:a0:38:a0 >03:47:05,223 INFO dhclient: Sending on Socket/fallback >03:47:05,223 INFO dhclient: DHCPDISCOVER on eth0 to 255.255.255.255 port 67 interval 8 (xid=0x683b6ee6) >03:47:05,256 INFO dhclient: DHCPREQUEST on eth0 to 255.255.255.255 port 67 (xid=0x683b6ee6) >03:47:05,257 INFO dhclient: DHCPOFFER from 192.168.100.1 >03:47:05,315 INFO dhclient: DHCPACK from 192.168.100.1 (xid=0x683b6ee6) >03:47:05,327 INFO NetworkManager: <info> (eth0): DHCPv4 state changed preinit -> bound >03:47:05,327 INFO NetworkManager: <info> address 192.168.100.106 >03:47:05,327 INFO NetworkManager: <info> prefix 24 (255.255.255.0) >03:47:05,328 INFO NetworkManager: <info> gateway 192.168.100.1 >03:47:05,328 INFO NetworkManager: <info> nameserver '192.168.100.1' >03:47:05,328 INFO NetworkManager: <info> nameserver '198.41.0.4' >03:47:05,328 INFO NetworkManager: <info> Activation (eth0) Stage 5 of 5 (IPv4 Configure Commit) scheduled... >03:47:05,330 INFO NetworkManager: <info> Activation (eth0) Stage 5 of 5 (IPv4 Commit) started... >03:47:05,331 INFO dhclient: bound to 192.168.100.106 -- renewal in 117289 seconds. >03:47:06,332 INFO NetworkManager: <info> (eth0): device state change: ip-config -> secondaries (reason 'none') [70 90 0] >03:47:06,333 INFO NetworkManager: <info> Activation (eth0) Stage 5 of 5 (IPv4 Commit) complete. >03:47:06,333 INFO NetworkManager: <info> (eth0): device state change: secondaries -> activated (reason 'none') [90 100 0] >03:47:06,349 INFO systemd: Started Network Manager Wait Online. >03:47:06,349 INFO systemd: Starting Network. >03:47:06,356 INFO systemd: Reached target Network. >03:47:06,356 INFO systemd: Starting Login and scanning of iSCSI devices... >03:47:06,357 INFO NetworkManager: <info> Policy set 'Wired connection 1' (eth0) as default for IPv4 routing and DNS. >03:47:06,403 INFO NetworkManager: <info> Activation (eth0) successful, device activated. >03:47:06,407 INFO dbus-daemon: dbus[666]: [system] Activating service name='org.freedesktop.nm_dispatcher' (using servicehelper) >03:47:06,407 NOTICE dbus: [system] Activating service name='org.freedesktop.nm_dispatcher' (using servicehelper) >03:47:06,432 INFO iscsiadm: iscsiadm: No records found >03:47:06,434 NOTICE systemd: iscsi.service: main process exited, code=exited, status=21/n/a >03:47:06,434 ERR systemd: Failed to start Login and scanning of iSCSI devices. >03:47:06,435 INFO systemd: Startup finished in 3.871s (kernel) + 3.018s (initrd) + 9.545s (userspace) = 16.435s. >03:47:06,435 NOTICE systemd: Unit iscsi.service entered failed state. >03:47:06,448 INFO dbus-daemon: dbus[666]: [system] Successfully activated service 'org.freedesktop.nm_dispatcher' >03:47:06,448 NOTICE dbus: [system] Successfully activated service 'org.freedesktop.nm_dispatcher' >03:47:07,232 WARNING kernel:[ 17.232648] audit_printk_skb: 27 callbacks suppressed >03:47:07,232 NOTICE kernel:[ 17.232652] type=1400 audit(1368244027.231:22): avc: denied { read write } for pid=732 comm="auditd" path="/dev/mapper/control" dev="devtmpfs" ino=8112 scontext=system_u:system_r:auditd_t:s0 tcontext=system_u:object_r:lvm_control_t:s0 tclass=chr_file >03:47:07,451 INFO kernel:[ 17.451747] device label fedora_dhcppc0 devid 2 transid 122 /dev/sdb2 >03:47:07,454 INFO kernel:[ 17.454299] device label fedora_dhcppc0 devid 4 transid 122 /dev/sdd2 >03:47:07,455 INFO kernel:[ 17.455765] device label fedora_dhcppc0 devid 3 transid 122 /dev/sdc2 >03:47:07,468 INFO kernel:[ 17.468605] device label fedora_dhcppc0 devid 1 transid 122 /dev/sda2 >03:47:11,335 NOTICE kernel:[ 21.335640] type=1400 audit(1368244031.334:23): avc: denied { read write } for pid=770 comm="dbus-daemon" path="/dev/pts/0" dev="devpts" ino=3 scontext=system_u:system_r:system_dbusd_t:s0-s0:c0.c1023 tcontext=system_u:object_r:devpts_t:s0 tclass=chr_file >03:47:11,990 NOTICE kernel:[ 21.990238] type=1400 audit(1368244031.989:24): avc: denied { open } for pid=662 comm="in:imfile" path="/tmp/X.log" dev="tmpfs" ino=16658 scontext=system_u:system_r:syslogd_t:s0 tcontext=system_u:object_r:xdm_tmp_t:s0 tclass=file >03:47:12,591 NOTICE NetworkManager: ifcfg-rh: read connection 'eth0' >03:47:12,596 INFO NetworkManager: <info> (eth0): device state change: activated -> disconnected (reason 'connection-removed') [100 30 38] >03:47:12,596 INFO NetworkManager: <info> (eth0): deactivating device (reason 'connection-removed') [38] >03:47:12,598 INFO NetworkManager: <info> (eth0): canceled DHCP transaction, DHCP client pid 688 >03:47:12,603 INFO NetworkManager: <info> Setting system hostname to 'localhost.localdomain' (no default device) >03:47:12,609 INFO NetworkManager: <info> Saved default wired connection 'eth0' to persistent storage >03:47:12,610 INFO NetworkManager: <info> Auto-activating connection 'eth0'. >03:47:12,611 INFO NetworkManager: <info> Activation (eth0) starting connection 'eth0' >03:47:12,611 INFO NetworkManager: <info> (eth0): device state change: disconnected -> prepare (reason 'none') [30 40 0] >03:47:12,611 INFO NetworkManager: <info> Activation (eth0) Stage 1 of 5 (Device Prepare) scheduled... >03:47:12,612 INFO NetworkManager: <info> Activation (eth0) Stage 1 of 5 (Device Prepare) started... >03:47:12,612 INFO NetworkManager: <info> Activation (eth0) Stage 2 of 5 (Device Configure) scheduled... >03:47:12,613 INFO NetworkManager: <info> Activation (eth0) Stage 1 of 5 (Device Prepare) complete. >03:47:12,614 INFO NetworkManager: <info> Activation (eth0) Stage 2 of 5 (Device Configure) starting... >03:47:12,615 INFO NetworkManager: <info> (eth0): device state change: prepare -> config (reason 'none') [40 50 0] >03:47:12,615 INFO NetworkManager: <info> Activation (eth0) Stage 2 of 5 (Device Configure) successful. >03:47:12,616 INFO NetworkManager: <info> Activation (eth0) Stage 3 of 5 (IP Configure Start) scheduled. >03:47:12,617 INFO NetworkManager: <info> Activation (eth0) Stage 2 of 5 (Device Configure) complete. >03:47:12,620 INFO NetworkManager: <info> Activation (eth0) Stage 3 of 5 (IP Configure Start) started... >03:47:12,620 INFO NetworkManager: <info> (eth0): device state change: config -> ip-config (reason 'none') [50 70 0] >03:47:12,620 INFO NetworkManager: <info> Activation (eth0) Beginning DHCPv4 transaction (timeout in 45 seconds) >03:47:12,624 INFO NetworkManager: <info> dhclient started with pid 789 >03:47:12,624 INFO NetworkManager: <info> Activation (eth0) Beginning IP6 addrconf. >03:47:12,625 INFO NetworkManager: <info> Activation (eth0) Stage 3 of 5 (IP Configure Start) complete. >03:47:12,719 INFO dhclient: Internet Systems Consortium DHCP Client 4.2.5 >03:47:12,719 INFO dhclient: Copyright 2004-2013 Internet Systems Consortium. >03:47:12,719 INFO dhclient: All rights reserved. >03:47:12,719 INFO dhclient: For info, please visit https://www.isc.org/software/dhcp/ >03:47:12,720 INFO dhclient: >03:47:12,734 INFO dhclient: Listening on LPF/eth0/52:54:00:a0:38:a0 >03:47:12,734 INFO dhclient: Sending on LPF/eth0/52:54:00:a0:38:a0 >03:47:12,735 INFO dhclient: Sending on Socket/fallback >03:47:12,735 INFO dhclient: DHCPREQUEST on eth0 to 255.255.255.255 port 67 (xid=0x27ab47e) >03:47:12,736 INFO NetworkManager: <info> (eth0): DHCPv4 state changed nbi -> preinit >03:47:12,770 INFO dhclient: DHCPACK from 192.168.100.1 (xid=0x27ab47e) >03:47:12,786 NOTICE NetworkManager: ifcfg-rh: updating /etc/sysconfig/network-scripts/ifcfg-eth0 >03:47:12,807 INFO dhclient: bound to 192.168.100.106 -- renewal in 101739 seconds. >03:47:12,808 INFO NetworkManager: <info> (eth0): DHCPv4 state changed preinit -> reboot >03:47:12,808 INFO NetworkManager: <info> address 192.168.100.106 >03:47:12,808 INFO NetworkManager: <info> prefix 24 (255.255.255.0) >03:47:12,809 INFO NetworkManager: <info> gateway 192.168.100.1 >03:47:12,809 INFO NetworkManager: <info> nameserver '192.168.100.1' >03:47:12,809 INFO NetworkManager: <info> nameserver '198.41.0.4' >03:47:12,809 INFO NetworkManager: <info> Activation (eth0) Stage 5 of 5 (IPv4 Configure Commit) scheduled... >03:47:12,810 INFO NetworkManager: <info> Activation (eth0) Stage 5 of 5 (IPv4 Commit) started... >03:47:13,117 NOTICE kernel:[ 23.117158] type=1400 audit(1368244033.116:25): avc: denied { read write } for pid=825 comm="mdadm" path="/dev/mapper/control" dev="devtmpfs" ino=8112 scontext=system_u:system_r:mdadm_t:s0 tcontext=system_u:object_r:lvm_control_t:s0 tclass=chr_file >03:47:13,407 INFO kernel:[ 23.407290] device label fedora_dhcppc0 devid 1 transid 122 /dev/sda2 >03:47:13,455 INFO kernel:[ 23.455465] device label fedora_dhcppc0 devid 1 transid 122 /dev/sda2 >03:47:13,458 INFO kernel:[ 23.458191] btrfs: disk space caching is enabled >03:47:13,482 DEBUG kernel:[ 23.482867] SELinux: initialized (dev sdc2, type btrfs), uses xattr >03:47:13,813 INFO NetworkManager: <info> (eth0): device state change: ip-config -> secondaries (reason 'none') [70 90 0] >03:47:13,816 INFO NetworkManager: <info> Activation (eth0) Stage 5 of 5 (IPv4 Commit) complete. >03:47:13,830 INFO NetworkManager: <info> (eth0): device state change: secondaries -> activated (reason 'none') [90 100 0] >03:47:13,831 INFO NetworkManager: <info> Policy set 'eth0' (eth0) as default for IPv4 routing and DNS. >03:47:13,832 INFO NetworkManager: <info> Activation (eth0) successful, device activated. >03:47:15,664 WARNING systemd: Cannot add dependency job for unit lvm2-monitor.service, ignoring: Unit dm-event.socket failed to load: No such file or directory. See system logs and 'systemctl status dm-event.socket' for details. >03:47:15,664 INFO systemd: Starting NTP client/server... >03:47:15,747 INFO chrony-helper: tr: write error: Broken pipe >03:47:15,747 INFO chrony-helper: tr: write error >03:47:15,798 INFO chronyd: chronyd version 1.27 starting >03:47:15,799 INFO chronyd: Linux kernel major=3 minor=9 patch=0 >03:47:15,799 INFO chronyd: hz=100 shift_hz=7 freq_scale=1.00000000 nominal_tick=10000 slew_delta_tick=833 max_tick_bias=1000 shift_pll=2 >03:47:15,811 INFO systemd: Started NTP client/server. >03:47:15,811 INFO systemd: Starting Wait for chrony to synchronize system clock... >03:47:16,538 INFO kernel:[ 26.538855] device label fedora_dhcppc0 devid 4 transid 124 /dev/sdd2 >03:47:16,552 INFO kernel:[ 26.552629] device label fedora_dhcppc0 devid 4 transid 124 /dev/sdd2 >03:47:16,901 NOTICE kernel:[ 26.901828] type=1400 audit(1368244036.899:26): avc: denied { read write } for pid=909 comm="mdadm" path="/dev/mapper/control" dev="devtmpfs" ino=8112 scontext=system_u:system_r:mdadm_t:s0 tcontext=system_u:object_r:lvm_control_t:s0 tclass=chr_file >03:47:17,041 INFO kernel:[ 27.041457] device label fedora_dhcppc0 devid 3 transid 124 /dev/sdc2 >03:47:17,046 INFO kernel:[ 27.046095] device label fedora_dhcppc0 devid 3 transid 124 /dev/sdc2 >03:47:17,788 INFO kernel:[ 27.788220] device label fedora_dhcppc0 devid 2 transid 124 /dev/sdb2 >03:47:17,795 INFO kernel:[ 27.795899] device label fedora_dhcppc0 devid 2 transid 124 /dev/sdb2 >03:47:18,530 WARNING kernel:[ 28.530197] md: md127 still in use. >03:47:18,730 INFO kernel:[ 28.730373] md127: detected capacity change from 2139029504 to 0 >03:47:18,730 INFO kernel:[ 28.730470] md: md127 stopped. >03:47:18,730 INFO kernel:[ 28.730476] md: unbind<sdd1> >03:47:18,730 INFO kernel:[ 28.730479] md: export_rdev(sdd1) >03:47:18,730 INFO kernel:[ 28.730672] md: unbind<sdc1> >03:47:18,730 INFO kernel:[ 28.730676] md: export_rdev(sdc1) >03:47:18,730 INFO kernel:[ 28.730797] md: unbind<sdb1> >03:47:18,730 INFO kernel:[ 28.730800] md: export_rdev(sdb1) >03:47:18,730 INFO kernel:[ 28.730916] md: unbind<sda1> >03:47:18,730 INFO kernel:[ 28.730919] md: export_rdev(sda1) >03:47:19,989 INFO kernel:[ 29.989841] device label fedora_dhcppc0 devid 1 transid 124 /dev/sda2 >03:47:19,991 INFO kernel:[ 29.991849] btrfs: disk space caching is enabled >03:47:20,079 DEBUG kernel:[ 30.079832] SELinux: initialized (dev sdc2, type btrfs), uses xattr >03:47:20,574 INFO kernel:[ 30.574837] device label fedora_dhcppc0 devid 1 transid 124 /dev/sda2 >03:47:20,576 INFO kernel:[ 30.576135] btrfs: disk space caching is enabled >03:47:20,650 DEBUG kernel:[ 30.650191] SELinux: initialized (dev sdc2, type btrfs), uses xattr >03:47:22,700 INFO chronyd: Selected source 200.58.118.148 >03:47:22,700 WARNING chronyd: System clock wrong by 45.375270 seconds, adjustment started >03:47:33,001 INFO NetworkManager: <info> (eth0): IP6 addrconf timed out or failed. >03:47:33,002 INFO NetworkManager: <info> Activation (eth0) Stage 4 of 5 (IPv6 Configure Timeout) scheduled... >03:47:33,002 INFO NetworkManager: <info> Activation (eth0) Stage 4 of 5 (IPv6 Configure Timeout) started... >03:47:33,002 INFO NetworkManager: <info> Activation (eth0) Stage 4 of 5 (IPv6 Configure Timeout) complete. >03:47:36,025 NOTICE kernel:[ 46.025514] type=1400 audit(1368244056.023:27): avc: denied { read write } for pid=1036 comm="ntpdate" path="/dev/mapper/control" dev="devtmpfs" ino=8112 scontext=system_u:system_r:ntpd_t:s0 tcontext=system_u:object_r:lvm_control_t:s0 tclass=chr_file >03:47:36,025 NOTICE kernel:[ 46.025959] type=1400 audit(1368244056.023:28): avc: denied { read } for pid=1036 comm="ntpdate" path="/proc/685/mounts" dev="proc" ino=16173 scontext=system_u:system_r:ntpd_t:s0 tcontext=system_u:system_r:initrc_t:s0 tclass=file >03:47:36,396 INFO kernel:[ 46.396273] SGI XFS with ACLs, security attributes, large block/inode numbers, no debug enabled >03:51:04,286 NOTICE kernel:[ 254.285729] type=1400 audit(1368244264.283:29): avc: denied { read write } for pid=1068 comm="hwclock" path="/dev/pts/0" dev="devpts" ino=3 scontext=system_u:system_r:hwclock_t:s0 tcontext=system_u:object_r:devpts_t:s0 tclass=chr_file >03:51:04,286 NOTICE kernel:[ 254.286495] type=1400 audit(1368244264.284:30): avc: denied { read write } for pid=1068 comm="hwclock" path="/dev/mapper/control" dev="devtmpfs" ino=8112 scontext=system_u:system_r:hwclock_t:s0 tcontext=system_u:object_r:lvm_control_t:s0 tclass=chr_file >03:51:04,286 NOTICE kernel:[ 254.286927] type=1400 audit(1368244264.284:31): avc: denied { read } for pid=1068 comm="hwclock" path="/proc/685/mounts" dev="proc" ino=16173 scontext=system_u:system_r:hwclock_t:s0 tcontext=system_u:system_r:initrc_t:s0 tclass=file >03:51:04,502 NOTICE kernel:[ 254.502108] type=1400 audit(1368244264.500:32): avc: denied { ioctl } for pid=1068 comm="hwclock" path="/dev/pts/0" dev="devpts" ino=3 scontext=system_u:system_r:hwclock_t:s0 tcontext=system_u:object_r:devpts_t:s0 tclass=chr_file >03:51:04,502 NOTICE kernel:[ 254.502121] type=1400 audit(1368244264.500:33): avc: denied { getattr } for pid=1068 comm="hwclock" path="/dev/pts/0" dev="devpts" ino=3 scontext=system_u:system_r:hwclock_t:s0 tcontext=system_u:object_r:devpts_t:s0 tclass=chr_file >03:51:04,996 INFO kernel:[ 254.996889] device label fedora_dhcppc0 devid 1 transid 124 /dev/sda2 >03:51:05,002 INFO kernel:[ 255.002716] btrfs: disk space caching is enabled >03:51:05,005 DEBUG kernel:[ 255.005893] SELinux: initialized (dev sdc2, type btrfs), uses xattr >03:51:08,426 INFO kernel:[ 258.426618] device label fedora_dhcppc0 devid 1 transid 127 /dev/sda2 >03:51:08,428 INFO kernel:[ 258.428280] btrfs: disk space caching is enabled >03:51:08,451 DEBUG kernel:[ 258.450740] SELinux: initialized (dev sdc2, type btrfs), uses xattr >03:51:23,216 INFO kernel:[ 273.216283] md: md127 stopped. >03:51:23,222 INFO kernel:[ 273.222203] md: bind<sdb1> >03:51:23,222 INFO kernel:[ 273.222415] md: bind<sdc1> >03:51:23,222 INFO kernel:[ 273.222637] md: bind<sdd1> >03:51:23,222 INFO kernel:[ 273.222833] md: bind<sda1> >03:51:23,236 INFO kernel:[ 273.236396] md/raid1:md127: active with 4 out of 4 mirrors >03:51:23,236 INFO kernel:[ 273.236416] md127: detected capacity change from 0 to 2139029504 >03:51:23,242 INFO kernel:[ 273.242393] md127: unknown partition table >03:51:23,835 WARNING kernel:[ 273.835382] md: md127 still in use. >03:51:24,035 INFO kernel:[ 274.035722] md127: detected capacity change from 2139029504 to 0 >03:51:24,035 INFO kernel:[ 274.035960] md: md127 stopped. >03:51:24,035 INFO kernel:[ 274.035972] md: unbind<sda1> >03:51:24,035 INFO kernel:[ 274.035981] md: export_rdev(sda1) >03:51:24,036 INFO kernel:[ 274.036778] md: unbind<sdd1> >03:51:24,036 INFO kernel:[ 274.036789] md: export_rdev(sdd1) >03:51:24,036 INFO kernel:[ 274.036891] md: unbind<sdc1> >03:51:24,036 INFO kernel:[ 274.036899] md: export_rdev(sdc1) >03:51:24,036 INFO kernel:[ 274.036990] md: unbind<sdb1> >03:51:24,036 INFO kernel:[ 274.036998] md: export_rdev(sdb1) >03:51:31,952 NOTICE kernel:[ 281.952191] type=1400 audit(1368244291.951:34): avc: denied { read } for pid=1345 comm="mdadm" name="urandom" dev="devtmpfs" ino=4650 scontext=system_u:system_r:mdadm_t:s0 tcontext=system_u:object_r:urandom_device_t:s0 tclass=chr_file >03:51:31,952 NOTICE kernel:[ 281.952201] type=1400 audit(1368244291.951:35): avc: denied { open } for pid=1345 comm="mdadm" path="/dev/urandom" dev="devtmpfs" ino=4650 scontext=system_u:system_r:mdadm_t:s0 tcontext=system_u:object_r:urandom_device_t:s0 tclass=chr_file >03:51:32,351 INFO kernel:[ 282.352001] md: bind<sda3> >03:51:32,354 INFO kernel:[ 282.353167] md: bind<sdb3> >03:51:32,354 INFO kernel:[ 282.353411] md: bind<sdc3> >03:51:32,354 INFO kernel:[ 282.353723] md: bind<sdd3> >03:51:32,360 NOTICE kernel:[ 282.360155] md/raid10:md127: not clean -- starting background reconstruction >03:51:32,360 INFO kernel:[ 282.360161] md/raid10:md127: active with 4 out of 4 devices >03:51:32,360 INFO kernel:[ 282.360196] md127: detected capacity change from 0 to 804257792 >03:51:32,363 INFO kernel:[ 282.363588] md: resync of RAID array md127 >03:51:32,363 INFO kernel:[ 282.363591] md: minimum _guaranteed_ speed: 1000 KB/sec/disk. >03:51:32,363 INFO kernel:[ 282.363593] md: using maximum available idle IO bandwidth (but not more than 200000 KB/sec) for resync. >03:51:32,363 INFO kernel:[ 282.363596] md: using 128k window, over a total of 785408k. >03:51:32,364 INFO kernel:[ 282.364634] md127: unknown partition table >03:51:33,141 NOTICE kernel:[ 283.141344] type=1400 audit(1368244293.139:36): avc: denied { read write } for pid=1367 comm="mdadm" path="/dev/mapper/control" dev="devtmpfs" ino=8112 scontext=system_u:system_r:mdadm_t:s0 tcontext=system_u:object_r:lvm_control_t:s0 tclass=chr_file >03:51:33,491 INFO kernel:[ 283.491894] md: bind<sda2> >03:51:33,492 INFO kernel:[ 283.492932] md: bind<sdb2> >03:51:33,494 INFO kernel:[ 283.493486] md: bind<sdc2> >03:51:33,494 INFO kernel:[ 283.493909] md: bind<sdd2> >03:51:33,499 NOTICE kernel:[ 283.499182] md/raid1:md126: not clean -- starting background reconstruction >03:51:33,499 INFO kernel:[ 283.499185] md/raid1:md126: active with 4 out of 4 mirrors >03:51:33,499 INFO kernel:[ 283.499202] md126: detected capacity change from 0 to 536805376 >03:51:33,501 INFO kernel:[ 283.501248] md126: unknown partition table >03:51:33,503 INFO kernel:[ 283.503625] md: delaying resync of md126 until md127 has finished (they share one or more physical units) >03:51:38,259 INFO kernel:[ 288.259599] md: bind<sda1> >03:51:38,259 INFO kernel:[ 288.259935] md: bind<sdb1> >03:51:38,260 INFO kernel:[ 288.260597] md: bind<sdc1> >03:51:38,260 INFO kernel:[ 288.260989] md: bind<sdd1> >03:51:38,262 NOTICE kernel:[ 288.262862] md/raid10:md125: not clean -- starting background reconstruction >03:51:38,262 INFO kernel:[ 288.262864] md/raid10:md125: active with 4 out of 4 devices >03:51:38,263 INFO kernel:[ 288.263331] created bitmap (1 pages) for device md125 >03:51:38,263 INFO kernel:[ 288.263576] md125: bitmap initialized from disk: read 1 pages, set 94 of 94 bits >03:51:38,319 INFO kernel:[ 288.319215] md125: detected capacity change from 0 to 6294601728 >03:51:38,322 INFO kernel:[ 288.322117] md: delaying resync of md125 until md126 has finished (they share one or more physical units) >03:51:38,322 INFO kernel:[ 288.322127] md: delaying resync of md126 until md127 has finished (they share one or more physical units) >03:51:38,338 INFO kernel:[ 288.338631] md125: unknown partition table >03:51:54,172 INFO kernel:[ 304.169830] Adding 785404k swap on /dev/md127. Priority:-1 extents:1 across:785404k >03:51:54,334 INFO kernel:[ 304.334444] EXT4-fs (md125): mounted filesystem with ordered data mode. Opts: (null) >03:51:54,334 DEBUG kernel:[ 304.334461] SELinux: initialized (dev md125, type ext4), uses xattr >03:51:54,503 INFO kernel:[ 304.502537] EXT4-fs (md126): mounted filesystem with ordered data mode. Opts: (null) >03:51:54,503 DEBUG kernel:[ 304.502556] SELinux: initialized (dev md126, type ext4), uses xattr >03:51:54,591 DEBUG kernel:[ 304.591177] SELinux: initialized (dev tmpfs, type tmpfs), uses transition SIDs >03:51:58,877 ALERT kernel:[ 308.876973] md/raid10:md125: Disk failure on sda1, disabling device. >03:51:58,877 ALERT kernel:[ 308.876973] md/raid10:md125: Operation continuing on 3 devices. >03:51:58,877 ALERT kernel:[ 308.876979] md/raid10:md125: Disk failure on sdb1, disabling device. >03:51:58,877 ALERT kernel:[ 308.876979] md/raid10:md125: Operation continuing on 2 devices. >03:51:59,055 ALERT kernel:[ 309.055829] md/raid10:md125: Disk failure on sdc1, disabling device. >03:51:59,055 ALERT kernel:[ 309.055829] md/raid10:md125: Operation continuing on 1 devices. >03:51:59,055 ALERT kernel:[ 309.055835] md/raid10:md125: Disk failure on sdd1, disabling device. >03:51:59,055 ALERT kernel:[ 309.055835] md/raid10:md125: Operation continuing on 0 devices. >03:51:59,055 ERR kernel:[ 309.055855] md125: WRITE SAME failed. Manually zeroing. >03:51:59,605 ERR kernel:[ 309.605888] Buffer I/O error on device md125, logical block 557056 >03:51:59,605 WARNING kernel:[ 309.605892] lost page write due to I/O error on md125 >03:51:59,605 ERR kernel:[ 309.605895] JBD2: Error -5 detected when updating journal superblock for md125-8. >03:51:59,605 ERR kernel:[ 309.605991] Aborting journal on device md125-8. >03:51:59,605 ERR kernel:[ 309.605994] Buffer I/O error on device md125, logical block 557056 >03:51:59,605 WARNING kernel:[ 309.605996] lost page write due to I/O error on md125 >03:51:59,605 ERR kernel:[ 309.605998] JBD2: Error -5 detected when updating journal superblock for md125-8. >03:51:59,606 ERR kernel:[ 309.606527] Buffer I/O error on device md125, logical block 0 >03:51:59,606 WARNING kernel:[ 309.606529] lost page write due to I/O error on md125 >03:51:59,606 CRIT kernel:[ 309.606532] EXT4-fs error (device md125): __ext4_journal_start_sb:60: Detected aborted journal >03:51:59,606 CRIT kernel:[ 309.606535] EXT4-fs (md125): Remounting filesystem read-only >03:51:59,606 ERR kernel:[ 309.606537] EXT4-fs (md125): previous I/O error to superblock detected >03:51:59,606 ERR kernel:[ 309.606539] Buffer I/O error on device md125, logical block 0 >03:51:59,606 WARNING kernel:[ 309.606540] lost page write due to I/O error on md125 >03:52:00,318 INFO kernel:[ 310.314717] md: md127: resync done. >03:52:00,395 INFO kernel:[ 310.390497] md: resync of RAID array md126 >03:52:00,396 INFO kernel:[ 310.390501] md: minimum _guaranteed_ speed: 1000 KB/sec/disk. >03:52:00,396 INFO kernel:[ 310.390503] md: using maximum available idle IO bandwidth (but not more than 200000 KB/sec) for resync. >03:52:00,396 INFO kernel:[ 310.390507] md: using 128k window, over a total of 524224k. >03:52:00,396 DEBUG kernel:[ 310.392193] RAID10 conf printout: >03:52:00,396 DEBUG kernel:[ 310.392195] --- wd:0 rd:4 >03:52:00,396 DEBUG kernel:[ 310.392198] disk 0, wo:1, o:0, dev:sda1 >03:52:00,396 DEBUG kernel:[ 310.392200] disk 1, wo:1, o:0, dev:sdb1 >03:52:00,396 DEBUG kernel:[ 310.392201] disk 2, wo:1, o:0, dev:sdc1 >03:52:00,396 DEBUG kernel:[ 310.392203] disk 3, wo:1, o:0, dev:sdd1 >03:52:00,396 DEBUG kernel:[ 310.392204] RAID10 conf printout: >03:52:00,396 DEBUG kernel:[ 310.392205] --- wd:0 rd:4 >03:52:00,396 DEBUG kernel:[ 310.392206] disk 0, wo:1, o:0, dev:sda1 >03:52:00,396 DEBUG kernel:[ 310.392207] disk 1, wo:1, o:0, dev:sdb1 >03:52:00,396 DEBUG kernel:[ 310.392208] disk 2, wo:1, o:0, dev:sdc1 >03:52:00,396 DEBUG kernel:[ 310.392241] RAID10 conf printout: >03:52:00,396 DEBUG kernel:[ 310.392242] --- wd:0 rd:4 >03:52:00,396 DEBUG kernel:[ 310.392244] disk 0, wo:1, o:0, dev:sda1 >03:52:00,396 DEBUG kernel:[ 310.392245] disk 1, wo:1, o:0, dev:sdb1 >03:52:00,396 DEBUG kernel:[ 310.392246] disk 2, wo:1, o:0, dev:sdc1 >03:52:00,396 DEBUG kernel:[ 310.392247] RAID10 conf printout: >03:52:00,396 DEBUG kernel:[ 310.392247] --- wd:0 rd:4 >03:52:00,396 DEBUG kernel:[ 310.392248] disk 0, wo:1, o:0, dev:sda1 >03:52:00,396 DEBUG kernel:[ 310.392249] disk 1, wo:1, o:0, dev:sdb1 >03:52:00,396 DEBUG kernel:[ 310.392251] RAID10 conf printout: >03:52:00,396 DEBUG kernel:[ 310.392252] --- wd:0 rd:4 >03:52:00,396 DEBUG kernel:[ 310.392253] disk 0, wo:1, o:0, dev:sda1 >03:52:00,396 DEBUG kernel:[ 310.392254] disk 1, wo:1, o:0, dev:sdb1 >03:52:00,396 DEBUG kernel:[ 310.392263] RAID10 conf printout: >03:52:00,396 DEBUG kernel:[ 310.392264] --- wd:0 rd:4 >03:52:00,396 DEBUG kernel:[ 310.392265] disk 0, wo:1, o:0, dev:sda1 >03:52:00,396 DEBUG kernel:[ 310.392267] RAID10 conf printout: >03:52:00,396 DEBUG kernel:[ 310.392268] --- wd:0 rd:4 >03:52:00,396 DEBUG kernel:[ 310.392269] disk 0, wo:1, o:0, dev:sda1 >03:52:00,396 DEBUG kernel:[ 310.392270] RAID10 conf printout: >03:52:00,396 DEBUG kernel:[ 310.392271] --- wd:0 rd:4 >03:52:00,396 INFO kernel:[ 310.395955] md: delaying resync of md125 until md126 has finished (they share one or more physical units)
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 962031
:
746461
|
746462
|
746463
|
746464
|
746465
|
746466
|
746467
|
746468
|
746469
|
746470
| 746471