Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.
RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.

Bug 1694589

Summary: [v2v] VMware to RHV: 6 out of 20 VMs migration failed
Product: Red Hat Enterprise Linux 7 Reporter: Ilanit Stein <istein>
Component: libguestfsAssignee: Richard W.M. Jones <rjones>
Status: CLOSED DUPLICATE QA Contact: Virtualization Bugs <virt-bugs>
Severity: high Docs Contact:
Priority: unspecified    
Version: 7.6CC: fdupont, juzhou, mxie, mzhan, ptoscano, tgolembi, tzheng, xiaodwan, zili
Target Milestone: rc   
Target Release: ---   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard: V2V
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2019-04-08 08:34:11 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
virt-v2v.log for one of the VMs failing on migration. none

Description Ilanit Stein 2019-04-01 08:14:46 UTC
Description of problem:
For a v2v migration of 20 VMs of 100GB disk,
(10 VMs reside on one VMware host, 10 VMs on another VMware host)  
6 out of 20 VMs failed ~5 minutes after migration plan execution in CFME.


Error in the virt-v2v log:
[   0.4] Opening the source -i vmx ssh://root.69.89/vmfs/volumes/rhv-v2v-performance-testing/file:///%5B%5D%20/vmfs/volumes/5bfe7583-fe7f9906-f066-b8ca3a638940/v2v_migration_vm_10/v2v_migration_vm_10.vmx
scp 'root'@'10.12.69.89':''\''/vmfs/volumes/rhv-v2v-performance-testing/file:///[] /vmfs/volumes/5bfe7583-fe7f9906-f066-b8ca3a638940/v2v_migration_vm_10/v2v_migration_vm_10.vmx'\''' '/var/tmp/vmx.9oyPDt/source.vmx'
scp: /vmfs/volumes/rhv-v2v-performance-testing/file:///[] /vmfs/volumes/5bfe7583-fe7f9906-f066-b8ca3a638940/v2v_migration_vm_10/v2v_migration_vm_10.vmx: No such file or directory
virt-v2v: error: could not copy the VMX file from the remote server, see 
earlier error messages

The other 14 VMs migrated successfully.
Some of those 14 VMs resided on one VMware host, and some on the other.


Version-Release number of selected component (if applicable):
cfme-5.10.2.2
vdsm-4.30.11-1.el7ev.x86_64
rhv-4.3.2.1-0.1.el7
nbdkit-1.2.6-1.1.lp.el7ev.x86_64
virt-v2v-1.38.2-12.29.lp.el7ev.x86_64
libguestfs-1.38.2-12.29.lp.el7ev.x86_64
VMware-vix-disklib-6.7.1-10362358

How reproducible:
Tried it twice on the same environment, and it failed the same (6 out of 20 VMs failed)

Comment 2 Pino Toscano 2019-04-01 08:30:01 UTC
(In reply to Ilanit Stein from comment #0)
> Description of problem:
> For a v2v migration of 20 VMs of 100GB disk,
> (10 VMs reside on one VMware host, 10 VMs on another VMware host)  
> 6 out of 20 VMs failed ~5 minutes after migration plan execution in CFME.
> 
> 
> Error in the virt-v2v log:
> [   0.4] Opening the source -i vmx
> ssh://root.69.89/vmfs/volumes/rhv-v2v-performance-testing/file:///
> %5B%5D%20/vmfs/volumes/5bfe7583-fe7f9906-f066-b8ca3a638940/
> v2v_migration_vm_10/v2v_migration_vm_10.vmx
> scp
> 'root'@'10.12.69.89':''\''/vmfs/volumes/rhv-v2v-performance-testing/file:///
> []
> /vmfs/volumes/5bfe7583-fe7f9906-f066-b8ca3a638940/v2v_migration_vm_10/
> v2v_migration_vm_10.vmx'\''' '/var/tmp/vmx.9oyPDt/source.vmx'
> scp: /vmfs/volumes/rhv-v2v-performance-testing/file:///[]
> /vmfs/volumes/5bfe7583-fe7f9906-f066-b8ca3a638940/v2v_migration_vm_10/
> v2v_migration_vm_10.vmx: No such file or directory

It looks like the URL passed to -it ssh was completely bogus.
What was the exact command line used to invoke virt-v2v?

Comment 3 Ilanit Stein 2019-04-01 14:10:27 UTC
Created attachment 1550566 [details]
virt-v2v.log for one of the VMs failing on migration.

Comment 4 Pino Toscano 2019-04-01 14:27:24 UTC
(In reply to Ilanit Stein from comment #3)
> Created attachment 1550566 [details]
> virt-v2v.log for one of the VMs failing on migration.

Unfortunately this log does not provide more details than what was written in comment 0.

Comment 5 Tomáš Golembiovský 2019-04-02 09:39:58 UTC
Fabien, this looks like a duplicate of bug 1624589 to me. Can you confirm?

Comment 6 Fabien Dupont 2019-04-02 09:47:17 UTC
Yes. I confirm that they look identical.
https://bugzilla.redhat.com/show_bug.cgi?id=1624589 was closed because we couldn't reproduce it.

As Pino says, the URL looks weird. In our lab, we have 119 VMware VMs and none of them contains 'file://' in their location.

Ilanit, could you provide access to the CloudForms appliance, please ?

Comment 7 Ilanit Stein 2019-04-02 10:45:18 UTC
Fabien,

I'm afraid all our hardware is currently under lab shutdown.
When I'll have an environment to reproduce I shall do so.

Forgot to mention that this bug is for VMware:ISCSI->RHV:FC.

Comment 8 Ilanit Stein 2019-04-04 10:41:55 UTC
CFME/RHV hosts logs can be found here:
https://drive.google.com/drive/u/0/folders/11ADeirgAefu97W8_4qpj3cZEdpWmQUm8

2019-03-31_10-46-39_cloudforms_v2v_test-cfme.tar.gz
2019-03-31_14-46-58_b02-h21-r620_v2v_test-conv_host.tar.gz
2019-03-31_14-47-14_b02-h23-r620_v2v_test-conv_host.tar.gz

Comment 9 Pino Toscano 2019-04-04 12:10:38 UTC
(In reply to Ilanit Stein from comment #8)
> CFME/RHV hosts logs can be found here:
> https://drive.google.com/drive/u/0/folders/11ADeirgAefu97W8_4qpj3cZEdpWmQUm8
> 
> 2019-03-31_10-46-39_cloudforms_v2v_test-cfme.tar.gz
> 2019-03-31_14-46-58_b02-h21-r620_v2v_test-conv_host.tar.gz
> 2019-03-31_14-47-14_b02-h23-r620_v2v_test-conv_host.tar.gz

I cannot access to these files without requesting access.

Comment 10 Ilanit Stein 2019-04-08 08:03:12 UTC
Pino,

I changed the files permission so that everyone at Redhat can access.

Please see if now you can access it.

Comment 11 Pino Toscano 2019-04-08 08:34:11 UTC
(In reply to Ilanit Stein from comment #8)
> CFME/RHV hosts logs can be found here:
> https://drive.google.com/drive/u/0/folders/11ADeirgAefu97W8_4qpj3cZEdpWmQUm8
> 
> 2019-03-31_10-46-39_cloudforms_v2v_test-cfme.tar.gz

In this archive, I can see in files/evm.log various lines like:

40377:[----] W, [2019-03-31T06:54:04.543363 #29793:1128f58]  WARN -- : MIQ(ManageIQ::Providers::Vmware::InfraManager::RefreshParser.vm_inv_to_hashes) Warning: [path, '[] /vmfs/volumes/5bfe7583-fe7f9906-f066-b8ca3a638940/v2v_migration_vm_03/v2v_migration_vm_03.vmx', is malformed]

So the URL is already malformed in CFME, and from there it is sent (invalid) to virt-v2v-wrapper, and then to virt-v2v (which rightfully fails).

I just reopened bug 1624589, and will mark this bug as duplicate of that.

*** This bug has been marked as a duplicate of bug 1624589 ***