Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.

Bug 2012799

Summary: Status of cold migration plan from RHV provider remains "Unknown" on "Migration plans" page
Product: Migration Toolkit for Virtualization Reporter: kpunwatk
Component: GeneralAssignee: Sam Lucidi <slucidi>
Status: CLOSED ERRATA QA Contact: kpunwatk
Severity: high Docs Contact: Avital Pinnick <apinnick>
Priority: high    
Version: 2.2.0CC: amastbau, apinnick, fdupont, istein
Target Milestone: ---Keywords: AutomationBlocker, Regression
Target Release: 2.2.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2021-12-09 19:20:58 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
Screenshot of Plan status in UI
none
Controller-log
none
Inventory-log none

Description kpunwatk 2021-10-11 10:41:17 UTC
Created attachment 1831820 [details]
Screenshot of Plan status in UI

Created attachment 1831820 [details]
Screenshot of Plan status in UI

Created attachment 1831820 [details]
Screenshot of Plan status in UI

Description of problem:
 
During cold migration from RHV Source provider.
In UI the Plan status stays at "Unknown". First it goes to validation and after that stuck to Unknown status.
(please see attached screenshot and logs)
 
 
Version-Release number of selected component (if applicable):
MTV 2.2.0-39 / iib: 121326
CNV 4.9.0-232 / iib: 117863
 
How reproducible:
100%
 
 
Additional info:
Cold Migration from VMware source provider ended successfully. 

In MTV 2.2.0-20/ CNV 4.9.0-232 this bug doesn't exist.

Inventory log:
[GIN] 2021/10/11 - 10:35:10 | 200 |    1.235249ms |      10.128.2.1 | GET      "/providers/ovirt/6b0e613e-3ceb-42fc-84f9-d87178d2f912/vms/6751eb41-e286-4e6d-9b58-2feecd1ec735"

Controller Log:
{"level":"info","ts":1633946236.1344683,"logger":"plan|4s5ct","msg":"Reconcile failed.","plan":"openshift-mtv/plan6","vm":" id:6751eb41-e286-4e6d-9b58-2feecd1ec735 name:'v2v-migration-rhel8-thick-eager' ","error":"VM not found in inventory. caused by: 'Resource &ovirt.Workload{SelfLink:\"\", xVM:ovirt.xVM{VM:ovirt.VM{VM1:ovirt.VM1{VM0:ovirt.Resource{ID:\"\", Revision:0, Path:\"\", Name:\"\", Description:\"\", SelfLink:\"\"}, Cluster:\"\", Host:\"\", RevisionValidated:0, NICs:[]ovirt.NIC(nil), DiskAttachments:[]ovirt.DiskAttachment(nil), Concerns:[]base.Concern(nil)}, PolicyVersion:0, GuestName:\"\", CpuSockets:0, CpuCores:0, CpuThreads:0, CpuShares:0, CpuAffinity:[]ovirt.CpuPinning(nil), Memory:0, BalloonedMemory:false, IOThreads:0, BIOS:\"\", Display:\"\", HasIllegalImages:false, NumaNodeAffinity:[]string(nil), LeaseStorageDomain:\"\", StorageErrorResumeBehaviour:\"\", HaEnabled:false, UsbEnabled:false, BootMenuEnabled:false, PlacementPolicyAffinity:\"\", Timezone:\"\", Status:\"\", Stateless:\"\", SerialNumber:\"\", HostDevices:[]ovirt.HostDevice(nil), CDROMs:[]ovirt.CDROM(nil), WatchDogs:[]ovirt.WatchDog(nil), Properties:[]ovirt.Property(nil), Snapshots:[]ovirt.Snapshot(nil)}, DiskAttachments:[]ovirt.xDiskAttachment(nil), NICs:[]ovirt.xNIC(nil)}, Host:(*ovirt.Host)(nil), Cluster:ovirt.Cluster{Resource:ovirt.Resource{ID:\"\", Revision:0, Path:\"\", Name:\"\", Description:\"\", SelfLink:\"\"}, DataCenter:\"\", HaReservation:false, KsmEnabled:false, BiosType:\"\"}, DataCenter:ovirt.DataCenter{Resource:ovirt.Resource{ID:\"\", Revision:0, Path:\"\", Name:\"\", Description:\"\", SelfLink:\"\"}}} cannot be resolved.'","stacktrace":"\ngithub.com/konveyor/forklift-controller/pkg/controller/provider/web/ovirt.(*Finder).ByRef()\n\t/remote-source/app/pkg/controller/provider/web/ovirt/client.go:257\ngithub.com/konveyor/forklift-controller/pkg/controller/provider/web.(*ProviderClient).Find()\n\t/remote-source/app/pkg/controller/provider/web/client.go:153\ngithub.com/konveyor/forklift-controller/pkg/controller/plan/adapter/ovirt.(*Validator).NetworksMapped()\n\t/remote-source/app/pkg/controller/plan/adapter/ovirt/validator.go:32\ngithub.com/konveyor/forklift-controller/pkg/controller/plan.(*Reconciler).validateVM()\n\t/remote-source/app/pkg/controller/plan/validation.go:370\ngithub.com/konveyor/forklift-controller/pkg/controller/plan.(*Reconciler).validate()\n\t/remote-source/app/pkg/controller/plan/validation.go:116\ngithub.com/konveyor/forklift-controller/pkg/controller/plan.Reconciler.Reconcile()\n\t/remote-source/app/pkg/controller/plan/controller.go:217\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler()\n\t/remote-source/deps/gomod/pkg/mod/sigs.k8s.io/controller-runtime.4/pkg/internal/controller/controller.go:244\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem()\n\t/remote-source/deps/gomod/pkg/mod/sigs.k8s.io/controller-runtime.4/pkg/internal/controller/controller.go:218\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker()\n\t/remote-source/deps/gomod/pkg/mod/sigs.k8s.io/controller-runtime.4/pkg/internal/controller/controller.go:197\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil.func1()\n\t/remote-source/deps/gomod/pkg/mod/k8s.io/apimachinery.3/pkg/util/wait/wait.go:155\nk8s.io/apimachinery/pkg/util/wait.BackoffUntil()\n\t/remote-source/deps/gomod/pkg/mod/k8s.io/apimachinery.3/pkg/util/wait/wait.go:156\nk8s.io/apimachinery/pkg/util/wait.JitterUntil()\n\t/remote-source/deps/gomod/pkg/mod/k8s.io/apimachinery.3/pkg/util/wait/wait.go:133\nk8s.io/apimachinery/pkg/util/wait.Until()\n\t/remote-source/deps/gomod/pkg/mod/k8s.io/apimachinery.3/pkg/util/wait/wait.go:90\nruntime.goexit()\n\t/usr/lib/golang/src/runtime/asm_amd64.s:1371"}

Comment 1 kpunwatk 2021-10-11 10:44:45 UTC
Created attachment 1831822 [details]
Controller-log

Comment 2 kpunwatk 2021-10-11 11:09:35 UTC
Created attachment 1831842 [details]
Inventory-log

Comment 6 Ilanit Stein 2021-10-13 08:08:01 UTC
Moving to ON_QA since we see it already fixed in MTV-2.2.0-43.
Fix might have introduced is some previous MTV-2.2 version.

Comment 9 errata-xmlrpc 2021-12-09 19:20:58 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (MTV 2.2.0 Images), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2021:5066