Bug 1858595
Summary: | [v2v] [RFE] VM import RHV to CNV The import should not fail when there is no space left by the ceph provider | ||
---|---|---|---|
Product: | Container Native Virtualization (CNV) | Reporter: | Amos Mastbaum <amastbau> |
Component: | V2V | Assignee: | Ondra Machacek <omachace> |
Status: | CLOSED DEFERRED | QA Contact: | Amos Mastbaum <amastbau> |
Severity: | medium | Docs Contact: | |
Priority: | unspecified | ||
Version: | 2.4.0 | CC: | cnv-qe-bugs, dagur, istein, mguetta, nachandr, omachace, pkliczew |
Target Milestone: | --- | Keywords: | RFE |
Target Release: | 2.5.1 | ||
Hardware: | Unspecified | ||
OS: | Unspecified | ||
Whiteboard: | |||
Fixed In Version: | v2.5.0-18 | Doc Type: | If docs needed, set a value |
Doc Text: | Story Points: | --- | |
Clone Of: | Environment: | ||
Last Closed: | 2020-11-02 10:53:17 UTC | Type: | Bug |
Regression: | --- | Mount Type: | --- |
Documentation: | --- | CRM: | |
Verified Versions: | Category: | --- | |
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
Cloudforms Team: | --- | Target Upstream Version: | |
Embargoed: | |||
Bug Depends On: | |||
Bug Blocks: | 1893529 |
Description
Amos Mastbaum
2020-07-19 11:34:04 UTC
Actually this is by design, if the import fail for some reason for 3times, we stop the import, we don't overload the network/storage, for no reason. I think we can add an option to disable this feature. We must monitor events and report to the VM Import logs and CR the errors of the CDI importer pod. *** Bug 1721504 has been marked as a duplicate of this bug. *** @Piotr Was anything changed that needs to be verified here? Yes, in this situation we fire an event to let the user know. VM import of a VM with a 100GB disk to target storage NFS gets this message: Importing (RHV) rhel8-import is being imported. Pending: DataVolume rhel8-import-03072434-e45b-430c-8860-ff50b0c71a2c is pending to bound Is this related to this fix, or do we need to verify this in another way? (VM import of a VM with 100GB disk to Ceph-RBD/Block passed) Ilanit, pending dv is expected. You should see fired alert letting user know about the situation. Piotr, We tried reproduce the "DataValueCreationgFailed" failure, mentioned in the bug description. VM import of a 100GB VM to Ceph-RBD storage (of 50GB) ends up with cdi importer keep reporting 47% for some time, then it fails and begins a "New phase" that show 0% progress. The VM import progress bar remain all the time on 39% and it seems the error that comes from the cdi import is not propagated. This way we cannot see the event that was added within this bug fix. Part of the cdi importer log that we managed to capture: 1029 16:00:16.785152 1 importer.go:52] Starting importer I1029 16:00:16.786785 1 importer.go:116] begin import process I1029 16:00:18.495650 1 http-datasource.go:219] Attempting to get certs from /certs/ca.pem I1029 16:00:18.546153 1 data-processor.go:302] Calculating available size I1029 16:00:18.547157 1 data-processor.go:310] Checking out block volume size. I1029 16:00:18.547170 1 data-processor.go:322] Request image size not empty. I1029 16:00:18.547199 1 data-processor.go:327] Target size 100Gi. I1029 16:00:18.547320 1 data-processor.go:224] New phase: TransferDataFile I1029 16:00:18.548337 1 util.go:161] Writing data... I1029 16:00:19.547661 1 prometheus.go:69] 0.00 I1029 16:00:20.547888 1 prometheus.go:69] 0.00 I1029 16:00:21.548133 1 prometheus.go:69] 0.00 I1029 16:00:22.548255 1 prometheus.go:69] 0.00 I1029 16:00:23.548440 1 prometheus.go:69] 0.00 I1029 16:00:24.548616 1 prometheus.go:69] 0.00 I1029 16:00:25.548837 1 prometheus.go:69] 0.00 I1029 16:00:26.549033 1 prometheus.go:69] 0.00 I1029 16:00:27.551421 1 prometheus.go:69] 0.00 I1029 16:00:28.551559 1 prometheus.go:69] 0.00 I1029 16:00:29.552086 1 prometheus.go:69] 0.00 I1029 16:00:30.552228 1 prometheus.go:69] 0.00 I1029 16:00:31.552421 1 prometheus.go:69] 0.00 I1029 16:00:32.552700 1 prometheus.go:69] 0.00 I1029 16:00:33.555754 1 prometheus.go:69] 0.00 I1029 16:00:34.555973 1 prometheus.go:69] 0.00 I1029 16:00:35.556195 1 prometheus.go:69] 0.00 I1029 16:00:36.556394 1 prometheus.go:69] 0.00 I1029 16:00:37.556617 1 prometheus.go:69] 0.00 I1029 16:00:38.556806 1 prometheus.go:69] 0.00 I1029 16:00:39.557024 1 prometheus.go:69] 0.00 I1029 16:00:40.557236 1 prometheus.go:69] 0.00 I1029 16:00:41.559071 1 prometheus.go:69] 0.00 I1029 16:00:42.559268 1 prometheus.go:69] 0.00 I1029 16:00:43.559453 1 prometheus.go:69] 0.00 I1029 16:00:44.559708 1 prometheus.go:69] 0.00 I1029 16:00:45.559912 1 prometheus.go:69] 0.00 I1029 16:00:46.560057 1 prometheus.go:69] 0.00 I1029 16:00:47.560253 1 prometheus.go:69] 0.00 I1029 16:00:48.560472 1 prometheus.go:69] 0.00 I1029 16:00:49.560610 1 prometheus.go:69] 0.00 I1029 16:00:50.560803 1 prometheus.go:69] 0.00 I1029 16:00:51.561134 1 prometheus.go:69] 0.00 I1029 16:00:52.561487 1 prometheus.go:69] 0.00 I1029 16:00:53.561764 1 prometheus.go:69] 0.00 I1029 16:00:54.561990 1 prometheus.go:69] 0.00 I1029 16:00:55.562177 1 prometheus.go:69] 0.00 I1029 16:00:56.562362 1 prometheus.go:69] 0.00 I1029 16:00:57.562577 1 prometheus.go:69] 0.00 I1029 16:00:58.562738 1 prometheus.go:69] 0.00 I1029 16:00:59.562968 1 prometheus.go:69] 0.00 I1029 16:01:00.563158 1 prometheus.go:69] 0.00 I1029 16:01:01.563429 1 prometheus.go:69] 0.00 I1029 16:01:02.563675 1 prometheus.go:69] 0.00 I1029 16:01:03.563918 1 prometheus.go:69] 0.00 I1029 16:01:04.564058 1 prometheus.go:69] 0.00 I1029 16:01:05.565075 1 prometheus.go:69] 0.00 I1029 16:01:06.565260 1 prometheus.go:69] 0.00 I1029 16:01:07.565495 1 prometheus.go:69] 0.00 I1029 16:01:08.565668 1 prometheus.go:69] 0.00 I1029 16:01:09.565897 1 prometheus.go:69] 0.00 I1029 16:01:10.566101 1 prometheus.go:69] 0.00 I1029 16:01:11.566318 1 prometheus.go:69] 0.00 I1029 16:01:12.566521 1 prometheus.go:69] 0.00 I1029 16:01:13.566692 1 prometheus.go:69] 0.00 I1029 16:01:14.566973 1 prometheus.go:69] 0.00 I1029 16:01:15.567099 1 prometheus.go:69] 0.00 I1029 16:01:16.567300 1 prometheus.go:69] 0.00 I1029 16:01:17.567465 1 prometheus.go:69] 0.00 I1029 16:01:18.567733 1 prometheus.go:69] 0.00 I1029 16:01:19.567940 1 prometheus.go:69] 0.00 I1029 16:01:20.568195 1 prometheus.go:69] 0.00 I1029 16:01:21.568411 1 prometheus.go:69] 0.00 I1029 16:01:22.571107 1 prometheus.go:69] 0.00 I1029 16:01:23.571320 1 prometheus.go:69] 0.00 I1029 16:01:24.571777 1 prometheus.go:69] 0.00 I1029 16:01:25.571990 1 prometheus.go:69] 0.00 I1029 16:01:26.573141 1 prometheus.go:69] 0.00 I1029 16:01:27.573287 1 prometheus.go:69] 0.00 I1029 16:01:28.573480 1 prometheus.go:69] 0.00 I1029 16:01:29.573693 1 prometheus.go:69] 0.00 I1029 16:01:30.573856 1 prometheus.go:69] 0.00 I1029 16:01:31.574030 1 prometheus.go:69] 0.00 I1029 16:01:32.574762 1 prometheus.go:69] 0.00 I1029 16:01:33.575052 1 prometheus.go:69] 0.00 I1029 16:01:34.575291 1 prometheus.go:69] 0.00 I1029 16:01:35.575552 1 prometheus.go:69] 0.00 I1029 16:01:36.575810 1 prometheus.go:69] 0.00 I1029 16:01:37.576122 1 prometheus.go:69] 0.00 I1029 16:01:38.576379 1 prometheus.go:69] 0.00 I1029 16:01:39.576583 1 prometheus.go:69] 0.00 I1029 16:01:40.576759 1 prometheus.go:69] 0.00 I1029 16:01:41.577000 1 prometheus.go:69] 0.00 I1029 16:01:42.577162 1 prometheus.go:69] 0.00 I1029 16:01:43.577405 1 prometheus.go:69] 0.00 I1029 16:01:44.577853 1 prometheus.go:69] 0.00 I1029 16:01:45.578084 1 prometheus.go:69] 0.00 I1029 16:01:46.578278 1 prometheus.go:69] 0.00 I1029 16:01:47.578489 1 prometheus.go:69] 0.00 I1029 16:01:48.578672 1 prometheus.go:69] 0.00 I1029 16:01:49.578863 1 prometheus.go:69] 0.00 I1029 16:01:50.579212 1 prometheus.go:69] 0.00 I1029 16:01:51.579395 1 prometheus.go:69] 0.00 I1029 16:01:52.579575 1 prometheus.go:69] 0.00 I1029 16:01:53.579735 1 prometheus.go:69] 0.00 I1029 16:01:54.579915 1 prometheus.go:69] 0.00 I1029 16:01:55.580071 1 prometheus.go:69] 0.00 I1029 16:01:56.580188 1 prometheus.go:69] 0.00 I1029 16:01:57.581113 1 prometheus.go:69] 0.00 I1029 16:01:58.581293 1 prometheus.go:69] 0.00 I1029 16:01:59.581486 1 prometheus.go:69] 0.00 I1029 16:02:00.581717 1 prometheus.go:69] 0.00 I1029 16:02:01.581931 1 prometheus.go:69] 0.00 I1029 16:02:02.582153 1 prometheus.go:69] 0.00 I1029 16:02:03.582466 1 prometheus.go:69] 0.00 I1029 16:02:04.582749 1 prometheus.go:69] 0.00 I1029 16:02:05.584099 1 prometheus.go:69] 0.00 I1029 16:02:06.584362 1 prometheus.go:69] 0.00 I1029 16:02:07.584597 1 prometheus.go:69] 0.00 I1029 16:02:08.584844 1 prometheus.go:69] 0.00 I1029 16:02:09.585042 1 prometheus.go:69] 0.01 I1029 16:02:10.585218 1 prometheus.go:69] 0.01 I1029 16:02:11.585402 1 prometheus.go:69] 0.01 I1029 16:02:12.585624 1 prometheus.go:69] 0.01 I1029 16:02:13.586616 1 prometheus.go:69] 0.01 I1029 16:02:14.587153 1 prometheus.go:69] 0.01 I1029 16:02:15.587334 1 prometheus.go:69] 0.01 I1029 16:02:16.587566 1 prometheus.go:69] 0.01 I1029 16:02:17.587789 1 prometheus.go:69] 0.01 I1029 16:02:18.588048 1 prometheus.go:69] 0.01 I1029 16:02:19.588224 1 prometheus.go:69] 0.01 I1029 16:02:20.589201 1 prometheus.go:69] 0.01 I1029 16:02:21.589386 1 prometheus.go:69] 0.01 I1029 16:02:22.589686 1 prometheus.go:69] 0.01 I1029 16:02:23.589923 1 prometheus.go:69] 0.01 I1029 16:02:24.590244 1 prometheus.go:69] 0.01 I1029 16:02:25.590439 1 prometheus.go:69] 0.01 I1029 16:02:26.590754 1 prometheus.go:69] 0.01 I1029 16:02:27.590971 1 prometheus.go:69] 0.01 I1029 16:02:28.592463 1 prometheus.go:69] 0.01 I1029 16:02:29.592729 1 prometheus.go:69] 0.01 I1029 16:02:30.593005 1 prometheus.go:69] 0.01 I1029 16:02:31.593168 1 prometheus.go:69] 0.01 I1029 16:02:32.593392 1 prometheus.go:69] 0.01 I1029 16:02:33.593583 1 prometheus.go:69] 0.01 I1029 16:02:34.593773 1 prometheus.go:69] 0.01 I1029 16:02:35.593995 1 prometheus.go:69] 0.01 I1029 16:02:36.594152 1 prometheus.go:69] 0.01 I1029 16:02:37.594374 1 prometheus.go:69] 0.01 I1029 16:02:38.596154 1 prometheus.go:69] 0.01 I1029 16:02:39.596341 1 prometheus.go:69] 0.01 I1029 16:02:40.596530 1 prometheus.go:69] 0.01 I1029 16:02:41.599935 1 prometheus.go:69] 0.01 I1029 16:02:42.600159 1 prometheus.go:69] 0.01 I1029 16:02:43.600308 1 prometheus.go:69] 0.01 I1029 16:02:44.600551 1 prometheus.go:69] 0.01 I1029 16:02:45.600738 1 prometheus.go:69] 0.01 I1029 16:02:46.600898 1 prometheus.go:69] 0.01 I1029 16:02:47.601140 1 prometheus.go:69] 0.01 I1029 16:02:48.601328 1 prometheus.go:69] 0.01 I1029 16:02:49.601819 1 prometheus.go:69] 0.01 I1029 16:02:50.602078 1 prometheus.go:69] 0.01 I1029 16:02:51.602348 1 prometheus.go:69] 0.01 I1029 16:02:52.602565 1 prometheus.go:69] 0.01 I1029 16:02:53.602819 1 prometheus.go:69] 0.01 I1029 16:02:54.603045 1 prometheus.go:69] 0.01 I1029 16:02:55.603321 1 prometheus.go:69] 0.01 I1029 16:02:56.603572 1 prometheus.go:69] 0.01 I1029 16:02:57.603779 1 prometheus.go:69] 0.01 E1029 16:02:58.540140 1 util.go:163] Unable to write file from dataReader: unexpected EOF E1029 16:02:58.540307 1 data-processor.go:221] unexpected EOF unable to write to file kubevirt.io/containerized-data-importer/pkg/util.StreamDataToFile /go/src/kubevirt.io/containerized-data-importer/pkg/util/util.go:165 kubevirt.io/containerized-data-importer/pkg/importer.(*ImageioDataSource).TransferFile /go/src/kubevirt.io/containerized-data-importer/pkg/importer/imageio-datasource.go:115 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessDataWithPause /go/src/kubevirt.io/containerized-data-importer/pkg/importer/data-processor.go:191 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessData /go/src/kubevirt.io/containerized-data-importer/pkg/importer/data-processor.go:153 main.main /go/src/kubevirt.io/containerized-data-importer/cmd/cdi-importer/importer.go:171 runtime.main /usr/lib/golang/src/runtime/proc.go:203 runtime.goexit /usr/lib/golang/src/runtime/asm_amd64.s:1357 Unable to transfer source data to target file kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessDataWithPause /go/src/kubevirt.io/containerized-data-importer/pkg/importer/data-processor.go:193 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessData /go/src/kubevirt.io/containerized-data-importer/pkg/importer/data-processor.go:153 main.main /go/src/kubevirt.io/containerized-data-importer/cmd/cdi-importer/importer.go:171 runtime.main /usr/lib/golang/src/runtime/proc.go:203 runtime.goexit /usr/lib/golang/src/runtime/asm_amd64.s:1357 E1029 16:02:58.540491 1 importer.go:173] unexpected EOF unable to write to file kubevirt.io/containerized-data-importer/pkg/util.StreamDataToFile /go/src/kubevirt.io/containerized-data-importer/pkg/util/util.go:165 kubevirt.io/containerized-data-importer/pkg/importer.(*ImageioDataSource).TransferFile /go/src/kubevirt.io/containerized-data-importer/pkg/importer/imageio-datasource.go:115 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessDataWithPause /go/src/kubevirt.io/containerized-data-importer/pkg/importer/data-processor.go:191 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessData /go/src/kubevirt.io/containerized-data-importer/pkg/importer/data-processor.go:153 main.main /go/src/kubevirt.io/containerized-data-importer/cmd/cdi-importer/importer.go:171 runtime.main /usr/lib/golang/src/runtime/proc.go:203 runtime.goexit /usr/lib/golang/src/runtime/asm_amd64.s:1357 Unable to transfer source data to target file kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessDataWithPause /go/src/kubevirt.io/containerized-data-importer/pkg/importer/data-processor.go:193 kubevirt.io/containerized-data-importer/pkg/importer.(*DataProcessor).ProcessData /go/src/kubevirt.io/containerized-data-importer/pkg/importer/data-processor.go:153 main.main /go/src/kubevirt.io/containerized-data-importer/cmd/cdi-importer/importer.go:171 runtime.main /usr/lib/golang/src/runtime/proc.go:203 runtime.goexit /usr/lib/golang/src/runtime/asm_amd64.s:1357 Is this another bug - that the error is not propogated, and also is there another way you would suggest to verify this bug? Adding that after the cdi importer pod gets into a "crash loopback" state it is going back to status "running". Is this information available on DV? Was this event (https://github.com/kubevirt/vm-import-operator/pull/367/files#diff-88694056f107609ecae52379508b2296457f11ddeb1355ab316a87b9ae21ae91R410) fired? This event is not fired. What happens eventually is that after a couple of "New phase" trials of the cdi importer pod, it turned into a "Terminating" state. The VM import was removed automatically. 1. Seems that the cdi importer behavior has changed since this bug was reported. 2. I'll file a separate bug for the current behavior. Please create new bug if you see any undesired behaviour. Would you mind providing list of recent events? The condition created checks container exit code so if cdi behaviour has changed it could be not met. Closing this bug since it cannot be verified. The event added in this bug is not reached. The importer error that should have triggered it did not occur. |