Bug 2165793 - Dataimportcron DV keeps importing when update new digest in imagestream
Summary: Dataimportcron DV keeps importing when update new digest in imagestream
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Container Native Virtualization (CNV)
Classification: Red Hat
Component: Storage
Version: 4.12.1
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: ---
: 4.12.1
Assignee: Arnon Gilboa
QA Contact: Yan Du
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-01-31 04:25 UTC by Yan Du
Modified: 2023-02-28 20:06 UTC (History)
0 users

Fixed In Version: v4.12.1-22
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2023-02-28 20:06:27 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github kubevirt containerized-data-importer pull 2566 0 None Merged Fix DataImportCron PVC timestamping 2023-02-01 13:03:16 UTC
Github kubevirt containerized-data-importer pull 2567 0 None Merged [release-v1.55] Fix DataImportCron PVC timestamping 2023-02-02 07:47:28 UTC
Red Hat Issue Tracker CNV-24858 0 None None None 2023-01-31 04:28:12 UTC
Red Hat Product Errata RHEA-2023:1023 0 None None None 2023-02-28 20:06:33 UTC

Description Yan Du 2023-01-31 04:25:35 UTC
Description of problem:
Dataimportcron DV keeps importing when update new digest in imagestream
After the import finished, the pvc was deleted and triggered a new import

Version-Release number of selected component (if applicable):
CNV 4.12.1-3

How reproducible:
Always

Steps to Reproduce:
1. Create a ImageStream
$ oc import-image rhel8-image-stream  --from=registry.redhat.io/rhel8/rhel-guest-image --scheduled --confirm
2. Create a DataImportCron (yaml as below)
3. Update a new digest for ImageStream
$ oc import-image rhel8-image-stream:8.4.0-423 --from=registry.redhat.io/rhel8/rhel-guest-image@sha256:947541648d7f12fd56d2224d55ce708d369f76ffeb4938c8846b287197f30970 --scheduled --confirm
4. Observe dv and pvc

Actual results:
$ oc get dv -w
---------------8<-------------------
rhel8-947541648d7f   ImportInProgress   99.24%                2m29s
rhel8-947541648d7f   ImportInProgress   99.24%                2m32s
rhel8-947541648d7f   ImportInProgress   99.24%                2m32s
rhel8-947541648d7f   ImportInProgress   99.24%                2m33s
rhel8-947541648d7f   Succeeded          100.0%                2m35s
rhel8-947541648d7f   Succeeded          100.0%                2m36s
rhel8-947541648d7f                                            0s
rhel8-947541648d7f   Pending            N/A                   0s
rhel8-947541648d7f   PVCBound           N/A                   0s
rhel8-947541648d7f   ImportScheduled    N/A                   0s
rhel8-947541648d7f   ImportScheduled    N/A                   0s
rhel8-947541648d7f   ImportInProgress   N/A                   12s
rhel8-947541648d7f   ImportInProgress   0.00%                 14s
rhel8-947541648d7f   ImportInProgress   2.00%                 16s
rhel8-947541648d7f   ImportInProgress   4.00%                 18s
rhel8-947541648d7f   ImportInProgress   6.02%                 20s
---------------8<-------------------

$ oc get pvc -w 
---------------8<-------------------
rhel8-947541648d7f   Bound    pvc-29488903-52d0-4675-8c5d-5c04931d3fe3   10Gi       RWX            ocs-storagecluster-ceph-rbd   2m35s
rhel8-947541648d7f   Bound    pvc-29488903-52d0-4675-8c5d-5c04931d3fe3   10Gi       RWX            ocs-storagecluster-ceph-rbd   2m35s
rhel8-e004563ff8cf   Terminating   pvc-dd73d8cf-e42f-4a26-a2f1-088da424fd2e   10Gi       RWX            ocs-storagecluster-ceph-rbd   6m1s
rhel8-947541648d7f   Terminating   pvc-29488903-52d0-4675-8c5d-5c04931d3fe3   10Gi       RWX            ocs-storagecluster-ceph-rbd   2m36s
rhel8-e004563ff8cf   Terminating   pvc-dd73d8cf-e42f-4a26-a2f1-088da424fd2e   10Gi       RWX            ocs-storagecluster-ceph-rbd   6m1s
rhel8-947541648d7f   Terminating   pvc-29488903-52d0-4675-8c5d-5c04931d3fe3   10Gi       RWX            ocs-storagecluster-ceph-rbd   2m36s
rhel8-947541648d7f   Pending                                                                            ocs-storagecluster-ceph-rbd   0s
rhel8-947541648d7f   Pending                                                                            ocs-storagecluster-ceph-rbd   0s
rhel8-947541648d7f   Pending       pvc-0e9a3ca6-4123-4af8-a06c-61c94f91a5cf   0                         ocs-storagecluster-ceph-rbd   0s
rhel8-947541648d7f   Bound         pvc-0e9a3ca6-4123-4af8-a06c-61c94f91a5cf   10Gi       RWX            ocs-storagecluster-ceph-rbd   0s
rhel8-947541648d7f   Bound         pvc-0e9a3ca6-4123-4af8-a06c-61c94f91a5cf   10Gi       RWX            ocs-storagecluster-ceph-rbd   0s
---------------8<-------------------



Expected results:
Import only once and succeed without any error

Additional info:

---
apiVersion: cdi.kubevirt.io/v1beta1
kind: DataImportCron
metadata:
  name: rhel8-image-import-cron
spec:
  template:
    spec:
      source:
        registry:
          pullMethod: node
          imageStream: rhel8-image-stream
      storage:
        resources:
          requests:
            storage: 10Gi
        storageClassName: ocs-storagecluster-ceph-rbd
  schedule: "* * * * *"
  garbageCollect: Outdated
  managedDataSource: rhel8
  importsToKeep: 1

Comment 1 Yan Du 2023-02-03 08:27:28 UTC
Test on CNV-v4.12.1-22
issue has been fixed.

$ oc get pvc 
NAME                 STATUS   VOLUME                                     CAPACITY   ACCESS MODES   STORAGECLASS                  AGE
rhel8-947541648d7f   Bound    pvc-441d6da1-730c-42ad-9400-b1551c993d19   10Gi       RWX            ocs-storagecluster-ceph-rbd   5m17s

Comment 7 errata-xmlrpc 2023-02-28 20:06:27 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (OpenShift Virtualization 4.12.1 Images), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHEA-2023:1023


Note You need to log in before you can comment on or make changes to this bug.