Bug 2092470 - [IBM Z] [DR] - FinalizerAddFailed Error While Creating Disaster Recovery Policy on Hub cluster
Summary: [IBM Z] [DR] - FinalizerAddFailed Error While Creating Disaster Recovery Poli...
Keywords:
Status: CLOSED NOTABUG
Alias: None
Product: Red Hat OpenShift Data Foundation
Classification: Red Hat Storage
Component: odf-dr
Version: 4.10
Hardware: s390x
OS: Linux
unspecified
medium
Target Milestone: ---
: ---
Assignee: Shyamsundar
QA Contact: krishnaram Karthick
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2022-06-01 16:02 UTC by Abdul Kandathil (IBM)
Modified: 2023-08-09 17:00 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2022-06-02 13:36:54 UTC
Embargoed:


Attachments (Terms of Use)
ramen-hub-operator-678b975bb8-m4wfv.log (83.86 KB, text/plain)
2022-06-01 16:02 UTC, Abdul Kandathil (IBM)
no flags Details

Description Abdul Kandathil (IBM) 2022-06-01 16:02:07 UTC
Created attachment 1885816 [details]
ramen-hub-operator-678b975bb8-m4wfv.log

Description of problem (please be detailed as possible and provide log
snippests):
FinalizerAddFailed Error While Creating Disaster Recovery Policy on Hub cluster as per instruction in documentation: https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.10/html/configuring_openshift_data_foundation_for_regional-dr_with_advanced_cluster_management/creating-disaster-recovery-policy-on-hub-cluster_rhodf


Error Message : 
DRPolicy.ramendr.openshift.io "odr-policy-5m" is invalid: spec.drClusterSet: Required value

Reason: FinalizerAddFailed

Yaml file used to create DRPolicy:

apiVersion: ramendr.openshift.io/v1alpha1
kind: DRPolicy
metadata:
  name: odr-policy-5m
spec:
  drClusterSet:
    - name: ocsm1301015
      region: east
      s3ProfileName: s3profile-ocsm1301015-ocs-storagecluster
    - name: ocsm4204001
      region: west
      s3ProfileName: s3profile-ocsm4204001-ocs-storagecluster
  schedulingInterval: 5m



Version of all relevant components (if applicable):
ODF 4.10
Openshift DR Hub Operator: v4.11.0-69
ODF Multicluster Orchestrator: v4.11.0-80



Does this issue impact your ability to continue to work with the product
(please explain in detail what is the user impact)?


Is there any workaround available to the best of your knowledge?


Rate from 1 - 5 the complexity of the scenario you performed that caused this
bug (1 - very simple, 5 - very complex)?


Can this issue reproducible?
yes

Can this issue reproduce from the UI?
yes

If this is a regression, please provide more details to justify this:


Steps to Reproduce:
1. Deploy DR cluster 
2. Deploy Openshift DR Hub Operator
3. Create DRPolicy following instruction : https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.10/html/configuring_openshift_data_foundation_for_regional-dr_with_advanced_cluster_management/creating-disaster-recovery-policy-on-hub-cluster_rhodf


Actual results:
FinalizerAddFailed

Expected results:
Succeeded

Additional info:
Attached ramen hub operator log.

Comment 2 Benamar Mekhissi 2022-06-01 16:20:23 UTC
There was a mistmatch between versions.  DRPolicy version 4.10 is being applied while its CRD version is already on 4.11. To fix, both versions must match.

Comment 3 Abdul Kandathil (IBM) 2022-06-02 13:36:54 UTC
This BZ is not more reproducible after matching all the versions identical, (odf: 4.10.2, DR Hub Operator: 4.10.3-5, Multicluster Orchestrator: 4.10.3-5).

[root@m1312001 ~]# oc get drpolicy odr-policy-5m -n openshift-dr-system -o jsonpath='{.status.conditions[].reason}{"\n"}'
Succeeded
[root@m1312001 ~]#


Note You need to log in before you can comment on or make changes to this bug.