Bug 1965969
Summary: | [aws] the public hosted zone id is not correct in the destroy log, while destroying a cluster which is using BYO private hosted zone. | |||
---|---|---|---|---|
Product: | OpenShift Container Platform | Reporter: | Yunfei Jiang <yunjiang> | |
Component: | Installer | Assignee: | Nobody <nobody> | |
Installer sub component: | openshift-installer | QA Contact: | Yunfei Jiang <yunjiang> | |
Status: | CLOSED ERRATA | Docs Contact: | ||
Severity: | low | |||
Priority: | low | |||
Version: | 4.8 | |||
Target Milestone: | --- | |||
Target Release: | 4.11.0 | |||
Hardware: | Unspecified | |||
OS: | Unspecified | |||
Whiteboard: | ||||
Fixed In Version: | Doc Type: | Bug Fix | ||
Doc Text: |
Cause: The destroyer incorrectly reports the ID of the private route53 hosted zone for the cluster when deleting DNS records from the hosted zone of the base domain.
Consequence: Wrong hosted zone ID reported in the log of the destroyer.
Fix: Use the proper hosted zone ID in the log.
Result: The log of the destroyer shows the correct hosted zone ID when destroying the DNS records in the base domain's hosted zone.
|
Story Points: | --- | |
Clone Of: | ||||
: | 2051333 (view as bug list) | Environment: | ||
Last Closed: | 2022-08-10 10:36:25 UTC | Type: | Bug | |
Regression: | --- | Mount Type: | --- | |
Documentation: | --- | CRM: | ||
Verified Versions: | Category: | --- | ||
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | ||
Cloudforms Team: | --- | Target Upstream Version: | ||
Embargoed: | ||||
Bug Depends On: | ||||
Bug Blocks: | 2051333 |
Description
Yunfei Jiang
2021-05-31 09:27:30 UTC
Will address this in a future sprint. Needs prioritized. Will review again for a future sprint. verified. FAILED. OCP version: 4.10.0-0.nightly-2021-11-15-034648 Got some confused info in log, e.g. level=debug msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP recordset=api.yunjiang-969a.qe.devcluster.openshift.com. (A) does api record come from public or private hosted zone? config: <--snip--> baseDomain: qe.devcluster.openshift.com <-- zone id Z3B3KOVA3TRCWP platform: aws: hostedZone: Z0839996PWILML3MAMMI <--snip--> destroy log: level=debug msg=listing AWS hosted zones "yunjiang-969a.qe.devcluster.openshift.com." (page 0) arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI level=debug msg=listing AWS hosted zones "qe.devcluster.openshift.com." (page 0) arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI level=debug msg=Deleted from public zone arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP recordset=api-int.yunjiang-969a.qe.devcluster.openshift.com. (A) level=info msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP record set=A api-int.yunjiang-969a.qe.devcluster.openshift.com. level=debug msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP recordset=api-int.yunjiang-969a.qe.devcluster.openshift.com. (A) level=info msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP record set=A api.yunjiang-969a.qe.devcluster.openshift.com. level=debug msg=Deleted from public zone arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP recordset=api.yunjiang-969a.qe.devcluster.openshift.com. (A) level=info msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP record set=A api.yunjiang-969a.qe.devcluster.openshift.com. level=debug msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP recordset=api.yunjiang-969a.qe.devcluster.openshift.com. (A) level=info msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP record set=A \052.apps.yunjiang-969a.qe.devcluster.openshift.com. level=debug msg=Deleted from public zone arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP recordset=\052.apps.yunjiang-969a.qe.devcluster.openshift.com. (A) level=info msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP record set=A \052.apps.yunjiang-969a.qe.devcluster.openshift.com. level=debug msg=Deleted arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP recordset=\052.apps.yunjiang-969a.qe.devcluster.openshift.com. (A) level=info msg=Cleaned record sets from hosted zone arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI id=/hostedzone/Z3B3KOVA3TRCWP level=info msg=Removed tag kubernetes.io/cluster/yunjiang-969a-lpj6h: shared arn=arn:aws:route53:::hostedzone/Z0839996PWILML3MAMMI verified. FAILED. OCP Version: 4.10.0-0.nightly-2022-01-10-014106 > grep hostedzone .openshift_install.log time="2022-01-10T05:09:15-05:00" level=debug msg="No cluster domain specified in metadata; cannot clean the shared hosted zone" arn="arn:aws:route53:::hostedzone/Z017831332AP9QNK19P4I" id=Z017831332AP9QNK19P4I time="2022-01-10T05:09:16-05:00" level=info msg="Removed tag kubernetes.io/cluster/yunjiang-bz969a-b24kb: shared" arn="arn:aws:route53:::hostedzone/Z017831332AP9QNK19P4I" issues: 1. records in public zone were not deleted. 2. records in BYO private zone were not deleted. Hello Staebler, I noticed the target release has been set to None now, but new issues (comment 10) have been introduced by https://github.com/openshift/installer/pull/5494, this needs to be resolved in 4.10. (In reply to Yunfei Jiang from comment #12) > Hello Staebler, I noticed the target release has been set to None now, but > new issues (comment 10) have been introduced by > https://github.com/openshift/installer/pull/5494, this needs to be resolved > in 4.10. Please file a separate BZ for the new issue. Moving this back to ON_QA, but testing of this is blocked on https://bugzilla.redhat.com/show_bug.cgi?id=2051333. verification failed. OCP version: 4.11.0-0.nightly-2022-02-27-122819 > install-config: platform: aws: region: us-east-2 subnets: - subnet-0cce8bda6928d94e5 - subnet-097e6bcbbe614f53b - subnet-02fe26fcb7a49f818 - subnet-0c7522cad1fc1a938 hostedZone: Z0273356ZY68YOEOT1OY publish: External baseDomain: qe.devcluster.openshift.com > error messages in .openshift_install.log: private zone id public zone id | | V V INFO Deleted arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY id=/hostedzone/Z3B3KOVA3TRCWP record set=A api.yunjiang-bz969.qe.devcluster.openshift.com. INFO Deleted arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY id=/hostedzone/Z3B3KOVA3TRCWP record set=A \052.apps.yunjiang-bz969.qe.devcluster.openshift.com. > .openshift_install.log: <--SNIP--> INFO Deleted arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY id=Z0273356ZY68YOEOT1OY record set=A api-int.yunjiang-bz969.qe.devcluster.openshift.com. INFO Deleted arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY id=/hostedzone/Z3B3KOVA3TRCWP record set=A api.yunjiang-bz969.qe.devcluster.openshift.com. INFO Deleted arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY id=Z0273356ZY68YOEOT1OY record set=A api.yunjiang-bz969.qe.devcluster.openshift.com. INFO Deleted arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY id=/hostedzone/Z3B3KOVA3TRCWP record set=A \052.apps.yunjiang-bz969.qe.devcluster.openshift.com. INFO Deleted arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY id=Z0273356ZY68YOEOT1OY record set=A \052.apps.yunjiang-bz969.qe.devcluster.openshift.com. INFO Cleaned record sets from hosted zone arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY id=Z0273356ZY68YOEOT1OY INFO Removed tag kubernetes.io/cluster/yunjiang-bz969-prnrb: shared arn=arn:aws:route53:::hostedzone/Z0273356ZY68YOEOT1OY INFO Time elapsed: 4m9s The arn field is always going to be for the private zone. The arn is the resource that the destroyer is responding to. The relevant part of the log message is the id field. per comment 16, the result in comment 15 is as expected. Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory (Important: OpenShift Container Platform 4.11.0 bug fix and security update), and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHSA-2022:5069 |