Bug 2181648 - [GSS][doc test day] many Links are broken/incorrect redirection in "Deploying and managing OpenShift Data Foundation using Google Cloud" guide
Summary: [GSS][doc test day] many Links are broken/incorrect redirection in "Deploying...
Keywords:
Status: VERIFIED
Alias: None
Product: Red Hat OpenShift Data Foundation
Classification: Red Hat Storage
Component: documentation
Version: 4.12
Hardware: All
OS: All
unspecified
medium
Target Milestone: ---
: ODF 4.12.2
Assignee: Kusuma
QA Contact: Disha Walvekar
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-03-24 20:33 UTC by Geo Jose
Modified: 2023-08-09 16:43 UTC (History)
3 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed:
Embargoed:


Attachments (Terms of Use)

Description Geo Jose 2023-03-24 20:33:48 UTC
Describe the issue:
 Incorrect the hyper links in "Deploying and managing OpenShift Data Foundation using Google Cloud" guide.

Describe the task you were trying to accomplish:
 Multiple links are broken/incorrect redirection and this needs to fix.

Suggestions for improvement:
Check "Additional information" Section 

Document URL:
https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html/deploying_and_managing_openshift_data_foundation_using_google_cloud/index

Chapter/Section Number and Title:
Check "Additional information" Section 
 
Product Version:
  RHODF 4.12

Environment Details:
  NA
  
Any other versions of this document that also needs this update:
 NA.

Comment 2 Geo Jose 2023-03-24 20:34:33 UTC
Additional information

Please find the details of the broken links:
This will be in the below format:
~~~
SL
 Chapter/Section
 The statement which contains the hyperlink.
 Current redirection
 Suggested redirection
~~~

1)
Preface

 To deploy OpenShift Data Foundation in internal mode, start with the requirements in "Preparing to deploy OpenShift Data Foundation" chapter and follow the appropriate deployment process based on your requirement:
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#preparing_to_deploy_openshift_data_foundation

 Deploy OpenShift Data Foundation on Google Cloud
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#deploying_openshift_data_foundation_on_google_cloud

 Deploy standalone Multicloud Object Gateway component
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#deploy-standalone-multicloud-object-gateway


2) 
Chapter 1. Preparing to deploy OpenShift Data Foundation:
 When the Token authentication method is selected for encryption then refer to "Enabling cluster-wide encryption with the Token authentication using KMS". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#enabling-cluster-wide-encryprtion-with-the-token-authentication-using-kms_gcp

3)
Chapter 2. Deploying OpenShift Data Foundation on Google Cloud:
 For more information, see "Deploy standalone Multicloud Object Gateway".
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#deploy-standalone-multicloud-object-gateway

 Ensure that you have addressed the requirements in "Preparing to deploy OpenShift Data Foundation" chapter before proceeding with the below steps for deploying using dynamic storage devices:
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#preparing_to_deploy_openshift_data_foundation

 Create the OpenShift Data Foundation Cluster. 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#creating-an-openshift-data-foundation-service_gcp

4) 
 2.4. Creating an OpenShift Data Foundation cluster
 To verify that all components for OpenShift Data Foundation are successfully installed, see "Verifying your OpenShift Data Foundation deployment".
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#verifying_openshift_data_foundation_deployment

5)
 2.5.2. Verifying the OpenShift Data Foundation cluster is healthy
 For more information on the health of the OpenShift Data Foundation cluster using the Block and File dashboard, see "Monitoring OpenShift Data Foundation". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/monitoring_openshift_data_foundation/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/monitoring_openshift_data_foundation/index#block_and_file_dashboard_indicators

6) 
 2.5.3. Verifying the Multicloud Object Gateway is healthy
 For more information on the health of the OpenShift Data Foundation cluster using the object service dashboard, see "Monitoring OpenShift Data Foundation". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/monitoring_openshift_data_foundation/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/monitoring_openshift_data_foundation/index#object_dashboard_indicators
 
7) 
 5.2. Creating a storage class for persistent volume encryption
 Using vaulttokens: Ensure to configure access as described in "Configuring access to KMS using vaulttokens"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_and_allocating_storage_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_and_allocating_storage_resources/index#access-configuration-for-key-management-system_rhodf
 
 Using vaulttenantsa (Technology Preview): Ensure to configure access as described in "Configuring access to KMS using vaulttenantsa"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_and_allocating_storage_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_and_allocating_storage_resources/index#configuring-access-to-kms-using-vaulttenantsa_rhodf
 
 The storage class can be used to create encrypted persistent volumes. For more information, see "managing persistent volume claims".
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_and_allocating_storage_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_and_allocating_storage_resources/index#storage-class-for-persistent-volume-encryption_rhodf       (unsure about this link so Please double confirm)

8)
 Chapter 6. Configure storage for OpenShift Container Platform services
 Red Hat recommends configuring shorter curation and retention intervals for these services. See "Configuring the Curator schedule"
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.9/html-single/logging/index
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.9/html-single/logging/index#cluster-logging-elasticsearch-retention_cluster-logging-store   (unsure about this link so confirm the same)

 and the Modifying retention time for Prometheus metrics data sub section of "Configuring persistent storage" in the OpenShift Container Platform documentation for details. 
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.9/html-single/monitoring/index
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.9/html-single/monitoring/index#modifying-retention-time-for-prometheus-metrics-data_configuring-the-monitoring-stack

9) 
 6.2. Configuring monitoring to use OpenShift Data Foundation
 Red Hat recommends configuring a short retention interval for this service. See the "Modifying retention time for Prometheus metrics data" of Monitoring guide in the OpenShift Container Platform documentation for details. 
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.12/html-single/monitoring/index
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.9/html-single/monitoring/index#modifying-retention-time-for-prometheus-metrics-data_configuring-the-monitoring-stack

10) 
 6.3. Cluster logging for OpenShift Data Foundation
 Red Hat recommends configuring shorter curation and retention intervals for these services. See "Cluster logging curator" in the OpenShift Container Platform documentation for details. 
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.9/html-single/logging/index
 Unable to find "Cluster logging curator" section.

11)
 6.3.1. Configuring persistent storage
 For information about Elasticsearch replication policies, see Elasticsearch replication policy in About deploying and configuring cluster logging. 
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.12/html-single/logging/index
 Unable to find "Elasticsearch replication policy" section

12)
 6.3.2. Configuring cluster logging to use OpenShift data Foundation
 For more details, see Curation of Elasticsearch Data. 
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.12/html-single/logging/index
 Unable to find "Curation of Elasticsearch Data"section.

13)
 9.1. Requirements for scaling storage nodes
 "Platform requirements"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/planning_your_deployment/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/planning_your_deployment/index#platform-requirements_rhodf

 "Dynamic storage devices"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/planning_your_deployment/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/planning_your_deployment/index#dynamic_storage_devices

"Capacity planning"
https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/planning_your_deployment/index
https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/planning_your_deployment/index#capacity_planning

14) 
 9.3.2. Scaling up storage capacity
 After you add a new node to OpenShift Data Foundation, you must scale up the storage capacity as described in "Scaling up storage by adding capacity". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#scaling_out_storage_capacity_by_adding_new_nodes

15) 
 10.2. Accessing the Multicloud Object Gateway with your applications
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found at "Download RedHat OpenShift Data Foundation page".
 https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/packages
 Unable to find rpms.

16) 
 10.4.1. Creating a new backing store
 For more information on creating an OCP secret, see the section "Creating the secret" in the Openshift Container Platform documentation. 
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.12/html-single/authentication_and_authorization/index
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.12/html-single/authentication_and_authorization/index#identity-provider-creating-secret-tls_configuring-keystone-identity-provider

17)
 10.4.2.1. Creating an AWS-backed backingstore
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/packages 
  1) Unable to find rpms. 
  2) Create hyperlink

18) 
 10.4.2.2. Creating an IBM COS-backed backingstore
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/packages 
  1) Unable to find rpms. 
  2) Create hyperlink

19)
 10.4.2.3. Creating an Azure-backed backingstore
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/packages 
  1) Unable to find rpms. 
  2) Create hyperlink

20) 
 10.4.2.4. Creating a GCP-backed backingstore
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/packages 
  1) Unable to find rpms. 
  2) Create hyperlink
  
21)
 10.4.2.5. Creating a local Persistent Volume-backed backingstore
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/packages 
  1) Unable to find rpms. 
  2) Create hyperlink   
  
22)  
 10.4.3. Creating an s3 compatible Multicloud Object Gateway backingstore
  v. To get the <RGW endpoint>, see "Accessing the RADOS Object Gateway S3 endpoint". 
  https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index
  https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index#Accessing-the-RADOS-Object-Gateway-S3-endpoint_rhodf
  
  iii. To get the <RGW endpoint>, see "Accessing the RADOS Object Gateway S3 endpoint".
  https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index
    https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index#Accessing-the-RADOS-Object-Gateway-S3-endpoint_rhodf
    
23)
 10.4.5. Creating a new bucket class
 Select at least one Backing Store resource from the available list if you have selected Tier 1 - Policy Type as Spread and click Next. Alternatively, you can also "create a new backing store". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index#creating-a-new-backing-store_rhodf
 
24)
10.5. Managing namespace buckets
https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#amazon-s3-api-endpoints-for-objects-in-namespace-buckets_gcp
https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#amazon-s3-api-endpoints-for-objects-in-namespace-buckets_rhodf

25)
 10.5.2. Adding a namespace bucket using the Multicloud Object Gateway CLI and YAML
 For more information about namespace buckets, see "Managing namespace buckets". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#Managing-namespace-buckets_rhodf
 
 "Adding an AWS S3 namespace bucket using YAML" 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#adding-an-aws-s3-namespace-bucket-using-yaml_rhodf

 "Adding an IBM COS namespace bucket using YAML"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#adding-an-ibm-cos-s3-namespace-bucket-using-yaml_rhodf

 "Adding an AWS S3 namespace bucket using the Multicloud Object Gateway CLI"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#adding-an-aws-s3-namespace-bucket-using-the-multicloud-object-gateway-cli_rhodf

 "Adding an IBM COS namespace bucket using the Multicloud Object Gateway CLI"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#adding-an-ibm-cos-s3-namespace-bucket-using-the-multicloud-object-gateway-cli_rhodf

26)
 10.5.2.1. Adding an AWS S3 namespace bucket using YAML
 For information, see Chapter 2, "Accessing the Multicloud Object Gateway with your applications". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index#accessing-the-multicloud-object-gateway-with-your-applications_rhodf

27) 
 10.5.2.2. Adding an IBM COS namespace bucket using YAML
 Access to the Multicloud Object Gateway (MCG), see Chapter 2, "Accessing the Multicloud Object Gateway with your applications". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index#accessing-the-multicloud-object-gateway-with-your-applications_rhodf
  
28)  
 10.5.2.3. Adding an AWS S3 namespace bucket using the Multicloud Object Gateway CLI
 Access to the Multicloud Object Gateway (MCG), see Chapter 2, "Accessing the Multicloud Object Gateway with your applications". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index#accessing-the-multicloud-object-gateway-with-your-applications_rhodf
   
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/package. 
      1) Unable to find rpms. 
      2) Create hyperlink   
      
29)      
 10.5.2.4. Adding an IBM COS namespace bucket using the Multicloud Object Gateway CLI
 Access to the Multicloud Object Gateway (MCG), see Chapter 2, "Accessing the Multicloud Object Gateway with your applications". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index#accessing-the-multicloud-object-gateway-with-your-applications_rhodf
   

 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/package. 
      1) Unable to find rpms. 
      2) Create hyperlink 

30) 
 10.5.3. Adding a namespace bucket using the OpenShift Container Platform user interface
 For information about namespace buckets, see "Managing namespace buckets".
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/managing_hybrid_and_multicloud_resources/index#Managing-namespace-buckets_rhodf
 
31) 
 10.7.3. Creating a user in the Multicloud Object Gateway
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found at "Download RedHat OpenShift Data Foundation page". 
  1) Unable to find rpms

32)  
10.8.1. Dynamic Object Bucket Claim
 See "Accessing the Multicloud Object Gateway with your applications" for more information. 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html/managing_hybrid_and_multicloud_resources/object-bucket-claim#dynamic-object-bucket-claim_rhodf
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html/managing_hybrid_and_multicloud_resources/accessing-the-multicloud-object-gateway-with-your-applications_rhodf

33)
 10.9. Caching policy for object buckets
 "AWS S3"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/managing_hybrid_and_multicloud_resources/index
 Incorrect Link or Incorrect Content

"IBM COS"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.9/html-single/managing_hybrid_and_multicloud_resources/index
 Incorrect Link or incorrect Content

34) 
 10.9.1. Creating an AWS cache bucket
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/packages 
  1) Unable to find rpms. 
  2) Create hyperlink   

35)  
 10.9.2. Creating an IBM COS cache bucket
 Alternatively, you can install the MCG package from the OpenShift Data Foundation RPMs found here https://access.redhat.com/downloads/content/547/ver=4/rhel---8/4/x86_64/packages 
  1) Unable to find rpms. 
  2) Create hyperlink   
 
36)  
 15.1. Replacing operational or failed storage devices on Google Cloud installer-provisioned infrastructure
 "Replacing operational nodes on Google Cloud installer-provisioned infrastructure"
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#replacing-operational-nodes-on-google-cloud-installer-provisioned-infrastructure_rhodf

 "Replacing failed nodes on Google Cloud installer-provisioned infrastructures."
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/deploying_and_managing_openshift_data_foundation_using_google_cloud/index#replacing-failed-nodes-on-google-cloud-installer-provisioned-infrastructures_rhodf

37)
 16.1. Overview of the OpenShift Data Foundation update process
 To prepare a disconnected environment for updates, see "Operators guide to using Operator Lifecycle Manager on restricted networks" to be able to update OpenShift Data Foundation as well as Local Storage Operator when in use. 
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.12/html-single/operators/index
 https://access.redhat.com/documentation/en-us/openshift_container_platform/4.12/html-single/operators/index#olm-restricted-networks

 For updating between minor releases, see "Updating Red Hat OpenShift Data Foundation 4.11 to 4.12". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/updating_openshift_data_foundation/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/updating_openshift_data_foundation/index#updating-ocs-to-odf_rhodf

 For updating between z-stream releases, see "Updating Red Hat OpenShift Data Foundation 4.12.x to 4.12.y". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/updating_openshift_data_foundation/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/updating_openshift_data_foundation/index#updating-zstream-odf_rhodf
 
 For updating external mode deployments, you must also perform the steps from section "Updating the Red Hat OpenShift Data Foundation external secret".
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/updating_openshift_data_foundation/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/updating_openshift_data_foundation/index#updating-the-openshift-data-foundation-external-secret_rhodf

38) 
 16.2. Updating Red Hat OpenShift Data Foundation 4.11 to 4.12
 After updating external mode deployments, you must also update the external secret. For instructions, see "Updating the OpenShift Data Foundation external secret". 
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/updating_openshift_data_foundation/index
 https://access.redhat.com/documentation/en-us/red_hat_openshift_data_foundation/4.12/html-single/updating_openshift_data_foundation/index#updating-the-openshift-data-foundation-external-secret_rhodf

Regards,
Geo Jose


Note You need to log in before you can comment on or make changes to this bug.