Login
Log in using an SSO provider:
Fedora Account System
Red Hat Associate
Red Hat Customer
Login using a Red Hat Bugzilla account
Forgot Password
Create an Account
Red Hat Bugzilla – Attachment 1871709 Details for
Bug 2073920
rook osd prepare failed with this error - failed to set kek as an environment variable: key encryption key is empty
Home
New
Search
Simple Search
Advanced Search
My Links
Browse
Requests
Reports
Current State
Search
Tabular reports
Graphical reports
Duplicates
Other Reports
User Changes
Plotly Reports
Bug Status
Bug Severity
Non-Defaults
Product Dashboard
Help
Page Help!
Bug Writing Guidelines
What's new
Browser Support Policy
5.0.4.rh90 Release notes
FAQ
Guides index
User guide
Web Services
Contact
Legal
[?]
This site requires JavaScript to be enabled to function correctly, please enable it.
list of pods, storageclass and storagecluster yaml
logs.txt (text/plain), 10.77 KB, created by
Tal Yichye
on 2022-04-11 06:40:03 UTC
(
hide
)
Description:
list of pods, storageclass and storagecluster yaml
Filename:
MIME Type:
Creator:
Tal Yichye
Created:
2022-04-11 06:40:03 UTC
Size:
10.77 KB
patch
obsolete
> >storagecluster yaml: > >apiVersion: v1 >items: >- apiVersion: ocs.openshift.io/v1 > kind: StorageCluster > metadata: > annotations: > uninstall.ocs.openshift.io/cleanup-policy: delete > uninstall.ocs.openshift.io/mode: graceful > creationTimestamp: "2022-04-06T11:31:01Z" > finalizers: > - storagecluster.ocs.openshift.io > generation: 2 > name: ocs-storagecluster > namespace: openshift-storage > ownerReferences: > - apiVersion: odf.openshift.io/v1alpha1 > kind: StorageSystem > name: ocs-storagecluster-storagesystem > uid: d285fa96-f148-4c69-9420-8110baa7dcbe > resourceVersion: "401797462" > uid: 958efe45-2a88-4c3a-8ed9-164f8e5a87f6 > spec: > arbiter: {} > encryption: > kms: {} > externalStorage: {} > managedResources: > cephBlockPools: {} > cephCluster: {} > cephConfig: {} > cephDashboard: {} > cephFilesystems: {} > cephObjectStoreUsers: {} > cephObjectStores: {} > mirroring: {} > nodeTopologies: {} > storageDeviceSets: > - config: {} > count: 1 > dataPVCTemplate: > metadata: {} > spec: > accessModes: > - ReadWriteOnce > resources: > requests: > storage: 512Gi > storageClassName: ibm-odf-test > volumeMode: Block > status: {} > name: ocs-deviceset-ibm-odf-test > placement: {} > portable: true > preparePlacement: {} > replica: 3 > resources: {} > version: 4.10.0 > status: > conditions: > - lastHeartbeatTime: "2022-04-11T05:32:39Z" > lastTransitionTime: "2022-04-06T11:31:01Z" > message: 'Error while reconciling: some StorageClasses [ocs-storagecluster-ceph-rbd] > were skipped while waiting for pre-requisites to be met' > reason: ReconcileFailed > status: "False" > type: ReconcileComplete > - lastHeartbeatTime: "2022-04-06T11:31:01Z" > lastTransitionTime: "2022-04-06T11:31:01Z" > message: Initializing StorageCluster > reason: Init > status: "False" > type: Available > - lastHeartbeatTime: "2022-04-06T11:31:01Z" > lastTransitionTime: "2022-04-06T11:31:01Z" > message: Initializing StorageCluster > reason: Init > status: "True" > type: Progressing > - lastHeartbeatTime: "2022-04-06T11:31:01Z" > lastTransitionTime: "2022-04-06T11:31:01Z" > message: Initializing StorageCluster > reason: Init > status: "False" > type: Degraded > - lastHeartbeatTime: "2022-04-06T11:31:01Z" > lastTransitionTime: "2022-04-06T11:31:01Z" > message: Initializing StorageCluster > reason: Init > status: Unknown > type: Upgradeable > externalStorage: > grantedCapacity: "0" > failureDomain: rack > failureDomainKey: topology.rook.io/rack > failureDomainValues: > - rack0 > - rack1 > - rack2 > images: > ceph: > actualImage: quay.io/rhceph-dev/rhceph@sha256:82acc1ae5b6ee7f4c9100dbc803054b8edd7b77c64461966ecba621b3380f14b > desiredImage: quay.io/rhceph-dev/rhceph@sha256:82acc1ae5b6ee7f4c9100dbc803054b8edd7b77c64461966ecba621b3380f14b > noobaaCore: > desiredImage: quay.io/rhceph-dev/odf4-mcg-core-rhel8@sha256:805be8933e81fa2b037a843f9bf521f46ce25b828b28854c1d377dedc23f7f08 > noobaaDB: > desiredImage: quay.io/rhceph-dev/rhel8-postgresql-12@sha256:be7212e938d1ef314a75aca070c28b6433cd0346704d0d3523c8ef403ff0c69e > nodeTopologies: > labels: > kubernetes.io/hostname: > - cluster4-68nrb-worker-pwsm6 > - cluster4-68nrb-worker-q6fmj > - cluster4-68nrb-worker-vq2tb > topology.rook.io/rack: > - rack0 > - rack1 > - rack2 > phase: Error > relatedObjects: > - apiVersion: ceph.rook.io/v1 > kind: CephCluster > name: ocs-storagecluster-cephcluster > namespace: openshift-storage > resourceVersion: "401797458" > uid: 6204015a-6717-4b78-86d5-87486922a4ef >kind: List >metadata: > resourceVersion: "" > selfLink: "" > > >----------------------------------------------------------------------------- > >oc get storageclass > >NAME PROVISIONER RECLAIMPOLICY VOLUMEBINDINGMODE ALLOWVOLUMEEXPANSION AGE >ibm-odf-test block.csi.ibm.com Delete Immediate false 4d18h >ocs-storagecluster-ceph-rgw openshift-storage.ceph.rook.io/bucket Delete Immediate false 4d18h >ocs-storagecluster-cephfs openshift-storage.cephfs.csi.ceph.com Delete Immediate true 4d18h >thin (default) kubernetes.io/vsphere-volume Delete Immediate false 415d > > >----------------------------------------------------------------------------- > >oc get pods: > >NAME READY STATUS RESTARTS AGE >cluster4-68nrb-worker-pwsm6-debug 1/1 Running 0 6s >cluster4-68nrb-worker-q6fmj-debug 1/1 Running 0 6s >cluster4-68nrb-worker-vq2tb-debug 1/1 Running 0 6s >csi-addons-controller-manager-7d4b5fc974-xj878 2/2 Running 0 4d18h >csi-cephfsplugin-btqj4 3/3 Running 0 4d18h >csi-cephfsplugin-dzzcc 3/3 Running 0 4d18h >csi-cephfsplugin-g9p2s 3/3 Running 0 4d18h >csi-cephfsplugin-mvnrk 3/3 Running 0 4d18h >csi-cephfsplugin-nzltr 3/3 Running 0 4d18h >csi-cephfsplugin-provisioner-6f56bddcf4-dstrh 6/6 Running 0 4d18h >csi-cephfsplugin-provisioner-6f56bddcf4-q6c5v 6/6 Running 0 4d18h >csi-cephfsplugin-xvhg7 3/3 Running 0 4d18h >csi-rbdplugin-4bcvz 4/4 Running 0 4d18h >csi-rbdplugin-bqs28 4/4 Running 0 4d18h >csi-rbdplugin-khqcj 4/4 Running 0 4d18h >csi-rbdplugin-n9wz7 4/4 Running 0 4d18h >csi-rbdplugin-provisioner-854fc798b5-662x2 7/7 Running 0 4d18h >csi-rbdplugin-provisioner-854fc798b5-zr5cz 7/7 Running 0 4d18h >csi-rbdplugin-trf7f 4/4 Running 0 4d18h >csi-rbdplugin-wgz9w 4/4 Running 0 4d18h >ibm-block-csi-controller-0 7/7 Running 0 4d18h >ibm-block-csi-node-4dzdc 3/3 Running 0 4d18h >ibm-block-csi-node-d4tbj 3/3 Running 0 4d18h >ibm-block-csi-node-fh7xh 3/3 Running 0 4d18h >ibm-block-csi-node-kn8l7 3/3 Running 0 4d18h >ibm-block-csi-node-llsw6 3/3 Running 0 4d18h >ibm-block-csi-node-xxckc 3/3 Running 0 4d18h >ibm-block-csi-operator-676cf5b87b-74n2z 1/1 Running 0 4d18h >ibm-flashsystem-storage-ibm-odf-test-9f578dd65-6lhqm 1/1 Running 0 4d18h >ibm-odf-console-bbb8896d9-t9fdb 1/1 Running 0 4d18h >ibm-storage-odf-operator-5947f89458-4rhvq 2/2 Running 0 4d18h >noobaa-operator-768d97c86c-x76cq 1/1 Running 0 4d18h >ocs-metrics-exporter-5b54fdc4c4-nxl7p 1/1 Running 0 4d18h >ocs-operator-757d5cdd76-f7bqp 1/1 Running 0 4d18h >odf-console-6b455944fc-tbfwb 1/1 Running 0 4d18h >odf-operator-controller-manager-b845564b4-9g27n 2/2 Running 0 4d18h >rook-ceph-crashcollector-cluster4-68nrb-worker-pwsm6-86648nsrgx 1/1 Running 0 4d18h >rook-ceph-crashcollector-cluster4-68nrb-worker-q6fmj-85d7btg2r7 1/1 Running 0 4d18h >rook-ceph-crashcollector-cluster4-68nrb-worker-vq2tb-847b7c4xpw 1/1 Running 0 4d17h >rook-ceph-mds-ocs-storagecluster-cephfilesystem-a-747bd7d5r79rb 2/2 Running 0 4d18h >rook-ceph-mds-ocs-storagecluster-cephfilesystem-b-79459d77pqt6g 2/2 Running 0 4d17h >rook-ceph-mgr-a-67c9d6d8bb-vvsrl 2/2 Running 0 4d18h >rook-ceph-mon-a-54546fcd74-xv56d 2/2 Running 0 4d18h >rook-ceph-mon-b-845bd5bc5d-rp5gb 2/2 Running 0 4d17h >rook-ceph-mon-c-b49cd7d56-2gmgt 2/2 Running 0 4d18h >rook-ceph-operator-5cbccdd5b-5tmxc 1/1 Running 0 4d17h >rook-ceph-osd-prepare-fbaff53b5e0ba80173198d7e8b474657-qkqc4 0/1 CrashLoopBackOff 5 (2m38s ago) 6m2s >rook-ceph-rgw-ocs-storagecluster-cephobjectstore-a-5bd4d78jskmc 1/2 Running 1225 (24s ago) 4d18h >rook-ceph-tools-5974886644-dtrml 1/1 Running 0 4d17h >
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 2073920
: 1871709 |
1871768
|
1871769
|
1876518
|
1876691
|
1876693