Description of problem: local storage operator can not support "All namespaces on the cluster" installation mode Version-Release number of selected component (if applicable): quay.io/openshift-release-dev/ocp-v4.0-art-dev:v4.2.0-201908181300-ose-local-storage-operator How reproducible: 100% Steps to Reproduce: 1. Uploaded the local storage operator to operatorhub as a custom operator with the manifests in https://github.com/openshift/local-storage-operator/tree/release-4.2/manifests 2. Created a project local-storage 3. Installed local storage operator with the "All namespaces on the cluster" installation mode 4. Created a LocalVolume instance in the local-storage project $ oc get localvolume local-disks -n local-storage -oyaml apiVersion: local.storage.openshift.io/v1 kind: LocalVolume metadata: creationTimestamp: "2019-08-20T05:31:54Z" generation: 1 name: local-disks namespace: local-storage resourceVersion: "526669" selfLink: /apis/local.storage.openshift.io/v1/namespaces/local-storage/localvolumes/local-disks uid: d199ca9c-c30b-11e9-af54-0251f29b9002 spec: nodeSelector: nodeSelectorTerms: - matchExpressions: - key: kubernetes.io/hostname operator: In values: - ip-10-0-129-132 storageClassDevices: - devicePaths: - /dev/vde - /dev/vdf fsType: xfs storageClassName: local-sc volumeMode: Filesystem 5. Check the deployment of local storage controller manager Actual results: Storage controller manager is not deployed. Expected results: Storage controller manager can be deployed successfully or do not support "All namespaces on the cluster" installation mode. Additional info: If creating a localvolume instance in openshift-operators project, local storage controller manager can be deployed successfully. $ oc get pod -n openshift-operators NAME READY STATUS RESTARTS AGE local-disks-local-diskmaker-wn98w 1/1 Running 0 14m local-disks-local-provisioner-8tr58 1/1 Running 0 14m local-storage-operator-844986dc88-94slr 1/1 Running 0 106m
Created attachment 1610664 [details] The web console to install local storage oprator.
Failed to verify. 1. Installed local storage operator in local-storage project. 2. After creating localvolume instance in local-storage project, no local storage provisioner runs. $ oc get pod -n local-storage NAME READY STATUS RESTARTS AGE local-storage-operator-6b6785b4d-mc89m 1/1 Running 0 7m35s $ oc get localvolume -n local-storage -oyaml apiVersion: v1 items: - apiVersion: local.storage.openshift.io/v1 kind: LocalVolume metadata: creationTimestamp: "2019-09-02T05:44:48Z" finalizers: - storage.openshift.com/local-volume-protection generation: 2 name: example namespace: local-storage resourceVersion: "65869" selfLink: /apis/local.storage.openshift.io/v1/namespaces/local-storage/localvolumes/example uid: c69ba252-cd44-11e9-a3ad-42010a000003 spec: logLevel: Normal managementState: Managed storageClassDevices: - devicePaths: - /dev/loop1 - /dev/loop2 fsType: ext4 storageClassName: local-sc volumeMode: Filesystem status: conditions: - lastTransitionTime: "2019-09-02T05:48:39Z" message: |- error syncing local storage: error applying pv cluster role binding local-storage-provisioner-pv-binding: clusterrolebindings.rbac.authorization.k8s.io "local-storage-provisioner-pv-binding" is forbidden: user "system:serviceaccount:local-storage:local-storage-operator" (groups=["system:serviceaccounts" "system:serviceaccounts:local-storage" "system:authenticated"]) is attempting to grant RBAC permissions not currently held: {APIGroups:[""], Resources:["events"], Verbs:["watch" "create" "patch" "update"]} {APIGroups:[""], Resources:["persistentvolumeclaims"], Verbs:["get" "list" "update" "watch"]} status: "False" type: Available readyReplicas: 0 kind: List metadata: resourceVersion: "" selfLink: "" $ oc logs local-storage-operator-6b6785b4d-mc89m -n local-storage I0902 05:40:26.050474 1 main.go:18] Go Version: go1.11.6 I0902 05:40:26.050575 1 main.go:19] Go OS/Arch: linux/amd64 I0902 05:40:26.050588 1 main.go:20] operator-sdk Version: 0.0.7 time="2019-09-02T05:40:26Z" level=info msg="Metrics service local-storage-operator created" I0902 05:40:26.251922 1 main.go:36] Watching local.storage.openshift.io/v1, LocalVolume I0902 05:40:26.251931 1 main.go:41] Watching local.storage.openshift.io/v1, LocalVolume, local-storage, 180000000000 I0902 05:44:48.517712 1 api_updater.go:75] Updating localvolume local-storage/example E0902 05:44:48.637661 1 controller.go:135] error applying pv cluster role binding local-storage-provisioner-pv-binding: clusterrolebindings.rbac.authorization.k8s.io "local-storage-provisioner-pv-binding" is forbidden: user "system:serviceaccount:local-storage:local-storage-operator" (groups=["system:serviceaccounts" "system:serviceaccounts:local-storage" "system:authenticated"]) is attempting to grant RBAC permissions not currently held: {APIGroups:[""], Resources:["events"], Verbs:["watch" "create" "patch" "update"]} {APIGroups:[""], Resources:["persistentvolumeclaims"], Verbs:["get" "list" "update" "watch"]} time="2019-09-02T05:44:48Z" level=error msg="error syncing key (local-storage/example): error applying pv cluster role binding local-storage-provisioner-pv-binding: clusterrolebindings.rbac.authorization.k8s.io \"local-storage-provisioner-pv-binding\" is forbidden: user \"system:serviceaccount:local-storage:local-storage-operator\" (groups=[\"system:serviceaccounts\" \"system:serviceaccounts:local-storage\" \"system:authenticated\"]) is attempting to grant RBAC permissions not currently held:\n{APIGroups:[\"\"], Resources:[\"events\"], Verbs:[\"watch\" \"create\" \"patch\" \"update\"]}\n{APIGroups:[\"\"], Resources:[\"persistentvolumeclaims\"], Verbs:[\"get\" \"list\" \"update\" \"watch\"]}" E0902 05:44:48.752440 1 controller.go:135] error applying pv cluster role binding local-storage-provisioner-pv-binding: clusterrolebindings.rbac.authorization.k8s.io "local-storage-provisioner-pv-binding" is forbidden: user "system:serviceaccount:local-storage:local-storage-operator" (groups=["system:serviceaccounts" "system:serviceaccounts:local-storage" "system:authenticated"]) is attempting to grant RBAC permissions not currently held:
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2019:2922