Bug 1743482 - local storage operator can not support "All namespaces on the cluster" installation mode
Summary: local storage operator can not support "All namespaces on the cluster" instal...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Storage
Version: 4.2.0
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: ---
: 4.2.0
Assignee: Hemant Kumar
QA Contact: Qin Ping
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-08-20 05:56 UTC by Qin Ping
Modified: 2019-10-16 06:37 UTC (History)
3 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-10-16 06:36:35 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)
The web console to install local storage oprator. (110.79 KB, image/png)
2019-09-02 05:38 UTC, Qin Ping
no flags Details


Links
System ID Private Priority Status Summary Last Updated
Github openshift local-storage-operator pull 40 0 'None' closed Bug 1743482: Fix install modes 2020-07-17 05:21:20 UTC
Red Hat Product Errata RHBA-2019:2922 0 None None None 2019-10-16 06:37:22 UTC

Description Qin Ping 2019-08-20 05:56:10 UTC
Description of problem:
local storage operator can not support "All namespaces on the cluster" installation mode

Version-Release number of selected component (if applicable):
quay.io/openshift-release-dev/ocp-v4.0-art-dev:v4.2.0-201908181300-ose-local-storage-operator

How reproducible:
100%

Steps to Reproduce:
1. Uploaded the local storage operator to operatorhub as a custom operator with the manifests in https://github.com/openshift/local-storage-operator/tree/release-4.2/manifests
2. Created a project local-storage
3. Installed local storage operator with the "All namespaces on the cluster" installation mode
4. Created a LocalVolume instance in the local-storage project
$ oc get localvolume local-disks -n local-storage -oyaml
apiVersion: local.storage.openshift.io/v1
kind: LocalVolume
metadata:
  creationTimestamp: "2019-08-20T05:31:54Z"
  generation: 1
  name: local-disks
  namespace: local-storage
  resourceVersion: "526669"
  selfLink: /apis/local.storage.openshift.io/v1/namespaces/local-storage/localvolumes/local-disks
  uid: d199ca9c-c30b-11e9-af54-0251f29b9002
spec:
  nodeSelector:
    nodeSelectorTerms:
    - matchExpressions:
      - key: kubernetes.io/hostname
        operator: In
        values:
        - ip-10-0-129-132
  storageClassDevices:
  - devicePaths:
    - /dev/vde
    - /dev/vdf
    fsType: xfs
    storageClassName: local-sc
    volumeMode: Filesystem
5. Check the deployment of local storage controller manager

Actual results:
Storage controller manager is not deployed.

Expected results:
Storage controller manager can be deployed successfully or do not support "All namespaces on the cluster" installation mode.

Additional info:
If creating a localvolume instance in openshift-operators project, local storage controller manager can be deployed successfully.
$ oc get pod -n openshift-operators
NAME                                      READY   STATUS    RESTARTS   AGE
local-disks-local-diskmaker-wn98w         1/1     Running   0          14m
local-disks-local-provisioner-8tr58       1/1     Running   0          14m
local-storage-operator-844986dc88-94slr   1/1     Running   0          106m

Comment 5 Qin Ping 2019-09-02 05:38:51 UTC
Created attachment 1610664 [details]
The web console to install local storage oprator.

Comment 6 Qin Ping 2019-09-02 05:50:41 UTC
Failed to verify.


1. Installed local storage operator in local-storage project.
2. After creating localvolume instance in local-storage project, no local storage provisioner runs.

$ oc get pod -n local-storage
NAME                                     READY   STATUS    RESTARTS   AGE
local-storage-operator-6b6785b4d-mc89m   1/1     Running   0          7m35s

$ oc get localvolume -n local-storage -oyaml
apiVersion: v1
items:
- apiVersion: local.storage.openshift.io/v1
  kind: LocalVolume
  metadata:
    creationTimestamp: "2019-09-02T05:44:48Z"
    finalizers:
    - storage.openshift.com/local-volume-protection
    generation: 2
    name: example
    namespace: local-storage
    resourceVersion: "65869"
    selfLink: /apis/local.storage.openshift.io/v1/namespaces/local-storage/localvolumes/example
    uid: c69ba252-cd44-11e9-a3ad-42010a000003
  spec:
    logLevel: Normal
    managementState: Managed
    storageClassDevices:
    - devicePaths:
      - /dev/loop1
      - /dev/loop2
      fsType: ext4
      storageClassName: local-sc
      volumeMode: Filesystem
  status:
    conditions:
    - lastTransitionTime: "2019-09-02T05:48:39Z"
      message: |-
        error syncing local storage: error applying pv cluster role binding local-storage-provisioner-pv-binding: clusterrolebindings.rbac.authorization.k8s.io "local-storage-provisioner-pv-binding" is forbidden: user "system:serviceaccount:local-storage:local-storage-operator" (groups=["system:serviceaccounts" "system:serviceaccounts:local-storage" "system:authenticated"]) is attempting to grant RBAC permissions not currently held:
        {APIGroups:[""], Resources:["events"], Verbs:["watch" "create" "patch" "update"]}
        {APIGroups:[""], Resources:["persistentvolumeclaims"], Verbs:["get" "list" "update" "watch"]}
      status: "False"
      type: Available
    readyReplicas: 0
kind: List
metadata:
  resourceVersion: ""
  selfLink: ""

$ oc logs local-storage-operator-6b6785b4d-mc89m -n local-storage
I0902 05:40:26.050474       1 main.go:18] Go Version: go1.11.6
I0902 05:40:26.050575       1 main.go:19] Go OS/Arch: linux/amd64
I0902 05:40:26.050588       1 main.go:20] operator-sdk Version: 0.0.7
time="2019-09-02T05:40:26Z" level=info msg="Metrics service local-storage-operator created"
I0902 05:40:26.251922       1 main.go:36] Watching local.storage.openshift.io/v1, LocalVolume
I0902 05:40:26.251931       1 main.go:41] Watching local.storage.openshift.io/v1, LocalVolume, local-storage, 180000000000
I0902 05:44:48.517712       1 api_updater.go:75] Updating localvolume local-storage/example
E0902 05:44:48.637661       1 controller.go:135] error applying pv cluster role binding local-storage-provisioner-pv-binding: clusterrolebindings.rbac.authorization.k8s.io "local-storage-provisioner-pv-binding" is forbidden: user "system:serviceaccount:local-storage:local-storage-operator" (groups=["system:serviceaccounts" "system:serviceaccounts:local-storage" "system:authenticated"]) is attempting to grant RBAC permissions not currently held:
{APIGroups:[""], Resources:["events"], Verbs:["watch" "create" "patch" "update"]}
{APIGroups:[""], Resources:["persistentvolumeclaims"], Verbs:["get" "list" "update" "watch"]}
time="2019-09-02T05:44:48Z" level=error msg="error syncing key (local-storage/example): error applying pv cluster role binding local-storage-provisioner-pv-binding: clusterrolebindings.rbac.authorization.k8s.io \"local-storage-provisioner-pv-binding\" is forbidden: user \"system:serviceaccount:local-storage:local-storage-operator\" (groups=[\"system:serviceaccounts\" \"system:serviceaccounts:local-storage\" \"system:authenticated\"]) is attempting to grant RBAC permissions not currently held:\n{APIGroups:[\"\"], Resources:[\"events\"], Verbs:[\"watch\" \"create\" \"patch\" \"update\"]}\n{APIGroups:[\"\"], Resources:[\"persistentvolumeclaims\"], Verbs:[\"get\" \"list\" \"update\" \"watch\"]}"
E0902 05:44:48.752440       1 controller.go:135] error applying pv cluster role binding local-storage-provisioner-pv-binding: clusterrolebindings.rbac.authorization.k8s.io "local-storage-provisioner-pv-binding" is forbidden: user "system:serviceaccount:local-storage:local-storage-operator" (groups=["system:serviceaccounts" "system:serviceaccounts:local-storage" "system:authenticated"]) is attempting to grant RBAC permissions not currently held:

Comment 11 errata-xmlrpc 2019-10-16 06:36:35 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:2922


Note You need to log in before you can comment on or make changes to this bug.