Bug 1878038

Summary: can not install operator with index image because of error "GLIBC_2.28 not found (required by opm)"
Product: OpenShift Container Platform Reporter: kuiwang
Component: OLMAssignee: Evan Cordell <ecordell>
OLM sub component: OLM QA Contact: kuiwang
Status: CLOSED DUPLICATE Docs Contact:
Severity: high    
Priority: unspecified CC: bandrade, jiazha, krizza, scolange, tbuskey, yhui
Version: 4.6   
Target Milestone: ---   
Target Release: 4.6.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2020-09-11 12:49:26 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description kuiwang 2020-09-11 07:00:15 UTC
Description of problem:
I make index image with the current latest opm and then install operator, but it fails at "opm: /lib64/libc.so.6: version `GLIBC_2.28' not found (required by opm)"


Version-Release number of selected component (if applicable):
[root@preserve-olm-env operator-registry]# bin/opm version
Version: version.Version{OpmVersion:"v1.13.8-14-gcccf27a", GitCommit:"cccf27a", BuildDate:"2020-09-11T05:24:15Z", GoOs:"linux", GoArch:"amd64"}

[root@preserve-olm-env OCP-25880]# oc get pod -n openshift-operator-lifecycle-manager
NAME                                READY   STATUS    RESTARTS   AGE
catalog-operator-7bcdf58995-dt4tk   1/1     Running   1          15m
olm-operator-6c67f6b8cc-zglcj       1/1     Running   0          15m
packageserver-68b8ddf76f-8b8qd      1/1     Running   0          14m
packageserver-68b8ddf76f-f99d4      1/1     Running   0          14m
[root@preserve-olm-env OCP-25880]# oc exec catalog-operator-7bcdf58995-dt4tk -n openshift-operator-lifecycle-manager -- olm --version
OLM version: 0.16.1
git commit: 18dd8b36f2ca9d4778cedb5b4ec166c0ff886ece

How reproducible:
always

Steps to Reproduce:

you could also refer to http://pastebin.test.redhat.com/901420 in one month
---
@@@check opm version which is latest currently
[root@preserve-olm-env operator-registry]# git log -n 1
commit cccf27a2efe982269cdeec69a7b4643c2824a614
Merge: e92049a f1a3c75
Author: OpenShift Merge Robot <openshift-merge-robot.github.com>
Date:   Wed Sep 9 20:36:14 2020 -0400

    Merge pull request #437 from ecordell/fix-workdir
    
    Bug 1877603: add workdir permissions back
[root@preserve-olm-env operator-registry]# make clean
[root@preserve-olm-env operator-registry]# make build
GOFLAGS="-mod=vendor" go build  -tags "json1" -o bin/appregistry-server ./cmd/appregistry-server
GOFLAGS="-mod=vendor" go build  -tags "json1" -o bin/configmap-server ./cmd/configmap-server
GOFLAGS="-mod=vendor" go build  -tags "json1" -o bin/initializer ./cmd/initializer
GOFLAGS="-mod=vendor" go build  -tags "json1" -o bin/registry-server ./cmd/registry-server
GOFLAGS="-mod=vendor" go build -ldflags "-X 'github.com/operator-framework/operator-registry/cmd/opm/version.gitCommit=cccf27a' -X 'github.com/operator-framework/operator-registry/cmd/opm/version.opmVersion=v1.13.8-14-gcccf27a' -X 'github.com/operator-framework/operator-registry/cmd/opm/version.buildDate=2020-09-11T05:24:15Z'"  -tags "json1" -o bin/opm ./cmd/opm
[root@preserve-olm-env operator-registry]# bin/opm version
Version: version.Version{OpmVersion:"v1.13.8-14-gcccf27a", GitCommit:"cccf27a", BuildDate:"2020-09-11T05:24:15Z", GoOs:"linux", GoArch:"amd64"}

@@@prepare index image
[root@preserve-olm-env operator-registry]# rm -fr bundle.Dockerfile manifests/portworx-essentials/metadata/
[root@preserve-olm-env operator-registry]# ./bin/opm alpha bundle build --directory /root/kuiwang/operator-registry/manifests/portworx-essentials/1.3.4 --tag quay.io/kuiwang/portworx-essentials:v1.3.4 --package portworx-essentials --channels stable --default stable
INFO[0000] Building annotations.yaml                    
INFO[0000] Writing annotations.yaml in /root/kuiwang/operator-registry/manifests/portworx-essentials/metadata 
INFO[0000] Building Dockerfile                          
INFO[0000] Writing bundle.Dockerfile in /root/kuiwang/operator-registry 
INFO[0000] Building bundle image                        
Sending build context to Docker daemon  103.8MB
Step 1/9 : FROM scratch
 ---> 
Step 2/9 : LABEL operators.operatorframework.io.bundle.mediatype.v1=registry+v1
 ---> Running in f0ca8647c698
Removing intermediate container f0ca8647c698
 ---> e6fb7095f3cd
Step 3/9 : LABEL operators.operatorframework.io.bundle.manifests.v1=manifests/
 ---> Running in 00e01f9a116c
Removing intermediate container 00e01f9a116c
 ---> 5eb3a6fbfd5e
Step 4/9 : LABEL operators.operatorframework.io.bundle.metadata.v1=metadata/
 ---> Running in 60f05a41253c
Removing intermediate container 60f05a41253c
 ---> 87fbf571174a
Step 5/9 : LABEL operators.operatorframework.io.bundle.package.v1=portworx-essentials
 ---> Running in b0688d70b77d
Removing intermediate container b0688d70b77d
 ---> 508bbae03080
Step 6/9 : LABEL operators.operatorframework.io.bundle.channels.v1=stable
 ---> Running in e4be52d9ef4c
Removing intermediate container e4be52d9ef4c
 ---> 6405b542765e
Step 7/9 : LABEL operators.operatorframework.io.bundle.channel.default.v1=stable
 ---> Running in be8dbae3c889
Removing intermediate container be8dbae3c889
 ---> 0eee245b3726
Step 8/9 : COPY manifests/portworx-essentials/1.3.4 /manifests/
 ---> fd44a651ec8c
Step 9/9 : COPY manifests/portworx-essentials/metadata /metadata/
 ---> 6d7f26869ba1
Successfully built 6d7f26869ba1
Successfully tagged quay.io/kuiwang/portworx-essentials:v1.3.4
[root@preserve-olm-env operator-registry]# docker push quay.io/kuiwang/portworx-essentials:v1.3.4
The push refers to repository [quay.io/kuiwang/portworx-essentials]
5a0b9be3447a: Pushed 
27d2e5467f5d: Pushed 
v1.3.4: digest: sha256:30dad01006718439ce85bf76171413cf32653c54c119d1c3ef646b0b12b3ce32 size: 733
[root@preserve-olm-env operator-registry]# ./bin/opm index add --bundles quay.io/kuiwang/portworx-essentials:v1.3.4 --tag quay.io/kuiwang/olm-isv:v1 -c docker
INFO[0000] building the index                            bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
INFO[0000] running /usr/bin/docker pull quay.io/kuiwang/portworx-essentials:v1.3.4  bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
INFO[0000] running docker create                         bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
INFO[0000] running docker cp                             bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
INFO[0000] running docker rm                             bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
INFO[0000] Could not find optional dependencies file     dir=bundle_tmp077337778 file=bundle_tmp077337778/metadata load=annotations
INFO[0000] found csv, loading bundle                     dir=bundle_tmp077337778 file=bundle_tmp077337778/manifests load=bundle
INFO[0000] loading bundle file                           dir=bundle_tmp077337778/manifests file=core_v1alpha1_storagecluster_crd.yaml load=bundle
INFO[0000] loading bundle file                           dir=bundle_tmp077337778/manifests file=core_v1alpha1_storagenode_crd.yaml load=bundle
INFO[0000] loading bundle file                           dir=bundle_tmp077337778/manifests file=portworxoperator.v1.3.4.clusterserviceversion.yaml load=bundle
INFO[0001] Generating dockerfile                         bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
INFO[0001] writing dockerfile: index.Dockerfile391095168  bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
INFO[0001] running docker build                          bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
INFO[0001] [docker build -f index.Dockerfile391095168 -t quay.io/kuiwang/olm-isv:v1 .]  bundles="[quay.io/kuiwang/portworx-essentials:v1.3.4]"
[root@preserve-olm-env operator-registry]# docker push quay.io/kuiwang/olm-isv:v1
The push refers to repository [quay.io/kuiwang/olm-isv]
0b54fe64116b: Pushed 
912e10e7963c: Layer already exists 
98a5965029a0: Layer already exists 
4150c4f2e6df: Layer already exists 
50644c29ef5a: Layer already exists 
v1: digest: sha256:890249c16655b1461dcac2799cdd0dbea32d778ab02b7776b72679ef25efdfce size: 1371

@@@apply sub
[root@preserve-olm-env OCP-25880]# cat og-single.yaml 
kind: OperatorGroup
apiVersion: operators.coreos.com/v1
metadata:
  name: og-single
  namespace: default
spec:
  targetNamespaces:
  - default

[root@preserve-olm-env OCP-25880]# oc apply -f og-single.yaml 
operatorgroup.operators.coreos.com/og-single created
[root@preserve-olm-env OCP-25880]# cat catsrcv1.yaml 
apiVersion: operators.coreos.com/v1alpha1
kind: CatalogSource
metadata:
  name: olm-isv-v1-catalog
  namespace: default
spec:
  displayName: OLM isv v1 Operator Catalog
  image: quay.io/kuiwang/olm-isv:v1
  icon:
    base64data: ""
    mediatype: ""
  publisher: QE
  sourceType: grpc
  updateStrategy:
    registryPoll:
      interval: 10m0s
  priority: 1


[root@preserve-olm-env OCP-25880]# oc apply -f catsrcv1.yaml 
catalogsource.operators.coreos.com/olm-isv-v1-catalog created
[root@preserve-olm-env OCP-25880]# cat sub-portworx-v1.yaml 
apiVersion: operators.coreos.com/v1alpha1
kind: Subscription
metadata:
  name: portworx-essentials
  namespace: default
spec:
  channel: stable
  installPlanApproval: Automatic
  name: portworx-essentials
  source: olm-isv-v1-catalog
  sourceNamespace: default
  startingCSV: ""

[root@preserve-olm-env OCP-25880]# oc apply -f sub-portworx-v1.yaml 
subscription.operators.coreos.com/portworx-essentials created

[root@preserve-olm-env OCP-25880]# oc get pod
NAME                                                              READY   STATUS             RESTARTS   AGE
be8f4d684883b7f1246d763765a3481a193debe9c4ea009237fffb7386hl5jd   0/1     CrashLoopBackOff   1          14s
olm-isv-v1-catalog-4hdpn                                          1/1     Running            0          38s
[root@preserve-olm-env OCP-25880]# oc logs be8f4d684883b7f1246d763765a3481a193debe9c4ea009237fffb7386hl5jd
@@@opm: /lib64/libc.so.6: version `GLIBC_2.28' not found (required by opm)
[root@preserve-olm-env OCP-25880]# oc get pod be8f4d684883b7f1246d763765a3481a193debe9c4ea009237fffb7386hl5jd -o yaml
apiVersion: v1
kind: Pod
metadata:
  annotations:
...
  name: be8f4d684883b7f1246d763765a3481a193debe9c4ea009237fffb7386hl5jd
  namespace: default
  ownerReferences:
  - apiVersion: batch/v1
    blockOwnerDeletion: true
    controller: true
    kind: Job
    name: be8f4d684883b7f1246d763765a3481a193debe9c4ea009237fffb7386568e2
    uid: c83ebe6e-26eb-40ac-9147-2df0db6586f5
  resourceVersion: "346796"
  selfLink: /api/v1/namespaces/default/pods/be8f4d684883b7f1246d763765a3481a193debe9c4ea009237fffb7386hl5jd
  uid: 8aa6b9d4-f2b7-4b23-9e3e-66803f59ce97
spec:
  containers:
  - command:
    - opm
    - alpha
    - bundle
    - extract
    - -m
    - /bundle/
    - -n
    - default
    - -c
    - be8f4d684883b7f1246d763765a3481a193debe9c4ea009237fffb7386568e2
    env:
    - name: CONTAINER_IMAGE
      value: quay.io/kuiwang/portworx-essentials:v1.3.4
    image: quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:232e321e42279d5660b64bb7010b239948f7f59e6b3309c86102392724b8f3d6
    imagePullPolicy: IfNotPresent
    name: extract
    resources: {}
    terminationMessagePath: /dev/termination-log
    terminationMessagePolicy: File
    volumeMounts:
    - mountPath: /bundle
      name: bundle
    - mountPath: /var/run/secrets/kubernetes.io/serviceaccount
      name: default-token-9f4xw
      readOnly: true
  dnsPolicy: ClusterFirst
  enableServiceLinks: true
  imagePullSecrets:
  - name: default-dockercfg-j6hr9
  initContainers:
  - command:
    - /bin/cp
    - -Rv
    - /bin/cpb
    - /util/cpb
    image: quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c227645623ec8e3fa78dd4640c3c237ffccb7a2b6bffcd7f90752140648c3128
    imagePullPolicy: IfNotPresent
    name: util
    resources: {}
    terminationMessagePath: /dev/termination-log
    terminationMessagePolicy: File
    volumeMounts:
    - mountPath: /util
      name: util
    - mountPath: /var/run/secrets/kubernetes.io/serviceaccount
      name: default-token-9f4xw
      readOnly: true
  - command:
    - /util/cpb
    - /bundle
    image: quay.io/kuiwang/portworx-essentials:v1.3.4
    imagePullPolicy: Always
    name: pull
    resources: {}
    terminationMessagePath: /dev/termination-log
    terminationMessagePolicy: File
    volumeMounts:
    - mountPath: /bundle
      name: bundle
    - mountPath: /util
      name: util
    - mountPath: /var/run/secrets/kubernetes.io/serviceaccount
      name: default-token-9f4xw
      readOnly: true
  nodeName: ip-10-0-143-213.us-east-2.compute.internal
  preemptionPolicy: PreemptLowerPriority
  priority: 0
  restartPolicy: OnFailure
  schedulerName: default-scheduler
  securityContext: {}
  serviceAccount: default
  serviceAccountName: default
  terminationGracePeriodSeconds: 30
  tolerations:
  - effect: NoExecute
    key: node.kubernetes.io/not-ready
    operator: Exists
    tolerationSeconds: 300
  - effect: NoExecute
    key: node.kubernetes.io/unreachable
    operator: Exists
    tolerationSeconds: 300
  volumes:
  - emptyDir: {}
    name: bundle
  - emptyDir: {}
    name: util
  - name: default-token-9f4xw
    secret:
      defaultMode: 420
      secretName: default-token-9f4xw
status:
  conditions:
  - lastProbeTime: null
    lastTransitionTime: "2020-09-11T05:36:30Z"
    status: "True"
    type: Initialized
  - lastProbeTime: null
    lastTransitionTime: "2020-09-11T05:36:25Z"
    message: 'containers with unready status: [extract]'
    reason: ContainersNotReady
    status: "False"
    type: Ready
  - lastProbeTime: null
    lastTransitionTime: "2020-09-11T05:36:25Z"
    message: 'containers with unready status: [extract]'
    reason: ContainersNotReady
    status: "False"
    type: ContainersReady
  - lastProbeTime: null
    lastTransitionTime: "2020-09-11T05:36:24Z"
    status: "True"
    type: PodScheduled
  containerStatuses:
  - containerID: cri-o://9f0bcc0164327fb5c6fcf429eedddb2f09e5e29fb85d99b55ec6de2b617397d3
    image: quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:232e321e42279d5660b64bb7010b239948f7f59e6b3309c86102392724b8f3d6
    imageID: quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:232e321e42279d5660b64bb7010b239948f7f59e6b3309c86102392724b8f3d6
    lastState:
      terminated:
        containerID: cri-o://9f0bcc0164327fb5c6fcf429eedddb2f09e5e29fb85d99b55ec6de2b617397d3
        exitCode: 1
        finishedAt: "2020-09-11T05:37:12Z"
        reason: Error
        startedAt: "2020-09-11T05:37:12Z"
    name: extract
    ready: false
    restartCount: 3
    started: false
    state:
      waiting:
        message: back-off 40s restarting failed container=extract pod=be8f4d684883b7f1246d763765a3481a193debe9c4ea009237fffb7386hl5jd_default(8aa6b9d4-f2b7-4b23-9e3e-66803f59ce97)
        reason: CrashLoopBackOff
  hostIP: 10.0.143.213
  initContainerStatuses:
  - containerID: cri-o://ea05f8b9bbd5fa059b52957719626923498c5131b8555a075070b23086a6c956
    image: quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c227645623ec8e3fa78dd4640c3c237ffccb7a2b6bffcd7f90752140648c3128
    imageID: quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c227645623ec8e3fa78dd4640c3c237ffccb7a2b6bffcd7f90752140648c3128
    lastState: {}
    name: util
    ready: true
    restartCount: 0
    state:
      terminated:
        containerID: cri-o://ea05f8b9bbd5fa059b52957719626923498c5131b8555a075070b23086a6c956
        exitCode: 0
        finishedAt: "2020-09-11T05:36:27Z"
        reason: Completed
        startedAt: "2020-09-11T05:36:27Z"
  - containerID: cri-o://027aaafda3b45dd836cc9a7033edd366e1860a83ce443fbb1eee25c50b829f53
    image: quay.io/kuiwang/portworx-essentials:v1.3.4
    imageID: quay.io/kuiwang/portworx-essentials@sha256:30dad01006718439ce85bf76171413cf32653c54c119d1c3ef646b0b12b3ce32
    lastState: {}
    name: pull
    ready: true
    restartCount: 0
    state:
      terminated:
        containerID: cri-o://027aaafda3b45dd836cc9a7033edd366e1860a83ce443fbb1eee25c50b829f53
        exitCode: 0
        finishedAt: "2020-09-11T05:36:30Z"
        reason: Completed
        startedAt: "2020-09-11T05:36:29Z"
  phase: Running
  podIP: 10.129.2.135
  podIPs:
  - ip: 10.129.2.135
  qosClass: BestEffort
  startTime: "2020-09-11T05:36:25Z"
---

Actual results:
can not install operator

Expected results:
can install operator

Additional info:

Comment 2 Evan Cordell 2020-09-11 12:49:26 UTC

*** This bug has been marked as a duplicate of bug 1878023 ***