Bug 1869441 - the operator installation fail if there is skip version in index image for multiple operator version [NEEDINFO]
Summary: the operator installation fail if there is skip version in index image for mu...
Keywords:
Status: VERIFIED
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: OLM
Version: 4.6
Hardware: Unspecified
OS: Unspecified
high
high
Target Milestone: ---
: 4.6.0
Assignee: Vu Dinh
QA Contact: kuiwang
URL:
Whiteboard:
: 1867254 1870388 1874053 (view as bug list)
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2020-08-18 02:33 UTC by kuiwang
Modified: 2020-09-18 01:10 UTC (History)
9 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed:
Target Upstream Version:
kuiwang: needinfo? (ecordell)


Attachments (Terms of Use)


Links
System ID Priority Status Summary Last Updated
Github operator-framework operator-lifecycle-manager pull 1735 None closed Bug 1869441: Add skips information to Operator representation 2020-09-14 01:20:22 UTC

Description kuiwang 2020-08-18 02:33:21 UTC
Description of problem:
when building index image from bundle image, if there is skip for certain bundle image, the installation will fails on "more than one head..."

Note: it is different issue with https://bugzilla.redhat.com/show_bug.cgi?id=1867254


Version-Release number of selected component (if applicable):
[root@preserve-olm-env operator-registry]# bin/opm version
Version: version.Version{OpmVersion:"v1.13.7", GitCommit:"018a040", BuildDate:"2020-08-13T03:00:21Z", GoOs:"linux", GoArch:"amd64"}

How reproducible:
always

Steps to Reproduce:
opm version
[root@preserve-olm-env operator-registry]# bin/opm version
Version: version.Version{OpmVersion:"v1.13.7", GitCommit:"018a040", BuildDate:"2020-08-13T03:00:21Z", GoOs:"linux", GoArch:"amd64"}
 
get atlasmap operator package manifest from community-operators repo
[root@preserve-olm-env operator-registry]# cp -fr /root/kuiwang/community-operators/community-operators/atlasmap-operator manifests/
 
build bundle image quay.io/olmqe/atlasmap-operator:v0.1.0-23473 and push it to quay.io
[root@preserve-olm-env operator-registry]# ./bin/opm alpha bundle build --directory /root/kuiwang/operator-registry/manifests/atlasmap-operator/0.1.0 --tag quay.io/olmqe/atlasmap-operator:v0.1.0-23473 --package atlasmap-operator --channels alpha --default alpha
INFO[0000] Building annotations.yaml                    
INFO[0000] Writing annotations.yaml in /root/kuiwang/operator-registry/manifests/atlasmap-operator/metadata
INFO[0000] Building Dockerfile                          
INFO[0000] Writing bundle.Dockerfile in /root/kuiwang/operator-registry
INFO[0000] Building bundle image                        
Sending build context to Docker daemon  102.9MB
Step 1/9 : FROM scratch
 --->
Step 2/9 : LABEL operators.operatorframework.io.bundle.mediatype.v1=registry+v1
 ---> Running in 3fe7f29303f4
Removing intermediate container 3fe7f29303f4
 ---> fd63ad4aaf7b
Step 3/9 : LABEL operators.operatorframework.io.bundle.manifests.v1=manifests/
 ---> Running in aec5205ea8b2
Removing intermediate container aec5205ea8b2
 ---> cc82b6176bd7
Step 4/9 : LABEL operators.operatorframework.io.bundle.metadata.v1=metadata/
 ---> Running in 1c9fe67f4536
Removing intermediate container 1c9fe67f4536
 ---> 2f20b94923f2
Step 5/9 : LABEL operators.operatorframework.io.bundle.package.v1=atlasmap-operator
 ---> Running in a572ff4525cc
Removing intermediate container a572ff4525cc
 ---> f1384a34b68d
Step 6/9 : LABEL operators.operatorframework.io.bundle.channels.v1=alpha
 ---> Running in 84bc0034a15c
Removing intermediate container 84bc0034a15c
 ---> 4d327d5031fe
Step 7/9 : LABEL operators.operatorframework.io.bundle.channel.default.v1=alpha
 ---> Running in b0bc80f1a842
Removing intermediate container b0bc80f1a842
 ---> 53f3dfe06461
Step 8/9 : COPY manifests/atlasmap-operator/0.1.0 /manifests/
 ---> cd70a7474534
Step 9/9 : COPY manifests/atlasmap-operator/metadata /metadata/
 ---> 7bb39506d8ee
Successfully built 7bb39506d8ee
Successfully tagged quay.io/olmqe/atlasmap-operator:v0.1.0-23473
[root@preserve-olm-env operator-registry]# docker logout quay.io
Removing login credentials for quay.io
[root@preserve-olm-env operator-registry]# docker login quay.io
Username: kuiwang
Password:
WARNING! Your password will be stored unencrypted in /root/.docker/config.json.
Configure a credential helper to remove this warning. See
https://docs.docker.com/engine/reference/commandline/login/#credentials-store
 
Login Succeeded
[root@preserve-olm-env operator-registry]# docker push quay.io/olmqe/atlasmap-operator:v0.1.0-23473
The push refers to repository [quay.io/olmqe/atlasmap-operator]
0b02adb0b0ad: Pushed
311fb1d2bbd1: Pushed
v0.1.0-23473: digest: sha256:f57cdc6b2669afd5b1729f96a81e1ef48346346b18e4e74c1662818538e12537 size: 732
[root@preserve-olm-env operator-registry]# rm -fr bundle.Dockerfile
[root@preserve-olm-env operator-registry]# cd manifests/atlasmap-operator/
[root@preserve-olm-env atlasmap-operator]# ls
0.1.0  0.2.0  0.3.0  atlasmap-operator.package.yaml  metadata
[root@preserve-olm-env atlasmap-operator]# rm -fr metadata/
[root@preserve-olm-env atlasmap-operator]# cd -
/root/kuiwang/operator-registry
 
 
build bundle image quay.io/olmqe/atlasmap-operator:v0.2.0-23473 and push it to quay.io
[root@preserve-olm-env operator-registry]# ./bin/opm alpha bundle build --directory /root/kuiwang/operator-registry/manifests/atlasmap-operator/0.2.0 --tag quay.io/olmqe/atlasmap-operator:v0.2.0-23473 --package atlasmap-operator --channels alpha --default alpha
INFO[0000] Building annotations.yaml                    
INFO[0000] Writing annotations.yaml in /root/kuiwang/operator-registry/manifests/atlasmap-operator/metadata
INFO[0000] Building Dockerfile                          
INFO[0000] Writing bundle.Dockerfile in /root/kuiwang/operator-registry
INFO[0000] Building bundle image                        
Sending build context to Docker daemon  102.9MB
Step 1/9 : FROM scratch
 --->
Step 2/9 : LABEL operators.operatorframework.io.bundle.mediatype.v1=registry+v1
 ---> Using cache
 ---> fd63ad4aaf7b
Step 3/9 : LABEL operators.operatorframework.io.bundle.manifests.v1=manifests/
 ---> Using cache
 ---> cc82b6176bd7
Step 4/9 : LABEL operators.operatorframework.io.bundle.metadata.v1=metadata/
 ---> Using cache
 ---> 2f20b94923f2
Step 5/9 : LABEL operators.operatorframework.io.bundle.package.v1=atlasmap-operator
 ---> Using cache
 ---> f1384a34b68d
Step 6/9 : LABEL operators.operatorframework.io.bundle.channels.v1=alpha
 ---> Using cache
 ---> 4d327d5031fe
Step 7/9 : LABEL operators.operatorframework.io.bundle.channel.default.v1=alpha
 ---> Using cache
 ---> 53f3dfe06461
Step 8/9 : COPY manifests/atlasmap-operator/0.2.0 /manifests/
 ---> 23352af01042
Step 9/9 : COPY manifests/atlasmap-operator/metadata /metadata/
 ---> d738a15b0ac2
Successfully built d738a15b0ac2
Successfully tagged quay.io/olmqe/atlasmap-operator:v0.2.0-23473
[root@preserve-olm-env operator-registry]# docker push quay.io/olmqe/atlasmap-operator:v0.2.0-23473
The push refers to repository [quay.io/olmqe/atlasmap-operator]
58a78dd5345f: Pushed
2d326b926751: Pushed
v0.2.0-23473: digest: sha256:b170594fe57504620f7e4f66d85bbcd4cf60c5f8db57ea655daedd44c6560604 size: 732
[root@preserve-olm-env operator-registry]#
[root@preserve-olm-env operator-registry]#
[root@preserve-olm-env operator-registry]#
[root@preserve-olm-env operator-registry]#
[root@preserve-olm-env operator-registry]# rm -fr bundle.Dockerfile  manifests/atlasmap-operator/metadata
 
build bundle image quay.io/olmqe/atlasmap-operator:v0.3.0-23473 and push it to quay.io
[root@preserve-olm-env operator-registry]# ./bin/opm alpha bundle build --directory /root/kuiwang/operator-registry/manifests/atlasmap-operator/0.3.0 --tag quay.io/olmqe/atlasmap-operator:v0.3.0-23473 --package atlasmap-operator --channels alpha --default alpha
INFO[0000] Building annotations.yaml                    
INFO[0000] Writing annotations.yaml in /root/kuiwang/operator-registry/manifests/atlasmap-operator/metadata
INFO[0000] Building Dockerfile                          
INFO[0000] Writing bundle.Dockerfile in /root/kuiwang/operator-registry
INFO[0000] Building bundle image                        
Sending build context to Docker daemon  102.9MB
Step 1/9 : FROM scratch
 --->
Step 2/9 : LABEL operators.operatorframework.io.bundle.mediatype.v1=registry+v1
 ---> Using cache
 ---> fd63ad4aaf7b
Step 3/9 : LABEL operators.operatorframework.io.bundle.manifests.v1=manifests/
 ---> Using cache
 ---> cc82b6176bd7
Step 4/9 : LABEL operators.operatorframework.io.bundle.metadata.v1=metadata/
 ---> Using cache
 ---> 2f20b94923f2
Step 5/9 : LABEL operators.operatorframework.io.bundle.package.v1=atlasmap-operator
 ---> Using cache
 ---> f1384a34b68d
Step 6/9 : LABEL operators.operatorframework.io.bundle.channels.v1=alpha
 ---> Using cache
 ---> 4d327d5031fe
Step 7/9 : LABEL operators.operatorframework.io.bundle.channel.default.v1=alpha
 ---> Using cache
 ---> 53f3dfe06461
Step 8/9 : COPY manifests/atlasmap-operator/0.3.0 /manifests/
 ---> 267acad38d83
Step 9/9 : COPY manifests/atlasmap-operator/metadata /metadata/
 ---> 72c3529b4ae8
Successfully built 72c3529b4ae8
Successfully tagged quay.io/olmqe/atlasmap-operator:v0.3.0-23473
[root@preserve-olm-env operator-registry]# docker push quay.io/olmqe/atlasmap-operator:v0.3.0-23473
The push refers to repository [quay.io/olmqe/atlasmap-operator]
b39479778b49: Pushed
d86e14465617: Pushed
v0.3.0-23473: digest: sha256:6473074c12c06b1e07ee9b8691e67759bc1ce10eda5ec5cfb77626224fb5a12b size: 733
[root@preserve-olm-env operator-registry]# rm -fr bundle.Dockerfile  manifests/atlasmap-operator/metadata
 
build index image quay.io/olmqe/atlasmap-operator-index:23473.1 and push it to quay.io. the base image is quay.io/operator-framework/upstream-opm-builder:v1.13.7
[root@preserve-olm-env operator-registry]# ./bin/opm index add --bundles quay.io/olmqe/atlasmap-operator:v0.1.0-23473 --tag quay.io/olmqe/atlasmap-operator-index:23473.1 --binary-image quay.io/operator-framework/upstream-opm-builder:v1.13.7 -c docker
INFO[0000] building the index                            bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
INFO[0000] running /usr/bin/docker pull quay.io/olmqe/atlasmap-operator:v0.1.0-23473  bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
INFO[0000] running docker create                         bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
INFO[0000] running docker cp                             bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
INFO[0000] running docker rm                             bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
INFO[0000] Could not find optional dependencies file     dir=bundle_tmp434866671 file=bundle_tmp434866671/metadata load=annotations
INFO[0000] found csv, loading bundle                     dir=bundle_tmp434866671 file=bundle_tmp434866671/manifests load=bundle
INFO[0000] loading bundle file                           dir=bundle_tmp434866671/manifests file=atlasmap-operator.v0.1.0.clusterserviceversion.yaml load=bundle
INFO[0000] loading bundle file                           dir=bundle_tmp434866671/manifests file=atlasmaps.atlasmap.io.crd.yaml load=bundle
INFO[0001] Generating dockerfile                         bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
INFO[0001] writing dockerfile: index.Dockerfile582248437  bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
INFO[0001] running docker build                          bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
INFO[0001] [docker build -f index.Dockerfile582248437 -t quay.io/olmqe/atlasmap-operator-index:23473.1 .]  bundles="[quay.io/olmqe/atlasmap-operator:v0.1.0-23473]"
[root@preserve-olm-env operator-registry]# docker push quay.io/olmqe/atlasmap-operator-index:23473.1
The push refers to repository [quay.io/olmqe/atlasmap-operator-index]
444a365a085c: Pushed
ba840fdc36a3: Mounted from operator-framework/upstream-opm-builder
cd87e5dc2bef: Mounted from operator-framework/upstream-opm-builder
4150c4f2e6df: Mounted from operator-framework/upstream-opm-builder
50644c29ef5a: Mounted from operator-framework/upstream-opm-builder
23473.1: digest: sha256:b53d34722eb192421260b38679f44919a4823f995ef49b6c53a79e23761dd93a size: 1371
 
build index image quay.io/olmqe/atlasmap-operator-index:23473.2 and push it to quay.io. the base image is quay.io/operator-framework/upstream-opm-builder:v1.13.7
[root@preserve-olm-env operator-registry]# ./bin/opm index add --bundles quay.io/olmqe/atlasmap-operator:v0.2.0-23473 --from-index quay.io/olmqe/atlasmap-operator-index:23473.1 --tag quay.io/olmqe/atlasmap-operator-index:23473.2 --binary-image quay.io/operator-framework/upstream-opm-builder:v1.13.7 -c docker
INFO[0000] building the index                            bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0000] Pulling previous image quay.io/olmqe/atlasmap-operator-index:23473.1 to get metadata  bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0000] running /usr/bin/docker pull quay.io/olmqe/atlasmap-operator-index:23473.1  bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0000] running /usr/bin/docker pull quay.io/olmqe/atlasmap-operator-index:23473.1  bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0001] Getting label data from previous image        bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0001] running docker inspect                        bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0001] running docker create                         bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0001] running docker cp                             bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0001] running docker rm                             bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0001] running /usr/bin/docker pull quay.io/olmqe/atlasmap-operator:v0.2.0-23473  bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0002] running docker create                         bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0002] running docker cp                             bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0002] running docker rm                             bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0002] Could not find optional dependencies file     dir=bundle_tmp237143957 file=bundle_tmp237143957/metadata load=annotations
INFO[0002] found csv, loading bundle                     dir=bundle_tmp237143957 file=bundle_tmp237143957/manifests load=bundle
INFO[0002] loading bundle file                           dir=bundle_tmp237143957/manifests file=atlasmap-operator.v0.2.0.clusterserviceversion.yaml load=bundle
INFO[0002] loading bundle file                           dir=bundle_tmp237143957/manifests file=atlasmap.io_atlasmaps_crd.yaml load=bundle
INFO[0002] Generating dockerfile                         bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0002] writing dockerfile: index.Dockerfile919892523  bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0002] running docker build                          bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
INFO[0002] [docker build -f index.Dockerfile919892523 -t quay.io/olmqe/atlasmap-operator-index:23473.2 .]  bundles="[quay.io/olmqe/atlasmap-operator:v0.2.0-23473]"
[root@preserve-olm-env operator-registry]# docker push quay.io/olmqe/atlasmap-operator-index:23473.2
The push refers to repository [quay.io/olmqe/atlasmap-operator-index]
83ae8ca0a7d6: Pushed
ba840fdc36a3: Layer already exists
cd87e5dc2bef: Layer already exists
4150c4f2e6df: Layer already exists
50644c29ef5a: Layer already exists
23473.2: digest: sha256:19ff1332c4e673b58aa79a6efcbab3081c7bc02e5f305522364ffc9e18077b58 size: 1371
 
build index image quay.io/olmqe/atlasmap-operator-index:23473 and push it to quay.io. the base image is quay.io/operator-framework/upstream-opm-builder:v1.13.7
[root@preserve-olm-env operator-registry]# ./bin/opm index add --bundles quay.io/olmqe/atlasmap-operator:v0.3.0-23473 --from-index quay.io/olmqe/atlasmap-operator-index:23473.2 --tag quay.io/olmqe/atlasmap-operator-index:23473 --binary-image quay.io/operator-framework/upstream-opm-builder:v1.13.7 -c docker
INFO[0000] building the index                            bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0000] Pulling previous image quay.io/olmqe/atlasmap-operator-index:23473.2 to get metadata  bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0000] running /usr/bin/docker pull quay.io/olmqe/atlasmap-operator-index:23473.2  bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0000] running /usr/bin/docker pull quay.io/olmqe/atlasmap-operator-index:23473.2  bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0001] Getting label data from previous image        bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0001] running docker inspect                        bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0001] running docker create                         bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0001] running docker cp                             bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0001] running docker rm                             bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0001] running /usr/bin/docker pull quay.io/olmqe/atlasmap-operator:v0.3.0-23473  bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0002] running docker create                         bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0002] running docker cp                             bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0002] running docker rm                             bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0002] Could not find optional dependencies file     dir=bundle_tmp968987558 file=bundle_tmp968987558/metadata load=annotations
INFO[0002] found csv, loading bundle                     dir=bundle_tmp968987558 file=bundle_tmp968987558/manifests load=bundle
INFO[0002] loading bundle file                           dir=bundle_tmp968987558/manifests file=atlasmap-operator.v0.3.0.clusterserviceversion.yaml load=bundle
INFO[0002] loading bundle file                           dir=bundle_tmp968987558/manifests file=atlasmaps.atlasmap.io.crd.yaml load=bundle
INFO[0002] Generating dockerfile                         bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0002] writing dockerfile: index.Dockerfile210952532  bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0002] running docker build                          bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
INFO[0002] [docker build -f index.Dockerfile210952532 -t quay.io/olmqe/atlasmap-operator-index:23473 .]  bundles="[quay.io/olmqe/atlasmap-operator:v0.3.0-23473]"
[root@preserve-olm-env operator-registry]# docker push quay.io/olmqe/atlasmap-operator-index:23473
The push refers to repository [quay.io/olmqe/atlasmap-operator-index]
83dfe662439e: Pushed
ba840fdc36a3: Layer already exists
cd87e5dc2bef: Layer already exists
4150c4f2e6df: Layer already exists
50644c29ef5a: Layer already exists
23473: digest: sha256:422286ee65b42c6bc4de2511e0a668dcff7a1dc0de051e68825e79f336c744ec size: 1371
 
 
try to install atlasmap operator with index image quay.io/olmqe/atlasmap-operator-index:23473
 check version
[root@preserve-olm-env operator-registry]# oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.6.0-0.nightly-2020-08-13-091737   True        False         8h      Cluster version is 4.6.0-0.nightly-2020-08-13-091737
[root@preserve-olm-env operator-registry]# oc exec catalog-operator-7bfb5f6576-bsbxv -n openshift-operator-lifecycle-manager -- olm --version
OLM version: 0.16.0
git commit: 1fdd347ab723bf6aec30c79dfb217bcbf21a13e9
 
create og
[root@preserve-olm-env OCP-23473]# cat og-single.yaml
kind: OperatorGroup
apiVersion: operators.coreos.com/v1
metadata:
  name: og-single
  namespace: default
spec:
  targetNamespaces:
  - default
[root@preserve-olm-env OCP-23473]# oc apply -f og-single.yaml
operatorgroup.operators.coreos.com/og-single created
 
create catsrc with index image
[root@preserve-olm-env OCP-23473]# cat catsrc.yaml
apiVersion: operators.coreos.com/v1alpha1
kind: CatalogSource
metadata:
  name: atlasmap-catalog
  namespace: default
spec:
  displayName: Atlasmap Operator Catalog
  image: quay.io/olmqe/atlasmap-operator-index:23473
  publisher: QE
  sourceType: grpc
 
[root@preserve-olm-env OCP-23473]# oc apply -f catsrc.yaml
catalogsource.operators.coreos.com/atlasmap-catalog created
[root@preserve-olm-env OCP-23473]# oc get catsrc atlasmap-catalog -o yaml
apiVersion: operators.coreos.com/v1alpha1
kind: CatalogSource
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"operators.coreos.com/v1alpha1","kind":"CatalogSource","metadata":{"annotations":{},"name":"atlasmap-catalog","namespace":"default"},"spec":{"displayName":"Atlasmap Operator Catalog","image":"quay.io/olmqe/atlasmap-operator-index:23473","publisher":"QE","sourceType":"grpc"}}
  creationTimestamp: "2020-08-14T07:16:49Z"
  generation: 1
  managedFields:
  - apiVersion: operators.coreos.com/v1alpha1
    fieldsType: FieldsV1
    fieldsV1:
      f:metadata:
        f:annotations:
          .: {}
          f:kubectl.kubernetes.io/last-applied-configuration: {}
      f:spec:
        .: {}
        f:displayName: {}
        f:image: {}
        f:publisher: {}
        f:sourceType: {}
    manager: oc
    operation: Update
    time: "2020-08-14T07:16:49Z"
  - apiVersion: operators.coreos.com/v1alpha1
    fieldsType: FieldsV1
    fieldsV1:
      f:spec:
        f:icon:
          .: {}
          f:base64data: {}
          f:mediatype: {}
      f:status:
        .: {}
        f:connectionState:
          .: {}
          f:address: {}
          f:lastConnect: {}
          f:lastObservedState: {}
        f:registryService:
          .: {}
          f:createdAt: {}
          f:port: {}
          f:protocol: {}
          f:serviceName: {}
          f:serviceNamespace: {}
    manager: catalog
    operation: Update
    time: "2020-08-14T07:17:08Z"
  name: atlasmap-catalog
  namespace: default
  resourceVersion: "493643"
  selfLink: /apis/operators.coreos.com/v1alpha1/namespaces/default/catalogsources/atlasmap-catalog
  uid: eb9c6a4a-e54c-4e76-84e6-f62206ce5f6c
spec:
  displayName: Atlasmap Operator Catalog
  image: quay.io/olmqe/atlasmap-operator-index:23473
  publisher: QE
  sourceType: grpc
status:
  connectionState:
    address: atlasmap-catalog.default.svc:50051
    lastConnect: "2020-08-14T07:17:08Z"
    lastObservedState: READY
  registryService:
    createdAt: "2020-08-14T07:16:49Z"
    port: "50051"
    protocol: grpc
    serviceName: atlasmap-catalog
    serviceNamespace: default
 
create sub
[root@preserve-olm-env OCP-23473]# cat sub.yaml
apiVersion: operators.coreos.com/v1alpha1
kind: Subscription
metadata:
  name: atlasmap-operator
  namespace: default
spec:
  channel: alpha
  installPlanApproval: Manual
  name: atlasmap-operator
  source: atlasmap-catalog
  sourceNamespace: default
  startingCSV: atlasmap-operator.v0.1.0
[root@preserve-olm-env OCP-23473]# oc apply -f sub.yaml
subscription.operators.coreos.com/atlasmap-operator created
[root@preserve-olm-env OCP-23473]# oc get sub atlasmap-operator -o yaml
apiVersion: operators.coreos.com/v1alpha1
kind: Subscription
metadata:
  annotations:
    kubectl.kubernetes.io/last-applied-configuration: |
      {"apiVersion":"operators.coreos.com/v1alpha1","kind":"Subscription","metadata":{"annotations":{},"name":"atlasmap-operator","namespace":"default"},"spec":{"channel":"alpha","installPlanApproval":"Manual","name":"atlasmap-operator","source":"atlasmap-catalog","sourceNamespace":"default","startingCSV":"atlasmap-operator.v0.1.0"}}
  creationTimestamp: "2020-08-14T07:18:42Z"
  generation: 1
  labels:
    operators.coreos.com/atlasmap-operator.default: ""
  managedFields:
  - apiVersion: operators.coreos.com/v1alpha1
    fieldsType: FieldsV1
    fieldsV1:
      f:spec:
        f:config:
          .: {}
          f:resources: {}
      f:status:
        .: {}
        f:catalogHealth: {}
        f:conditions: {}
        f:lastUpdated: {}
    manager: catalog
    operation: Update
    time: "2020-08-14T07:18:42Z"
  - apiVersion: operators.coreos.com/v1alpha1
    fieldsType: FieldsV1
    fieldsV1:
      f:metadata:
        f:annotations:
          .: {}
          f:kubectl.kubernetes.io/last-applied-configuration: {}
      f:spec:
        .: {}
        f:channel: {}
        f:installPlanApproval: {}
        f:name: {}
        f:source: {}
        f:sourceNamespace: {}
        f:startingCSV: {}
    manager: oc
    operation: Update
    time: "2020-08-14T07:18:42Z"
  - apiVersion: operators.coreos.com/v1alpha1
    fieldsType: FieldsV1
    fieldsV1:
      f:metadata:
        f:labels:
          .: {}
          f:operators.coreos.com/atlasmap-operator.default: {}
    manager: olm
    operation: Update
    time: "2020-08-14T07:18:42Z"
  name: atlasmap-operator
  namespace: default
  resourceVersion: "495156"
  selfLink: /apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
  uid: 85669021-7cb3-402e-83b2-5c6b10f867ea
spec:
  channel: alpha
  installPlanApproval: Manual
  name: atlasmap-operator
  source: atlasmap-catalog
  sourceNamespace: default
  startingCSV: atlasmap-operator.v0.1.0
status:
  catalogHealth:
  - catalogSourceRef:
      apiVersion: operators.coreos.com/v1alpha1
      kind: CatalogSource
      name: atlasmap-catalog
      namespace: default
      resourceVersion: "493643"
      uid: eb9c6a4a-e54c-4e76-84e6-f62206ce5f6c
    healthy: true
    lastUpdated: "2020-08-14T07:18:42Z"
  - catalogSourceRef:
      apiVersion: operators.coreos.com/v1alpha1
      kind: CatalogSource
      name: certified-operators
      namespace: openshift-marketplace
      resourceVersion: "479629"
      uid: cc8f16a0-ecf2-4ef1-ac19-0b2da74f331a
    healthy: true
    lastUpdated: "2020-08-14T07:18:42Z"
  - catalogSourceRef:
      apiVersion: operators.coreos.com/v1alpha1
      kind: CatalogSource
      name: community-operators
      namespace: openshift-marketplace
      resourceVersion: "479635"
      uid: 726b0e15-adc6-46c6-bc1b-1bd7df357c6d
    healthy: true
    lastUpdated: "2020-08-14T07:18:42Z"
  - catalogSourceRef:
      apiVersion: operators.coreos.com/v1alpha1
      kind: CatalogSource
      name: qe-app-registry
      namespace: openshift-marketplace
      resourceVersion: "482389"
      uid: 31d851d0-b442-4931-b1f1-a124016304f8
    healthy: true
    lastUpdated: "2020-08-14T07:18:42Z"
  - catalogSourceRef:
      apiVersion: operators.coreos.com/v1alpha1
      kind: CatalogSource
      name: redhat-marketplace
      namespace: openshift-marketplace
      resourceVersion: "479602"
      uid: 2463e651-0b9a-42db-bc4b-69b7756274f5
    healthy: true
    lastUpdated: "2020-08-14T07:18:42Z"
  - catalogSourceRef:
      apiVersion: operators.coreos.com/v1alpha1
      kind: CatalogSource
      name: redhat-operators
      namespace: openshift-marketplace
      resourceVersion: "479641"
      uid: 60c76bba-82f6-4cd6-8c45-f94a480a3298
    healthy: true
    lastUpdated: "2020-08-14T07:18:42Z"
  conditions:
  - lastTransitionTime: "2020-08-14T07:18:42Z"
    message: all available catalogsources are healthy
    reason: AllCatalogSourcesHealthy
    status: "False"
    type: CatalogSourcesUnhealthy
  lastUpdated: "2020-08-14T07:18:42Z"
 
but no ip is generated and no csc. the error log is the following
time="2020-08-14T07:16:49Z" level=info msg="state.Key.Namespace=default state.Key.Name=atlasmap-catalog state.State=CONNECTING"
time="2020-08-14T07:16:52Z" level=info msg="state.Key.Namespace=default state.Key.Name=atlasmap-catalog state.State=TRANSIENT_FAILURE"
time="2020-08-14T07:16:53Z" level=info msg="state.Key.Namespace=default state.Key.Name=atlasmap-catalog state.State=CONNECTING"
time="2020-08-14T07:17:08Z" level=info msg="state.Key.Namespace=default state.Key.Name=atlasmap-catalog state.State=READY"
time="2020-08-14T07:18:42Z" level=info msg=syncing event=update reconciling="*v1alpha1.Subscription" selflink=/apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
time="2020-08-14T07:18:42Z" level=warning msg="an error was encountered during reconciliation" error="Operation cannot be fulfilled on subscriptions.operators.coreos.com \"atlasmap-operator\": the object has been modified; please apply your changes to the latest version and try again" event=update reconciling="*v1alpha1.Subscription" selflink=/apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
E0814 07:18:42.834357       1 queueinformer_operator.go:290] sync {"update" "default/atlasmap-operator"} failed: Operation cannot be fulfilled on subscriptions.operators.coreos.com "atlasmap-operator": the object has been modified; please apply your changes to the latest version and try again
time="2020-08-14T07:18:42Z" level=info msg=syncing event=update reconciling="*v1alpha1.Subscription" selflink=/apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
time="2020-08-14T07:18:42Z" level=info msg=syncing event=update reconciling="*v1alpha1.Subscription" selflink=/apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
time="2020-08-14T07:18:42Z" level=info msg=syncing event=update reconciling="*v1alpha1.Subscription" selflink=/apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
E0814 07:18:42.861326       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:18:42.861504       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:18:45.625928       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:18:45.626203       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:18:45.636541       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:18:45.636702       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:18:45.645890       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:18:45.646080       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:18:45.666793       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:18:45.666992       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:18:45.757946       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:18:45.758083       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:18:45.928286       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:18:45.928379       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:18:46.262541       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:18:46.262604       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
I0814 07:18:46.913280       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:19:47Z" level=info msg="ImageID registry.redhat.io/redhat/redhat-operator-index@sha256:9a516c485c4bcfbb4607dbf0373bc41254237e0505c06382cce39bc7d5b4abd1" CatalogSource=redhat-operators-wh6f6
time="2020-08-14T07:19:47Z" level=info msg="Update Pod ImageID registry.redhat.io/redhat/redhat-operator-index@sha256:9a516c485c4bcfbb4607dbf0373bc41254237e0505c06382cce39bc7d5b4abd1" CatalogSource=redhat-operators-wh6f6
time="2020-08-14T07:19:47Z" level=info msg="ImageID registry.redhat.io/redhat/redhat-operator-index@sha256:9a516c485c4bcfbb4607dbf0373bc41254237e0505c06382cce39bc7d5b4abd1" CatalogSource=redhat-operators-zz97p
time="2020-08-14T07:19:47Z" level=info msg="Serving Pod ImageID registry.redhat.io/redhat/redhat-operator-index@sha256:9a516c485c4bcfbb4607dbf0373bc41254237e0505c06382cce39bc7d5b4abd1" CatalogSource=redhat-operators-zz97p
time="2020-08-14T07:19:47Z" level=info msg="no image update for catalogsource pod" CatalogSource=redhat-operators
time="2020-08-14T07:19:47Z" level=info msg="creating new catalog source update pod" CatalogSource=redhat-operators
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=redhat-operators state.State=SHUTDOWN"
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=redhat-operators state.State=CONNECTING"
time="2020-08-14T07:19:47Z" level=info msg="ImageID registry.redhat.io/redhat/certified-operator-index@sha256:39b4589658b766142baef3bd20d21114d8d0571e50c5c38d1a13f248c41c1e68" CatalogSource=certified-operators-q4gc2
time="2020-08-14T07:19:47Z" level=info msg="Update Pod ImageID registry.redhat.io/redhat/certified-operator-index@sha256:39b4589658b766142baef3bd20d21114d8d0571e50c5c38d1a13f248c41c1e68" CatalogSource=certified-operators-q4gc2
time="2020-08-14T07:19:47Z" level=info msg="ImageID registry.redhat.io/redhat/certified-operator-index@sha256:39b4589658b766142baef3bd20d21114d8d0571e50c5c38d1a13f248c41c1e68" CatalogSource=certified-operators-d75n9
time="2020-08-14T07:19:47Z" level=info msg="Serving Pod ImageID registry.redhat.io/redhat/certified-operator-index@sha256:39b4589658b766142baef3bd20d21114d8d0571e50c5c38d1a13f248c41c1e68" CatalogSource=certified-operators-d75n9
time="2020-08-14T07:19:47Z" level=info msg="no image update for catalogsource pod" CatalogSource=certified-operators
time="2020-08-14T07:19:47Z" level=info msg="creating new catalog source update pod" CatalogSource=certified-operators
time="2020-08-14T07:19:47Z" level=info msg="ImageID registry.redhat.io/redhat/certified-operator-index@sha256:39b4589658b766142baef3bd20d21114d8d0571e50c5c38d1a13f248c41c1e68" CatalogSource=redhat-marketplace-rkcmm
time="2020-08-14T07:19:47Z" level=info msg="Update Pod ImageID registry.redhat.io/redhat/certified-operator-index@sha256:39b4589658b766142baef3bd20d21114d8d0571e50c5c38d1a13f248c41c1e68" CatalogSource=redhat-marketplace-rkcmm
time="2020-08-14T07:19:47Z" level=info msg="ImageID registry.redhat.io/redhat/certified-operator-index@sha256:39b4589658b766142baef3bd20d21114d8d0571e50c5c38d1a13f248c41c1e68" CatalogSource=redhat-marketplace-nwsjr
time="2020-08-14T07:19:47Z" level=info msg="Serving Pod ImageID registry.redhat.io/redhat/certified-operator-index@sha256:39b4589658b766142baef3bd20d21114d8d0571e50c5c38d1a13f248c41c1e68" CatalogSource=redhat-marketplace-nwsjr
time="2020-08-14T07:19:47Z" level=info msg="no image update for catalogsource pod" CatalogSource=redhat-marketplace
time="2020-08-14T07:19:47Z" level=info msg="creating new catalog source update pod" CatalogSource=redhat-marketplace
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=redhat-operators state.State=READY"
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=certified-operators state.State=SHUTDOWN"
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=certified-operators state.State=CONNECTING"
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=certified-operators state.State=READY"
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=redhat-marketplace state.State=CONNECTING"
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=redhat-marketplace state.State=READY"
time="2020-08-14T07:19:47Z" level=info msg="ImageID quay.io/openshift-community-operators/catalog@sha256:97e40b5cfbdb50a6b9e78830e250579f224f222245f3150d0848ede3120886f4" CatalogSource=community-operators-cn7p7
time="2020-08-14T07:19:47Z" level=info msg="Update Pod ImageID quay.io/openshift-community-operators/catalog@sha256:97e40b5cfbdb50a6b9e78830e250579f224f222245f3150d0848ede3120886f4" CatalogSource=community-operators-cn7p7
time="2020-08-14T07:19:47Z" level=info msg="ImageID quay.io/openshift-community-operators/catalog@sha256:97e40b5cfbdb50a6b9e78830e250579f224f222245f3150d0848ede3120886f4" CatalogSource=community-operators-s2d5l
time="2020-08-14T07:19:47Z" level=info msg="Serving Pod ImageID quay.io/openshift-community-operators/catalog@sha256:97e40b5cfbdb50a6b9e78830e250579f224f222245f3150d0848ede3120886f4" CatalogSource=community-operators-s2d5l
time="2020-08-14T07:19:47Z" level=info msg="no image update for catalogsource pod" CatalogSource=community-operators
time="2020-08-14T07:19:47Z" level=info msg="creating new catalog source update pod" CatalogSource=community-operators
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=community-operators state.State=SHUTDOWN"
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=community-operators state.State=CONNECTING"
time="2020-08-14T07:19:47Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=community-operators state.State=READY"
time="2020-08-14T07:19:48Z" level=info msg=syncing event=update reconciling="*v1alpha1.Subscription" selflink=/apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
E0814 07:19:49.872362       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:19:49.872551       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:19:55.038086       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:19:55.038468       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:19:55.057288       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:19:55.057383       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:19:55.086678       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:19:55.087008       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:19:55.136012       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:19:55.136091       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:19:55.226806       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:19:55.226905       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:19:55.395951       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:19:55.396114       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:19:55.726663       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:19:55.726782       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
I0814 07:19:56.377820       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:20:35Z" level=info msg=syncing event=update reconciling="*v1alpha1.Subscription" selflink=/apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
time="2020-08-14T07:20:35Z" level=info msg="ImageID quay.io/openshift-qe-optional-operators/ocp4-index@sha256:9181eb700005b239e0967927c75290d188632d2f98fd029ef160708185c17682" CatalogSource=qe-app-registry-r8g4n
time="2020-08-14T07:20:35Z" level=info msg="Update Pod ImageID quay.io/openshift-qe-optional-operators/ocp4-index@sha256:9181eb700005b239e0967927c75290d188632d2f98fd029ef160708185c17682" CatalogSource=qe-app-registry-r8g4n
time="2020-08-14T07:20:35Z" level=info msg="ImageID quay.io/openshift-qe-optional-operators/ocp4-index@sha256:9181eb700005b239e0967927c75290d188632d2f98fd029ef160708185c17682" CatalogSource=qe-app-registry-vg4tw
time="2020-08-14T07:20:35Z" level=info msg="Serving Pod ImageID quay.io/openshift-qe-optional-operators/ocp4-index@sha256:9181eb700005b239e0967927c75290d188632d2f98fd029ef160708185c17682" CatalogSource=qe-app-registry-vg4tw
time="2020-08-14T07:20:35Z" level=info msg="no image update for catalogsource pod" CatalogSource=qe-app-registry
time="2020-08-14T07:20:35Z" level=info msg="creating new catalog source update pod" CatalogSource=qe-app-registry
time="2020-08-14T07:20:35Z" level=info msg=syncing event=update reconciling="*v1alpha1.Subscription" selflink=/apis/operators.coreos.com/v1alpha1/namespaces/default/subscriptions/atlasmap-operator
E0814 07:20:35.434873       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:35.435041       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:20:35Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=CONNECTING"
time="2020-08-14T07:20:35Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=READY"
time="2020-08-14T07:20:35Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=READY"
E0814 07:20:35.447265       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:35.447516       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:20:35Z" level=warning msg="pod status unknown" CatalogSource=qe-app-registry-4ng76
time="2020-08-14T07:20:35Z" level=info msg="ImageID " CatalogSource=qe-app-registry-4ng76
time="2020-08-14T07:20:35Z" level=info msg="Update Pod ImageID " CatalogSource=qe-app-registry-4ng76
time="2020-08-14T07:20:35Z" level=info msg="ImageID quay.io/openshift-qe-optional-operators/ocp4-index@sha256:9181eb700005b239e0967927c75290d188632d2f98fd029ef160708185c17682" CatalogSource=qe-app-registry-vg4tw
time="2020-08-14T07:20:35Z" level=info msg="Serving Pod ImageID quay.io/openshift-qe-optional-operators/ocp4-index@sha256:9181eb700005b239e0967927c75290d188632d2f98fd029ef160708185c17682" CatalogSource=qe-app-registry-vg4tw
time="2020-08-14T07:20:35Z" level=info msg="detect image update for catalogsource pod" CatalogSource=qe-app-registry
E0814 07:20:35.505300       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:35.505396       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:35.588216       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:35.588322       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:36.184775       1 queueinformer_operator.go:290] sync {"update" "openshift-marketplace/qe-app-registry"} failed: couldn't ensure registry server - error ensuring updated catalog source pod: : error creating new pod: : Operation cannot be fulfilled on pods "qe-app-registry-4ng76": the object has been modified; please apply your changes to the latest version and try again
E0814 07:20:36.381097       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:36.381266       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:20:36Z" level=info msg="ImageID " CatalogSource=qe-app-registry-4ng76
time="2020-08-14T07:20:36Z" level=info msg="Update Pod ImageID " CatalogSource=qe-app-registry-4ng76
time="2020-08-14T07:20:36Z" level=info msg="ImageID quay.io/openshift-qe-optional-operators/ocp4-index@sha256:9181eb700005b239e0967927c75290d188632d2f98fd029ef160708185c17682" CatalogSource=qe-app-registry-vg4tw
time="2020-08-14T07:20:36Z" level=info msg="Serving Pod ImageID quay.io/openshift-qe-optional-operators/ocp4-index@sha256:9181eb700005b239e0967927c75290d188632d2f98fd029ef160708185c17682" CatalogSource=qe-app-registry-vg4tw
time="2020-08-14T07:20:36Z" level=info msg="detect image update for catalogsource pod" CatalogSource=qe-app-registry
time="2020-08-14T07:20:37Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=CONNECTING"
time="2020-08-14T07:20:37Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=READY"
time="2020-08-14T07:20:37Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=CONNECTING"
time="2020-08-14T07:20:37Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=TRANSIENT_FAILURE"
E0814 07:20:37.181300       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
time="2020-08-14T07:20:37Z" level=error msg="failed to list bundles: rpc error: code = Unavailable desc = connection error: desc = \"transport: Error while dialing dial tcp 172.30.176.176:50051: connect: connection refused\"" catalog="{qe-app-registry openshift-marketplace}"
I0814 07:20:37.181805       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:20:37Z" level=info msg="no image update for catalogsource pod" CatalogSource=qe-app-registry
time="2020-08-14T07:20:37Z" level=info msg="creating new catalog source update pod" CatalogSource=qe-app-registry
time="2020-08-14T07:20:38Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=SHUTDOWN"
time="2020-08-14T07:20:38Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=CONNECTING"
E0814 07:20:38.181690       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:38.181845       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:20:38Z" level=info msg="ImageID " CatalogSource=qe-app-registry-rbbv2
time="2020-08-14T07:20:38Z" level=info msg="Update Pod ImageID " CatalogSource=qe-app-registry-rbbv2
time="2020-08-14T07:20:38Z" level=info msg="ImageID " CatalogSource=qe-app-registry-4ng76
time="2020-08-14T07:20:38Z" level=info msg="Serving Pod ImageID " CatalogSource=qe-app-registry-4ng76
time="2020-08-14T07:20:38Z" level=info msg="no image update for catalogsource pod" CatalogSource=qe-app-registry
time="2020-08-14T07:20:38Z" level=info msg="creating new catalog source update pod" CatalogSource=qe-app-registry
time="2020-08-14T07:20:38Z" level=error msg="failed to list bundles: rpc error: code = Canceled desc = grpc: the client connection is closing" catalog="{qe-app-registry openshift-marketplace}"
time="2020-08-14T07:20:38Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=CONNECTING"
E0814 07:20:38.980400       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:38.980551       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:20:45Z" level=info msg="state.Key.Namespace=openshift-marketplace state.Key.Name=qe-app-registry state.State=READY"
I0814 07:20:46.013238       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:46.032085       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:46.032667       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:46.124141       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:46.124263       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:46.143642       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:46.143766       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:46.217853       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:46.218012       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:46.617801       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:46.617919       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:47.019523       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:47.019879       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:47.417778       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:47.417876       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
E0814 07:20:47.827378       1 queueinformer_operator.go:290] sync "default" failed: found more than one head for channel
I0814 07:20:47.827540       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
I0814 07:20:48.478071       1 event.go:278] Event(v1.ObjectReference{Kind:"Namespace", Namespace:"", Name:"default", UID:"25b187c6-4964-416e-87ef-247072500f70", APIVersion:"v1", ResourceVersion:"489359", FieldPath:""}): type: 'Warning' reason: 'ResolutionFailed' found more than one head for channel
time="2020-08-14T07:20:49Z" level=info msg="Adding related objects for operator-lifecycle-manager-catalog"


Actual results:
installation of operator fails

Expected results:
installation of operator succeed

Additional info:

Comment 1 kuiwang 2020-08-18 02:36:47 UTC
Hi, 
 after it is fixed, the all index image should be rebuild based on the fixed opm, correct? Thanks

Comment 2 Vu Dinh 2020-08-20 17:23:20 UTC
*** Bug 1867254 has been marked as a duplicate of this bug. ***

Comment 3 Vu Dinh 2020-08-20 17:25:34 UTC
*** Bug 1870388 has been marked as a duplicate of this bug. ***

Comment 4 Evan Cordell 2020-09-11 12:48:18 UTC
*** Bug 1874053 has been marked as a duplicate of this bug. ***

Comment 6 kuiwang 2020-09-14 06:15:13 UTC
its verification is blocked by https://bugzilla.redhat.com/show_bug.cgi?id=1878023
After https://bugzilla.redhat.com/show_bug.cgi?id=1878023 is verified, I will verify it.

Comment 7 Evan Cordell 2020-09-16 16:58:07 UTC
*** Bug 1875247 has been marked as a duplicate of this bug. ***

Comment 8 kuiwang 2020-09-18 01:10:25 UTC
verify on 4.6 cluster. LGTM

--
kuiwang@Kuis-MacBook-Pro 1869441 % oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.6.0-0.nightly-2020-09-17-050000   True        False         76m     Cluster version is 4.6.0-0.nightly-2020-09-17-050000
kuiwang@Kuis-MacBook-Pro 1869441 % oc get pod -n openshift-operator-lifecycle-manager
NAME                                READY   STATUS    RESTARTS   AGE
catalog-operator-7b76596b48-mg47w   1/1     Running   0          72m
olm-operator-6ff98c98c6-7cscj       1/1     Running   0          72m
packageserver-79bc57c484-47gk7      1/1     Running   0          70m
packageserver-79bc57c484-8mf74      1/1     Running   0          72m
kuiwang@Kuis-MacBook-Pro 1869441 % oc exec catalog-operator-7b76596b48-mg47w -n openshift-operator-lifecycle-manager -- olm --version
OLM version: 0.16.1
git commit: 78bb41b4613cafca7264a3108d1f1fa4b66024c8
kuiwang@Kuis-MacBook-Pro 1869441 % vi og-single.yaml
kuiwang@Kuis-MacBook-Pro 1869441 % cat og-single.yaml 
kind: OperatorGroup
apiVersion: operators.coreos.com/v1
metadata:
  name: og-single
  namespace: default
spec:
  targetNamespaces:
  - default


kuiwang@Kuis-MacBook-Pro 1869441 % oc apply -f og-single.yaml 
operatorgroup.operators.coreos.com/og-single created
kuiwang@Kuis-MacBook-Pro 1869441 % vi catsrc.yaml
kuiwang@Kuis-MacBook-Pro 1869441 % cat catsrc.yaml 
apiVersion: operators.coreos.com/v1alpha1
kind: CatalogSource
metadata:
  name: atlasmap-catalog
  namespace: default
spec:
  displayName: Atlasmap Operator Catalog
  image: quay.io/olmqe/atlasmap-operator-index:23473
  publisher: QE
  sourceType: grpc

kuiwang@Kuis-MacBook-Pro 1869441 % oc apply -f catsrc.yaml 
catalogsource.operators.coreos.com/atlasmap-catalog created
kuiwang@Kuis-MacBook-Pro 1869441 % oc get catsrc atlasmap-catalog -o yaml
apiVersion: operators.coreos.com/v1alpha1
kind: CatalogSource
metadata:
...
status:
  connectionState:
    address: atlasmap-catalog.default.svc:50051
    lastConnect: "2020-09-18T01:00:21Z"
    lastObservedState: READY
  registryService:
    createdAt: "2020-09-18T01:00:02Z"
    port: "50051"
    protocol: grpc
    serviceName: atlasmap-catalog
    serviceNamespace: default

kuiwang@Kuis-MacBook-Pro 1869441 % vi sub.yaml
kuiwang@Kuis-MacBook-Pro 1869441 % cat sub.yaml 
apiVersion: operators.coreos.com/v1alpha1
kind: Subscription
metadata:
  name: atlasmap-operator
  namespace: default
spec:
  channel: alpha
  installPlanApproval: Manual
  name: atlasmap-operator
  source: atlasmap-catalog
  sourceNamespace: default
  startingCSV: atlasmap-operator.v0.1.0

kuiwang@Kuis-MacBook-Pro 1869441 % oc apply -f sub.yaml 
subscription.operators.coreos.com/atlasmap-operator created
kuiwang@Kuis-MacBook-Pro 1869441 % oc get sub atlasmap-operator -o yaml
apiVersion: operators.coreos.com/v1alpha1
kind: Subscription
metadata:
...
  currentCSV: atlasmap-operator.v0.1.0
  installPlanGeneration: 1
  installPlanRef:
    apiVersion: operators.coreos.com/v1alpha1
    kind: InstallPlan
    name: install-kfk5r
    namespace: default
    resourceVersion: "108691"
    uid: f4fd9f3c-85cf-43bc-9db1-e3b8a7a30d01
  installplan:
    apiVersion: operators.coreos.com/v1alpha1
    kind: InstallPlan
    name: install-kfk5r
    uuid: f4fd9f3c-85cf-43bc-9db1-e3b8a7a30d01
  lastUpdated: "2020-09-18T01:01:32Z"
  state: UpgradePending

kuiwang@Kuis-MacBook-Pro 1869441 % oc get ip
NAME            CSV                        APPROVAL   APPROVED
install-kfk5r   atlasmap-operator.v0.1.0   Manual     false
kuiwang@Kuis-MacBook-Pro 1869441 % oc edit ip install-kfk5r
installplan.operators.coreos.com/install-kfk5r edited
kuiwang@Kuis-MacBook-Pro 1869441 % oc get ip
NAME            CSV                        APPROVAL   APPROVED
install-kfk5r   atlasmap-operator.v0.1.0   Manual     true
install-xtth5   atlasmap-operator.v0.3.0   Manual     false
kuiwang@Kuis-MacBook-Pro 1869441 % oc get csv
NAME                       DISPLAY             VERSION   REPLACES   PHASE
atlasmap-operator.v0.1.0   AtlasMap Operator   0.1.0                Succeeded
kuiwang@Kuis-MacBook-Pro 1869441 % oc edit ip install-xtth5
installplan.operators.coreos.com/install-xtth5 edited
kuiwang@Kuis-MacBook-Pro 1869441 % oc get ip
NAME            CSV                        APPROVAL   APPROVED
install-kfk5r   atlasmap-operator.v0.1.0   Manual     true
install-xtth5   atlasmap-operator.v0.3.0   Manual     true
kuiwang@Kuis-MacBook-Pro 1869441 % oc get csv
NAME                       DISPLAY             VERSION   REPLACES                   PHASE
atlasmap-operator.v0.1.0   AtlasMap Operator   0.1.0                                Replacing
atlasmap-operator.v0.3.0   AtlasMap Operator   0.3.0     atlasmap-operator.v0.1.0   Installing
kuiwang@Kuis-MacBook-Pro 1869441 % oc get csv
NAME                       DISPLAY             VERSION   REPLACES                   PHASE
atlasmap-operator.v0.3.0   AtlasMap Operator   0.3.0     atlasmap-operator.v0.1.0   Succeeded

--


Note You need to log in before you can comment on or make changes to this bug.