Bug 1737577 - Cannot install app migration components in OCP 3.11
Summary: Cannot install app migration components in OCP 3.11
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Migration Tooling
Version: 3.11.0
Hardware: Unspecified
OS: Unspecified
unspecified
high
Target Milestone: ---
: 4.2.0
Assignee: Derek Whatley
QA Contact: Sergio
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-08-05 17:09 UTC by Sergio
Modified: 2019-10-16 06:35 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2019-10-16 06:34:48 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2019:2922 0 None None None 2019-10-16 06:35:05 UTC

Description Sergio 2019-08-05 17:09:44 UTC
Description of problem:

Installation of the operator and the controller application migration fails in OCP 3.11 source cluster.

Version-Release number of selected component (if applicable):

$ oc version
oc v3.11.126
kubernetes v1.11.0+d4cacc0
features: Basic-Auth GSSAPI Kerberos SPNEGO

Server https://
openshift v3.11.104
kubernetes v1.11.0+d4cacc0


https://github.com/fusor/mig-operator
commit-id: 75321b8757d3d997315b93bfb284131d8acb738c

Quay manifest operator image
https://quay.io/repository/ocpmigrate/mig-operator/manifest/sha256:6cd4fb75ce7e79f668de3281c7ebe1069f1d32f8f17902ead4bacc3cd0a74e5f

$ oc describe pod migration-operator-6d47cc9948-qrrdf   | grep image
  Normal  Pulling    39m   kubelet, node1.sregidor-ocp3.internal  pulling image "quay.io/ocpmigrate/mig-operator:latest"


How reproducible:

Steps to Reproduce:

1. clone https://github.com/fusor/mig-operator

2. oc create -f operator.yml

3.Since it is the ocp3 source cluster, edit controller.yaml so that
migration_controller: false
migration_ui: false

4. oc create -f controller.yml


Actual results:

1. Service account with name "mig" is not created.

$ oc get pods
NAME                                  READY     STATUS    RESTARTS   AGE
migration-operator-6d47cc9948-pw2p7   2/2       Running   0          1m
restic-6c2f9                          1/1       Running   0          23s
restic-8jlmh                          1/1       Running   0          23s
restic-b79b4                          1/1       Running   0          23s
restic-ctbgt                          1/1       Running   0          23s
restic-wskh2                          1/1       Running   0          23s
velero-7559946c5c-hqvh8               1/1       Running   0          23s

$ oc get sa
NAME                 SECRETS   AGE
builder              2         1m
default              2         1m
deployer             2         1m
migration-operator   2         1m
velero               2         29s



2. There is an error in operator logs
$ oc logs migration-operator-6d47cc9948-pw2p7 -c operator
.
.
.
\nTASK [migrationcontroller : Set up migration CRDs] *****************************\r\ntask path: /opt/ansible/roles/migrationcontroller/tasks/main.yml:86\nok: [localhost] => (item=cluster-registry-crd.yaml) => {\"changed\": false, \"item\": \"cluster-registry-crd.yaml\", \"method\": \"delete\", \"result\": {}}\nok: [localhost] => (item=migration_v1alpha1_migcluster.yaml) => {\"changed\": false, \"item\": \"migration_v1alpha1_migcluster.yaml\", \"method\": \"delete\", \"result\": {}}\nok: [localhost] => (item=migration_v1alpha1_migmigration.yaml) => {\"changed\": false, \"item\": \"migration_v1alpha1_migmigration.yaml\", \"method\": \"delete\", \"result\": {}}\nok: [localhost] => (item=migration_v1alpha1_migplan.yaml) => {\"changed\": false, \"item\": \"migration_v1alpha1_migplan.yaml\", \"method\": \"delete\", \"result\": {}}\nok: [localhost] => (item=migration_v1alpha1_migstorage.yaml) => {\"changed\": false, \"item\": \"migration_v1alpha1_migstorage.yaml\", \"method\": \"delete\", \"result\": {}}\n\r\nTASK [migrationcontroller : Set up mig controller] *****************************\r\ntask path: /opt/ansible/roles/migrationcontroller/tasks/main.yml:97\nfatal: [localhost]: FAILED! => {\"changed\": false, \"msg\": \"Failed to find exact match for migration.openshift.io/v1alpha1.MigCluster by [kind, name, singularName, shortNames]\"}\n\r\nPLAY RECAP *********************************************************************\r\nlocalhost                  : ok=8    changed=0    unreachable=0    failed=1   \r\n\n","job":"1837425794803595142","name":"migration-controller","namespace":"mig","error":"exit status 2","stacktrace":"github.com/go-logr/zapr.(*zapLogger).Error\n\tpkg/mod/github.com/go-logr/zapr.1/zapr.go:128\ngithub.com/operator-framework/operator-sdk/pkg/ansible/runner.(*runner).Run.func1\n\tsrc/github.com/operator-framework/operator-sdk/pkg/ansible/runner/runner.go:190"}



Expected results:

1.  No failures in operator's logs
2.  and "mig" service account created.

Additional info:

Comment 1 Derek Whatley 2019-08-05 20:02:14 UTC
@Sergio, I've just merged a PR aimed at fixing this: https://github.com/fusor/mig-operator/pull/31

Can you test again with the latest mig-operator image? should be available on latest/master https://quay.io/repository/ocpmigrate/mig-operator?tab=tags

Corresponding quay autobuild: https://quay.io/repository/ocpmigrate/mig-operator/build/cba457a3-3d79-4d9c-9540-a3ce424e7c7a

Comment 3 Sergio 2019-08-06 07:30:37 UTC
@Derek Whatley, I have checked it. It installed properly, the "mig" service account was created and I could add the source cluster without any problem.

$ oc get sa
NAME                 SECRETS   AGE
builder              2         1m
default              2         1m
deployer             2         1m
mig                  2         12s
migration-operator   2         1m
velero               2         20s


Thank you very much.

Comment 5 errata-xmlrpc 2019-10-16 06:34:48 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2019:2922


Note You need to log in before you can comment on or make changes to this bug.