Bug 1918124 - Failed to create migrationcontroller instance because failed to get ansible-runner stdout
Summary: Failed to create migrationcontroller instance because failed to get ansible-r...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Migration Toolkit for Containers
Classification: Red Hat
Component: Operator
Version: 1.4.0
Hardware: Unspecified
OS: Unspecified
urgent
urgent
Target Milestone: ---
: 1.4.0
Assignee: Rayford Johnson
QA Contact: Xin jiang
Avital Pinnick
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2021-01-20 06:37 UTC by Xin jiang
Modified: 2021-02-11 12:55 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2021-02-11 12:55:27 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Bugzilla 1918781 0 unspecified CLOSED Failed to get ansible-runner stdout 2021-02-22 00:41:40 UTC
Red Hat Product Errata RHBA-2020:5329 0 None None None 2021-02-11 12:55:45 UTC

Internal Links: 1918781

Description Xin jiang 2021-01-20 06:37:21 UTC
Description of problem:
With the latest stage MTC 1.4, the installation failed at creating migrationcontroller instance, the error message is "Failed to get ansible-runner stdout"

Version-Release number of selected component (if applicable):
MTC 1.4.0

Operator image:
registry.redhat.io/rhmtc/openshift-migration-rhel7-operator@sha256:d5126b848d3ea5595470ed49dd03fb7fd1a78349646865cbf44c12cee79bed65

How reproducible:
Always

Steps to Reproduce:
When the operator pod is ready , try to create migrationcontroller instance. 

Actual results:
The migrationcontroller instance "migration-controller" was showing an error message and didn't create velero/restric/controller pods.

$ oc get migrationcontroller -n openshift-migration migration-controller -o yaml
......
status:
  conditions:
  - lastTransitionTime: "2021-01-20T04:26:49Z"
    message: Failed to get ansible-runner stdout
    reason: Failed
    status: "True"
    type: Failure

$ oc get pod -n openshift-migration
NAME                                  READY   STATUS    RESTARTS   AGE
migration-operator-865695f59b-5w25c   1/1     Running   0          122m

Expected results:
migrationcontroller instance should create expected pods successfully, for instance velero/restric/controller pods.

Additional info:
Logs in operator pod.
$ oc logs migration-operator-865695f59b-5w25c -n openshift-migration
(python2_virtual_env) [fedora@preserve-appmigration-workmachine ocp45]$ oc logs migration-operator-865695f59b-5w25c -n openshift-migration
{"level":"info","ts":1611116795.7176538,"logger":"cmd","msg":"Go Version: go1.13.15"}
{"level":"info","ts":1611116795.7176766,"logger":"cmd","msg":"Go OS/Arch: linux/amd64"}
{"level":"info","ts":1611116795.71768,"logger":"cmd","msg":"Version of operator-sdk: v0.16.0"}
{"level":"info","ts":1611116795.7180014,"logger":"cmd","msg":"Watching single namespace.","Namespace":"openshift-migration"}
{"level":"info","ts":1611116798.4735296,"logger":"controller-runtime.metrics","msg":"metrics server is starting to listen","addr":"0.0.0.0:8383"}
{"level":"info","ts":1611116798.474781,"logger":"watches","msg":"Environment variable not set; using default value","envVar":"WORKER_MIGRATIONCONTROLLER_MIGRATION_OPENSHIFT_IO","default":1}
{"level":"info","ts":1611116798.4748175,"logger":"watches","msg":"Environment variable not set; using default value","envVar":"ANSIBLE_VERBOSITY_MIGRATIONCONTROLLER_MIGRATION_OPENSHIFT_IO","default":2}
{"level":"info","ts":1611116798.4748678,"logger":"cmd","msg":"Environment variable not set; using default value","Namespace":"openshift-migration","envVar":"ANSIBLE_DEBUG_LOGS","ANSIBLE_DEBUG_LOGS":false}
{"level":"info","ts":1611116798.4748774,"logger":"ansible-controller","msg":"Watching resource","Options.Group":"migration.openshift.io","Options.Version":"v1alpha1","Options.Kind":"MigrationController"}
{"level":"info","ts":1611116798.475004,"logger":"leader","msg":"Trying to become the leader."}
{"level":"info","ts":1611116801.2512538,"logger":"leader","msg":"No pre-existing lock was found."}
{"level":"info","ts":1611116801.2613206,"logger":"leader","msg":"Became the leader."}
{"level":"info","ts":1611116806.7989035,"logger":"metrics","msg":"Metrics Service object created","Service.Name":"migration-operator-metrics","Service.Namespace":"openshift-migration"}
{"level":"info","ts":1611116809.5811715,"logger":"proxy","msg":"Starting to serve","Address":"127.0.0.1:8888"}
{"level":"info","ts":1611116809.5813034,"logger":"controller-runtime.manager","msg":"starting metrics server","path":"/metrics"}
{"level":"info","ts":1611116809.5814123,"logger":"controller-runtime.controller","msg":"Starting EventSource","controller":"migrationcontroller-controller","source":"kind source: migration.openshift.io/v1alpha1, Kind=MigrationController"}
{"level":"info","ts":1611116809.6817353,"logger":"controller-runtime.controller","msg":"Starting Controller","controller":"migrationcontroller-controller"}
{"level":"info","ts":1611116809.6817756,"logger":"controller-runtime.controller","msg":"Starting workers","controller":"migrationcontroller-controller","worker count":1}
{"level":"error","ts":1611116809.973457,"logger":"runner","msg":"Traceback (most recent call last):\n  File \"/usr/lib/python2.7/site-packages/ansible_runner/__main__.py\", line 329, in main\n    res = run(**run_options)\n  File \"/usr/lib/python2.7/site-packages/ansible_runner/interface.py\", line 162, in run\n    r.run()\n  File \"/usr/lib/python2.7/site-packages/ansible_runner/runner.py\", line 93, in run\n    self.status_callback('starting')\n  File \"/usr/lib/python2.7/site-packages/ansible_runner/runner.py\", line 84, in status_callback\n    ansible_runner.plugins[plugin].status_handler(self.config, status_data)\n  File \"/usr/lib/python2.7/site-packages/ansible_runner_http/events.py\", line 38, in status_handler\n    urlpath=plugin_config['runner_path'])\n  File \"/usr/lib/python2.7/site-packages/ansible_runner_http/events.py\", line 18, in send_request\n    return session.post(url_actual, headers=headers, json=(data))\n  File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 529, in post\n    return self.request('POST', url, data=data, json=json, **kwargs)\n  File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 472, in request\n    prep = self.prepare_request(req)\n  File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 403, in prepare_request\n    hooks=merge_hooks(request.hooks, self.hooks),\n  File \"/usr/lib/python2.7/site-packages/requests/models.py\", line 304, in prepare\n    self.prepare_url(url, params)\n  File \"/usr/lib/python2.7/site-packages/requests/models.py\", line 355, in prepare_url\n    scheme, auth, host, port, path, query, fragment = parse_url(url)\n  File \"/usr/lib/python2.7/site-packages/urllib3/util/url.py\", line 407, in parse_url\n    ensure_func = six.ensure_text\nAttributeError: 'module' object has no attribute 'ensure_text'\n","job":"6129484611666145821","name":"migration-controller","namespace":"openshift-migration","error":"exit status 1","stacktrace":"github.com/go-logr/zapr.(*zapLogger).Error\n\tsrc/github.com/operator-framework/operator-sdk/vendor/github.com/go-logr/zapr/zapr.go:128\ngithub.com/operator-framework/operator-sdk/pkg/ansible/runner.(*runner).Run.func1\n\tsrc/github.com/operator-framework/operator-sdk/pkg/ansible/runner/runner.go:239"}
{"level":"error","ts":1611116809.9860802,"logger":"reconciler","msg":"Failed to get ansible-runner stdout","job":"6129484611666145821","name":"migration-controller","namespace":"openshift-migration","error":"open /tmp/ansible-operator/runner/migration.openshift.io/v1alpha1/MigrationController/openshift-migration/migration-controller/artifacts/6129484611666145821/stdout: no such file or directory","stacktrace":"github.com/go-logr/zapr.(*zapLogger).Error\n\tsrc/github.com/operator-framework/operator-sdk/vendor/github.com/go-logr/zapr/zapr.go:128\ngithub.com/operator-framework/operator-sdk/pkg/ansible/controller.(*AnsibleOperatorReconciler).Reconcile\n\tsrc/github.com/operator-framework/operator-sdk/pkg/ansible/controller/reconcile.go:208\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:256\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:232\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:211\nk8s.io/apimachinery/pkg/util/wait.JitterUntil.func1\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:152\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:153\nk8s.io/apimachinery/pkg/util/wait.Until\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88"}
{"level":"error","ts":1611116809.9862165,"logger":"controller-runtime.controller","msg":"Reconciler error","controller":"migrationcontroller-controller","request":"openshift-migration/migration-controller","error":"open /tmp/ansible-operator/runner/migration.openshift.io/v1alpha1/MigrationController/openshift-migration/migration-controller/artifacts/6129484611666145821/stdout: no such file or directory","stacktrace":"github.com/go-logr/zapr.(*zapLogger).Error\n\tsrc/github.com/operator-framework/operator-sdk/vendor/github.com/go-logr/zapr/zapr.go:128\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:258\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:232\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:211\nk8s.io/apimachinery/pkg/util/wait.JitterUntil.func1\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:152\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:153\nk8s.io/apimachinery/pkg/util/wait.Until\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88"}
......

Comment 9 Xin jiang 2021-01-25 03:15:19 UTC
verified.

Images:
"registry.redhat.io/rhmtc/openshift-migration-controller-rhel8@sha256:cdf1bd56e353f076693cb7373c0a876be8984593d664ee0d7e1aeae7a3c54c1f",
"registry.redhat.io/rhmtc/openshift-migration-log-reader-rhel8@sha256:6dbd4c4aa27dcaede49f68159b9923840732d67bfb4f14e4107e8ff28f56defa",
"registry.redhat.io/rhmtc/openshift-migration-rhel7-operator@sha256:79f524931e7188bfbfddf1e3d23f491b627d691ef7849a42432c7aec2d5f8a54",
"registry.redhat.io/rhmtc/openshift-migration-ui-rhel8@sha256:8d460632dd50529aa0054b14c95e7a44dd6478ad3116ef5a27a4b904fe4360d7",
"registry.redhat.io/rhmtc/openshift-migration-velero-plugin-for-aws-rhel8@sha256:7c8d143d1ba9605e33e33392dece4a06607ddbdaccfeece36259b7d4fbbeff96",
"registry.redhat.io/rhmtc/openshift-migration-velero-plugin-for-gcp-rhel8@sha256:4ef0b71cf9d464d39086c024f26df7579279877afbab31935c4fb00ca7c883b9",
"registry.redhat.io/rhmtc/openshift-migration-velero-plugin-for-microsoft-azure-rhel8@sha256:bc8beadfeaac4ca72c9aa185176a097501781ad35064f1785183c34e577505f4",
"registry.redhat.io/rhmtc/openshift-migration-velero-rhel8@sha256:60b3aa7f53afccbbba9630904b10ba3257921769e7b142fd9ceb3df2c5016302",
"registry.redhat.io/rhmtc/openshift-velero-plugin-rhel8@sha256:c0f0a0698ae9b1ac8ff4279653f688f5cfbf80615efce50e9a03a194a02ede2a"

Comment 11 errata-xmlrpc 2021-02-11 12:55:27 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Migration Toolkit for Containers (MTC) tool image release advisory 1.4.0), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:5329


Note You need to log in before you can comment on or make changes to this bug.