Bug 1918987 - Failed to get ansible-runner stdout
Summary: Failed to get ansible-runner stdout
Keywords:
Status: CLOSED NOTABUG
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Operator SDK
Version: 4.5
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: ---
: 4.6.0
Assignee: Joe Lanford
QA Contact: Fan Jia
URL:
Whiteboard:
Depends On:
Blocks: 1918781 1918988
TreeView+ depends on / blocked
 
Reported: 2021-01-21 20:44 UTC by Jesus M. Rodriguez
Modified: 2021-01-21 20:48 UTC (History)
1 user (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2021-01-21 20:45:02 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)

Description Jesus M. Rodriguez 2021-01-21 20:44:22 UTC
This bug was initially created as a copy of Bug #1918781

I am copying this bug because:  I need to appease the bug bot validation for z-streams.



Description of problem:
With the latest stage MTC 1.4, the installation failed at creating migrationcontroller instance, the error message is "Failed to get ansible-runner stdout"

Version-Release number of selected component (if applicable):
MTC 1.4.0

Operator image:
registry.redhat.io/rhmtc/openshift-migration-rhel7-operator@sha256:d5126b848d3ea5595470ed49dd03fb7fd1a78349646865cbf44c12cee79bed65

How reproducible:
Always

Steps to Reproduce:
When the operator pod is ready , try to create migrationcontroller instance. 

Actual results:
The migrationcontroller instance "migration-controller" was showing an error message and didn't create velero/restric/controller pods.

$ oc get migrationcontroller -n openshift-migration migration-controller -o yaml
......
status:
  conditions:
  - lastTransitionTime: "2021-01-20T04:26:49Z"
    message: Failed to get ansible-runner stdout
    reason: Failed
    status: "True"
    type: Failure

$ oc get pod -n openshift-migration
NAME                                  READY   STATUS    RESTARTS   AGE
migration-operator-865695f59b-5w25c   1/1     Running   0          122m

Expected results:
migrationcontroller instance should create expected pods successfully, for instance velero/restric/controller pods.

Additional info:
Logs in operator pod.
$ oc logs migration-operator-865695f59b-5w25c -n openshift-migration
(python2_virtual_env) [fedora@preserve-appmigration-workmachine ocp45]$ oc logs migration-operator-865695f59b-5w25c -n openshift-migration
{"level":"info","ts":1611116795.7176538,"logger":"cmd","msg":"Go Version: go1.13.15"}
{"level":"info","ts":1611116795.7176766,"logger":"cmd","msg":"Go OS/Arch: linux/amd64"}
{"level":"info","ts":1611116795.71768,"logger":"cmd","msg":"Version of operator-sdk: v0.16.0"}
{"level":"info","ts":1611116795.7180014,"logger":"cmd","msg":"Watching single namespace.","Namespace":"openshift-migration"}
{"level":"info","ts":1611116798.4735296,"logger":"controller-runtime.metrics","msg":"metrics server is starting to listen","addr":"0.0.0.0:8383"}
{"level":"info","ts":1611116798.474781,"logger":"watches","msg":"Environment variable not set; using default value","envVar":"WORKER_MIGRATIONCONTROLLER_MIGRATION_OPENSHIFT_IO","default":1}
{"level":"info","ts":1611116798.4748175,"logger":"watches","msg":"Environment variable not set; using default value","envVar":"ANSIBLE_VERBOSITY_MIGRATIONCONTROLLER_MIGRATION_OPENSHIFT_IO","default":2}
{"level":"info","ts":1611116798.4748678,"logger":"cmd","msg":"Environment variable not set; using default value","Namespace":"openshift-migration","envVar":"ANSIBLE_DEBUG_LOGS","ANSIBLE_DEBUG_LOGS":false}
{"level":"info","ts":1611116798.4748774,"logger":"ansible-controller","msg":"Watching resource","Options.Group":"migration.openshift.io","Options.Version":"v1alpha1","Options.Kind":"MigrationController"}
{"level":"info","ts":1611116798.475004,"logger":"leader","msg":"Trying to become the leader."}
{"level":"info","ts":1611116801.2512538,"logger":"leader","msg":"No pre-existing lock was found."}
{"level":"info","ts":1611116801.2613206,"logger":"leader","msg":"Became the leader."}
{"level":"info","ts":1611116806.7989035,"logger":"metrics","msg":"Metrics Service object created","Service.Name":"migration-operator-metrics","Service.Namespace":"openshift-migration"}
{"level":"info","ts":1611116809.5811715,"logger":"proxy","msg":"Starting to serve","Address":"127.0.0.1:8888"}
{"level":"info","ts":1611116809.5813034,"logger":"controller-runtime.manager","msg":"starting metrics server","path":"/metrics"}
{"level":"info","ts":1611116809.5814123,"logger":"controller-runtime.controller","msg":"Starting EventSource","controller":"migrationcontroller-controller","source":"kind source: migration.openshift.io/v1alpha1, Kind=MigrationController"}
{"level":"info","ts":1611116809.6817353,"logger":"controller-runtime.controller","msg":"Starting Controller","controller":"migrationcontroller-controller"}
{"level":"info","ts":1611116809.6817756,"logger":"controller-runtime.controller","msg":"Starting workers","controller":"migrationcontroller-controller","worker count":1}
{"level":"error","ts":1611116809.973457,"logger":"runner","msg":"Traceback (most recent call last):\n  File \"/usr/lib/python2.7/site-packages/ansible_runner/__main__.py\", line 329, in main\n    res = run(**run_options)\n  File \"/usr/lib/python2.7/site-packages/ansible_runner/interface.py\", line 162, in run\n    r.run()\n  File \"/usr/lib/python2.7/site-packages/ansible_runner/runner.py\", line 93, in run\n    self.status_callback('starting')\n  File \"/usr/lib/python2.7/site-packages/ansible_runner/runner.py\", line 84, in status_callback\n    ansible_runner.plugins[plugin].status_handler(self.config, status_data)\n  File \"/usr/lib/python2.7/site-packages/ansible_runner_http/events.py\", line 38, in status_handler\n    urlpath=plugin_config['runner_path'])\n  File \"/usr/lib/python2.7/site-packages/ansible_runner_http/events.py\", line 18, in send_request\n    return session.post(url_actual, headers=headers, json=(data))\n  File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 529, in post\n    return self.request('POST', url, data=data, json=json, **kwargs)\n  File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 472, in request\n    prep = self.prepare_request(req)\n  File \"/usr/lib/python2.7/site-packages/requests/sessions.py\", line 403, in prepare_request\n    hooks=merge_hooks(request.hooks, self.hooks),\n  File \"/usr/lib/python2.7/site-packages/requests/models.py\", line 304, in prepare\n    self.prepare_url(url, params)\n  File \"/usr/lib/python2.7/site-packages/requests/models.py\", line 355, in prepare_url\n    scheme, auth, host, port, path, query, fragment = parse_url(url)\n  File \"/usr/lib/python2.7/site-packages/urllib3/util/url.py\", line 407, in parse_url\n    ensure_func = six.ensure_text\nAttributeError: 'module' object has no attribute 'ensure_text'\n","job":"6129484611666145821","name":"migration-controller","namespace":"openshift-migration","error":"exit status 1","stacktrace":"github.com/go-logr/zapr.(*zapLogger).Error\n\tsrc/github.com/operator-framework/operator-sdk/vendor/github.com/go-logr/zapr/zapr.go:128\ngithub.com/operator-framework/operator-sdk/pkg/ansible/runner.(*runner).Run.func1\n\tsrc/github.com/operator-framework/operator-sdk/pkg/ansible/runner/runner.go:239"}
{"level":"error","ts":1611116809.9860802,"logger":"reconciler","msg":"Failed to get ansible-runner stdout","job":"6129484611666145821","name":"migration-controller","namespace":"openshift-migration","error":"open /tmp/ansible-operator/runner/migration.openshift.io/v1alpha1/MigrationController/openshift-migration/migration-controller/artifacts/6129484611666145821/stdout: no such file or directory","stacktrace":"github.com/go-logr/zapr.(*zapLogger).Error\n\tsrc/github.com/operator-framework/operator-sdk/vendor/github.com/go-logr/zapr/zapr.go:128\ngithub.com/operator-framework/operator-sdk/pkg/ansible/controller.(*AnsibleOperatorReconciler).Reconcile\n\tsrc/github.com/operator-framework/operator-sdk/pkg/ansible/controller/reconcile.go:208\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:256\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:232\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:211\nk8s.io/apimachinery/pkg/util/wait.JitterUntil.func1\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:152\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:153\nk8s.io/apimachinery/pkg/util/wait.Until\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88"}
{"level":"error","ts":1611116809.9862165,"logger":"controller-runtime.controller","msg":"Reconciler error","controller":"migrationcontroller-controller","request":"openshift-migration/migration-controller","error":"open /tmp/ansible-operator/runner/migration.openshift.io/v1alpha1/MigrationController/openshift-migration/migration-controller/artifacts/6129484611666145821/stdout: no such file or directory","stacktrace":"github.com/go-logr/zapr.(*zapLogger).Error\n\tsrc/github.com/operator-framework/operator-sdk/vendor/github.com/go-logr/zapr/zapr.go:128\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).reconcileHandler\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:258\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).processNextWorkItem\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:232\nsigs.k8s.io/controller-runtime/pkg/internal/controller.(*Controller).worker\n\tsrc/github.com/operator-framework/operator-sdk/vendor/sigs.k8s.io/controller-runtime/pkg/internal/controller/controller.go:211\nk8s.io/apimachinery/pkg/util/wait.JitterUntil.func1\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:152\nk8s.io/apimachinery/pkg/util/wait.JitterUntil\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:153\nk8s.io/apimachinery/pkg/util/wait.Until\n\tsrc/github.com/operator-framework/operator-sdk/vendor/k8s.io/apimachinery/pkg/util/wait/wait.go:88"}
......


http://pastebin.test.redhat.com/933018


Note You need to log in before you can comment on or make changes to this bug.