Bug 1803122 - [4.3] cannot interact with toolbox container after first execution
Summary: [4.3] cannot interact with toolbox container after first execution
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: RHCOS
Version: 4.3.0
Hardware: Unspecified
OS: Unspecified
unspecified
unspecified
Target Milestone: ---
: 4.3.z
Assignee: Micah Abbott
QA Contact: Michael Nguyen
URL:
Whiteboard:
Depends On: 1803112
Blocks: 1186913 1803128
TreeView+ depends on / blocked
 
Reported: 2020-02-14 14:26 UTC by Micah Abbott
Modified: 2020-03-24 14:34 UTC (History)
7 users (show)

Fixed In Version: toolbox-0.0.7-1.rhaos4.3.el8
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 1803112
: 1803128 (view as bug list)
Environment:
Last Closed: 2020-03-24 14:33:37 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github coreos toolbox pull 62 0 None closed rhcos-toolbox: use interactive flag for exec 2021-01-18 03:08:23 UTC
Red Hat Knowledge Base (Solution) 4919141 0 None None None 2020-03-20 21:18:37 UTC
Red Hat Product Errata RHBA-2020:0858 0 None None None 2020-03-24 14:34:03 UTC

Description Micah Abbott 2020-02-14 14:26:04 UTC
+++ This bug was initially created as a clone of Bug #1803112 +++

Users trying to invoke the toolbox container after the first execution will encounter a stuck session where no input is possible.

This has already been fixed upstream

https://github.com/coreos/toolbox/pull/62

Comment 1 Micah Abbott 2020-03-12 13:23:39 UTC
The fixed package landed in RHCOS 43.81.202003111353.0; all subsequent builds will include the fix.

Comment 4 Michael Nguyen 2020-03-13 18:16:12 UTC
Verified on 4.3.0-0.nightly-2020-03-13-103840 which has RHCOS 43.81.202003111633.0

$ oc get node
NAME                           STATUS   ROLES    AGE     VERSION
ip-10-0-132-161.ec2.internal   Ready    worker   9m42s   v1.16.2
ip-10-0-132-86.ec2.internal    Ready    worker   10m     v1.16.2
ip-10-0-135-21.ec2.internal    Ready    master   18m     v1.16.2
ip-10-0-135-88.ec2.internal    Ready    master   18m     v1.16.2
ip-10-0-144-81.ec2.internal    Ready    master   18m     v1.16.2
ip-10-0-152-86.ec2.internal    Ready    worker   10m     v1.16.2

$ oc debug node/ip-10-0-132-161.ec2.internal
Starting pod/ip-10-0-132-161ec2internal-debug ...
To use host binaries, run `chroot /host`


chroot /host
If you don't see a command prompt, try pressing enter.

sh-4.2# 

sh-4.2# sh-4.2# chroot /host
sh-4.4# sosreport
sh: sosreport: command not found
sh-4.4# rpm -q toolbox
toolbox-0.0.7-1.rhaos4.3.el8.noarch
sh-4.4# rpm-ostree status
State: idle
AutomaticUpdates: disabled
Deployments:
* pivot://quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:de35f4db968ba15b9868e2c3851b445c202dd79fa141db92e225ed2bc53b599a
              CustomOrigin: Managed by machine-config-operator
                   Version: 43.81.202003111633.0 (2020-03-11T16:38:41Z)

  ostree://23527ffc123c6e2bedf3479ff7e96f38d92cec88d5a7951fa56e9d0ec75ddd77
                   Version: 43.81.202001142154.0 (2020-01-14T21:59:51Z)
sh-4.4# toolbox
Trying to pull registry.redhat.io/rhel8/support-tools...
Getting image source signatures
Copying blob eae5d284042d done
Copying blob 0a4a43613721 done
Copying blob ff6f434a470a done
Copying config 53d1e01dae done
Writing manifest to image destination
Storing signatures
53d1e01dae0c44c45f36e72d2d1f0fa91069c147bbd9d2971335ecf2ca93b446
Spawning a container 'toolbox-' with image 'registry.redhat.io/rhel8/support-tools'
Detected RUN label in the container image. Using that as the default...
command: podman run -it --name toolbox- --privileged --ipc=host --net=host --pid=host -e HOST=/host -e NAME=toolbox- -e IMAGE=registry.redhat.io/rhel8/support-tools:latest -v /run:/run -v /var/log:/var/log -v /etc/machine-id:/etc/machine-id -v /etc/localtime:/etc/localtime -v /:/host registry.redhat.io/rhel8/support-tools:latest
[root@ip-10-0-132-161 /]# sosreport

sosreport (version 3.7)

This command will collect diagnostic and configuration information from
this Red Hat Enterprise Linux system and installed applications.

An archive containing the collected information will be generated in
/host/var/tmp/sos.p3wvnj7t and may be provided to a Red Hat support
representative.

Any information provided to Red Hat will be treated in accordance with
the published support policies at:

  https://access.redhat.com/support/

The generated archive may contain data considered sensitive and its
content should be reviewed by the originating organization before being
passed to any third party.

No changes will be made to system configuration.

Press ENTER to continue, or CTRL-C to quit.
^CExiting on user cancel
[root@ip-10-0-132-161 /]# exit
exit
sh-4.4# toolbox
Container 'toolbox-' already exists. Trying to start...
(To remove the container and start with a fresh toolbox, run: sudo podman rm 'toolbox-')
toolbox-
Container started successfully. To exit, type 'exit'.
sh-4.2# exit
exit
sh-4.4# toolbox
Container 'toolbox-' already exists. Trying to start...
(To remove the container and start with a fresh toolbox, run: sudo podman rm 'toolbox-')
toolbox-
Container started successfully. To exit, type 'exit'.
sh-4.2# sosreport

sosreport (version 3.7)

This command will collect diagnostic and configuration information from
this Red Hat Enterprise Linux system and installed applications.

An archive containing the collected information will be generated in
/host/var/tmp/sos.K8V9ht and may be provided to a Red Hat support
representative.

Any information provided to Red Hat will be treated in accordance with
the published support policies at:

  https://access.redhat.com/support/

The generated archive may contain data considered sensitive and its
content should be reviewed by the originating organization before being
passed to any third party.

No changes will be made to system configuration.

Press ENTER to continue, or CTRL-C to quit.
^CExiting on user cancel
sh-4.2# exit
exit
sh-4.4# toolbox
Container 'toolbox-' already exists. Trying to start...
(To remove the container and start with a fresh toolbox, run: sudo podman rm 'toolbox-')
toolbox-
Container started successfully. To exit, type 'exit'.
sh-4.2# exit
exit
sh-4.4# exit
exit
sh-4.2# exit
exit

Removing debug pod ...
$ oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.3.0-0.nightly-2020-03-13-103840   True        False         5m45s   Cluster version is 4.3.0-0.nightly-2020-03-13-103840

Comment 6 errata-xmlrpc 2020-03-24 14:33:37 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:0858


Note You need to log in before you can comment on or make changes to this bug.