RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1811514 - [container-tools:rhel8] Failed to start existing container
Summary: [container-tools:rhel8] Failed to start existing container
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 8
Classification: Red Hat
Component: toolbox
Version: 8.2
Hardware: x86_64
OS: Linux
high
high
Target Milestone: rc
: 8.2
Assignee: Jindrich Novy
QA Contact: Martin Jenner
URL:
Whiteboard:
Depends On:
Blocks: 1734579 1815460 1816287
TreeView+ depends on / blocked
 
Reported: 2020-03-09 06:03 UTC by Alex Jia
Modified: 2021-09-03 15:16 UTC (History)
5 users (show)

Fixed In Version: toolbox-0.0.7-1.el8
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 1815460 1816287 (view as bug list)
Environment:
Last Closed: 2020-04-28 15:53:37 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github coreos toolbox pull 62 0 None closed rhcos-toolbox: use interactive flag for exec 2021-01-18 03:15:27 UTC
Red Hat Bugzilla 1803112 0 low CLOSED cannot interact with toolbox container after first execution 2021-02-22 00:41:40 UTC
Red Hat Issue Tracker RHELPLAN-39559 0 None None None 2021-09-03 15:16:36 UTC
Red Hat Knowledge Base (Solution) 4919141 0 None None None 2020-03-20 21:17:55 UTC
Red Hat Product Errata RHSA-2020:1650 0 None None None 2020-04-28 15:53:51 UTC

Description Alex Jia 2020-03-09 06:03:41 UTC
Description of problem:
Failed to start existing container, and can't input any command on terminal window.

Version-Release number of selected component (if applicable):

[root@hpe-dl380pgen8-02-vm-5 ~]# cat /etc/redhat-release 
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)

[root@hpe-dl380pgen8-02-vm-5 ~]# rpm -q toolbox podman buildah runc skopeo kernel
toolbox-0.0.4-1.module+el8.2.0+4897+4dbe4c2c.x86_64
podman-1.6.4-6.module+el8.2.0+5825+4c03a97d.x86_64
buildah-1.11.6-7.module+el8.2.0+5856+b8046c6d.x86_64
runc-1.0.0-65.rc10.module+el8.2.0+5762+aaee29fb.x86_64
skopeo-0.1.40-9.module+el8.2.0+5762+aaee29fb.x86_64
kernel-4.18.0-185.el8.x86_64

How reproducible:
always

Steps to Reproduce:
1. toolbox then exit
2. toolbox again

Actual results:

[root@hpe-dl380pgen8-02-vm-5 ~]# toolbox
Trying to pull registry.redhat.io/rhel8/support-tools...
Getting image source signatures
Copying blob 0a4a43613721 done
Copying blob ff6f434a470a done
Copying blob eae5d284042d done
Copying config 53d1e01dae done
Writing manifest to image destination
Storing signatures
53d1e01dae0c44c45f36e72d2d1f0fa91069c147bbd9d2971335ecf2ca93b446
Spawning a container 'toolbox-root' with image 'registry.redhat.io/rhel8/support-tools'
Detected RUN label in the container image. Using that as the default...
command: podman run -it --name toolbox-root --privileged --ipc=host --net=host --pid=host -e HOST=/host -e NAME=toolbox-root -e IMAGE=registry.redhat.io/rhel8/support-tools:latest -v /run:/run -v /var/log:/var/log -v /etc/machine-id:/etc/machine-id -v /etc/localtime:/etc/localtime -v /:/host registry.redhat.io/rhel8/support-tools:latest
[root@hpe-dl380pgen8-02-vm-5 /]# pwd
/
[root@hpe-dl380pgen8-02-vm-5 /]# ls
bin  boot  dev	etc  home  host  lib  lib64  lost+found  media	mnt  opt  proc	root  run  sbin  srv  sys  tmp	usr  var
[root@hpe-dl380pgen8-02-vm-5 /]# exit
exit
[root@hpe-dl380pgen8-02-vm-5 ~]# echo $?
0
[root@hpe-dl380pgen8-02-vm-5 ~]# podman images
REPOSITORY                               TAG      IMAGE ID       CREATED       SIZE
registry.redhat.io/rhel8/support-tools   latest   53d1e01dae0c   5 weeks ago   271 MB
[root@hpe-dl380pgen8-02-vm-5 ~]# podman ps -a
CONTAINER ID  IMAGE                                          COMMAND        CREATED         STATUS                     PORTS  NAMES
a92e0e3814e1  registry.redhat.io/rhel8/support-tools:latest  /usr/bin/bash  39 seconds ago  Exited (0) 27 seconds ago         toolbox-root
[root@hpe-dl380pgen8-02-vm-5 ~]# toolbox
Container 'toolbox-root' already exists. Trying to start...
(To remove the container and start with a fresh toolbox, run: sudo podman rm 'toolbox-root')
toolbox-root
Container started successfully. To exit, type 'exit'.
sh-4.4# 

NOTE, can't input any command and terminal window is dead.


Expected results:


Additional info:

(gdb) bt
#0  0x00007efeec92af4b in __GI___waitpid (pid=pid@entry=-1, stat_loc=stat_loc@entry=0x7ffe74227b40, options=options@entry=0) at ../sysdeps/unix/sysv/linux/waitpid.c:30
#1  0x0000555e62a01649 in waitchld (block=block@entry=1, wpid=24715) at jobs.c:3475
#2  0x0000555e62a02cd3 in wait_for (pid=24715) at jobs.c:2718
#3  0x0000555e629f16f2 in execute_command_internal (command=0x555e641f8bb0, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=<optimized out>) at execute_cmd.c:865
#4  0x0000555e629efab5 in execute_command_internal (command=0x555e641f5510, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:980
#5  0x0000555e629f2a79 in execute_function (var=var@entry=0x555e641ea860, flags=flags@entry=0, fds_to_close=fds_to_close@entry=0x555e641ef450, async=async@entry=0, subshell=subshell@entry=0, words=<optimized out>) at execute_cmd.c:4810
#6  0x0000555e629ef124 in execute_builtin_or_function (flags=<optimized out>, fds_to_close=0x555e641ef450, redirects=<optimized out>, var=0x555e641ea860, builtin=0x0, words=0x555e641f60f0) at execute_cmd.c:5045
#7  execute_simple_command (simple_command=<optimized out>, pipe_in=<optimized out>, pipe_in@entry=-1, pipe_out=pipe_out@entry=-1, async=async@entry=0, fds_to_close=fds_to_close@entry=0x555e641ef450) at execute_cmd.c:4345
#8  0x0000555e629f01a6 in execute_command_internal (command=0x555e641f50c0, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:819
#9  0x0000555e629f0ddc in execute_connection (fds_to_close=0x555e641ef450, pipe_out=-1, pipe_in=-1, asynchronous=0, command=0x555e641f1570) at execute_cmd.c:2615
#10 execute_command_internal (command=0x555e641f1570, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:988
#11 0x0000555e629efab5 in execute_command_internal (command=0x555e641f1380, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:980
#12 0x0000555e629f2a79 in execute_function (var=var@entry=0x555e641e5920, flags=flags@entry=0, fds_to_close=fds_to_close@entry=0x555e641ef450, async=async@entry=0, subshell=subshell@entry=0, words=<optimized out>) at execute_cmd.c:4810
#13 0x0000555e629ef124 in execute_builtin_or_function (flags=<optimized out>, fds_to_close=0x555e641ef450, redirects=<optimized out>, var=0x555e641e5920, builtin=0x0, words=0x555e641f1a60) at execute_cmd.c:5045
#14 execute_simple_command (simple_command=<optimized out>, pipe_in=<optimized out>, pipe_in@entry=-1, pipe_out=pipe_out@entry=-1, async=async@entry=0, fds_to_close=fds_to_close@entry=0x555e641ef450) at execute_cmd.c:4345
#15 0x0000555e629f01a6 in execute_command_internal (command=0x555e641f0cc0, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:819
#16 0x0000555e629f0ddc in execute_connection (fds_to_close=0x555e641ef450, pipe_out=-1, pipe_in=-1, asynchronous=0, command=0x555e641ef7f0) at execute_cmd.c:2615
#17 execute_command_internal (command=0x555e641ef7f0, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:988
#18 0x0000555e629f1a56 in execute_command (command=0x555e641ef7f0) at execute_cmd.c:409
#19 0x0000555e629f0da9 in execute_connection (fds_to_close=0x555e641f0a40, pipe_out=-1, pipe_in=-1, asynchronous=0, command=0x555e641ef770) at execute_cmd.c:2613
#20 execute_command_internal (command=0x555e641ef770, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641f0a40) at execute_cmd.c:988
#21 0x0000555e629efab5 in execute_command_internal (command=0x555e641ef700, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641f0a40) at execute_cmd.c:980
#22 0x0000555e629f2a79 in execute_function (var=var@entry=0x555e641edb70, flags=flags@entry=0, fds_to_close=fds_to_close@entry=0x555e641f0a40, async=async@entry=0, subshell=subshell@entry=0, words=<optimized out>) at execute_cmd.c:4810
#23 0x0000555e629ef124 in execute_builtin_or_function (flags=<optimized out>, fds_to_close=0x555e641f0a40, redirects=<optimized out>, var=0x555e641edb70, builtin=0x0, words=0x555e641e8cd0) at execute_cmd.c:5045
#24 execute_simple_command (simple_command=<optimized out>, pipe_in=<optimized out>, pipe_in@entry=-1, pipe_out=pipe_out@entry=-1, async=async@entry=0, fds_to_close=fds_to_close@entry=0x555e641f0a40) at execute_cmd.c:4345
#25 0x0000555e629f01a6 in execute_command_internal (command=0x555e641ee560, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641f0a40) at execute_cmd.c:819
#26 0x0000555e629f1a56 in execute_command (command=0x555e641ee560) at execute_cmd.c:409
#27 0x0000555e629d93d9 in reader_loop () at eval.c:181
#28 0x0000555e629d7b9b in main (argc=2, argv=0x7ffe74228bd8, env=0x7ffe74228bf0) at shell.c:802

Comment 4 Micah Abbott 2020-03-20 19:33:35 UTC
The workaround for this BZ is to use `podman rm` to remove the existing container.

```
$ toolbox
Trying to pull registry.redhat.io/rhel8/support-tools...
  unable to retrieve auth token: invalid username/password: unauthorized: Please login to the Red Hat Registry using your Customer Portal credentials. Further instructions can be found here: https://access.redhat.com/RegistryAuthentication
Error: error pulling image "registry.redhat.io/rhel8/support-tools": unable to pull registry.redhat.io/rhel8/support-tools: unable to pull image: Error initializing source docker://registry.redhat.io/rhel8/support-tools:latest: unable to retrieve auth token: invalid username/passworn
Would you like to manually authenticate to registry: 'registry.redhat.io' and try again? [y/N] y
Username: miabbott
Password: 
Login Succeeded!
Trying to pull registry.redhat.io/rhel8/support-tools...
Getting image source signatures
Copying blob eae5d284042d done  
Copying blob 0a4a43613721 done  
Copying blob ff6f434a470a done  
Copying config 53d1e01dae done  
Writing manifest to image destination
Storing signatures
53d1e01dae0c44c45f36e72d2d1f0fa91069c147bbd9d2971335ecf2ca93b446
Spawning a container 'toolbox-core' with image 'registry.redhat.io/rhel8/support-tools'
Detected RUN label in the container image. Using that as the default...
command: podman run -it --name toolbox-core --privileged --ipc=host --net=host --pid=host -e HOST=/host -e NAME=toolbox-core -e IMAGE=registry.redhat.io/rhel8/support-tools:latest -v /run:/run -v /var/log:/var/log -v /etc/machine-id:/etc/machine-id -v /etc/localtime:/etc/localtime -t
[root@ibm-p8-kvm-03-guest-02 /]# exit
exit

$ sudo podman ps -a
CONTAINER ID  IMAGE                                          COMMAND        CREATED         STATUS                    PORTS  NAMES
61b2039091a8  registry.redhat.io/rhel8/support-tools:latest  /usr/bin/bash  46 seconds ago  Exited (0) 6 seconds ago         toolbox-core

$ sudo podman rm toolbox-core
61b2039091a8e3c9de231b8e582b341169529a5eef54da2cb62228af8bf7e064

$ toolbox
Spawning a container 'toolbox-core' with image 'registry.redhat.io/rhel8/support-tools'
Detected RUN label in the container image. Using that as the default...
command: podman run -it --name toolbox-core --privileged --ipc=host --net=host --pid=host -e HOST=/host -e NAME=toolbox-core -e IMAGE=registry.redhat.io/rhel8/support-tools:latest -v /run:/run -v /var/log:/var/log -v /etc/machine-id:/etc/machine-id -v /etc/localtime:/etc/localtime -t
[root@ibm-p8-kvm-03-guest-02 /]# exit
exit
$
```

Comment 8 Alex Jia 2020-03-24 07:58:36 UTC
Verified in toolbox-0.0.7-1.module+el8.2.0+6096+9c3f08f3.noarch.


[root@hpe-dl380pgen8-02-vm-14 ~]# podman ps -a
CONTAINER ID  IMAGE                                          COMMAND        CREATED         STATUS                    PORTS  NAMES
a7e55f683190  registry.redhat.io/rhel8/support-tools:latest  /usr/bin/bash  27 seconds ago  Exited (0) 5 seconds ago         toolbox-root

[root@hpe-dl380pgen8-02-vm-14 ~]# toolbox
Container 'toolbox-root' already exists. Trying to start...
(To remove the container and start with a fresh toolbox, run: sudo podman rm 'toolbox-root')
toolbox-root
Container started successfully. To exit, type 'exit'.
sh-4.4# ls
bin   dev  home  lib	lost+found  mnt  proc  run   srv  tmp  var
boot  etc  host  lib64	media	    opt  root  sbin  sys  usr
sh-4.4# pwd
/
sh-4.4# exit
exit

[root@hpe-dl380pgen8-02-vm-14 ~]# echo $?
0

Comment 10 errata-xmlrpc 2020-04-28 15:53:37 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2020:1650


Note You need to log in before you can comment on or make changes to this bug.