RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1815460 - [container-tools:1.0] Failed to start existing container
Summary: [container-tools:1.0] Failed to start existing container
Keywords:
Status: CLOSED WONTFIX
Alias: None
Product: Red Hat Enterprise Linux 8
Classification: Red Hat
Component: toolbox
Version: 8.1
Hardware: x86_64
OS: Linux
high
high
Target Milestone: rc
: 8.1
Assignee: Yu Qi Zhang
QA Contact: Martin Jenner
URL:
Whiteboard:
Depends On: 1811514 1816287
Blocks:
TreeView+ depends on / blocked
 
Reported: 2020-03-20 10:39 UTC by Alex Jia
Modified: 2020-03-24 17:30 UTC (History)
4 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of: 1811514
Environment:
Last Closed: 2020-03-24 17:30:53 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Bugzilla 1803112 0 low CLOSED cannot interact with toolbox container after first execution 2021-02-22 00:41:40 UTC

Description Alex Jia 2020-03-20 10:39:49 UTC
The same issue in toolbox-0.0.4-1.module+el8.1.1+4407+ac444e5d.x86_64.

+++ This bug was initially created as a clone of Bug #1811514 +++

Description of problem:
Failed to start existing container, and can't input any command on terminal window.

Version-Release number of selected component (if applicable):

[root@hpe-dl380pgen8-02-vm-5 ~]# cat /etc/redhat-release 
Red Hat Enterprise Linux release 8.2 Beta (Ootpa)

[root@hpe-dl380pgen8-02-vm-5 ~]# rpm -q toolbox podman buildah runc skopeo kernel
toolbox-0.0.4-1.module+el8.2.0+4897+4dbe4c2c.x86_64
podman-1.6.4-6.module+el8.2.0+5825+4c03a97d.x86_64
buildah-1.11.6-7.module+el8.2.0+5856+b8046c6d.x86_64
runc-1.0.0-65.rc10.module+el8.2.0+5762+aaee29fb.x86_64
skopeo-0.1.40-9.module+el8.2.0+5762+aaee29fb.x86_64
kernel-4.18.0-185.el8.x86_64

How reproducible:
always

Steps to Reproduce:
1. toolbox then exit
2. toolbox again

Actual results:

[root@hpe-dl380pgen8-02-vm-5 ~]# toolbox
Trying to pull registry.redhat.io/rhel8/support-tools...
Getting image source signatures
Copying blob 0a4a43613721 done
Copying blob ff6f434a470a done
Copying blob eae5d284042d done
Copying config 53d1e01dae done
Writing manifest to image destination
Storing signatures
53d1e01dae0c44c45f36e72d2d1f0fa91069c147bbd9d2971335ecf2ca93b446
Spawning a container 'toolbox-root' with image 'registry.redhat.io/rhel8/support-tools'
Detected RUN label in the container image. Using that as the default...
command: podman run -it --name toolbox-root --privileged --ipc=host --net=host --pid=host -e HOST=/host -e NAME=toolbox-root -e IMAGE=registry.redhat.io/rhel8/support-tools:latest -v /run:/run -v /var/log:/var/log -v /etc/machine-id:/etc/machine-id -v /etc/localtime:/etc/localtime -v /:/host registry.redhat.io/rhel8/support-tools:latest
[root@hpe-dl380pgen8-02-vm-5 /]# pwd
/
[root@hpe-dl380pgen8-02-vm-5 /]# ls
bin  boot  dev	etc  home  host  lib  lib64  lost+found  media	mnt  opt  proc	root  run  sbin  srv  sys  tmp	usr  var
[root@hpe-dl380pgen8-02-vm-5 /]# exit
exit
[root@hpe-dl380pgen8-02-vm-5 ~]# echo $?
0
[root@hpe-dl380pgen8-02-vm-5 ~]# podman images
REPOSITORY                               TAG      IMAGE ID       CREATED       SIZE
registry.redhat.io/rhel8/support-tools   latest   53d1e01dae0c   5 weeks ago   271 MB
[root@hpe-dl380pgen8-02-vm-5 ~]# podman ps -a
CONTAINER ID  IMAGE                                          COMMAND        CREATED         STATUS                     PORTS  NAMES
a92e0e3814e1  registry.redhat.io/rhel8/support-tools:latest  /usr/bin/bash  39 seconds ago  Exited (0) 27 seconds ago         toolbox-root
[root@hpe-dl380pgen8-02-vm-5 ~]# toolbox
Container 'toolbox-root' already exists. Trying to start...
(To remove the container and start with a fresh toolbox, run: sudo podman rm 'toolbox-root')
toolbox-root
Container started successfully. To exit, type 'exit'.
sh-4.4# 

NOTE, can't input any command and terminal window is dead.


Expected results:


Additional info:

(gdb) bt
#0  0x00007efeec92af4b in __GI___waitpid (pid=pid@entry=-1, stat_loc=stat_loc@entry=0x7ffe74227b40, options=options@entry=0) at ../sysdeps/unix/sysv/linux/waitpid.c:30
#1  0x0000555e62a01649 in waitchld (block=block@entry=1, wpid=24715) at jobs.c:3475
#2  0x0000555e62a02cd3 in wait_for (pid=24715) at jobs.c:2718
#3  0x0000555e629f16f2 in execute_command_internal (command=0x555e641f8bb0, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=<optimized out>) at execute_cmd.c:865
#4  0x0000555e629efab5 in execute_command_internal (command=0x555e641f5510, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:980
#5  0x0000555e629f2a79 in execute_function (var=var@entry=0x555e641ea860, flags=flags@entry=0, fds_to_close=fds_to_close@entry=0x555e641ef450, async=async@entry=0, subshell=subshell@entry=0, words=<optimized out>) at execute_cmd.c:4810
#6  0x0000555e629ef124 in execute_builtin_or_function (flags=<optimized out>, fds_to_close=0x555e641ef450, redirects=<optimized out>, var=0x555e641ea860, builtin=0x0, words=0x555e641f60f0) at execute_cmd.c:5045
#7  execute_simple_command (simple_command=<optimized out>, pipe_in=<optimized out>, pipe_in@entry=-1, pipe_out=pipe_out@entry=-1, async=async@entry=0, fds_to_close=fds_to_close@entry=0x555e641ef450) at execute_cmd.c:4345
#8  0x0000555e629f01a6 in execute_command_internal (command=0x555e641f50c0, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:819
#9  0x0000555e629f0ddc in execute_connection (fds_to_close=0x555e641ef450, pipe_out=-1, pipe_in=-1, asynchronous=0, command=0x555e641f1570) at execute_cmd.c:2615
#10 execute_command_internal (command=0x555e641f1570, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:988
#11 0x0000555e629efab5 in execute_command_internal (command=0x555e641f1380, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:980
#12 0x0000555e629f2a79 in execute_function (var=var@entry=0x555e641e5920, flags=flags@entry=0, fds_to_close=fds_to_close@entry=0x555e641ef450, async=async@entry=0, subshell=subshell@entry=0, words=<optimized out>) at execute_cmd.c:4810
#13 0x0000555e629ef124 in execute_builtin_or_function (flags=<optimized out>, fds_to_close=0x555e641ef450, redirects=<optimized out>, var=0x555e641e5920, builtin=0x0, words=0x555e641f1a60) at execute_cmd.c:5045
#14 execute_simple_command (simple_command=<optimized out>, pipe_in=<optimized out>, pipe_in@entry=-1, pipe_out=pipe_out@entry=-1, async=async@entry=0, fds_to_close=fds_to_close@entry=0x555e641ef450) at execute_cmd.c:4345
#15 0x0000555e629f01a6 in execute_command_internal (command=0x555e641f0cc0, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:819
#16 0x0000555e629f0ddc in execute_connection (fds_to_close=0x555e641ef450, pipe_out=-1, pipe_in=-1, asynchronous=0, command=0x555e641ef7f0) at execute_cmd.c:2615
#17 execute_command_internal (command=0x555e641ef7f0, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641ef450) at execute_cmd.c:988
#18 0x0000555e629f1a56 in execute_command (command=0x555e641ef7f0) at execute_cmd.c:409
#19 0x0000555e629f0da9 in execute_connection (fds_to_close=0x555e641f0a40, pipe_out=-1, pipe_in=-1, asynchronous=0, command=0x555e641ef770) at execute_cmd.c:2613
#20 execute_command_internal (command=0x555e641ef770, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641f0a40) at execute_cmd.c:988
#21 0x0000555e629efab5 in execute_command_internal (command=0x555e641ef700, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641f0a40) at execute_cmd.c:980
#22 0x0000555e629f2a79 in execute_function (var=var@entry=0x555e641edb70, flags=flags@entry=0, fds_to_close=fds_to_close@entry=0x555e641f0a40, async=async@entry=0, subshell=subshell@entry=0, words=<optimized out>) at execute_cmd.c:4810
#23 0x0000555e629ef124 in execute_builtin_or_function (flags=<optimized out>, fds_to_close=0x555e641f0a40, redirects=<optimized out>, var=0x555e641edb70, builtin=0x0, words=0x555e641e8cd0) at execute_cmd.c:5045
#24 execute_simple_command (simple_command=<optimized out>, pipe_in=<optimized out>, pipe_in@entry=-1, pipe_out=pipe_out@entry=-1, async=async@entry=0, fds_to_close=fds_to_close@entry=0x555e641f0a40) at execute_cmd.c:4345
#25 0x0000555e629f01a6 in execute_command_internal (command=0x555e641ee560, asynchronous=0, pipe_in=-1, pipe_out=-1, fds_to_close=0x555e641f0a40) at execute_cmd.c:819
#26 0x0000555e629f1a56 in execute_command (command=0x555e641ee560) at execute_cmd.c:409
#27 0x0000555e629d93d9 in reader_loop () at eval.c:181
#28 0x0000555e629d7b9b in main (argc=2, argv=0x7ffe74228bd8, env=0x7ffe74228bf0) at shell.c:802

Comment 1 Alex Jia 2020-03-20 10:47:40 UTC
Also hang in toolbox-0.0.4-1.module+el8.1.0+4081+b29780af.x86_64.

Comment 4 Micah Abbott 2020-03-24 17:30:22 UTC
The discussion around this BZ is that it will not be fixed as part of 8.1.z.  

Users are encouraged to use updated versions of the `container-tools:rhel8` or `container-tools:2.0` modules.


Note You need to log in before you can comment on or make changes to this bug.