RHEL Engineering is moving the tracking of its product development work on RHEL 6 through RHEL 9 to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "RHEL project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs in the statuses "NEW", "ASSIGNED", and "POST" are being migrated throughout September 2023. Bugs of Red Hat partners with an assigned Engineering Partner Manager (EPM) are migrated in late September as per pre-agreed dates. Bugs against components "kernel", "kernel-rt", and "kpatch" are only migrated if still in "NEW" or "ASSIGNED". If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "RHEL project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/RHEL-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 2177611 - [Regression] podman-4.4.0-1.el9 breaks checkpoint/restore
Summary: [Regression] podman-4.4.0-1.el9 breaks checkpoint/restore
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Linux 9
Classification: Red Hat
Component: podman
Version: 9.3
Hardware: Unspecified
OS: Unspecified
unspecified
high
Target Milestone: rc
: ---
Assignee: Jindrich Novy
QA Contact: Alex Jia
URL:
Whiteboard:
Depends On:
Blocks: 2179449 2179450
TreeView+ depends on / blocked
 
Reported: 2023-03-13 06:59 UTC by Chao Ye
Modified: 2023-11-07 09:54 UTC (History)
12 users (show)

Fixed In Version: podman-4.5.1-2.el9
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
: 2179449 2179450 (view as bug list)
Environment:
Last Closed: 2023-11-07 08:33:59 UTC
Type: Bug
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github containers podman pull 17755 0 None Merged Use append() to add elements to a slice (restore) 2023-03-20 09:26:53 UTC
Red Hat Issue Tracker RHELPLAN-151529 0 None None None 2023-03-13 07:00:46 UTC
Red Hat Product Errata RHSA-2023:6474 0 None None None 2023-11-07 08:35:28 UTC

Description Chao Ye 2023-03-13 06:59:05 UTC
Description of problem:
Run checkpoint/restore with podman-4.2.0-11.el9_1
=========================================================================
[root@hp-dl360g9-23-vm-02 ~]# uname -a
Linux hp-dl360g9-23-vm-02.rhts.eng.pek2.redhat.com 5.14.0-284.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Feb 27 20:08:54 EST 2023 x86_64 x86_64 x86_64 GNU/Linux
[root@hp-dl360g9-23-vm-02 ~]# rpm -q kernel podman criu
kernel-5.14.0-284.el9.x86_64
podman-4.2.0-11.el9_1.x86_64
criu-3.17-4.el9.x86_64
[root@hp-dl360g9-23-vm-02 ~]# podman run -d ubi8 sleep 300
db92a7c0adf2dd730ed629ca616daeead8ca16c56779ce64d09be3a67eb38b32
[root@hp-dl360g9-23-vm-02 ~]# podman container ps -a
CONTAINER ID  IMAGE                                   COMMAND     CREATED        STATUS            PORTS       NAMES
db92a7c0adf2  registry.access.redhat.com/ubi8:latest  sleep 300   6 seconds ago  Up 6 seconds ago              gallant_mahavira
[root@hp-dl360g9-23-vm-02 ~]# podman container checkpoint -l
db92a7c0adf2dd730ed629ca616daeead8ca16c56779ce64d09be3a67eb38b32
[root@hp-dl360g9-23-vm-02 ~]# podman container ps -a
CONTAINER ID  IMAGE                                   COMMAND     CREATED         STATUS                    PORTS       NAMES
db92a7c0adf2  registry.access.redhat.com/ubi8:latest  sleep 300   12 seconds ago  Exited (0) 2 seconds ago              gallant_mahavira
[root@hp-dl360g9-23-vm-02 ~]# podman container restore -l
db92a7c0adf2dd730ed629ca616daeead8ca16c56779ce64d09be3a67eb38b32
[root@hp-dl360g9-23-vm-02 ~]# podman container ps -a
CONTAINER ID  IMAGE                                   COMMAND     CREATED         STATUS             PORTS       NAMES
db92a7c0adf2  registry.access.redhat.com/ubi8:latest  sleep 300   19 seconds ago  Up 19 seconds ago              gallant_mahavira



Run checkpoint/restore with podman-4.4.0-1.el9
=========================================================================
[root@hp-dl360g9-23-vm-02 ~]# uname -a
Linux hp-dl360g9-23-vm-02.rhts.eng.pek2.redhat.com 5.14.0-284.el9.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Feb 27 20:08:54 EST 2023 x86_64 x86_64 x86_64 GNU/Linux
[root@hp-dl360g9-23-vm-02 ~]# rpm -q kernel podman criu
kernel-5.14.0-284.el9.x86_64
podman-4.4.0-1.el9.x86_64
criu-3.17-4.el9.x86_64
[root@hp-dl360g9-23-vm-02 ~]# podman run -d ubi8 sleep 300
7348f15d7e9dddf47ccfbabdfe0da6f8dcf28b865a17cb2d590b52317602cdfe
[root@hp-dl360g9-23-vm-02 ~]# podman container checkpoint -l
7348f15d7e9dddf47ccfbabdfe0da6f8dcf28b865a17cb2d590b52317602cdfe
[root@hp-dl360g9-23-vm-02 ~]# podman container restore -l
panic: runtime error: index out of range [0] with length 0

goroutine 1 [running]:
panic({0x55ca67ca90e0, 0xc0003ef3f8})
        /usr/lib/golang/src/runtime/panic.go:987 +0x3ba fp=0xc00069f670 sp=0xc00069f5b0 pc=0x55ca664d279a
runtime.goPanicIndex(0x0, 0x0)
        /usr/lib/golang/src/runtime/panic.go:113 +0x7f fp=0xc00069f6b0 sp=0xc00069f670 pc=0x55ca664d06ff
github.com/containers/podman/pkg/domain/infra/abi.(*ContainerEngine).ContainerRestore(0xc000014ed8, {0x55ca67d92c90, 0xc0000420c0}, {0x55ca68ac2880, 0x0, 0x0}, {0x0, 0x0, 0x0, 0x0, ...})
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/pkg/domain/infra/abi/containers.go:676 +0x39c fp=0xc00069fac8 sp=0xc00069f6b0 pc=0x55ca6740347c
github.com/containers/podman/cmd/podman/containers.restore(0x55ca6898a7c0?, {0xc00074c980, 0x0, 0x1?})
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/cmd/podman/containers/restore.go:171 +0x4ef fp=0xc00069fcd0 sp=0xc00069fac8 pc=0x55ca6754ef2f
github.com/containers/podman/vendor/github.com/spf13/cobra.(*Command).execute(0x55ca6898a7c0, {0xc0000400b0, 0x1, 0x1})
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/vendor/github.com/spf13/cobra/command.go:916 +0x862 fp=0xc00069fe08 sp=0xc00069fcd0 pc=0x55ca66a49ac2
github.com/containers/podman/vendor/github.com/spf13/cobra.(*Command).ExecuteC(0x55ca689a9640)
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/vendor/github.com/spf13/cobra/command.go:1044 +0x3bd fp=0xc00069fec0 sp=0xc00069fe08 pc=0x55ca66a4a33d
github.com/containers/podman/vendor/github.com/spf13/cobra.(*Command).Execute(...)
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/vendor/github.com/spf13/cobra/command.go:968
github.com/containers/podman/vendor/github.com/spf13/cobra.(*Command).ExecuteContext(...)
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/vendor/github.com/spf13/cobra/command.go:961
main.Execute()
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/cmd/podman/root.go:107 +0xcc fp=0xc00069ff50 sp=0xc00069fec0 pc=0x55ca6766366c
main.main()
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/cmd/podman/main.go:40 +0x7c fp=0xc00069ff80 sp=0xc00069ff50 pc=0x55ca67662abc
runtime.main()
        /usr/lib/golang/src/runtime/proc.go:250 +0x213 fp=0xc00069ffe0 sp=0xc00069ff80 pc=0x55ca664d54f3
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc00069ffe8 sp=0xc00069ffe0 pc=0x55ca66507b01

goroutine 2 [force gc (idle)]:
runtime.gopark(0x0?, 0x0?, 0x0?, 0x0?, 0x0?)
        /usr/lib/golang/src/runtime/proc.go:363 +0xd6 fp=0xc000066fb0 sp=0xc000066f90 pc=0x55ca664d58b6
runtime.goparkunlock(...)
        /usr/lib/golang/src/runtime/proc.go:369
runtime.forcegchelper()
        /usr/lib/golang/src/runtime/proc.go:302 +0xad fp=0xc000066fe0 sp=0xc000066fb0 pc=0x55ca664d574d
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000066fe8 sp=0xc000066fe0 pc=0x55ca66507b01
created by runtime.init.7
        /usr/lib/golang/src/runtime/proc.go:290 +0x25

goroutine 3 [runnable]:
runtime.Gosched(...)
        /usr/lib/golang/src/runtime/proc.go:318
runtime.bgsweep(0x0?)
        /usr/lib/golang/src/runtime/mgcsweep.go:283 +0xfc fp=0xc0000677c8 sp=0xc000067790 pc=0x55ca664c01dc
runtime.gcenable.func1()
        /usr/lib/golang/src/runtime/mgc.go:178 +0x26 fp=0xc0000677e0 sp=0xc0000677c8 pc=0x55ca664b4e06
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000677e8 sp=0xc0000677e0 pc=0x55ca66507b01
created by runtime.gcenable
        /usr/lib/golang/src/runtime/mgc.go:178 +0x6b

goroutine 4 [runnable]:
runtime.gopark(0xc000088000?, 0x55ca6797d820?, 0x0?, 0x0?, 0x0?)
        /usr/lib/golang/src/runtime/proc.go:363 +0xd6 fp=0xc000067f70 sp=0xc000067f50 pc=0x55ca664d58b6
runtime.goparkunlock(...)
        /usr/lib/golang/src/runtime/proc.go:369
runtime.(*scavengerState).park(0x55ca68a8b300)
        /usr/lib/golang/src/runtime/mgcscavenge.go:389 +0x53 fp=0xc000067fa0 sp=0xc000067f70 pc=0x55ca664be213
runtime.bgscavenge(0x0?)
        /usr/lib/golang/src/runtime/mgcscavenge.go:622 +0x65 fp=0xc000067fc8 sp=0xc000067fa0 pc=0x55ca664be805
runtime.gcenable.func2()
        /usr/lib/golang/src/runtime/mgc.go:179 +0x26 fp=0xc000067fe0 sp=0xc000067fc8 pc=0x55ca664b4da6
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000067fe8 sp=0xc000067fe0 pc=0x55ca66507b01
created by runtime.gcenable
        /usr/lib/golang/src/runtime/mgc.go:179 +0xaa

goroutine 5 [finalizer wait]:
runtime.gopark(0x55ca68a8d320?, 0xc000007860?, 0x0?, 0x0?, 0xc000066770?)
        /usr/lib/golang/src/runtime/proc.go:363 +0xd6 fp=0xc000066628 sp=0xc000066608 pc=0x55ca664d58b6
runtime.goparkunlock(...)
        /usr/lib/golang/src/runtime/proc.go:369
runtime.runfinq()
        /usr/lib/golang/src/runtime/mfinal.go:180 +0x10f fp=0xc0000667e0 sp=0xc000066628 pc=0x55ca664b3e8f
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000667e8 sp=0xc0000667e0 pc=0x55ca66507b01
created by runtime.createfing
        /usr/lib/golang/src/runtime/mfinal.go:157 +0x45

goroutine 6 [GC worker (idle)]:
runtime.gopark(0x44c1a0045c6?, 0x0?, 0x0?, 0x0?, 0x0?)
        /usr/lib/golang/src/runtime/proc.go:363 +0xd6 fp=0xc000068750 sp=0xc000068730 pc=0x55ca664d58b6
runtime.gcBgMarkWorker()
        /usr/lib/golang/src/runtime/mgc.go:1235 +0xf1 fp=0xc0000687e0 sp=0xc000068750 pc=0x55ca664b6f71
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000687e8 sp=0xc0000687e0 pc=0x55ca66507b01
created by runtime.gcBgMarkStartWorkers
        /usr/lib/golang/src/runtime/mgc.go:1159 +0x25

goroutine 7 [runnable, locked to thread]:
runtime.gopark(0x1?, 0x2?, 0x0?, 0x0?, 0x0?)
        /usr/lib/golang/src/runtime/proc.go:363 +0xd6 fp=0xc000063ea0 sp=0xc000063e80 pc=0x55ca664d58b6
runtime.chansend(0xc000050240, 0xc000063f8f, 0x1, 0x2?)
        /usr/lib/golang/src/runtime/chan.go:259 +0x42c fp=0xc000063f28 sp=0xc000063ea0 pc=0x55ca664a0d8c
runtime.chansend1(0xc000000002?, 0xc000063f98?)
        /usr/lib/golang/src/runtime/chan.go:145 +0x1d fp=0xc000063f58 sp=0xc000063f28 pc=0x55ca664a093d
runtime.ensureSigM.func1()
        /usr/lib/golang/src/runtime/signal_unix.go:1002 +0x15e fp=0xc000063fe0 sp=0xc000063f58 pc=0x55ca664e9dfe
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000063fe8 sp=0xc000063fe0 pc=0x55ca66507b01
created by runtime.ensureSigM
        /usr/lib/golang/src/runtime/signal_unix.go:974 +0xbd

goroutine 8 [syscall]:
runtime.notetsleepg(0x0?, 0x0?)
        /usr/lib/golang/src/runtime/lock_futex.go:236 +0x34 fp=0xc0000647a0 sp=0xc000064768 pc=0x55ca664a7174
os/signal.signal_recv()
        /usr/lib/golang/src/runtime/sigqueue.go:152 +0x2f fp=0xc0000647c0 sp=0xc0000647a0 pc=0x55ca66503faf
os/signal.loop()
        /usr/lib/golang/src/os/signal/signal_unix.go:23 +0x19 fp=0xc0000647e0 sp=0xc0000647c0 pc=0x55ca66866699
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000647e8 sp=0xc0000647e0 pc=0x55ca66507b01
created by os/signal.Notify.func1.1
        /usr/lib/golang/src/os/signal/signal.go:151 +0x2a

goroutine 9 [runnable]:
github.com/containers/podman/libpod/shutdown.Start.func1()
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/libpod/shutdown/handler.go:46 fp=0xc000064fe0 sp=0xc000064fd8 pc=0x55ca67226cc0
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000064fe8 sp=0xc000064fe0 pc=0x55ca66507b01
created by github.com/containers/podman/libpod/shutdown.Start
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/libpod/shutdown/handler.go:46 +0xe7

goroutine 10 [runnable]:
github.com/containers/podman/libpod.(*Runtime).libimageEvents.func2()
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/libpod/runtime.go:722 fp=0xc0000657e0 sp=0xc0000657d8 pc=0x55ca67375dc0
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc0000657e8 sp=0xc0000657e0 pc=0x55ca66507b01
created by github.com/containers/podman/libpod.(*Runtime).libimageEvents
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/libpod/runtime.go:722 +0x10d

goroutine 11 [runnable]:
github.com/containers/podman/libpod.(*Runtime).startWorker.func1()
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/libpod/runtime_worker.go:5 fp=0xc000065fe0 sp=0xc000065fd8 pc=0x55ca67391000
runtime.goexit()
        /usr/lib/golang/src/runtime/asm_amd64.s:1594 +0x1 fp=0xc000065fe8 sp=0xc000065fe0 pc=0x55ca66507b01
created by github.com/containers/podman/libpod.(*Runtime).startWorker
        /builddir/build/BUILD/podman-3443f453e28169a88848f90a7ce3137fc4a4bebf/_build/src/github.com/containers/podman/libpod/runtime_worker.go:5 +0x96
Aborted (core dumped)


Version-Release number of selected component (if applicable):
podman-4.4.0-1.el9

How reproducible:
100%

Steps to Reproduce:
1.
2.
3.

Actual results:


Expected results:


Additional info:

Comment 11 Alex Jia 2023-04-02 07:38:15 UTC
This bug has been verified on podman-4.4.1-6.el9.x86_64.

[root@kvm-03-guest15 ~]# cat /etc/redhat-release 
Red Hat Enterprise Linux release 9.3 Beta (Plow)

[root@kvm-03-guest15 ~]# rpm -q podman crun criu kernel
podman-4.4.1-6.el9.x86_64
crun-1.8.3-1.el9.x86_64
criu-3.17-4.el9.x86_64
kernel-5.14.0-289.el9.x86_64

[root@kvm-03-guest15 ~]# podman run -td ubi8 sleep 180
Resolved "ubi8" as an alias (/etc/containers/registries.conf.d/001-rhel-shortnames.conf)
Trying to pull registry.access.redhat.com/ubi8:latest...
Getting image source signatures
Checking if image destination supports signatures
Copying blob c4877503c8d2 done  
Copying config 36660eab1e done  
Writing manifest to image destination
Storing signatures
19307454b25efe17de25fb0a336333577b8d4949431fd7b7f5806ddcdca46917
[root@kvm-03-guest15 ~]# podman ps
CONTAINER ID  IMAGE                                   COMMAND     CREATED        STATUS        PORTS       NAMES
19307454b25e  registry.access.redhat.com/ubi8:latest  sleep 180   3 seconds ago  Up 3 seconds              festive_zhukovsky
[root@kvm-03-guest15 ~]# podman container checkpoint -l
19307454b25efe17de25fb0a336333577b8d4949431fd7b7f5806ddcdca46917
[root@kvm-03-guest15 ~]# podman ps -a
CONTAINER ID  IMAGE                                   COMMAND     CREATED         STATUS                    PORTS       NAMES
19307454b25e  registry.access.redhat.com/ubi8:latest  sleep 180   39 seconds ago  Exited (0) 5 seconds ago              festive_zhukovsky
[root@kvm-03-guest15 ~]# podman container inspect --format '{{.State.Status}}:{{.State.Running}}:{{.State.Paused}}:{{.State.Checkpointed}}' festive_zhukovsky
exited:false:false:true
[root@kvm-03-guest15 ~]# podman container restore -l
19307454b25efe17de25fb0a336333577b8d4949431fd7b7f5806ddcdca46917
[root@kvm-03-guest15 ~]# podman ps
CONTAINER ID  IMAGE                                   COMMAND     CREATED             STATUS             PORTS       NAMES
19307454b25e  registry.access.redhat.com/ubi8:latest  sleep 180   About a minute ago  Up About a minute              festive_zhukovsky
[root@kvm-03-guest15 ~]# podman container inspect --format '{{.State.Status}}:{{.State.Running}}:{{.State.Paused}}:{{.State.Checkpointed}}' festive_zhukovsky
running:true:false:false

Comment 14 Alex Jia 2023-06-25 03:24:30 UTC
This bug is verified both on podman-4.5.1-2.el9 and podman-4.5.1-4.el9.

Comment 16 errata-xmlrpc 2023-11-07 08:33:59 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Moderate: podman security, bug fix, and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2023:6474


Note You need to log in before you can comment on or make changes to this bug.