Description of problem: NFS server is getting crashed and moving to error state. We are using Cephfs as backend storage Steps Followed: This we have encountered when we ran our automation script. It is failing while running for n in {1..20}; do dd if=/dev/urandom of=/mnt/nfs_VA6JX/volumes/_nogroup/subvolume2/6c7638e9-bb94-49a2-a832-703807b264e1/file$(printf %03d $n) bs=500k count=1000; done NFS logs : Automation script logs : http://magna002.ceph.redhat.com/cephci-jenkins/cephci-run-S0SH7I/cephfs_nfs_snapshot_clone_operations_0.log NFS server logs : http://magna002.ceph.redhat.com/ceph-qe-logs/amar/BZ_NFS_logs.txt [root@ceph-amk-test-o56vqd-node6 edf01e48-21a9-11ee-becc-fa163e6d5609]# podman ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 41a4a13349a8 registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n osd.2 -f --set... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-osd-2 8c8a2ae00d8d registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n osd.8 -f --set... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-osd-8 2e4f25e306fb registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n osd.11 -f --se... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-osd-11 ea356097720c registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n osd.5 -f --set... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-osd-5 e4bc52d0a37d registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n mds.cephfs.cep... 59 minutes ago Up 59 minutes ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-mds-cephfs-ceph-amk-test-o56vqd-node6-ibqcai 272e4942bc8d registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -F -L STDERR -N N... 2 seconds ago Up 2 seconds ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-nfs-cephfs-nfs-0-0-ceph-amk-test-o56vqd-node6-uppiec [root@ceph-amk-test-o56vqd-node6 edf01e48-21a9-11ee-becc-fa163e6d5609]# podman logs 272e4942bc8d [root@ceph-amk-test-o56vqd-node6 edf01e48-21a9-11ee-becc-fa163e6d5609]# podman ps CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES 41a4a13349a8 registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n osd.2 -f --set... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-osd-2 8c8a2ae00d8d registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n osd.8 -f --set... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-osd-8 2e4f25e306fb registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n osd.11 -f --se... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-osd-11 ea356097720c registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n osd.5 -f --set... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-osd-5 e4bc52d0a37d registry-proxy.engineering.redhat.com/rh-osbs/rhceph@sha256:cf8710ef94bf3dcb65b998f90ce0c0ecf80bee8541fe6034f95d252500046cfd -n mds.cephfs.cep... About an hour ago Up About an hour ceph-edf01e48-21a9-11ee-becc-fa163e6d5609-mds-cephfs-ceph-amk-test-o56vqd-node6-ibqcai [root@ceph-amk-test-o56vqd-node9 ~]# ceph orch ps NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID mds.cephfs.ceph-amk-test-o56vqd-node3.bjjbwo ceph-amk-test-o56vqd-node3 running (63m) 3m ago 18h 17.4M - 17.2.6-96.el9cp 85cb9476225e 870a32dca594 mds.cephfs.ceph-amk-test-o56vqd-node4.xcfydx ceph-amk-test-o56vqd-node4 running (63m) 3m ago 18h 65.8M - 17.2.6-96.el9cp 85cb9476225e 618abd182541 mds.cephfs.ceph-amk-test-o56vqd-node5.asejol ceph-amk-test-o56vqd-node5 running (63m) 3m ago 18h 89.8M - 17.2.6-96.el9cp 85cb9476225e bc88ffc160c2 mds.cephfs.ceph-amk-test-o56vqd-node6.ibqcai ceph-amk-test-o56vqd-node6 running (63m) 9m ago 18h 69.7M - 17.2.6-96.el9cp 85cb9476225e e4bc52d0a37d mds.cephfs.ceph-amk-test-o56vqd-node7.mumwju ceph-amk-test-o56vqd-node7 running (63m) 6m ago 18h 70.3M - 17.2.6-96.el9cp 85cb9476225e b6251c0a6620 mgr.ceph-amk-test-o56vqd-node1-installer.hcqtsu ceph-amk-test-o56vqd-node1-installer *:9283 running (18h) 6m ago 18h 408M - 17.2.6-96.el9cp 85cb9476225e c9e2cb37570e mgr.ceph-amk-test-o56vqd-node2.bldjxz ceph-amk-test-o56vqd-node2 *:8443 running (18h) 6m ago 18h 500M - 17.2.6-96.el9cp 85cb9476225e 41358a1086f1 mon.ceph-amk-test-o56vqd-node1-installer ceph-amk-test-o56vqd-node1-installer running (63m) 6m ago 18h 109M 2048M 17.2.6-96.el9cp 85cb9476225e f458e2e97bed mon.ceph-amk-test-o56vqd-node2 ceph-amk-test-o56vqd-node2 running (63m) 6m ago 18h 100M 2048M 17.2.6-96.el9cp 85cb9476225e c2c25a66ac31 mon.ceph-amk-test-o56vqd-node3 ceph-amk-test-o56vqd-node3 running (63m) 3m ago 18h 104M 2048M 17.2.6-96.el9cp 85cb9476225e 8580c3977a8e nfs.cephfs-nfs.0.0.ceph-amk-test-o56vqd-node6.uppiec ceph-amk-test-o56vqd-node6 *:2049 running (9m) 9m ago 72m 15.0M - 5.1 85cb9476225e 53c4ca2ce000 osd.0 ceph-amk-test-o56vqd-node5 running (70m) 3m ago 18h 492M 4096M 17.2.6-96.el9cp 85cb9476225e 875d3b758333 osd.1 ceph-amk-test-o56vqd-node4 running (72m) 3m ago 18h 470M 4096M 17.2.6-96.el9cp 85cb9476225e 86984e2016f1 osd.2 ceph-amk-test-o56vqd-node6 running (67m) 9m ago 18h 446M 4096M 17.2.6-96.el9cp 85cb9476225e 41a4a13349a8 osd.3 ceph-amk-test-o56vqd-node5 running (69m) 3m ago 18h 459M 4096M 17.2.6-96.el9cp 85cb9476225e f330e2de7e64 osd.4 ceph-amk-test-o56vqd-node4 running (71m) 3m ago 18h 524M 4096M 17.2.6-96.el9cp 85cb9476225e 946f3fdee22e osd.5 ceph-amk-test-o56vqd-node6 running (67m) 9m ago 18h 516M 4096M 17.2.6-96.el9cp 85cb9476225e ea356097720c osd.6 ceph-amk-test-o56vqd-node5 running (69m) 3m ago 18h 499M 4096M 17.2.6-96.el9cp 85cb9476225e e98cc06ac9c3 osd.7 ceph-amk-test-o56vqd-node4 running (71m) 3m ago 18h 480M 4096M 17.2.6-96.el9cp 85cb9476225e 01031abcea23 osd.8 ceph-amk-test-o56vqd-node6 running (67m) 9m ago 18h 421M 4096M 17.2.6-96.el9cp 85cb9476225e 8c8a2ae00d8d osd.9 ceph-amk-test-o56vqd-node5 running (69m) 3m ago 18h 415M 4096M 17.2.6-96.el9cp 85cb9476225e eb569f668458 osd.10 ceph-amk-test-o56vqd-node4 running (72m) 3m ago 18h 446M 4096M 17.2.6-96.el9cp 85cb9476225e 19513209737c osd.11 ceph-amk-test-o56vqd-node6 running (67m) 9m ago 18h 385M 4096M 17.2.6-96.el9cp 85cb9476225e 2e4f25e306fb [root@ceph-amk-test-o56vqd-node9 ~]# ceph orch ps NAME HOST PORTS STATUS REFRESHED AGE MEM USE MEM LIM VERSION IMAGE ID CONTAINER ID mds.cephfs.ceph-amk-test-o56vqd-node3.bjjbwo ceph-amk-test-o56vqd-node3 running (65m) 4m ago 18h 17.4M - 17.2.6-96.el9cp 85cb9476225e 870a32dca594 mds.cephfs.ceph-amk-test-o56vqd-node4.xcfydx ceph-amk-test-o56vqd-node4 running (64m) 4m ago 18h 65.8M - 17.2.6-96.el9cp 85cb9476225e 618abd182541 mds.cephfs.ceph-amk-test-o56vqd-node5.asejol ceph-amk-test-o56vqd-node5 running (64m) 4m ago 18h 89.8M - 17.2.6-96.el9cp 85cb9476225e bc88ffc160c2 mds.cephfs.ceph-amk-test-o56vqd-node6.ibqcai ceph-amk-test-o56vqd-node6 running (64m) 15s ago 18h 75.3M - 17.2.6-96.el9cp 85cb9476225e e4bc52d0a37d mds.cephfs.ceph-amk-test-o56vqd-node7.mumwju ceph-amk-test-o56vqd-node7 running (64m) 7m ago 18h 70.3M - 17.2.6-96.el9cp 85cb9476225e b6251c0a6620 mgr.ceph-amk-test-o56vqd-node1-installer.hcqtsu ceph-amk-test-o56vqd-node1-installer *:9283 running (18h) 7m ago 18h 408M - 17.2.6-96.el9cp 85cb9476225e c9e2cb37570e mgr.ceph-amk-test-o56vqd-node2.bldjxz ceph-amk-test-o56vqd-node2 *:8443 running (18h) 7m ago 18h 500M - 17.2.6-96.el9cp 85cb9476225e 41358a1086f1 mon.ceph-amk-test-o56vqd-node1-installer ceph-amk-test-o56vqd-node1-installer running (65m) 7m ago 18h 109M 2048M 17.2.6-96.el9cp 85cb9476225e f458e2e97bed mon.ceph-amk-test-o56vqd-node2 ceph-amk-test-o56vqd-node2 running (65m) 7m ago 18h 100M 2048M 17.2.6-96.el9cp 85cb9476225e c2c25a66ac31 mon.ceph-amk-test-o56vqd-node3 ceph-amk-test-o56vqd-node3 running (65m) 4m ago 18h 104M 2048M 17.2.6-96.el9cp 85cb9476225e 8580c3977a8e nfs.cephfs-nfs.0.0.ceph-amk-test-o56vqd-node6.uppiec ceph-amk-test-o56vqd-node6 *:2049 error 15s ago 74m - - <unknown> <unknown> <unknown> osd.0 ceph-amk-test-o56vqd-node5 running (71m) 4m ago 18h 492M 4096M 17.2.6-96.el9cp 85cb9476225e 875d3b758333 osd.1 ceph-amk-test-o56vqd-node4 running (73m) 4m ago 18h 470M 4096M 17.2.6-96.el9cp 85cb9476225e 86984e2016f1 osd.2 ceph-amk-test-o56vqd-node6 running (69m) 15s ago 18h 486M 4096M 17.2.6-96.el9cp 85cb9476225e 41a4a13349a8 osd.3 ceph-amk-test-o56vqd-node5 running (70m) 4m ago 18h 459M 4096M 17.2.6-96.el9cp 85cb9476225e f330e2de7e64 osd.4 ceph-amk-test-o56vqd-node4 running (72m) 4m ago 18h 524M 4096M 17.2.6-96.el9cp 85cb9476225e 946f3fdee22e osd.5 ceph-amk-test-o56vqd-node6 running (68m) 15s ago 18h 540M 4096M 17.2.6-96.el9cp 85cb9476225e ea356097720c osd.6 ceph-amk-test-o56vqd-node5 running (70m) 4m ago 18h 499M 4096M 17.2.6-96.el9cp 85cb9476225e e98cc06ac9c3 osd.7 ceph-amk-test-o56vqd-node4 running (73m) 4m ago 18h 480M 4096M 17.2.6-96.el9cp 85cb9476225e 01031abcea23 osd.8 ceph-amk-test-o56vqd-node6 running (68m) 15s ago 18h 436M 4096M 17.2.6-96.el9cp 85cb9476225e 8c8a2ae00d8d osd.9 ceph-amk-test-o56vqd-node5 running (71m) 4m ago 18h 415M 4096M 17.2.6-96.el9cp 85cb9476225e eb569f668458 osd.10 ceph-amk-test-o56vqd-node4 running (73m) 4m ago 18h 446M 4096M 17.2.6-96.el9cp 85cb9476225e 19513209737c osd.11 ceph-amk-test-o56vqd-node6 running (68m) 15s ago 18h 406M 4096M 17.2.6-96.el9cp 85cb9476225e 2e4f25e306fb [root@ceph-amk-test-o56vqd-node9 ~]# Version-Release number of selected component (if applicable): How reproducible: 1/1 Steps to Reproduce: 1. 2. 3. Actual results: Expected results: Additional info:
Missed the window for 6.1 z1. Retargeting to 6.1 z2.
What version of Ganesha is running? Do you have a stack back trace from the crash?
I'm going to assume it's the known crash fixed in V5.2. We should have the latest version (V5.4) available soon.