Description of problem: ======================== Though brick demon is not running, gluster vol status command shows the pid Version-Release number of selected component (if applicable): ================================== Though brick demon is not running, gluster vol status command shows the pid How reproducible: Steps to Reproduce: ======================= 1.Create a volume with two bricks of two different nodes and then kill brick process in one of the node 2.Observe the gluster vol status and still shows the pid though no brick demon is running on 3. Actual results: Expected results: =============== Status should show N/A under PID section for the down brick Additional info: ==================== [root@rhs-client38 ~]# gluster vol status vol1 Status of volume: vol1 Gluster process TCP Port RDMA Port Online Pid ------------------------------------------------------------------------------ Brick 10.70.33.229:/rajesh2/brick4 49185 0 Y 4291 Brick 10.70.33.235:/rajesh2/brick4 N/A N/A N 17125 NFS Server on localhost 2049 0 Y 17145 Bitrot Daemon on localhost N/A N/A Y 17150 Scrubber Daemon on localhost N/A N/A Y 17162 NFS Server on 10.70.33.229 2049 0 Y 13601 Bitrot Daemon on 10.70.33.229 N/A N/A Y 13609 Scrubber Daemon on 10.70.33.229 N/A N/A Y 13621 Task Status of Volume vol1 ------------------------------------------------------------------------------ There are no active volume tasks [root@rhs-client38 ~]# ps -aux | grep 17125 Warning: bad syntax, perhaps a bogus '-'? See /usr/share/doc/procps-3.2.8/FAQ root 17245 0.0 0.0 103252 864 pts/0 S+ 02:10 0:00 grep 17125 [root@rhs-client38 ~]#
upstream patch http://review.gluster.org/#/c/10877/ is available. once it merge i will clone it to rhgs.
Tested with glusterfs-api-3.7.1-1 and vol status is not showing PID of non running brick so marking this bug as verified
Shouldn't this be marked as Verified?
Hi Gaurav, The doc text is updated. Please review the same and share your technical review comments. If it looks ok, then sign-off on the same. Regards, Bhavana
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://rhn.redhat.com/errata/RHSA-2015-1495.html