Bug 763451 (GLUSTER-1719) - volume stop reports unsuccessful while its successful
Summary: volume stop reports unsuccessful while its successful
Keywords:
Status: CLOSED WORKSFORME
Alias: GLUSTER-1719
Product: GlusterFS
Classification: Community
Component: glusterd
Version: 3.1-alpha
Hardware: All
OS: Linux
low
medium
Target Milestone: ---
Assignee: Pranith Kumar K
QA Contact:
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2010-09-28 06:14 UTC by Lakshmipathi G
Modified: 2015-12-01 16:45 UTC (History)
2 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed:
Regression: RTP
Mount Type: nfs
Documentation: ---
CRM:
Verified Versions:


Attachments (Terms of Use)

Description Lakshmipathi G 2010-09-28 06:14:15 UTC
started four volumes with 8 bricks for each volume.While stopping these volume it said stop unsuccessful but volume info status shows "stopped" and all process killed on the bricks.

---------
#gluster volume info

Volume Name: HT8
Type: Distribute
Status: Started
Number of Bricks: 8
Transport-type: tcp
Bricks:
Brick1: 10.240.94.228:/mnt/a2
Brick2: 10.212.70.131:/mnt/a2
Brick3: 10.202.57.169:/mnt/a2
Brick4: 10.212.117.143:/mnt/a2
Brick5: 10.202.54.53:/mnt/a2
Brick6: 10.243.113.224:/mnt/a2
Brick7: 10.245.209.205:/mnt/a2
Brick8: 10.245.210.193:/mnt/a2

10.192.141.187#showmount -e localhost
Export list for localhost:
/HT8 *


10.192.141.187#gluster volume stop HT8 
Stopping volume will make its data inaccessible. Do you want to Continue? (y/n) y
Stopping volume HT8 has been unsuccessful


10.192.141.187#gluster volume info

Volume Name: HT8
Type: Distribute
Status: Stopped
Number of Bricks: 8
Transport-type: tcp
Bricks:
Brick1: 10.240.94.228:/mnt/a2
Brick2: 10.212.70.131:/mnt/a2
Brick3: 10.202.57.169:/mnt/a2
Brick4: 10.212.117.143:/mnt/a2
Brick5: 10.202.54.53:/mnt/a2
Brick6: 10.243.113.224:/mnt/a2
Brick7: 10.245.209.205:/mnt/a2
Brick8: 10.245.210.193:/mnt/a2
----------

Comment 1 Pranith Kumar K 2010-10-05 08:26:21 UTC
Havent seen this issue with the latest QA releases. Will reopen this if we see this again.


Note You need to log in before you can comment on or make changes to this bug.