Bug 1339054
Summary: | Need to improve remove-brick failure message when the brick process is down. | |||
---|---|---|---|---|
Product: | [Red Hat Storage] Red Hat Gluster Storage | Reporter: | Byreddy <bsrirama> | |
Component: | glusterd | Assignee: | Atin Mukherjee <amukherj> | |
Status: | CLOSED ERRATA | QA Contact: | Rajesh Madaka <rmadaka> | |
Severity: | high | Docs Contact: | ||
Priority: | unspecified | |||
Version: | rhgs-3.1 | CC: | amukherj, nchilaka, rhinduja, rhs-bugs, rmadaka, sheggodu, storage-qa-internal, vbellur | |
Target Milestone: | --- | Keywords: | ZStream | |
Target Release: | RHGS 3.4.0 | |||
Hardware: | x86_64 | |||
OS: | Linux | |||
Whiteboard: | rebase | |||
Fixed In Version: | glusterfs-3.12.2-1 | Doc Type: | If docs needed, set a value | |
Doc Text: | Story Points: | --- | ||
Clone Of: | ||||
: | 1422624 (view as bug list) | Environment: | ||
Last Closed: | 2018-09-04 06:29:40 UTC | Type: | Bug | |
Regression: | --- | Mount Type: | --- | |
Documentation: | --- | CRM: | ||
Verified Versions: | Category: | --- | ||
oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | ||
Cloudforms Team: | --- | Target Upstream Version: | ||
Embargoed: | ||||
Bug Depends On: | ||||
Bug Blocks: | 1422624, 1423406, 1438325, 1503134 |
Description
Byreddy
2016-05-24 04:35:27 UTC
Upstream patch : https://review.gluster.org/16630 Verified this bug for distributed volume and distributed replica volume with 3 node cluster. Verified scenario: -> Created Distributed volume with each brick from each node in 3 node cluster. -> Killed one of the volume brick -> And tried remove the offline brick using "remove-brick" command -> Command thrown proper error message like below "volume remove-brick start: failed: Found stopped brick 10.70.35.216:/bricks/brick0/rep3-2. Use force option to remove the offline brick" -> Above process tried for distributed replica volume also got proper error message. Verified version: glusterfs-3.12.2-4 Moving this bug to verified state Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHSA-2018:2607 |