Hide Forgot
Description of problem: Unable to remove replicate brick pair from distributed replicate volume of replicate count of 2. The following error is returned to the user. <fault> <reason>Operation Failed</reason> <detail>[Cannot start removing Gluster Volume. Replica count cannot be reduced by more than one.]</detail> </fault> Version-Release number of selected component (if applicable): rhsc-cb8 How reproducible: 100% Steps to Reproduce: 1. setup a distributed-replicate volume of count 2 with 6 bricks 2. migrate a replicate pair or bricks using the rest api (POST /api/cluster/:id/glustervolume/:id/bricks/migrate) Actual results: received HTTP 409 response <?xml version="1.0" encoding="UTF-8" standalone="yes"?> <fault> <reason>Operation Failed</reason> <detail>[Cannot start removing Gluster Volume. Replica count cannot be reduced by more than one.]</detail> </fault> Expected results: HTTP 200 OK expected, remove brick started on gluster job id is returned to user Additional info:
This also an issue with collection brick delete. for example a collection delete on a replicate pair yields the same error mesage: DELETE /api/clusters/7b045a50-e930-4a9b-8f52-9f8d4617d6e5/glustervolumes/ab8a3ca6-9593-4a59-8209-2ab00f91d8ff/bricks >> Authorization: Basic YWRtaW5AaW50ZXJuYWw6cmVkaGF0¬ >> "<?xml version="1.0" encoding="UTF-8" standalone="yes"?>[\n]"¬ >> "<bricks>[\n]"¬ >> " <brick id="3e08d9a5-2a32-4348-a62a-4dcc50ba2db4"/>[\n]"¬ >> " <brick id="b5925551-2ece-440e-b6fb-b2bf695a7fb6"/>[\n]"¬ >> "</bricks>[\n]" "<?xml version="1.0" encoding="UTF-8" standalone="yes"?><fault><reason>Operation Failed</reason><detail>[Cannot remove Gluster Brick. Replica count cannot be reduced by more than one.]</detail></fault>
This issue is fixed with CB10.
If you are reducing replica count by removing one brick from each sub volume then u cann't call migrate because there is no migration required in this case. You have to use normal delete option.
verified in rhsc-cb10
New feature hence no doc text required
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. http://rhn.redhat.com/errata/RHEA-2014-0208.html