| Summary: | Display a warning if user tries to create multiple bricks on the same server during replace-brick. | ||||||
|---|---|---|---|---|---|---|---|
| Product: | [oVirt] ovirt-engine | Reporter: | RamaKasturi <knarra> | ||||
| Component: | BLL.Gluster | Assignee: | Gobinda Das <godas> | ||||
| Status: | CLOSED CURRENTRELEASE | QA Contact: | SATHEESARAN <sasundar> | ||||
| Severity: | low | Docs Contact: | |||||
| Priority: | medium | ||||||
| Version: | 4.0.2.6 | CC: | bugs, godas, lveyde, sabose, sasundar | ||||
| Target Milestone: | ovirt-4.2.0 | Flags: | rule-engine:
ovirt-4.2+
rule-engine: planning_ack+ rule-engine: devel_ack+ sasundar: testing_ack+ |
||||
| Target Release: | 4.1.9 | ||||||
| Hardware: | Unspecified | ||||||
| OS: | Unspecified | ||||||
| Whiteboard: | |||||||
| Fixed In Version: | ovirt-engine-4.1.9 | Doc Type: | If docs needed, set a value | ||||
| Doc Text: | Story Points: | --- | |||||
| Clone Of: | Environment: | ||||||
| Last Closed: | 2018-05-10 06:31:19 UTC | Type: | Bug | ||||
| Regression: | --- | Mount Type: | --- | ||||
| Documentation: | --- | CRM: | |||||
| Verified Versions: | Category: | --- | |||||
| oVirt Team: | Gluster | RHEL 7.3 requirements from Atomic Host: | |||||
| Cloudforms Team: | --- | Target Upstream Version: | |||||
| Attachments: |
|
||||||
|
Description
RamaKasturi
2016-08-17 09:13:41 UTC
Sahina, this is not marked as a blocker, please either block 4.1.7 or push to 4.1.8. Hi Gobinda,
I tried to replace a brick in replicate volume with the same server and i do not see UI displaying any warning for the same. Below are the versions of RHV-M and vdsm i am using. Can you please confirm ?
Attaching the screenshot for the same.
RHV-M : Red Hat Virtualization Manager Version: 4.1.8.1-0.1.el7
vdsm : vdsm-4.19.40-1.el7ev.x86_64
Created attachment 1359570 [details]
Attaching screenshot for replace brick
Hi Kasturi, The changes happen in master branch and for that I wrote lambda expression and stream api which is supported by jdk8 but backport 4.1 branch does not support jdk8, so i changed backport code to normal code. But I am surprised the working code is still in my local,I really don't know how it did not push. I am really sorry for that. Please fail QA and we can retarget it to 4.1.9. Cool, thanks gobinda for confirming. Based on comment 2 and comment 4 moving marking this as failed qa. Retargeting to 4.1.9 Hello gobinda,
I still see that there is no warning thrown while replacing bricks from same server. Am i missing something here ?
Below is the RHV version i have : Red Hat Virtualization Manager Version: 4.1.9.1-0.1.el7
[root@hostedenginesm2 ~]# rpm -qa | grep ovirt-engine
ovirt-engine-dashboard-1.1.8-1.el7ev.noarch
ovirt-engine-setup-plugin-ovirt-engine-common-4.1.9.1-0.1.el7.noarch
ovirt-engine-websocket-proxy-4.1.9.1-0.1.el7.noarch
ovirt-engine-userportal-4.1.9.1-0.1.el7.noarch
ovirt-engine-sdk-python-3.6.9.1-1.el7ev.noarch
ovirt-engine-setup-plugin-vmconsole-proxy-helper-4.1.9.1-0.1.el7.noarch
ovirt-engine-4.1.9.1-0.1.el7.noarch
ovirt-engine-extension-aaa-ldap-setup-1.3.6-1.el7ev.noarch
ovirt-engine-lib-4.1.9.1-0.1.el7.noarch
ovirt-engine-setup-plugin-ovirt-engine-4.1.9.1-0.1.el7.noarch
ovirt-engine-setup-plugin-websocket-proxy-4.1.9.1-0.1.el7.noarch
ovirt-engine-vmconsole-proxy-helper-4.1.9.1-0.1.el7.noarch
ovirt-engine-restapi-4.1.9.1-0.1.el7.noarch
ovirt-engine-backend-4.1.9.1-0.1.el7.noarch
ovirt-engine-sdk-java-3.6.8.0-1.el7ev.noarch
python-ovirt-engine-sdk4-4.1.5-1.el7ev.x86_64
ovirt-engine-dwh-setup-4.1.9-1.el7ev.noarch
ovirt-engine-setup-4.1.9.1-0.1.el7.noarch
ovirt-engine-extensions-api-impl-4.1.9.1-0.1.el7.noarch
ovirt-engine-webadmin-portal-4.1.9.1-0.1.el7.noarch
ovirt-engine-dwh-4.1.9-1.el7ev.noarch
java-ovirt-engine-sdk4-4.1.3-1.el7ev.noarch
ovirt-engine-cli-3.6.8.1-1.el7ev.noarch
ovirt-engine-tools-backup-4.1.9.1-0.1.el7.noarch
ovirt-engine-tools-4.1.9.1-0.1.el7.noarch
ovirt-engine-extension-aaa-ldap-1.3.6-1.el7ev.noarch
ovirt-engine-setup-base-4.1.9.1-0.1.el7.noarch
ovirt-engine-metrics-1.0.5-1.el7ev.noarch
ovirt-engine-dbscripts-4.1.9.1-0.1.el7.noarch
ovirt-engine-extension-aaa-jdbc-1.1.6-1.el7ev.noarch
Thanks
kasturi
(In reply to RamaKasturi from comment #7) > Hello gobinda, > > I still see that there is no warning thrown while replacing bricks > from same server. Am i missing something here ? > > Below is the RHV version i have : Red Hat Virtualization Manager Version: > 4.1.9.1-0.1.el7 > > [root@hostedenginesm2 ~]# rpm -qa | grep ovirt-engine > ovirt-engine-dashboard-1.1.8-1.el7ev.noarch > ovirt-engine-setup-plugin-ovirt-engine-common-4.1.9.1-0.1.el7.noarch > ovirt-engine-websocket-proxy-4.1.9.1-0.1.el7.noarch > ovirt-engine-userportal-4.1.9.1-0.1.el7.noarch > ovirt-engine-sdk-python-3.6.9.1-1.el7ev.noarch > ovirt-engine-setup-plugin-vmconsole-proxy-helper-4.1.9.1-0.1.el7.noarch > ovirt-engine-4.1.9.1-0.1.el7.noarch > ovirt-engine-extension-aaa-ldap-setup-1.3.6-1.el7ev.noarch > ovirt-engine-lib-4.1.9.1-0.1.el7.noarch > ovirt-engine-setup-plugin-ovirt-engine-4.1.9.1-0.1.el7.noarch > ovirt-engine-setup-plugin-websocket-proxy-4.1.9.1-0.1.el7.noarch > ovirt-engine-vmconsole-proxy-helper-4.1.9.1-0.1.el7.noarch > ovirt-engine-restapi-4.1.9.1-0.1.el7.noarch > ovirt-engine-backend-4.1.9.1-0.1.el7.noarch > ovirt-engine-sdk-java-3.6.8.0-1.el7ev.noarch > python-ovirt-engine-sdk4-4.1.5-1.el7ev.x86_64 > ovirt-engine-dwh-setup-4.1.9-1.el7ev.noarch > ovirt-engine-setup-4.1.9.1-0.1.el7.noarch > ovirt-engine-extensions-api-impl-4.1.9.1-0.1.el7.noarch > ovirt-engine-webadmin-portal-4.1.9.1-0.1.el7.noarch > ovirt-engine-dwh-4.1.9-1.el7ev.noarch > java-ovirt-engine-sdk4-4.1.3-1.el7ev.noarch > ovirt-engine-cli-3.6.8.1-1.el7ev.noarch > ovirt-engine-tools-backup-4.1.9.1-0.1.el7.noarch > ovirt-engine-tools-4.1.9.1-0.1.el7.noarch > ovirt-engine-extension-aaa-ldap-1.3.6-1.el7ev.noarch > ovirt-engine-setup-base-4.1.9.1-0.1.el7.noarch > ovirt-engine-metrics-1.0.5-1.el7ev.noarch > ovirt-engine-dbscripts-4.1.9.1-0.1.el7.noarch > ovirt-engine-extension-aaa-jdbc-1.1.6-1.el7ev.noarch > > Thanks > kasturi Gobinda, Based on comment7, looks like this issue is **not** fixed correctly with RHV 4.1.9. Marking this bug to ASSIGNED Retargeting this to RHV 4.2 as it's not a high priority bug I wonder how it's not working?I verified multiple times in local before push patch.Anyways I will cross check. Tested with RHV 4.2.3 and gluster-3.12 1. When the brick is replaced with brick from the same host, proper warning message is thrown: "The setup is not optimal since multiple bricks are on the same server. Do you want to continue?" This bugzilla is included in oVirt 4.2.0 release, published on Dec 20th 2017. Since the problem described in this bug report should be resolved in oVirt 4.2.0 release, published on Dec 20th 2017, it has been closed with a resolution of CURRENT RELEASE. If the solution does not work for you, please open a new bug report. |