Hide Forgot
Description of problem: There is no warning message displayed as of now if user tries to create multiple bricks on the same server during replace-brick. Version-Release number of selected component (if applicable): ovirt-engine-4.0.2.6-0.1.el7ev.noarch How reproducible: Always Steps to Reproduce: 1. Create a volume of replica 3 say s1:/b1, s2:/b2 , s3:/b3 2. Now click on replace-brick and select s1 from the Host and select brick from Brick Directory. 3.Click ok Actual results: Brick gets replaced. Expected results: Replacing of brick should ask for a confirmation which says that the setup is not optimal since multiple bricks are on the same server. Do you want to continue ? Additional info:
Sahina, this is not marked as a blocker, please either block 4.1.7 or push to 4.1.8.
Hi Gobinda, I tried to replace a brick in replicate volume with the same server and i do not see UI displaying any warning for the same. Below are the versions of RHV-M and vdsm i am using. Can you please confirm ? Attaching the screenshot for the same. RHV-M : Red Hat Virtualization Manager Version: 4.1.8.1-0.1.el7 vdsm : vdsm-4.19.40-1.el7ev.x86_64
Created attachment 1359570 [details] Attaching screenshot for replace brick
Hi Kasturi, The changes happen in master branch and for that I wrote lambda expression and stream api which is supported by jdk8 but backport 4.1 branch does not support jdk8, so i changed backport code to normal code. But I am surprised the working code is still in my local,I really don't know how it did not push. I am really sorry for that. Please fail QA and we can retarget it to 4.1.9.
Cool, thanks gobinda for confirming. Based on comment 2 and comment 4 moving marking this as failed qa.
Retargeting to 4.1.9
Hello gobinda, I still see that there is no warning thrown while replacing bricks from same server. Am i missing something here ? Below is the RHV version i have : Red Hat Virtualization Manager Version: 4.1.9.1-0.1.el7 [root@hostedenginesm2 ~]# rpm -qa | grep ovirt-engine ovirt-engine-dashboard-1.1.8-1.el7ev.noarch ovirt-engine-setup-plugin-ovirt-engine-common-4.1.9.1-0.1.el7.noarch ovirt-engine-websocket-proxy-4.1.9.1-0.1.el7.noarch ovirt-engine-userportal-4.1.9.1-0.1.el7.noarch ovirt-engine-sdk-python-3.6.9.1-1.el7ev.noarch ovirt-engine-setup-plugin-vmconsole-proxy-helper-4.1.9.1-0.1.el7.noarch ovirt-engine-4.1.9.1-0.1.el7.noarch ovirt-engine-extension-aaa-ldap-setup-1.3.6-1.el7ev.noarch ovirt-engine-lib-4.1.9.1-0.1.el7.noarch ovirt-engine-setup-plugin-ovirt-engine-4.1.9.1-0.1.el7.noarch ovirt-engine-setup-plugin-websocket-proxy-4.1.9.1-0.1.el7.noarch ovirt-engine-vmconsole-proxy-helper-4.1.9.1-0.1.el7.noarch ovirt-engine-restapi-4.1.9.1-0.1.el7.noarch ovirt-engine-backend-4.1.9.1-0.1.el7.noarch ovirt-engine-sdk-java-3.6.8.0-1.el7ev.noarch python-ovirt-engine-sdk4-4.1.5-1.el7ev.x86_64 ovirt-engine-dwh-setup-4.1.9-1.el7ev.noarch ovirt-engine-setup-4.1.9.1-0.1.el7.noarch ovirt-engine-extensions-api-impl-4.1.9.1-0.1.el7.noarch ovirt-engine-webadmin-portal-4.1.9.1-0.1.el7.noarch ovirt-engine-dwh-4.1.9-1.el7ev.noarch java-ovirt-engine-sdk4-4.1.3-1.el7ev.noarch ovirt-engine-cli-3.6.8.1-1.el7ev.noarch ovirt-engine-tools-backup-4.1.9.1-0.1.el7.noarch ovirt-engine-tools-4.1.9.1-0.1.el7.noarch ovirt-engine-extension-aaa-ldap-1.3.6-1.el7ev.noarch ovirt-engine-setup-base-4.1.9.1-0.1.el7.noarch ovirt-engine-metrics-1.0.5-1.el7ev.noarch ovirt-engine-dbscripts-4.1.9.1-0.1.el7.noarch ovirt-engine-extension-aaa-jdbc-1.1.6-1.el7ev.noarch Thanks kasturi
(In reply to RamaKasturi from comment #7) > Hello gobinda, > > I still see that there is no warning thrown while replacing bricks > from same server. Am i missing something here ? > > Below is the RHV version i have : Red Hat Virtualization Manager Version: > 4.1.9.1-0.1.el7 > > [root@hostedenginesm2 ~]# rpm -qa | grep ovirt-engine > ovirt-engine-dashboard-1.1.8-1.el7ev.noarch > ovirt-engine-setup-plugin-ovirt-engine-common-4.1.9.1-0.1.el7.noarch > ovirt-engine-websocket-proxy-4.1.9.1-0.1.el7.noarch > ovirt-engine-userportal-4.1.9.1-0.1.el7.noarch > ovirt-engine-sdk-python-3.6.9.1-1.el7ev.noarch > ovirt-engine-setup-plugin-vmconsole-proxy-helper-4.1.9.1-0.1.el7.noarch > ovirt-engine-4.1.9.1-0.1.el7.noarch > ovirt-engine-extension-aaa-ldap-setup-1.3.6-1.el7ev.noarch > ovirt-engine-lib-4.1.9.1-0.1.el7.noarch > ovirt-engine-setup-plugin-ovirt-engine-4.1.9.1-0.1.el7.noarch > ovirt-engine-setup-plugin-websocket-proxy-4.1.9.1-0.1.el7.noarch > ovirt-engine-vmconsole-proxy-helper-4.1.9.1-0.1.el7.noarch > ovirt-engine-restapi-4.1.9.1-0.1.el7.noarch > ovirt-engine-backend-4.1.9.1-0.1.el7.noarch > ovirt-engine-sdk-java-3.6.8.0-1.el7ev.noarch > python-ovirt-engine-sdk4-4.1.5-1.el7ev.x86_64 > ovirt-engine-dwh-setup-4.1.9-1.el7ev.noarch > ovirt-engine-setup-4.1.9.1-0.1.el7.noarch > ovirt-engine-extensions-api-impl-4.1.9.1-0.1.el7.noarch > ovirt-engine-webadmin-portal-4.1.9.1-0.1.el7.noarch > ovirt-engine-dwh-4.1.9-1.el7ev.noarch > java-ovirt-engine-sdk4-4.1.3-1.el7ev.noarch > ovirt-engine-cli-3.6.8.1-1.el7ev.noarch > ovirt-engine-tools-backup-4.1.9.1-0.1.el7.noarch > ovirt-engine-tools-4.1.9.1-0.1.el7.noarch > ovirt-engine-extension-aaa-ldap-1.3.6-1.el7ev.noarch > ovirt-engine-setup-base-4.1.9.1-0.1.el7.noarch > ovirt-engine-metrics-1.0.5-1.el7ev.noarch > ovirt-engine-dbscripts-4.1.9.1-0.1.el7.noarch > ovirt-engine-extension-aaa-jdbc-1.1.6-1.el7ev.noarch > > Thanks > kasturi Gobinda, Based on comment7, looks like this issue is **not** fixed correctly with RHV 4.1.9. Marking this bug to ASSIGNED
Retargeting this to RHV 4.2 as it's not a high priority bug
I wonder how it's not working?I verified multiple times in local before push patch.Anyways I will cross check.
Tested with RHV 4.2.3 and gluster-3.12 1. When the brick is replaced with brick from the same host, proper warning message is thrown: "The setup is not optimal since multiple bricks are on the same server. Do you want to continue?"
This bugzilla is included in oVirt 4.2.0 release, published on Dec 20th 2017. Since the problem described in this bug report should be resolved in oVirt 4.2.0 release, published on Dec 20th 2017, it has been closed with a resolution of CURRENT RELEASE. If the solution does not work for you, please open a new bug report.