Description of problem: Up to now if a customer needed to add any custom fence agent that is not listed in oVirt, he had to tweak the original relevant oVirt configuration values in order to achieve that. The main problem of the above method except its complexity is that it does not survive product upgrades and custom settings are lost after upgrading oVirt. oVirt should provide a friendly method of how to add custom fence agents and preserve those changes upon oVirt upgrades Version-Release number of selected component (if applicable): How reproducible: Steps to Reproduce: 1. 2. 3. Actual results: Expected results: Additional info:
Basic functionality is working well as per: https://tcms.engineering.redhat.com/run/164181/ issues: after adding a fence agent type name "test_fence_agent" and copied at the proxy host the ipmilan fence script to a new script "fence_test_fence_agent" I tried some configurations for CustomVdsFenceOptionMapping parameter. 1. CustomVdsFenceOptionMapping="test_fence_Agent:" (same as ipmilan mapping) - restart engine - mapping in ui is similar to ipmilan - test works. in proxy host vdsm.log: Thread-45::DEBUG::2014-08-12 12:29:08,838::API::1153::vds::(fenceNode) fenceNode(addr=rose07-mgmt.qa .lab.tlv.redhat.com,port=,agent=test_fence_agent,user=root,passwd=XXXX,action=status,secure=,options =) 2. CustomVdsFenceOptionMapping="test_fence_Agent:port=invalid" - restart engine - mapping in ui has ssh port field with no value. - test works. same message in vdsm.log (port=,agent.....), port didn't get the invalid value. 3. CustomVdsFenceOptionMapping="test_fence_Agent:port=ipport" - restart engine - mapping in ui has ssh port field with no value. - test works. same message in vdsm.log (port=,agent.....), port didn't get any value. 4. CustomVdsFenceOptionMapping="test_fence_Agent:blabla=invalid"- restart engine - mapping in ui is similar to ipmilan - test works. same vdsm message. 5. CustomVdsFenceOptionMapping="different_fence_Agent:" - restart engine - mapping has all the different fields (as default apc) - test fails with message: 'Power Management test failed for Host monique-vds01.tlv.redhat.com.There is no other host in the data center that can be used to test the power management settings.' Basically it seems that defining an agent of type name X and providing a script fence_X and configuring any sort of mapping is enough for the agent to work. Is this the expected functionality, when does 'CustomVdsFenceOptionMapping' values make a difference?
please see https://bugzilla.redhat.com/show_bug.cgi?id=1129596#c1 https://bugzilla.redhat.com/show_bug.cgi?id=1129596#c2
Verified with ovirt-engine-3.5.0-0.0.master.20140821064931.gitb794d66.el6.noarch according to TCMS plan 14443.
oVirt 3.5 has been released and should include the fix for this issue.