Bug 1764289 - Document details how each fence agent can be configured in RESTAPI
Summary: Document details how each fence agent can be configured in RESTAPI
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Enterprise Virtualization Manager
Classification: Red Hat
Component: ovirt-engine
Version: unspecified
Hardware: x86_64
OS: Linux
low
medium
Target Milestone: ovirt-4.4.0
: ---
Assignee: Eli Mesika
QA Contact: rhev-docs@redhat.com
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-10-22 16:10 UTC by Cyril Lopez
Modified: 2021-03-18 08:22 UTC (History)
6 users (show)

Fixed In Version: rhv-4.4.0-28
Doc Type: No Doc Update
Doc Text:
Clone Of:
Environment:
Last Closed: 2020-08-04 13:20:56 UTC
oVirt Team: Infra
Target Upstream Version:
Embargoed:
emarcus: needinfo-


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Product Errata RHSA-2020:3247 0 None None None 2020-08-04 13:21:18 UTC
oVirt gerrit 105092 0 master MERGED api-model: add documentation for agents params 2020-12-14 17:30:17 UTC

Description Cyril Lopez 2019-10-22 16:10:26 UTC
Description of problem:
I setup a power management and I faced two issue : This is not possible to setup a specific port thru the web UI and when I setup it well with ansible, this doesn't work.

Version-Release number of selected component (if applicable):
rhvm-4.3.6.7-0.1.el7.noarch

How reproducible:
Install and setup fencing with this following ansible playbook
---
- hosts: all
  vars:
    host: rhvm.rhv
    ca_file: apache-ca.pem
    kerberos: no
    username: admin@internal
    password: Mysuperpass
  tasks:
  - ovirt_host_pm:
      auth:
        hostname: "{{ host }}"
        ca_file: "{{ ca_file }}"
        kerberos: "{{ kerberos }}"
        username: "{{ username }}"
        password: "{{ password }}"
      name: rhv2.rhv
      order: 1
      address: 192.168.200.1
      username: admin
      password: mysuperpass
      port: 624
      type: ipmilan
...

Actual results:
2019-10-22 17:37:52,310+02 INFO  [org.ovirt.engine.core.dal.dbbroker.auditloghandling.AuditLogDirector] (EE-ManagedThreadFactory-engine-Thread-1) [110f989d] EVENT_ID: FENCE_OPERATION_USING_AGENT_AND_PROXY_STARTED(9,020), Executing power management status on Host rhv2.rhv using Proxy Host rhv3.rhv and Fence Agent ipmilan:192.168.200.1.
2019-10-22 17:37:52,312+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.FenceVdsVDSCommand] (EE-ManagedThreadFactory-engine-Thread-1) [110f989d] START, FenceVdsVDSCommand(HostName = rhv3.rhv, FenceVdsVDSCommandParameters:{hostId='a16f8860-6ccd-458b-af2f-246b0c9c5a01', targetVdsId='190b3305-0274-4ba8-83ec-22276cf81f0e', action='STATUS', agent='FenceAgent:{id='703883ee-c82d-4002-ad25-62da751389cf', hostId='190b3305-0274-4ba8-83ec-22276cf81f0e', order='1', type='ipmilan', ip='192.168.200.1', port='624', user='admin', password='***', encryptOptions='false', options='port=624'}', policy='null'}), log id: 3ea236bf
2019-10-22 17:38:24,839+02 INFO  [org.ovirt.engine.core.vdsbroker.vdsbroker.FenceVdsVDSCommand] (EE-ManagedThreadFactory-engine-Thread-1) [5620c463] START, FenceVdsVDSCommand(HostName = rhv1.rhv, FenceVdsVDSCommandParameters:{hostId='77a4e099-362d-435e-8c80-dd9bdc2cec82', targetVdsId='190b3305-0274-4ba8-83ec-22276cf81f0e', action='STOP', agent='FenceAgent:{id='703883ee-c82d-4002-ad25-62da751389cf', hostId='190b3305-0274-4ba8-83ec-22276cf81f0e', order='1', type='ipmilan', ip='192.168.200.1', port='624', user='admin', password='***', encryptOptions='false', options='port=624'}', policy='[]'}), log id: 6f92151c

52:54:00:a9:a3:f5 > 52:54:00:26:80:7f, ethertype IPv4 (0x0800), length 65: (tos 0x0, ttl 64, id 42700, offset 0, flags [DF], proto UDP (17), length 51)
    192.168.200.252.43462 > 192.168.200.1.623: UDP, length 23
52:54:00:a9:a3:f5 > 52:54:00:26:80:7f, ethertype IPv4 (0x0800), length 65: (tos 0x0, ttl 64, id 44359, offset 0, flags [DF], proto UDP (17), length 51)
    192.168.200.252.43462 > 192.168.200.1.623: UDP, length 23


Expected results:
We expect IPMI connection on 624.

Comment 1 Eli Mesika 2019-11-11 22:56:53 UTC
From the help of the ipmilan script[1] it seems that you should put in the UI options field 'ipport=624' 

Please try and see if that resolves your issue 


[1] 
/usr/sbin/fence_ipmilan --help
Usage:
	fence_ipmilan [options]
Options:
   ....
   ....

   -u, --ipport=[port]            TCP/UDP port to use (default 623)
   ....
   ....

Comment 2 Cyril Lopez 2019-11-12 09:31:12 UTC
Hello Eli,

Thanks for this, I added lanplus=true and not it works.

So it's mean there is a bug into ansible module because port argument is not set as expected ?

Regards
Cyril

Comment 3 Eli Mesika 2019-11-12 09:42:25 UTC
(In reply to Cyril Lopez from comment #2)
> Hello Eli,
> 
> Thanks for this, I added lanplus=true and not it works.
> 
> So it's mean there is a bug into ansible module because port argument is not
> set as expected ?

In the case of ipmi agent port argument is not mandatory, so the only way to add it is to set it in the options.

> 
> Regards
> Cyril

Comment 4 Martin Perina 2019-11-13 13:23:41 UTC
We will try to document specific details how each fence agent can be configured in RESTAPI within generated RESTAPI documentation and aftrewards we will publish the same information into Ansible documentation for ovirt_host_pm module

Comment 5 RHV bug bot 2019-12-13 13:15:41 UTC
WARN: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 6 RHV bug bot 2019-12-20 17:45:17 UTC
WARN: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 7 RHV bug bot 2020-01-08 14:49:22 UTC
WARN: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 8 RHV bug bot 2020-01-08 15:16:57 UTC
WARN: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 9 RHV bug bot 2020-01-24 19:51:11 UTC
WARN: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops: Bug status (ON_QA) wasn't changed but the folowing should be fixed:

[Found non-acked flags: '{}', ]

For more info please contact: rhv-devops

Comment 18 errata-xmlrpc 2020-08-04 13:20:56 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Important: RHV Manager (ovirt-engine) 4.4 security, bug fix, and enhancement update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2020:3247


Note You need to log in before you can comment on or make changes to this bug.