Bug 1677269 - Need to add port 9283/tcp to /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml
Summary: Need to add port 9283/tcp to /usr/share/cephmetrics-ansible/roles/ceph-node-e...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat Storage
Component: Ceph-Metrics
Version: 3.1
Hardware: x86_64
OS: Linux
medium
low
Target Milestone: rc
: 3.3
Assignee: Boris Ranto
QA Contact: Madhavi Kasturi
Aron Gunn
URL:
Whiteboard:
Depends On:
Blocks: 1726135
TreeView+ depends on / blocked
 
Reported: 2019-02-14 12:28 UTC by Servesha
Modified: 2020-03-06 12:21 UTC (History)
11 users (show)

Fixed In Version: cephmetrics-2.0.3-1.el7cp
Doc Type: Bug Fix
Doc Text:
.The TCP port for the Ceph exporter is opened during the Ansible deployment of the Ceph Dashboard Previously, the TCP port for the Ceph exporter was not opened by the Ansible deployment scripts on all the nodes in the storage cluster. Opening TCP port 9283 had to be done manually on all nodes for the metrics to be available to the Ceph Dashboard. With this release, the TCP port is now being opened by the Ansible deployment scripts for Ceph Dashboard.
Clone Of:
Environment:
Last Closed: 2019-08-21 15:10:25 UTC
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github ceph cephmetrics pull 235 0 'None' closed firewalld: added port 2020-03-16 02:49:28 UTC
Red Hat Bugzilla 1677269 0 medium CLOSED Need to add port 9283/tcp to /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml 2021-02-22 00:41:40 UTC
Red Hat Product Errata RHSA-2019:2538 0 None None None 2019-08-21 15:10:42 UTC

Internal Links: 1677269

Description Servesha 2019-02-14 12:28:19 UTC
Description of problem: 

* Ceph metrics dashboard receiving no data after storage node reboot


Version-Release number of selected component (if applicable): RHCS version 3.1


How reproducible: always


Steps to Reproduce:
1. Install ceph dashboard
2. Reboot all the storage nodes one by one
3. Observe dashboard for more than one day

Actual results:
 
* After rebooting all storage nodes one by one dashboard showing no value.


Expected results:

* After rebooting all storage nodes one by one, dashboard should how the values.

Additional info: 

[root@ssd1 cephmetrics-ansible]# grep -R 9283 *
roles/ceph-mgr/tasks/configure_firewall.yml:    - 9283/tcp
roles/ceph-prometheus/templates/prometheus.yml:      - targets: ['{{ host }}:9283']
[root@ssd1 cephmetrics-ansible]# grep -R 9100 *
roles/ceph-node-exporter/tasks/configure_firewall.yml:    - 9100/tcp
roles/ceph-node-exporter/tests/test_node_exporter.py:        socket_spec = "tcp://0.0.0.0:9100"
roles/ceph-prometheus/templates/prometheus.yml:      - targets: ['{{ host }}:9100']
roles/ceph-prometheus/templates/prometheus.yml:      - targets: ['{{ host }}:9100']

-------
**Need to add 9283/tcp to /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml**


#  vi /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml 

- name: Open ports for node_exporter
  firewalld:
    port: "{{ item }}"
    zone: "{{ firewalld_zone }}"
    state: enabled
    immediate: true
    permanent: true
  with_items:
    - 9100/tcp
  when: "'enabled' in firewalld_status.stdout"

In  configure_firewall.yml file only 9100 is added. Need to add 9283/tcp port as well.

Comment 1 Servesha 2019-04-19 06:22:05 UTC
Hello,

Can I please know what is the status of this bug ? 

Best Regards,
Servesha

Comment 2 Servesha 2019-04-23 07:07:57 UTC
Hello,

I have added a pull request with required changes, please refer to below link : 

https://github.com/ceph/cephmetrics/pull/235/commits/73b77e352661d62a13e41ae9fd5799a2e6aaedec

Hopefully changes will be merged soon.

Thank you !

Comment 3 Servesha 2019-04-26 07:28:08 UTC
Hello,

Could someone please review the patch and merge the pull request : 

https://github.com/ceph/cephmetrics/pull/235/commits/73b77e352661d62a13e41ae9fd5799a2e6aaedec 

Regards,
Servesha

Comment 4 Servesha 2019-05-30 10:38:13 UTC
Hello,

May I know the progress on this bug ?

Comment 5 Zack Cerza 2019-06-04 22:49:20 UTC
PR is merged - thank you Servesha!

Comment 15 errata-xmlrpc 2019-08-21 15:10:25 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2019:2538


Note You need to log in before you can comment on or make changes to this bug.