Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.
This project is now read‑only. Starting Monday, February 2, please use https://ibm-ceph.atlassian.net/ for all bug tracking management.

Bug 1677269

Summary: Need to add port 9283/tcp to /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml
Product: [Red Hat Storage] Red Hat Ceph Storage Reporter: Servesha <sdudhgao>
Component: Ceph-MetricsAssignee: Boris Ranto <branto>
Status: CLOSED ERRATA QA Contact: Madhavi Kasturi <mkasturi>
Severity: low Docs Contact: Aron Gunn <agunn>
Priority: medium    
Version: 3.1CC: agunn, branto, ceph-eng-bugs, ceph-qe-bugs, gmeno, gsitlani, kdreyer, shzhou, tchandra, tserlin, zcerza
Target Milestone: rc   
Target Release: 3.3   
Hardware: x86_64   
OS: Linux   
Whiteboard:
Fixed In Version: cephmetrics-2.0.3-1.el7cp Doc Type: Bug Fix
Doc Text:
.The TCP port for the Ceph exporter is opened during the Ansible deployment of the Ceph Dashboard Previously, the TCP port for the Ceph exporter was not opened by the Ansible deployment scripts on all the nodes in the storage cluster. Opening TCP port 9283 had to be done manually on all nodes for the metrics to be available to the Ceph Dashboard. With this release, the TCP port is now being opened by the Ansible deployment scripts for Ceph Dashboard.
Story Points: ---
Clone Of: Environment:
Last Closed: 2019-08-21 15:10:25 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On:    
Bug Blocks: 1726135    

Description Servesha 2019-02-14 12:28:19 UTC
Description of problem: 

* Ceph metrics dashboard receiving no data after storage node reboot


Version-Release number of selected component (if applicable): RHCS version 3.1


How reproducible: always


Steps to Reproduce:
1. Install ceph dashboard
2. Reboot all the storage nodes one by one
3. Observe dashboard for more than one day

Actual results:
 
* After rebooting all storage nodes one by one dashboard showing no value.


Expected results:

* After rebooting all storage nodes one by one, dashboard should how the values.

Additional info: 

[root@ssd1 cephmetrics-ansible]# grep -R 9283 *
roles/ceph-mgr/tasks/configure_firewall.yml:    - 9283/tcp
roles/ceph-prometheus/templates/prometheus.yml:      - targets: ['{{ host }}:9283']
[root@ssd1 cephmetrics-ansible]# grep -R 9100 *
roles/ceph-node-exporter/tasks/configure_firewall.yml:    - 9100/tcp
roles/ceph-node-exporter/tests/test_node_exporter.py:        socket_spec = "tcp://0.0.0.0:9100"
roles/ceph-prometheus/templates/prometheus.yml:      - targets: ['{{ host }}:9100']
roles/ceph-prometheus/templates/prometheus.yml:      - targets: ['{{ host }}:9100']

-------
**Need to add 9283/tcp to /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml**


#  vi /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml 

- name: Open ports for node_exporter
  firewalld:
    port: "{{ item }}"
    zone: "{{ firewalld_zone }}"
    state: enabled
    immediate: true
    permanent: true
  with_items:
    - 9100/tcp
  when: "'enabled' in firewalld_status.stdout"

In  configure_firewall.yml file only 9100 is added. Need to add 9283/tcp port as well.

Comment 1 Servesha 2019-04-19 06:22:05 UTC
Hello,

Can I please know what is the status of this bug ? 

Best Regards,
Servesha

Comment 2 Servesha 2019-04-23 07:07:57 UTC
Hello,

I have added a pull request with required changes, please refer to below link : 

https://github.com/ceph/cephmetrics/pull/235/commits/73b77e352661d62a13e41ae9fd5799a2e6aaedec

Hopefully changes will be merged soon.

Thank you !

Comment 3 Servesha 2019-04-26 07:28:08 UTC
Hello,

Could someone please review the patch and merge the pull request : 

https://github.com/ceph/cephmetrics/pull/235/commits/73b77e352661d62a13e41ae9fd5799a2e6aaedec 

Regards,
Servesha

Comment 4 Servesha 2019-05-30 10:38:13 UTC
Hello,

May I know the progress on this bug ?

Comment 5 Zack Cerza 2019-06-04 22:49:20 UTC
PR is merged - thank you Servesha!

Comment 15 errata-xmlrpc 2019-08-21 15:10:25 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2019:2538