Description of problem: * Ceph metrics dashboard receiving no data after storage node reboot Version-Release number of selected component (if applicable): RHCS version 3.1 How reproducible: always Steps to Reproduce: 1. Install ceph dashboard 2. Reboot all the storage nodes one by one 3. Observe dashboard for more than one day Actual results: * After rebooting all storage nodes one by one dashboard showing no value. Expected results: * After rebooting all storage nodes one by one, dashboard should how the values. Additional info: [root@ssd1 cephmetrics-ansible]# grep -R 9283 * roles/ceph-mgr/tasks/configure_firewall.yml: - 9283/tcp roles/ceph-prometheus/templates/prometheus.yml: - targets: ['{{ host }}:9283'] [root@ssd1 cephmetrics-ansible]# grep -R 9100 * roles/ceph-node-exporter/tasks/configure_firewall.yml: - 9100/tcp roles/ceph-node-exporter/tests/test_node_exporter.py: socket_spec = "tcp://0.0.0.0:9100" roles/ceph-prometheus/templates/prometheus.yml: - targets: ['{{ host }}:9100'] roles/ceph-prometheus/templates/prometheus.yml: - targets: ['{{ host }}:9100'] ------- **Need to add 9283/tcp to /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml** # vi /usr/share/cephmetrics-ansible/roles/ceph-node-exporter/tasks/configure_firewall.yml - name: Open ports for node_exporter firewalld: port: "{{ item }}" zone: "{{ firewalld_zone }}" state: enabled immediate: true permanent: true with_items: - 9100/tcp when: "'enabled' in firewalld_status.stdout" In configure_firewall.yml file only 9100 is added. Need to add 9283/tcp port as well.
Hello, Can I please know what is the status of this bug ? Best Regards, Servesha
Hello, I have added a pull request with required changes, please refer to below link : https://github.com/ceph/cephmetrics/pull/235/commits/73b77e352661d62a13e41ae9fd5799a2e6aaedec Hopefully changes will be merged soon. Thank you !
Hello, Could someone please review the patch and merge the pull request : https://github.com/ceph/cephmetrics/pull/235/commits/73b77e352661d62a13e41ae9fd5799a2e6aaedec Regards, Servesha
Hello, May I know the progress on this bug ?
PR is merged - thank you Servesha!
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHSA-2019:2538