Bugzilla will be upgraded to version 5.0 on a still to be determined date in the near future. The original upgrade date has been delayed.
Bug 1577846 - After latest environment update all ceph-disk@dev-sdXX.service are in failed state
After latest environment update all ceph-disk@dev-sdXX.service are in failed ...
Status: CLOSED ERRATA
Product: Red Hat Ceph Storage
Classification: Red Hat
Component: Ceph-Ansible (Show other bugs)
2.5
All All
high Severity high
: z4
: 3.0
Assigned To: leseb
Yogev Rabl
:
Depends On:
Blocks: 1578730 1581579 1583767
  Show dependency treegraph
 
Reported: 2018-05-14 05:32 EDT by Alex Stupnikov
Modified: 2018-07-11 14:12 EDT (History)
16 users (show)

See Also:
Fixed In Version: RHEL: ceph-ansible-3.0.35-1.el7cp Ubuntu: ceph-ansible_3.0.35-2redhat1
Doc Type: Bug Fix
Doc Text:
.Update to the `ceph-disk` Unit Files Previously, the transition to containerized Ceph left some "ceph-disk" unit files. The files were harmless, but appeared as failing, which could be distressing to the operator. With this update, executing the "switch-from-non-containerized-to-containerized-ceph-daemons.yml" playbook disables the "ceph-disk" unit files too.
Story Points: ---
Clone Of:
: 1581579 (view as bug list)
Environment:
Last Closed: 2018-07-11 14:11:10 EDT
Type: Bug
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)
journalctl logs and ceph-osd status (4.38 KB, text/plain)
2018-05-14 05:32 EDT, Alex Stupnikov
no flags Details


External Trackers
Tracker ID Priority Status Summary Last Updated
Github ceph/ceph-ansible/pull/2595 None None None 2018-05-16 11:39 EDT
Red Hat Product Errata RHSA-2018:2177 None None None 2018-07-11 14:12 EDT

  None (edit)
Description Alex Stupnikov 2018-05-14 05:32:38 EDT
Created attachment 1436089 [details]
journalctl logs and ceph-osd status

Description of problem:

CU asked us to investigate issue with his RHOSP12 + ceph environment. He reported the following issue: all ceph-disk@dev-sdXX.service units are in failed state.

I have tried to troubleshoot the issue for one specific ceph disk sdv (but the picture is the same for other ones). Please find the extract from journalctl logs and ``systemctl status --all`` command in attachments (to keep description shorter).

It looks like the following ceph-ansible v3.0.27 play masked all ceph-osd@N.service services and broken ceph-disk systemd units:

    - name: stop non-containerized ceph osd(s)
      systemd:
        name: "{{ item }}"
        state: stopped
        enabled: no
        masked: yes
      with_items: "{{ running_osds.stdout_lines | default([])}}"
      when: running_osds != []


I may be wrong with the clue above, but customers are still struggling, so please find additional information about customer's environment in comment #1.

Customer said that his ceph environment is running fine, but he is worried about failed systemd units. Please feel free to adjust severity if this problem is cosmetic.
Comment 4 leseb 2018-05-16 08:28:29 EDT
I don't think this a real issue, we simply don't change the ceph-disk unit file. We probably should disable it too. However, this does not affect the ceph-osd@XXX.service units and the cluster should be fine.
Comment 8 Guillaume Abrioux 2018-05-22 07:40:04 EDT
fix will be in v3.1.0rc4 and v3.0.35
Comment 9 Alex Stupnikov 2018-05-22 07:45:04 EDT
Guillaume, will Red Hat ship those ceph-ansible versions with RHCS 2.5?

BR, Alex
Comment 14 Guillaume Abrioux 2018-05-29 09:23:30 EDT
fixed in v3.1.0rc4
Comment 26 leseb 2018-06-07 22:12:48 EDT
Edu, the fix is in 2.5z1 as per comment https://bugzilla.redhat.com/show_bug.cgi?id=1577846#c10 I'm not sure how I can assist further.
Please let me know.
Comment 28 leseb 2018-06-08 04:01:09 EDT
I'm not sure how I can help here, the only thing I can tell you is that the patch is present in v3.0.35 and above.

When it comes to how fix faster or release date please ask Ken.
Thanks
Comment 31 leseb 2018-06-12 02:28:57 EDT
Done.
Comment 32 Yogev Rabl 2018-06-26 08:06:28 EDT
Verified on rc9
Comment 34 errata-xmlrpc 2018-07-11 14:11:10 EDT
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2018:2177

Note You need to log in before you can comment on or make changes to this bug.