Bug 2229099 - ceph orch host drain <host_ip> cmd needs to report error, as it's not processed.
Summary: ceph orch host drain <host_ip> cmd needs to report error, as it's not processed.
Keywords:
Status: NEW
Alias: None
Product: Red Hat Ceph Storage
Classification: Red Hat Storage
Component: RADOS
Version: 6.1
Hardware: Unspecified
OS: Linux
unspecified
low
Target Milestone: ---
: 7.1
Assignee: Adam King
QA Contact: Pawan
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2023-08-04 07:23 UTC by sumr
Modified: 2023-08-11 04:28 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Issue Tracker RHCEPH-7159 0 None None None 2023-08-04 07:25:49 UTC

Description sumr 2023-08-04 07:23:31 UTC
Description of problem:

To remove hosts from Ceph cluster, ran 'ceph orch host drain host_ip' cmd. Command had to return an error that ADDR is not supported, HOSTNAME needs to be used. But command executes with no error and generates info in output that misleads that drain has started.

Version-Release number of selected component (if applicable):
Container image build - ceph-6.1-rhel-9-containers-candidate-85059-20230715042303


How reproducible: Reproducible.


Steps to Reproduce:
1. Setup 3-node Ceph Cluster, add OSDs across nodes
2. Perform host drain using host IP as 'ceph orch host drain  10.0.210.182'

Actual results:

[ceph: root@ceph-sumar-guxo8h-node1-installer /]# ceph orch host drain  10.0.210.182
Scheduled to remove the following daemons from host '10.0.210.182'
type                 id             
-------------------- ---------------
[ceph: root@ceph-sumar-guxo8h-node1-installer /]# 
[ceph: root@ceph-sumar-guxo8h-node1-installer /]# ceph orch osd rm status
No OSD remove/replace operations reported

Expected results:

When host drain command is run with hostname,

[ceph: root@ceph-sumar-guxo8h-node1-installer /]# ceph orch host drain ceph-sumar-guxo8h-node3
Scheduled to remove the following daemons from host 'ceph-sumar-guxo8h-node3'
type                 id             
-------------------- ---------------
ceph-exporter        ceph-sumar-guxo8h-node3
crash                ceph-sumar-guxo8h-node3
node-exporter        ceph-sumar-guxo8h-node3
osd                  2              
osd                  3              
mds                  cephfs_test.ceph-sumar-guxo8h-node3.ocpvex
mgr                  ceph-sumar-guxo8h-node3.huwmtu
mds                  cephfs_test.ceph-sumar-guxo8h-node3.pqbbsd
[ceph: root@ceph-sumar-guxo8h-node1-installer /]#

[ceph: root@ceph-sumar-guxo8h-node1-installer /]# ceph orch osd rm status
OSD  HOST                     STATE    PGS  REPLACE  FORCE  ZAP    DRAIN STARTED AT  
0    ceph-sumar-guxo8h-node2  started   59  False    False  False                    
1    ceph-sumar-guxo8h-node2  started   97  False    False  False                    
2    ceph-sumar-guxo8h-node3  started   68  False    False  False                    
3    ceph-sumar-guxo8h-node3  started   74  False    False  False  


>>> If the host drain using IP_ADDR is not supported, there should be an error reported upon command execution, instead of graceful exit with below output,

Scheduled to remove the following daemons from host '10.0.210.182'
type                 id             
-------------------- ---------------
[ceph: root@ceph-sumar-guxo8h-node1-installer /]# 


Additional info:

[ceph: root@ceph-sumar-guxo8h-node1-installer /]# ceph orch host ls
HOST                               ADDR          LABELS                          STATUS  
ceph-sumar-guxo8h-node1-installer  10.0.208.168  _admin admin_installer mon_mgr          
ceph-sumar-guxo8h-node2            10.0.209.2    mon_mgr_osd                             
ceph-sumar-guxo8h-node3            10.0.210.182  osd  

[ceph: root@ceph-sumar-guxo8h-node1-installer /]# ceph orch ps
NAME                                             HOST                               PORTS        STATUS         REFRESHED  AGE  MEM USE  MEM LIM  VERSION          IMAGE ID      CONTAINER ID  
alertmanager.ceph-sumar-guxo8h-node1-installer   ceph-sumar-guxo8h-node1-installer  *:9093,9094  running (3d)     44s ago   3d    25.3M        -  0.24.0           83cf53701c61  b86c81efa4b9  
ceph-exporter.ceph-sumar-guxo8h-node1-installer  ceph-sumar-guxo8h-node1-installer               running (3d)     44s ago   3d    16.4M        -  17.2.6-98.el9cp  584072ee7ce2  0a9118568065  
ceph-exporter.ceph-sumar-guxo8h-node2            ceph-sumar-guxo8h-node2                         running (3d)      9m ago   3d    17.6M        -  17.2.6-98.el9cp  584072ee7ce2  adb040121510  
ceph-exporter.ceph-sumar-guxo8h-node3            ceph-sumar-guxo8h-node3                         running (3d)      2m ago   3d    17.4M        -  17.2.6-98.el9cp  584072ee7ce2  42fba4d88d25  
crash.ceph-sumar-guxo8h-node1-installer          ceph-sumar-guxo8h-node1-installer               running (3d)     44s ago   3d    6891k        -  17.2.6-98.el9cp  584072ee7ce2  e841713589da  
crash.ceph-sumar-guxo8h-node2                    ceph-sumar-guxo8h-node2                         running (3d)      9m ago   3d    6891k        -  17.2.6-98.el9cp  584072ee7ce2  69da27293ea9  
crash.ceph-sumar-guxo8h-node3                    ceph-sumar-guxo8h-node3                         running (3d)      2m ago   3d    6891k        -  17.2.6-98.el9cp  584072ee7ce2  d30a12c3f399  
grafana.ceph-sumar-guxo8h-node1-installer        ceph-sumar-guxo8h-node1-installer  *:3000       running (3d)     44s ago   3d    89.7M        -  9.4.7            78fc9f90ce6a  9423da04eeb2  
mds.cephfs_test.ceph-sumar-guxo8h-node2.lihfjv   ceph-sumar-guxo8h-node2                         running (2d)      9m ago   2d    27.4M        -  17.2.6-98.el9cp  584072ee7ce2  8d600d8bb74f  
mds.cephfs_test.ceph-sumar-guxo8h-node3.ocpvex   ceph-sumar-guxo8h-node3                         running (2d)      2m ago   2d    25.9M        -  17.2.6-98.el9cp  584072ee7ce2  b5db31d21c37  
mgr.ceph-sumar-guxo8h-node1-installer.jonlmx     ceph-sumar-guxo8h-node1-installer  *:9283       running (3d)     44s ago   3d     612M        -  17.2.6-98.el9cp  584072ee7ce2  1aafbcb6a2b9  
mgr.ceph-sumar-guxo8h-node2.fxbqtp               ceph-sumar-guxo8h-node2            *:8443,9283  running (3d)      9m ago   3d     412M        -  17.2.6-98.el9cp  584072ee7ce2  8587a47aa359  
mon.ceph-sumar-guxo8h-node1-installer            ceph-sumar-guxo8h-node1-installer               running (3d)     44s ago   3d     463M    2048M  17.2.6-98.el9cp  584072ee7ce2  73416415f0fe  
mon.ceph-sumar-guxo8h-node2                      ceph-sumar-guxo8h-node2                         running (3d)      9m ago   3d     474M    2048M  17.2.6-98.el9cp  584072ee7ce2  5caf7c5a2707  
node-exporter.ceph-sumar-guxo8h-node1-installer  ceph-sumar-guxo8h-node1-installer  *:9100       running (3d)     44s ago   3d    19.4M        -  1.4.0            58105242d182  b17f18425a80  
node-exporter.ceph-sumar-guxo8h-node2            ceph-sumar-guxo8h-node2            *:9100       running (3d)      9m ago   3d    21.8M        -  1.4.0            58105242d182  873488da3e7c  
node-exporter.ceph-sumar-guxo8h-node3            ceph-sumar-guxo8h-node3            *:9100       running (3d)      2m ago   3d    19.0M        -  1.4.0            58105242d182  d07040282a69  
osd.0                                            ceph-sumar-guxo8h-node2                         running (18h)     9m ago   3d    70.0M    4096M  17.2.6-98.el9cp  584072ee7ce2  42faf02497df  
osd.1                                            ceph-sumar-guxo8h-node2                         running (3d)      9m ago   3d     102M    4096M  17.2.6-98.el9cp  584072ee7ce2  e25ae7d5a709  
osd.2                                            ceph-sumar-guxo8h-node3                         running (3d)      2m ago   3d    98.1M    4096M  17.2.6-98.el9cp  584072ee7ce2  7b52ef06c3ad  
osd.3                                            ceph-sumar-guxo8h-node3                         running (3d)      2m ago   3d     101M    4096M  17.2.6-98.el9cp  584072ee7ce2  81f27e0c0f02


Note You need to log in before you can comment on or make changes to this bug.