Bug 1761745 - [wmcb] ROLES label missed when wmcb bootstrapping a Windows node
Summary: [wmcb] ROLES label missed when wmcb bootstrapping a Windows node
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Windows Containers
Version: 4.4
Hardware: Unspecified
OS: Unspecified
unspecified
medium
Target Milestone: ---
: 4.4.0
Assignee: ravig
QA Contact: gaoshang
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2019-10-15 09:17 UTC by gaoshang
Modified: 2020-05-13 21:52 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Environment:
Last Closed: 2020-05-13 21:52:14 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Github openshift windows-machine-config-operator pull 87 0 'None' 'closed' 'Bug 1761745: [wmcb] Add worker label to the Windows Node' 2020-05-14 02:11:11 UTC
Red Hat Product Errata RHBA-2020:0581 0 None None None 2020-05-13 21:52:15 UTC

Description gaoshang 2019-10-15 09:17:28 UTC
Description of problem:
After bootstrapping a Windows node with wmcb, ROLES label is <none>, which should be "worker".

Version-Release number of selected component (if applicable):


How reproducible:
always

Steps to Reproduce:
1. Create windows instance by wni and run wsu ansible
2. Run wmcb.exe in windows
PS > .\wmcb.exe run --ignition-file .\worker.ign --kubelet-path .\kubelet.exe
{"level":"info","ts":1571004543.2873032,"logger":"wmcb","msg":"Bootstrapping completed successfully"}
3. Approve csr
4. Check node status, windows node ROLES is <none> which should be worker
# oc get nodes
NAME                                         STATUS   ROLES    AGE   VERSION
...
ip-10-0-165-234.us-east-2.compute.internal   Ready    worker   11h   v1.14.6+c07e432da
ip-10-0-30-236.us-east-2.compute.internal    Ready    <none>   35s   v1.14.0
5. check node labels, compared with rhel node, windows node missed label: "node-role.kubernetes.io/worker="
# oc get node --show-labels
NAME                                         STATUS   ROLES    AGE     VERSION             LABELS
ip-10-0-159-106.us-east-2.compute.internal   Ready    worker   38h     v1.14.6+c07e432da   beta.kubernetes.io/arch=amd64,beta.kubernetes.io/instance-type=m4.large,beta.kubernetes.io/os=linux,failure-domain.beta.kubernetes.io/region=us-east-2,failure-domain.beta.kubernetes.io/zone=us-east-2b,kubernetes.io/arch=amd64,kubernetes.io/hostname=ip-10-0-159-106,kubernetes.io/os=linux,node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhcos
ip-10-0-16-84.us-east-2.compute.internal     Ready    <none>   9m49s   v1.14.0             beta.kubernetes.io/arch=amd64,beta.kubernetes.io/instance-type=m4.large,beta.kubernetes.io/os=windows,failure-domain.beta.kubernetes.io/region=us-east-2,failure-domain.beta.kubernetes.io/zone=us-east-2b,kubernetes.io/arch=amd64,kubernetes.io/hostname=ec2amaz-9qi7k4d,kubernetes.io/os=windows


Actual results:


Expected results:


Additional info:

Comment 4 gaoshang 2019-12-10 03:37:26 UTC
This bug has been verified and passed on 4.3.0-0.nightly-2019-12-08-215349, move status to VERIFIED, thanks.

Version-Release number of selected component (if applicable):
# oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.3.0-0.nightly-2019-12-08-215349   True        False         22h     Cluster version is 4.3.0-0.nightly-2019-12-08-215349

Steps:
1, Install OCP 4.3 and configure OVNKubernetes network
2, Install windows instance
3, Bootstrap windows node with wsu ansible, check windows node label correct.

ansible-playbook -i hosts windows-machine-config-operator/tools/ansible/tasks/wsu/main.yaml -v
oc get nodes
NAME STATUS ROLES AGE VERSION
sgao-devpreview-6lqtw-master-0 Ready master 4h7m v1.16.2
sgao-devpreview-6lqtw-master-1 Ready master 4h7m v1.16.2
sgao-devpreview-6lqtw-master-2 Ready master 4h7m v1.16.2
sgao-devpreview-6lqtw-worker-eastus1-h74dc Ready worker 3h57m v1.16.2
sgao-devpreview-6lqtw-worker-eastus2-tfwlq Ready worker 3h57m v1.16.2
sgao-devpreview-6lqtw-worker-eastus3-rgrg8 Ready worker 3h56m v1.16.2
winnode Ready worker 64s v1.16.2

Comment 7 errata-xmlrpc 2020-05-13 21:52:14 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:0581


Note You need to log in before you can comment on or make changes to this bug.