Bug 1761745

Summary: [wmcb] ROLES label missed when wmcb bootstrapping a Windows node
Product: OpenShift Container Platform Reporter: gaoshang <sgao>
Component: Windows ContainersAssignee: ravig <rgudimet>
Status: CLOSED ERRATA QA Contact: gaoshang <sgao>
Severity: medium Docs Contact:
Priority: unspecified    
Version: 4.4CC: aos-bugs, aravindh, gmarkley, rgudimet, xtian
Target Milestone: ---   
Target Release: 4.4.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2020-05-13 21:52:14 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Description gaoshang 2019-10-15 09:17:28 UTC
Description of problem:
After bootstrapping a Windows node with wmcb, ROLES label is <none>, which should be "worker".

Version-Release number of selected component (if applicable):


How reproducible:
always

Steps to Reproduce:
1. Create windows instance by wni and run wsu ansible
2. Run wmcb.exe in windows
PS > .\wmcb.exe run --ignition-file .\worker.ign --kubelet-path .\kubelet.exe
{"level":"info","ts":1571004543.2873032,"logger":"wmcb","msg":"Bootstrapping completed successfully"}
3. Approve csr
4. Check node status, windows node ROLES is <none> which should be worker
# oc get nodes
NAME                                         STATUS   ROLES    AGE   VERSION
...
ip-10-0-165-234.us-east-2.compute.internal   Ready    worker   11h   v1.14.6+c07e432da
ip-10-0-30-236.us-east-2.compute.internal    Ready    <none>   35s   v1.14.0
5. check node labels, compared with rhel node, windows node missed label: "node-role.kubernetes.io/worker="
# oc get node --show-labels
NAME                                         STATUS   ROLES    AGE     VERSION             LABELS
ip-10-0-159-106.us-east-2.compute.internal   Ready    worker   38h     v1.14.6+c07e432da   beta.kubernetes.io/arch=amd64,beta.kubernetes.io/instance-type=m4.large,beta.kubernetes.io/os=linux,failure-domain.beta.kubernetes.io/region=us-east-2,failure-domain.beta.kubernetes.io/zone=us-east-2b,kubernetes.io/arch=amd64,kubernetes.io/hostname=ip-10-0-159-106,kubernetes.io/os=linux,node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhcos
ip-10-0-16-84.us-east-2.compute.internal     Ready    <none>   9m49s   v1.14.0             beta.kubernetes.io/arch=amd64,beta.kubernetes.io/instance-type=m4.large,beta.kubernetes.io/os=windows,failure-domain.beta.kubernetes.io/region=us-east-2,failure-domain.beta.kubernetes.io/zone=us-east-2b,kubernetes.io/arch=amd64,kubernetes.io/hostname=ec2amaz-9qi7k4d,kubernetes.io/os=windows


Actual results:


Expected results:


Additional info:

Comment 4 gaoshang 2019-12-10 03:37:26 UTC
This bug has been verified and passed on 4.3.0-0.nightly-2019-12-08-215349, move status to VERIFIED, thanks.

Version-Release number of selected component (if applicable):
# oc get clusterversion
NAME      VERSION                             AVAILABLE   PROGRESSING   SINCE   STATUS
version   4.3.0-0.nightly-2019-12-08-215349   True        False         22h     Cluster version is 4.3.0-0.nightly-2019-12-08-215349

Steps:
1, Install OCP 4.3 and configure OVNKubernetes network
2, Install windows instance
3, Bootstrap windows node with wsu ansible, check windows node label correct.

ansible-playbook -i hosts windows-machine-config-operator/tools/ansible/tasks/wsu/main.yaml -v
oc get nodes
NAME STATUS ROLES AGE VERSION
sgao-devpreview-6lqtw-master-0 Ready master 4h7m v1.16.2
sgao-devpreview-6lqtw-master-1 Ready master 4h7m v1.16.2
sgao-devpreview-6lqtw-master-2 Ready master 4h7m v1.16.2
sgao-devpreview-6lqtw-worker-eastus1-h74dc Ready worker 3h57m v1.16.2
sgao-devpreview-6lqtw-worker-eastus2-tfwlq Ready worker 3h57m v1.16.2
sgao-devpreview-6lqtw-worker-eastus3-rgrg8 Ready worker 3h56m v1.16.2
winnode Ready worker 64s v1.16.2

Comment 7 errata-xmlrpc 2020-05-13 21:52:14 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2020:0581