Bug 1868083

Summary: [upgrade] ovnkube-node rollout hung on 4.5.5 -> 4.6 upgrade with rhel workers
Product: OpenShift Container Platform Reporter: Anurag saxena <anusaxen>
Component: NetworkingAssignee: Alexander Constantinescu <aconstan>
Networking sub component: ovn-kubernetes QA Contact: Anurag saxena <anusaxen>
Status: CLOSED DUPLICATE Docs Contact:
Severity: medium    
Priority: medium CC: aconstan, bbennett
Version: 4.6   
Target Milestone: ---   
Target Release: 4.6.0   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: If docs needed, set a value
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2020-08-25 15:49:16 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:

Comment 1 Anurag saxena 2020-08-13 13:53:28 UTC
Seems to be a clone of https://bugzilla.redhat.com/show_bug.cgi?id=1868259

Comment 2 Alexander Constantinescu 2020-08-13 13:55:38 UTC
@Anurag

W.r.t #comment 1: I wanted to ask you exactly that, i.e: the difference between both bugs? If they are clones, could you close this bug?

/Alex

Comment 3 Anurag saxena 2020-08-13 14:18:45 UTC
@Alex yea the difference is btw env. This one is on RHEL worker nodes while other on RHCOS but seems like root cause is same as per logs

Comment 4 Alexander Constantinescu 2020-08-13 16:28:32 UTC
@Anurag

Are you able to connect to the node hosting the ovnkube-node-np7jn pod and do a:

journalctl -u ovs-configuration.service > journal-ovs-configuration.logs ?

Essentially https://bugzilla.redhat.com/show_bug.cgi?id=1868259 is missing the system logs to be able to figure out the problem, so if you can get those using this cluster, that would be great. 

We need the journal for the ovs-configuration.service specifially, but the cluster in https://bugzilla.redhat.com/show_bug.cgi?id=1868259#c4 is in a very bad shape and does not help....I am hoping your cluster might be in a better one :)

/Alex

Comment 5 Anurag saxena 2020-08-17 15:57:58 UTC
yea, this is connected to BZ 1868259. Tim and i did some investigation last week.

Comment 6 Anurag saxena 2020-08-18 13:44:04 UTC
@Alex changing the target release as its same as BZ 1868259. Please correct me if :)

Comment 7 Alexander Constantinescu 2020-08-25 15:49:16 UTC
Closing this as it seems it was a dupe of 1868259, and that bug has been verified.

*** This bug has been marked as a duplicate of bug 1868259 ***