Bug 1358185 - Failed to launch slave pod when using custom slave image with jenkins as S2i builder
Summary: Failed to launch slave pod when using custom slave image with jenkins as S2i ...
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Image
Version: 3.3.0
Hardware: Unspecified
OS: Unspecified
medium
medium
Target Milestone: ---
: ---
Assignee: Ben Parees
QA Contact: Wang Haoran
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2016-07-20 09:21 UTC by shiyang.wang
Modified: 2017-03-08 18:26 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: No Doc Update
Doc Text:
undefined
Clone Of:
Environment:
Last Closed: 2016-09-27 09:41:14 UTC
Target Upstream Version:


Attachments (Terms of Use)


Links
System ID Priority Status Summary Last Updated
Red Hat Product Errata RHBA-2016:1933 normal SHIPPED_LIVE Red Hat OpenShift Container Platform 3.3 Release Advisory 2016-09-27 13:24:36 UTC

Description shiyang.wang 2016-07-20 09:21:07 UTC
Description of problem:
Build a custom slave image, and using jenkins as S2i builder, after trigger job in the jenkins webconsle, the slave pod can't be running with below error:

Jul 19, 2016 11:27:50 PM hudson.remoting.jnlp.Main$CuiListener status
INFO: Connecting to 172.30.166.34:50000 (retrying:3)
java.net.NoRouteToHostException: No route to host
        at java.net.PlainSocketImpl.socketConnect(Native Method)
        at java.net.AbstractPlainSocketImpl.doConnect(AbstractPlainSocketImpl.java:350)
        at java.net.AbstractPlainSocketImpl.connectToAddress(AbstractPlainSocketImpl.java:206)
        at java.net.AbstractPlainSocketImpl.connect(AbstractPlainSocketImpl.java:188)
        at java.net.SocksSocketImpl.connect(SocksSocketImpl.java:392)
        at java.net.Socket.connect(Socket.java:589)
        at java.net.Socket.connect(Socket.java:538)
        at hudson.remoting.Engine.connect(Engine.java:383)
        at hudson.remoting.Engine.run(Engine.java:298)


Version-Release number of selected component (if applicable):
openshift3/jenkins-1-rhel7 (imageid 1e3f32968b5a)
openshift v3.3.0.6
kubernetes v1.3.0+57fb9ac

How reproducible:
always

Steps to Reproduce:
1. Create a new project and add edit role.
2.Build a new custom slave image
$ oc new-app  https://raw.githubusercontent.com/openshift/origin/master/examples/jenkins/master-slave/jenkins-slave-template.json  -p IMAGE_NAME=$registry/rhscl/ruby-22-rhel7:latest,IMAGE_STREAM_NAME=ruby22,SLAVE_REPO_URL=https://github.com/xiuwang/jenkins-slave-rhel7repo,SLAVE_REPO_CONTEXTDIR=
3.Create jenkins master pod included the custom slave image label
oc new-app  -f https://raw.githubusercontent.com/openshift/origin/master/examples/jenkins/master-slave/jenkins-master-template.json

4.Login jenkins webconsole, go to ruby-hellow-world-test project configuration page, change to use ruby22 label
 5.Start a build in the jenkins console.
 6.Check slave pod.

Actual results:
  #1
(pending—All nodes of label ‘ruby22’ are offline) 
And description error in the slave pod logs.

Expected results:

Should no error in slave pod, and could build successfully.


Additional info:

Comment 1 Ben Parees 2016-07-20 17:21:18 UTC
PR here: https://github.com/openshift/origin/pull/9956

Comment 2 shiyang.wang 2016-07-22 09:08:02 UTC
after test it fixed on origin, will verified when image push to brew registry

Comment 3 Troy Dawson 2016-08-01 22:21:44 UTC
This has been merged and is in OSE v3.3.0.13 or newer.

Comment 4 shiyang.wang 2016-08-03 02:42:42 UTC
Test with brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/rhscl/ruby-22-rhel7:latest can't reproduce this bug. Move this bug to verified.

Comment 6 errata-xmlrpc 2016-09-27 09:41:14 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHBA-2016:1933


Note You need to log in before you can comment on or make changes to this bug.