Bug 1622014
| Summary: | Starting a pipeline build from Jenkins UI or Jenkins triggers could result in duplicate build parameters. | ||
|---|---|---|---|
| Product: | OpenShift Container Platform | Reporter: | Yuxiang Zhu <yuxzhu> |
| Component: | ImageStreams | Assignee: | Gabe Montero <gmontero> |
| Status: | CLOSED ERRATA | QA Contact: | wewang <wewang> |
| Severity: | unspecified | Docs Contact: | |
| Priority: | unspecified | ||
| Version: | 3.11.0 | CC: | aos-bugs, bparees, gmontero, jokerman, mmccomas, wewang, wzheng |
| Target Milestone: | --- | ||
| Target Release: | 3.11.0 | ||
| Hardware: | All | ||
| OS: | All | ||
| Whiteboard: | |||
| Fixed In Version: | Doc Type: | If docs needed, set a value | |
| Doc Text: | Story Points: | --- | |
| Clone Of: | Environment: | ||
| Last Closed: | 2018-10-11 07:25:54 UTC | Type: | Bug |
| Regression: | --- | Mount Type: | --- |
| Documentation: | --- | CRM: | |
| Verified Versions: | Category: | --- | |
| oVirt Team: | --- | RHEL 7.3 requirements from Atomic Host: | |
| Cloudforms Team: | --- | Target Upstream Version: | |
| Embargoed: | |||
|
Description
Yuxiang Zhu
2018-08-24 07:28:31 UTC
PR https://github.com/openshift/jenkins-sync-plugin/pull/254 from @Yuziang Zhu has merged. I've initiated v1.0.26 of the sync plugin with that change. PR https://github.com/openshift/jenkins/pull/685 and distgit job https://buildvm.openshift.eng.bos.redhat.com:8443/job/devex/job/devex%252Fjenkins-plugins/62/ is pulling in the sync plugin fix into the centos/rhel images respectively. I will move to on qa when I see a brew pulp image with the new sync plugin version. image brew-pulp-docker01.web.prod.ext.phx2.redhat.com:8888/openshift3/jenkins-2-rhel7:v3.11.0.20180828.200945 has v1.0.26 of the sync plugin moving to qa Are you sure that's a pod log from the slave? I don't know why the slave would be running sync-plugin logic. Also the error you got sounds like a permissions problem in your default service account. The sync plugin is expected to have edit permission for the namespace it operates against.. your pod is running w/ the "default" service account and it would appear it does not have permission to view configmaps, which would normally be part of the edit permission. Can you provide your full recreate steps, as well as providing your slave pod template definition and the slave pod yaml? To answer your original question, yes registry.dev.redhat.io/openshift3/jenkins-2-rhel7:v3.11 has the v1.026 plugin, registry.dev.redhat.io uses the same storage as brew, so the images will be identical. @Ben Parees sorry my misunderstood, the bug already verified in registry.dev.xxx/openshift3/jenkins-2-rhel7 ea444244e521
registry.dev.xxx/openshift3/jenkins-slave-base-rhel7 126b00ef54fe
Thanks @Yuxiang Zhu
steps:
1. Launch openshift with latest jenkins version
2. Using file from pull/254, replace docker.io/openshift/jenkins-slave-base-centos7:latest with registry.dev.xxx/openshift3/jenkins-slave-base-rhel7:latest
3.Create buildconfig from above file
$oc create -f XXX
4.Start a pipeline build from jenkins job
5. Check the page "This build requires parameters: ", will not duplicate parameters, and start a build
6. Build complete
OpenShift Build bug/demo-pipeline-1
Started by user wewang
[Pipeline] podTemplate
[Pipeline] {
[Pipeline] node
Still waiting to schedule task
Jenkins doesn’t have label jenkins-slave-40249bxxxx86-c545a9db3b9a
Agent jenkins-slave-40249bbe-c6a9-4dc1-xxx5a9db3b9a-2g33z-sjbfx is provisioned from template Kubernetes Pod Template
Agent specification [Kubernetes Pod Template] (jenkins-slave-xxxc6a9-4dc1-8f86-c545a9db3b9a):
Running on jenkins-slave-40249bbe-c6a9-4dc1-8f86-c545a9db3b9a-2g33z-sjbfx in /home/jenkins/workspace/bug/bug-demo-pipeline
[Pipeline] {
[Pipeline] container
[Pipeline] {
[Pipeline] stage
[Pipeline] { (foo)
[Pipeline] echo
Hello, openshift
[Pipeline] }
[Pipeline] // stage
[Pipeline] }
[Pipeline] // container
[Pipeline] }
[Pipeline] // node
[Pipeline] }
[Pipeline] // podTemplate
[Pipeline] End of Pipeline
Finished: SUCCESS
Since the problem described in this bug report should be resolved in a recent advisory, it has been closed with a resolution of ERRATA. For information on the advisory, and where to find the updated files, follow the link below. If the solution does not work for you, open a new bug report. https://access.redhat.com/errata/RHBA-2018:2652 |