Bug 2075584 - improve clarity of build failure messages when using csi shared resources but tech preview is not enabled
Summary: improve clarity of build failure messages when using csi shared resources but...
Alias: None
Product: OpenShift Container Platform
Classification: Red Hat
Component: Build
Version: 4.11
Hardware: Unspecified
OS: Unspecified
Target Milestone: ---
: 4.11.0
Assignee: Corey Daley
QA Contact: Jitendar Singh
Depends On:
TreeView+ depends on / blocked
Reported: 2022-04-14 15:59 UTC by Gabe Montero
Modified: 2022-08-10 11:07 UTC (History)
5 users (show)

Fixed In Version:
Doc Type: If docs needed, set a value
Doc Text:
Clone Of:
Last Closed: 2022-08-10 11:07:06 UTC
Target Upstream Version:

Attachments (Terms of Use)

System ID Private Priority Status Summary Last Updated
Github openshift openshift-controller-manager pull 220 0 None open Bug 2075584: Bubble actual error message up to the build 2022-04-14 19:04:38 UTC
Github openshift origin pull 27028 0 None open Bug 2075584: Use substring match instead of exact match 2022-04-18 13:21:50 UTC
Red Hat Product Errata RHSA-2022:5069 0 None None None 2022-08-10 11:07:24 UTC

Description Gabe Montero 2022-04-14 15:59:06 UTC
Description of problem:

So if one fails to enable a cluster for tech preview, and then tries to perform a build using Shared Resource CSI Driver based volumes, then the 'CannotCreateBuildPod' failure reason just has a generic message with no details explaining things.

One currently has to mine the OCM controller logs to find

E0414 14:44:10.736074       1 build_controller.go:1251] failed to create a build pod spec for build my-csi-app-namespace/my-csi-bc-1: csi volumes require the BuildCSIVolumes feature gate to be enabled

If we had added the error string to the message at https://github.com/openshift/openshift-controller-manager/blob/19a7c51e45d3b09563c4042fbfaaf822ad081dbc/pkg/build/controller/build/build_controller.go#L1261 one could discover their mistake by simply dumping the build yaml.

Version-Release number of selected component (if applicable):

4.11 and 4.10

How reproducible:


Steps to Reproduce:
1. provision a 4.11/4.10 cluster without turning on tech preview, but not from cluster-bot - via the installer with your personal pull secret that has redhat cloud credentials
3. oc new-project my-csi-app-namespace
4. oc apply -f https://raw.githubusercontent.com/openshift/csi-driver-shared-resource/master/examples/build-with-rhel-entitlements/01-role-bc.yaml
5. oc apply -f https://raw.githubusercontent.com/openshift/csi-driver-shared-resource/master/examples/build-with-rhel-entitlements/01-rolebinding-bc.yaml
6. oc apply -f https://raw.githubusercontent.com/openshift/csi-driver-shared-resource/master/examples/build-with-rhel-entitlements/02-csi-share-bc.yaml
7. oc apply -f https://raw.githubusercontent.com/openshift/csi-driver-shared-resource/master/examples/build-with-rhel-entitlements/03-bc.yaml
8. oc start-build my-csi-bc

Actual results:

Expected results:

Additional info:

Comment 1 Gabe Montero 2022-04-14 16:02:21 UTC
also https://github.com/openshift/openshift-controller-manager/blob/19a7c51e45d3b09563c4042fbfaaf822ad081dbc/pkg/build/controller/build/build_controller.go#L1234 needs to be updated to handle the utilruntime.HandleError(err) call on line 1251

Comment 4 Priti Kumari 2022-04-22 05:46:59 UTC


$ oc new-project my-csi-app-namespace
Now using project "my-csi-app-namespace" on server "https://api.rhoms-4.11-042204.dev.openshiftappsvc.org:6443".

You can add applications to this project with the 'new-app' command. For example, try:

    oc new-app ruby~https://github.com/sclorg/ruby-ex.git

to build a new example application in Python. Or use kubectl to deploy a simple Kubernetes application:

    kubectl create deployment hello-node --image=gcr.io/hello-minikube-zero-install/hello-node

pkumari$ oc apply -f ./examples/build-with-rhel-entitlements/01-role-bc.yaml role.rbac.authorization.k8s.io/shared-resource-my-share-bc created

pkumari$ oc apply -f ./examples/build-with-rhel-entitlements/01-rolebinding-bc.yaml 
rolebinding.rbac.authorization.k8s.io/shared-resource-my-share-bc created

pkumari$ oc apply -f ./examples/build-with-rhel-entitlements/02-csi-share-bc.yaml 
error: unable to recognize "./examples/build-with-rhel-entitlements/02-csi-share-bc.yaml": no matches for kind "SharedSecret" in version "sharedresource.openshift.io/v1alpha1"

pkumari$ oc apply -f ./examples/build-with-rhel-entitlements/03-bc.yaml 
buildconfig.build.openshift.io/my-csi-bc created

pkumari$ oc start-build my-csi-bc
build.build.openshift.io/my-csi-bc-1 started

pkumari$ oc get build
NAME          TYPE     FROM         STATUS                           STARTED   DURATION
my-csi-bc-1   Docker   Dockerfile   New (CannotCreateBuildPodSpec)             

pkumari$ oc describe build my-csi-bc-1
Name:           my-csi-bc-1
Namespace:      my-csi-app-namespace
Created:        About a minute ago
Labels:         buildconfig=my-csi-bc
Annotations:    openshift.io/build-config.name=my-csi-bc

Status:         New (Failed to create pod spec: failed to create a build pod spec for build my-csi-app-namespace/my-csi-bc-1: csi volumes require the BuildCSIVolumes feature gate to be enabled)
Duration:       waiting for 1m56s

Build Config:   my-csi-bc
Build Pod:      my-csi-bc-1-build

Strategy:       Docker
  FROM registry.redhat.io/ubi8/ubi:latest
  RUN ls -la /etc/pki/entitlement
  RUN rm /etc/rhsm-host
  RUN yum repolist --disablerepo=*
  RUN subscription-manager repos --enable rhocp-4.9-for-rhel-8-x86_64-rpms
  RUN yum -y update
  RUN yum install -y openshift-clients.x86_64
Empty Source:   no input source provided

Build trigger cause:    Manually triggered

Events: <none>

Comment 6 errata-xmlrpc 2022-08-10 11:07:06 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory (Important: OpenShift Container Platform 4.11.0 bug fix and security update), and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.


Note You need to log in before you can comment on or make changes to this bug.