test: [Feature:Marketplace] Marketplace resources with labels provider displayName [Top Level] [Feature:Marketplace] Marketplace resources with labels provider displayName [ocp-21728] create opsrc with labels [Serial] is failing frequently in CI, see search results: https://search.ci.openshift.org/?maxAge=168h&context=1&type=bug%2Bjunit&name=&maxMatches=5&maxBytes=20971520&groupBy=job&search=%5C%5BFeature%3AMarketplace%5C%5D+Marketplace+resources+with+labels+provider+displayName+%5C%5BTop+Level%5C%5D+%5C%5BFeature%3AMarketplace%5C%5D+Marketplace+resources+with+labels+provider+displayName+%5C%5Bocp-21728%5C%5D+create+opsrc+with+labels+%5C%5BSerial%5C%5D Link to recent failure: https://prow.ci.openshift.org/view/gs/origin-ci-test/logs/periodic-ci-openshift-release-master-ocp-4.4-e2e-vsphere-upi-serial/1316047329492471808 It looks like the tests times out on gettiong the packagelist with label of opsrctestlabel catalogsourceconfig.operators.coreos.com/csctestlabel created Oct 13 16:29:04.996: INFO: Running 'oc --namespace=e2e-test-marketplace-w25sh --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get catalogsourceconfig csctestlabel -o=jsonpath={.status.currentPhase.phase.message} -n openshift-marketplace' Oct 13 16:29:05.186: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get -n openshift-marketplace catalogsourceconfig csctestlabel -o=jsonpath={.spec.csDisplayName}' Oct 13 16:29:05.317: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get -n openshift-marketplace catalogsourceconfig csctestlabel -o=jsonpath={.spec.csPublisher}' Oct 13 16:29:05.453: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get -n openshift-operators catalogsource csctestlabel -o=jsonpath={.spec.displayName}' Oct 13 16:29:05.602: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get -n openshift-operators catalogsource csctestlabel -o=jsonpath={.spec.publisher}' Oct 13 16:29:05.741: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get -n openshift-marketplace operatorsource opsrctestlabel -o=jsonpath={.status.packages}' Oct 13 16:29:10.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:15.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:20.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:25.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:30.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:35.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:40.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:45.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:50.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:29:55.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:30:00.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:30:05.880: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' Oct 13 16:30:06.015: INFO: Running 'oc --kubeconfig=/var/run/secrets/ci.openshift.io/multi-stage/kubeconfig get packagemanifests -lopsrc-provider=optestlabel -o=name -n openshift-marketplace' [AfterEach] [Feature:Marketplace] Marketplace resources with labels provider displayName
Still needs to be investigated, marking target release as 4.7.0 for now.
> Andrew, I'm going to clone this back to 4.4 now, but it seems like an issue in and of itself that sippy cannot infer existing bugs exist at higher release versions to accommodate the existing github-based backport process (open bugs one release at a time as we backport). it's not that sippy cannot, it's that sippy deliberately does not. The fact that you have a bug targeted at 4.7 is no guarantee you are aware the issue exists in 4.4 or are intending to fix it there. The only thing that makes that intent clear is you creating the clone and targeting it to that release. For that reason, sippy only acknowledges bugs that are targeted at the same release that the issue is showing up in. And I do not see that the 4.4 clone has been created. Adding a needs info in lieu of an "action required" to create the clone, as this is still showing up as a high failure on 4.4 (passing at 56%)
it also looks like this bug is the main reason we have no accepted 4.4-ci payloads in the last 7 days: https://openshift-release.apps.ci.l2s4.p1.openshiftapps.com/dashboards/overview#4.5.0-0.nightly so i think this should be considered for raising severity/priority.
link above should have been to https://openshift-release.apps.ci.l2s4.p1.openshiftapps.com/dashboards/overview#4.4.0-0.ci
> The impetus for the comment was that there's a set of manual steps involved when targetRelease > version, like creating bugs on [version, targetRelease] to verify the issue in that range. Since this process should be the same across components, could it be automated? i believe dptp does have tooling for creating BZ clones for you: https://bugs.ci.openshift.org/clones?ID=1888220 (use the "Create clone" button)
Closing this as it is a duplicate of 1895019, which at the time was not cloned from the parent 4.5.z bug. *** This bug has been marked as a duplicate of bug 1895019 ***