Description of problem: https://build.gluster.org/job/devrpm-el7/10441/console 10:12:42 Wrote: /home/jenkins/root/workspace/devrpm-el7/extras/LinuxRPM/rpmbuild/SRPMS/glusterfs-4.2dev-0.240.git4657137.el7.src.rpm 10:12:42 mv rpmbuild/SRPMS/* . 10:12:44 INFO: mock.py version 1.4.11 starting (python version = 2.7.5)... 10:12:44 Start: init plugins 10:12:44 INFO: selinux disabled 10:12:44 Finish: init plugins 10:12:44 Start: run 10:12:44 INFO: Start(glusterfs-4.2dev-0.240.git4657137.el7.src.rpm) Config(epel-7-x86_64) 10:12:44 Start: clean chroot 10:12:44 ERROR: Exception(glusterfs-4.2dev-0.240.git4657137.el7.src.rpm) Config(epel-7-x86_64) 0 minutes 0 seconds I am not sure why it is saying exception for the src.rpm and failing, does anyone know? Version-Release number of selected component (if applicable): How reproducible: Steps to Reproduce: 1. 2. 3. Actual results: Expected results: Additional info:
The rest of the error message contains a little more details: INFO: mock.py version 1.4.11 starting (python version = 2.7.5)... Start: init plugins INFO: selinux disabled Finish: init plugins Start: run INFO: Start(glusterfs-4.2dev-0.240.git4657137.el7.src.rpm) Config(epel-7-x86_64) Start: clean chroot ERROR: Exception(glusterfs-4.2dev-0.240.git4657137.el7.src.rpm) Config(epel-7-x86_64) 0 minutes 0 seconds INFO: Results and/or logs in: /home/jenkins/root/workspace/devrpm-el7/RPMS/el7/x86_64/ INFO: Cleaning up build root ('cleanup_on_failure=True') Start: clean chroot ERROR: Build root is locked by another process. Build step 'Execute shell' marked build as failure Archiving artifacts Finished: FAILURE The "ERROR: Build root is locked by another process." suggests that there was an other mock process building for epel-7-x86_64 on the system. Or, possibly a mock process was uncleanly exited and some pieces were left over. In order to prevent this in the future, I recommend to run mock with a --uniqueext=UNIQUEEXT option that is unique for the jobname+jobid.
Cleaned up the old folder and verified that there were no duplicate Jenkins connections, which is when I've seen this happen in the past.