Customer hit a regression in that shadows always trying to create a spool directory per job in 7.4.4-0.4.el5 despite the config they have kept with for some time.
Possible due to BZ549432.
It was also reported that:
"A side effect discovery of that would appear to be that if I did need to invoke the creation of a spool directory our multi-schedd config is scoped wrong. All shadows try to use the base SPOOL, they don't parent to the SPOOL of their respective parent schedd."
A fix is in the branch 7.4.3-BZ625205.
To verify, the expected behavior is this:
1. When submitting a parallel universe job, condor should create a job spool (aka "job sandbox") directory. This directory will be named something like $SPOOL/clusterX.procY.subprocZ and is necessary for the PU scripts to work. Parallel universe jobs will have the "JobRequiresSandbox" attribute set to TRUE automatically by condor_submit (this is visible via "condor_q -long").
2. When submitting a vanilla universe job with no extra attributes, such a directory should not be created.
3. When submitting a vanilla universe job that uses chirp (that is, one that has "+WantIOProxy=TRUE" in the submit file), such a directory should not be created.
4. When submitting a vanilla universe job with "+JobRequiresSandbox=TRUE" in the submit file, a job spool directory should be created.
Can you please post the patch?
Checked behaviour of creating $SPOOL/clusterX.procY.subprocZ directories as described in Comment 2.
Tested with (version):
RHEL4 i386,x86_64 - passed
RHEL5 i386,x86_64 - passed
Technical note added. If any revisions are required, please edit the "Technical Notes" field
accordingly. All revisions will be proofread by the Engineering Content Services team.
With this update, condor creates a job spool directory only when submitting a parallel universe job but not when submitting a vanilla universe job without "+JobRequiresSandbox=TRUE" in the submit file.
An advisory has been issued which should help the problem
described in this bug report. This report is therefore being
closed with a resolution of ERRATA. For more information
on therefore solution and/or where to find the updated files,
please follow the link below. You may reopen this bug report
if the solution does not work for you.