Bug 869455 - submitting job with <package/> results in database error: (OperationalError) (1048, "Column 'job_id' cannot be null")
Summary: submitting job with <package/> results in database error: (OperationalError) ...
Keywords:
Status: CLOSED CURRENTRELEASE
Alias: None
Product: Beaker
Classification: Community
Component: scheduler
Version: 0.9
Hardware: Unspecified
OS: Unspecified
medium
unspecified vote
Target Milestone: 0.11
Assignee: Dan Callaghan
QA Contact: Amit Saha
URL:
Whiteboard: Misc
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2012-10-24 00:35 UTC by Dan Callaghan
Modified: 2018-02-06 00:41 UTC (History)
6 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2013-01-17 04:34:41 UTC


Attachments (Terms of Use)


Links
System ID Priority Status Summary Last Updated
Red Hat Bugzilla 896622 None None None Never

Internal Links: 896622

Description Dan Callaghan 2012-10-24 00:35:57 UTC
Description of problem:
When processing a submitted job, TaskPackage.lazy_create can trigger a flush while the job is not fully populated, resulting in a NULLable violation.

Version-Release number of selected component (if applicable):
0.9

How reproducible:
always

Steps to Reproduce:
1. Submit a job containing a <package/> whose name has never been used in Beaker before
  
Actual results:
Failed to import job because of: (OperationalError) (1048, "Column 'job_id' cannot be null") 'INSERT INTO recipe_set (job_id, priority, queue_time, result, status, lab_controller_id, ttasks, ptasks, wtasks, ftasks, ktasks) VALUES (%s, %s, %s, %s, %s, %s, %s, %s, %s, %s, %s)' (None, 'Normal', datetime.datetime(2012, 10, 24, 0, 29, 37, 807458), 'New', 'New', None, 0, 0, 0, 0, 0)

Expected results:
Job is accepted, task_package row is created.

Additional info:
Reported by Jaroslav Kortus:
https://lists.fedorahosted.org/pipermail/beaker-devel/2012-October/000379.html

Comment 1 Dan Callaghan 2012-10-24 00:37:04 UTC
Workaround is to manually insert the offending task_package row:

INSERT INTO task_package (package) VALUES ('new-package-name');

Comment 2 Dan Callaghan 2012-10-24 00:47:57 UTC
Actually I think this affects any job which includes any <package/> element, which makes this quite a serious regression... still investigating.

Comment 3 Dan Callaghan 2012-10-24 01:49:10 UTC
This was a regression in 0.8.2 and affects any job containing a <package/> element.

Like all the other lazy_create bugs we have had, it is due to this change introducing nested transactions:
http://git.beaker-project.org/cgit/beaker/diff/Server/bkr/server/model.py?id=bb12ed97

The workaround in comment 1 is not valid either. The only workaround is to avoid using <package/>. One possibility is to use a <ks_append/> containing a second %packages section, although I think this breaks in very old Anaconda versions. Another option is to pass ks_meta="packages=onepackage:anotherpackage" inside <recipe/>.

Comment 4 Dan Callaghan 2012-10-24 06:47:18 UTC
On Gerrit: http://gerrit.beaker-project.org/1436

Comment 6 Amit Saha 2012-11-09 07:58:13 UTC
Verified to be fixed. Ran the test case against the release-0.10 branch and against the master branch. It fails in the latter. Also submitted a simple Job with additional <package> specification. The kickstart file was generated correctly.

Comment 7 Raymond Mancy 2012-11-22 06:44:01 UTC
This has now been released

Comment 8 Dan Callaghan 2012-11-29 06:04:25 UTC
This bug still exists. My fix only works in the case with one recipeset and one recipe and no guestrecipes.

I also just discovered the reason why nobody has hit this bug in our production Beaker instance. There is one very small but very important difference in the production schema vs our source code (and tests): recipe.recipe_set_id and recipe_set.job_id are NULLable in production. That will also need to be fixed: bug 881563.

Comment 9 Dan Callaghan 2012-11-29 06:24:14 UTC
On Gerrit: http://gerrit.beaker-project.org/1525

Comment 12 Dan Callaghan 2013-01-17 04:34:41 UTC
Beaker 0.11.0 has been released.


Note You need to log in before you can comment on or make changes to this bug.