Red Hat Bugzilla – Full Text Bug Listing
|Summary:||xref in legalnotice can cause the "growing nodeset" error|
|Product:||[Community] Publican||Reporter:||Jaromir Hradilek <jhradile>|
|Component:||publican||Assignee:||PnT DevOps Devs <hss-ied-bugs>|
|Status:||CLOSED CURRENTRELEASE||QA Contact:||tools-bugs <tools-bugs>|
|Version:||2.3||CC:||ajaysajay, dlesage, jfearn, jhradile, mmcallis, publican-list, r.landmann|
|Fixed In Version:||Doc Type:||Bug Fix|
|Doc Text:||Story Points:||---|
|Last Closed:||2013-11-24 18:54:00 EST||Type:||---|
|oVirt Team:||---||RHEL 7.3 requirements from Atomic Host:|
Description Jaromir Hradilek 2010-07-26 10:08:54 EDT
Created attachment 434429 [details] A sample book to reproduce this error Description of problem: Under certain circumstances, building a book with Git repository in its root directory causes Publican to produce the "growing nodeset" error message. As a result, it starts consuming all available computer memory, and has to be terminated with a TERM signal (for example, by pressing Ctrl+C). Version-Release number of selected component (if applicable): publican-2.1-0.fc13 How reproducible: Always. Steps to Reproduce: 1. Extract the content of the attachment below: "tar xfz Vim_Survival_Handbook.tar.gz" 2. Get into the newly created directory: "cd Vim_Survival_Handbook" 3. Build the book with Publican: "publican build --langs=en-US --formats=html-desktop" Actual results: Beginning work on en-US Starting html-desktop Using XML::LibXSLT on /usr/share/publican/xsl/html-single.xsl XPath error : Memory allocation failed : growing nodeset growing nodeset ^ ^CSegmentation fault (core dumped) Expected results: Finished html-desktop Additional info: For some reason, removing Git repository data (that is, typing "rm -rf .git .gitignore" at a shell prompt) fixes the problem, and Publican is able to build the book again.
Comment 1 Jeff Fearn 2010-07-27 02:34:20 EDT
Hi, I don't think this has anything to do with the .git directory, deleting it has no affect on this error in my testing. I've run valgrind and there appears to be some memory leaks in LibXML, testing is ongoing. valgrind --trace-children=yes -v --track-origins=yes --leak-check=full /usr/bin/publican build --langs=en-US --formats=html-desktop Interestingly if you run 'publican clean' you can build the book fine, but running the build a second time, without running clean, generates the same issue.
Comment 2 Jeff Fearn 2010-07-28 02:23:43 EDT
I think the real problem here is the xref in the legal notice. If you comment the xref out in Legal_Notice.xml the book builds fine. If you change the xref in Legal_Notice.xml to like to the revision history, or the main chapter, the same error occurs.
Comment 3 Jaromir Hradilek 2010-07-28 09:48:17 EDT
Hi Jeff, and thank you for the clarification: it helped me a lot to know that there is an easy workaround, so that I can actually continue writing without being distracted by occasional swapping. I am surprised nobody else has experienced this issue before though, as referencing an appendix from the legal notice is quite common in technical writing (see DocBook: The Definitive Guide for an example). Should I update the summary of this bug to make it easier to find?
Comment 5 Jeff Fearn 2011-02-17 23:35:10 EST
Just a note that I am still poking around this issue when I get some time to tinker. It turns out that you can have xrefs in the legal notice if you don't generate an index ... crazy stuff.
Comment 7 ajaysajay 2013-10-07 04:25:43 EDT
I have been using publican (2.1-0.el6) on CentOS 6.4 (with 6 GB RAM) to build docbook document. The source document used to build earlier but after xref elements are added to source document, the build results in the following error: =================================================== Beginning work on en-US Starting pdf Using XML::LibXSLT on /usr/share/publican/xsl/pdf.xsl XPath error : Memory allocation failed : growing nodeset hit limit growing nodeset hit limit =================================================== Although I don't have any xref element in legalnotice element in my document, but there were two xref elements in bookinfo element. Commenting these xref's solved the problem for now, and the document could be processed into html and pdf. But I do not think it is the appropriate solution.
Comment 8 Jeff Fearn 2013-11-05 01:14:09 EST
I've just taken a look at this again, to see if I can fix it for the next release, but I can no longer duplicate this. Process: 1: get Publican User Guide from git 2: add <xref linkend="pref-Users_Guide-Introduction" /> to Book_Info.xml abstract. 3: add <index/> to Users_Guide.xml 4: Build No problem. 1: Copy Legal Notice from common brand in to pug 2: change link in Book_Info.xml to local legale notice file 3: add <xref linkend="pref-Users_Guide-Introduction" /> in to local legal notice 4: keep <index/> in Users_Guide.xml 5: Build No problem. I picked that link because it has indexterms so the index is involved. If anyone has some source that currently triggers this can you please link to it so I can test on something we know triggers it in older versions? My testing was done on the devel branch code.
Comment 9 Jaromir Hradilek 2013-11-13 11:10:58 EST
Hi Jeff, Thank you for looking into this issue. I tried to test this with the latest development version of Publican (commit 007f85f), but any attempt to build it fails with the following error: $ ./Build local Cleaning up build files Created META.yml and META.json Creating Publican-v3.9.9 Can't copy('po/test.po', 'Publican-v3.9.9/po/test.po'): No such file or directory at /usr/share/perl5/vendor_perl/Module/Build/Base.pm line 5616. Anyway, I can no longer reproduce the issue on my Fedora 19 machine with publican-3.2.1-0.fc19.noarch.
Comment 10 Jeff Fearn 2013-11-13 18:01:39 EST
Hi Jaromir, I've checked in a fix for the MANIFEST. I will extract penance from Rudi later on for messing up my git repo ;)
Comment 11 Jeff Fearn 2013-11-24 18:54:00 EST
Since we can;'t duplicate this anymore I'm going to close it. There is no "CLOSED MAGICHAPPENED" but upgrading everything to the latest release does seem to bypass the issue.
Comment 12 Ruediger Landmann 2014-12-15 22:25:23 EST
Removing needinfo; I'm sure penance has been extracted