Hide Forgot
From Bugzilla Helper: User-Agent: Mozilla/5.0 (X11; U; Linux x86_64; en-US; rv:1.4.3) Gecko/20040924 Description of problem: CUPS running on an application server to service local queues only; All printers are remote, using socket://x.x.x.x:9100 to print; Intermittently, submitting a print job via lpr will cause some random number of copies of the print job to land in the queue. We've seen up to 25 copies so far, although 16 seems to be a popular number :-(. We haven't tried lp, as the application only understands BSD-style lpr right now. (BTW: lpq functionality in cups may as well not be there... it never seems to report anything useful!) I'm switching from socket:// to lpd:// later today to see if that makes any difference at all, but I really didn't want to use LPD - the nature of the remote print servers is such that direct "raw" streams seem to work better/faster. Version-Release number of selected component (if applicable): cups-1.1.17-13.3.16 How reproducible: Sometimes Steps to Reproduce: 1. Configure any brand of print server as a remote printer using socket://a.b.c.d:9100/, servicing a local queue 2. print to that queue 3. plant new trees Actual Results: Usually works fine, sometimes prints multiple copies (typically about 16 copies) Expected Results: One copy, as requested. Additional info:
This is a major problem - if I can't solve it soon, the AIX box will be booted back up and replace the RHEL "upgrade"!
Which particular brand (and model) of print server are you using? I would like to get you to capture the TCP traffic between the client and the print server. Are you using socket: or lpd: at the moment, and if lpd: has the problem recurred?
We are experiencing the same problem , even after updating to cups- 1.1.17-13.3.10
Mary: which problem are you seeing? This bug report is about socket://-type transports, i.e. "JetDirect"; bug #118707 is about lpd://-type transports, i.e. RFC 1179. Which one are you using?
Sorry , I didn't see the difference , we have Lexmark printers , so lpd://-type transports, i.e. RFC 1179. Should I be removed from the cc ?
Sorry for the delay... 1. Switching from socket:// to lpd:// made no difference whatsoever. 2. Black Box LEP0000 print servers. Single-port parallel print server, I don't know who the OEM is or what embedded OS it uses. 3. Briefly looking at TCP traffic, the server does in fact send X# of print jobs to the print server. Certainly X# of jobs appear in the queue, and the data stream goes over to the print server X# times, and there are no errors apparent - i.e. the TCP stream closes normally every time.. 4. Switching the spool command from "lpr -P" to "lp -d" seems to have fixed the problem 100% so far. I had even managed to duplicate the problem with "ls | lpr -Plj4050", but as long as I don't use /usr/bin/lpr anywhere everything's been working fine!
Adam: please try these commands and let me know what they say: type lpr rpm -qf /usr/bin/lpr rpm -V cups Also, could you count exactly how many copies are printed? Also, have you set an authentication scheme in cupsd.conf, or changed it in any other way by hand? If so I'd like to see what changes have been made. Thanks.
Created attachment 106706 [details] cupd.conf See attached cupsd.conf file
(erifi1:root)$ type lpr lpr is /usr/bin/lpr (erifi1:root)$ rpm -qf /usr/bin/lpr cups-1.1.17-13.3.16 (erifi1:root)$ rpm -V cups S.5....T c /etc/cups/cupsd.conf S.5....T c /etc/cups/mime.convs S.5....T c /etc/cups/mime.types S.5....T c /etc/cups/printers.conf
Mary: are you also seeing a difference between using lp and using lpr?
The command being used is : lp -s -dcfrlex1 , we can try to change it to lpr to see if is the same. We are going to try this
Mary: okay, that's different behaviour than Adam is seeing. Adam: still would like to know what output the commands in comment #7 give.
Tim, changing the lp command to use lpr seems to have corrected our problem, can you explain why?
No, I can't, as I still need more information. Mary, I'd like you to show me the output of *these* commands (not those in comment #7): type lp rpm -qf /usr/bin/lp Adam: I'd still like to see the output from the commands in comment #7.
[root@fx root]# type lpr lpr is /usr/bin/lpr [root@fx root]# rpm -qf /usr/bin/lpr cups-1.1.17-13.3.16 [root@fx root]# rpm -V cups S.5....T c /etc/cups/cupsd.conf S.5....T c /etc/cups/mime.types S.5....T c /etc/cups/printers.conf .......T c /etc/rc.d/init.d/cups .......T /usr/bin/cancel.cups .......T /usr/bin/lp.cups .......T /usr/bin/lpq.cups .......T /usr/bin/lpr.cups .......T /usr/bin/lprm.cups .......T /usr/bin/lpstat.cups .......T /usr/lib64/cups/backend/ipp .......T /usr/sbin/accept .......T /usr/sbin/lpc.cups .......T d /usr/share/man/fr/man1/lp.1.gz .......T d /usr/share/man/fr/man8/accept.8.gz .......T d /usr/share/man/fr/man8/enable.8.gz .......T d /usr/share/man/man1/lp-cups.1.gz .......T d /usr/share/man/man1/lpq-cups.1.gz .......T d /usr/share/man/man1/lpr-cups.1.gz .......T d /usr/share/man/man1/lprm-cups.1.gz .......T d /usr/share/man/man1/lpstat-cups.1.gz .......T d /usr/share/man/man8/accept.8.gz .......T d /usr/share/man/man8/enable.8.gz .......T d /usr/share/man/man8/lpc-cups.8.gz [root@fx root]#
We just added a new printer through the Red Hat printer applet, and immediately got a report that two different users' print jobs were printing out multiple times. We've been told that 20 copies were generated, but that number hasn't been verified to be exact. In the past, it fluctuated between 2 and about 16 or so. Still using "lp" ... this is the first occurrenceof this problem since I switched from lpr to lp on Nov. 12 !
Okay, 20 copies is the number I was hoping to hear: cupsDoFileRequest() (which both lpr and lp call) has this loop: max_attempts=20 while (no success && max_attempts-- > 0) { send ipp request } So it looks like that's the code responsible. The reason for the loop is to cope with authorization issues. However, what I don't understand is *why* the IPP request seems to fail as far as the client is concerned, even though in reality it succeeds. I really need to see the debug log from the cups server at the point where things go wrong, in addition to some other information about the event. Here is how to collect that: 1. Please adjust /etc/cups/cupsd.conf to have the line "LogLevel debug2" if you haven't already. Then run "/sbin/service cups reload" as root. 2. Wait until multiple copies are printed (or try to reproduce the problem by printing something). As I understand it, no-one can cause this problem to occur at will but instead it is intermittent -- is that right? 3. Attach the /var/log/cups/error_log to this bug report, and include this information: * what time (as precise as you can be) the copies were printed * which queue name they were printed to * how many copies ended up on paper * how many copies were expected Thanks.
Another datapoint that might (or might not) prove useful: are either of you using the cups-lpd service?
Well, adding another printer through the applet was what triggered the bug again; specifically printing the test page, which uses lpr (not lp). I don't have cups-lpd running, as far as I know. All the print jobs originate from localhost. May or may not be relevant; I'm running x86_64, not i386, but I seem to have a significant mixture of i386/i686 and x86_64 packages installed. I don't think any of the CUPS rpms are duplicated or mis-architectured, but could thunking issues have anything to do with this?
Adam: to clarify paragraph 1, do you mean that adding a print queue always (or often) causes multiple test pages to be printed? Concerning x86_64, it is normal to have multiarch libraries installed, and I don't think that is having any bearing here.
Just a note.. A customer site running RHEL3 with Update4 applied (X86_64) is still having this problem. After reading through the exchanges on this thread just now I got the SRC RPM and tried compiling an x86_64 version which fails: Processing files: cups-devel-1.1.17-13.3.18 Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 Requires: /bin/sh cups-libs = 1:1.1.17 Processing files: cups-libs-1.1.17-13.3.18 Provides: libcups.so.2()(64bit) libcupsimage.so.2()(64bit) Requires(interp): /sbin/ldconfig /sbin/ldconfig Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 Requires(post): /sbin/ldconfig Requires(postun): /sbin/ldconfig Requires: libc.so.6()(64bit) libc.so.6(GLIBC_2.2.5)(64bit) libc.so.6(GLIBC_2.3)(64bit) libcrypto.so.4()(64bit) libjpeg.so.62()(64bit) libm.so.6()(64bit) libm.so.6(GLIBC_2.2.5)(64bit) libpng12.so.0()(64bit) libssl.so.4()(64bit) libtiff.so.3()(64bit) libz.so.1()(64bit) Processing files: cups-debuginfo-1.1.17-13.3.18 Provides: libcups.so.2.debug()(64bit) libcupsimage.so.2.debug()(64bit) Requires(rpmlib): rpmlib(CompressedFileNames) <= 3.0.4-1 rpmlib(PayloadFilesHavePrefix) <= 4.0-1 RPM build errors: File not found: /var/tmp/cups-root/etc/pam.d/cups # cat /etc/redhat-release Red Hat Enterprise Linux WS release 3 (Taroon Update 4) Anyway, any FYI that might help you out there...
I think what I'd really like to see is the captured TCP traffic between the lp or lpr client and the cupsd server for a job that is incorrectly submitted several times. You can capture this using ethereal or tcpdump -- it's the actual pcap capture file I'd like to see. Can anyone provide that? Can anyone cause this problem to happen on demand or is it entirely random?
I'm not sure if it's 100% on-demand, we're somewhat reluctant to test on a production system, but so far every time we add a new printer using the GUI tools (i.e. redhat-config-printer, I think?) the problem crops up again. Rebooting the server seems to make it go away again. I doubt I would be able to get a pcap of the traffic, as it happens so sporadically now that I'd have to capture a ridiculous amount of data.
I have a trace 1 copy requested 31/Mar/2005:18:38:11 by a user through firefox and lp with print command as "lp -dgaston" total number of copies printed: 21 ( 1 + 20 extras) client: [tru@heilig ~]$ rpm -q cups cups-1.1.17-13.3.27.x86_64 server: [tru@nao ~]$ rpm -q cups cups-1.1.17-13.3.24
Created attachment 112519 [details] client /var/cups/access_log
Created attachment 112520 [details] client /var/cups/error_log
Created attachment 112521 [details] client /etc/cups/cupsd.conf
Created attachment 112522 [details] client /var/cups/page_log
Created attachment 112523 [details] server access_log
Created attachment 112524 [details] server cupsd.conf
Created attachment 112525 [details] server error_log.O
Created attachment 112526 [details] server error_log
The client just called - and "recently", the problem has begun to reappear, fairly consistently. I'll be back on-site soon to diagnose/debug. I realized that getting packet captures will be rather difficult, since all the lp/lpr invocations are occurring ON the print server, so all the traffic to cupsd is happening either through pipes or via loopback. (I do not have another system with which I can test over the wire.) Aside from increasing the CUPS debug level, what can I capture that would be helpful?
Would they be able to install a CUPS package that has more logging instrumentation in the suspect area of code, if I built an RPM for it?
speaking for myself: yes
Sure, that's a possibility. Apparently it's annoying enough (40 MICR cheques wasted, for example) that they'll let me do pretty much anything to make it work properly.
I have put some packages here: ftp://people.redhat.com/twaugh/tmp/cups-rhel3/ This will make the suspect routine (cupsDoFileRequest) write debugging output to /tmp/cups/err_output, which you need to create (as root) like so: mkdir /tmp/cups touch /tmp/cups/err_output chown lp /tmp/cups/err_output If the file gets too large, it can safely be truncated like this at any time: >/tmp/cups/err_output Thanks very much for testing, and I hope to be able to make some sense out of the err_output file when the problem happens again. (I apologise for having to go this route, but I have been unable to reproduce this locally, and have never seen it first-hand.)
I have one this night, the user was quickly sending several files to print to the same printer, and seeing the duplicates, he stop the printer by opening the paper tray. I have stopped the cups client (on the sending machine) and cups server which was running cupsd at "LogLevel debug2". /tmp/cups/err_output [32432] cupsDoFileRequest(0xc1ebf0, 0xc211c0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xc3e210, 0xc3d820, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xc45610, 0xc3bda0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xc4ceb0, 0xc472c0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xf32f90, 0x20208d0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xf32f90, 0x205ccd0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xf32f90, 0x12bb4f0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [root@nao _stuck_queue]# grep cups /var/log/rpmpkgs cups-1.1.17-13.3.27.0.0.1.x86_64.rpm cups-libs-1.1.17-13.3.27.0.0.1.x86_64.rpm /var/spool/cups shows - 254 files (d00383-001...d00636-001) - 218 times the same file (md5sum on d*) I can provide /var/log/cups/* if needed Tru
Aha. So it looks like the problem is inside cups-lpd rather than in the cupsDoFileRequest() function from the CUPS library. Thank you very much for this trace! Is there anything relating to cups-lpd in /var/log/messages for that time?
It happened again today: [32432] cupsDoFileRequest(0xc1ebf0, 0xc211c0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xc3e210, 0xc3d820, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xc45610, 0xc3bda0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xc4ceb0, 0xc472c0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xf32f90, 0x20208d0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xf32f90, 0x205ccd0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [32432] cupsDoFileRequest(0xf32f90, 0x12bb4f0, '/', '(null)') [32432] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [32432] cupsDoFileRequest: authstring="" [32432] cupsDoFileRequest: post... [32432] cupsDoFileRequest: ipp write... [32432] cupsDoFileRequest: update... [32432] cupsDoFileRequest: status = 200 [32432] cupsDoFileRequest: response... [31728] cupsDoFileRequest(0x5198e0, 0x51beb0, '/', '(null)') [31728] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [31728] cupsDoFileRequest: authstring="" [31728] cupsDoFileRequest: post... [31728] cupsDoFileRequest: ipp write... [31728] cupsDoFileRequest: update... [31728] cupsDoFileRequest: status = 200 [31728] cupsDoFileRequest: response... [31728] cupsDoFileRequest(0x5198e0, 0x522b50, '/', '(null)') [31728] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [31728] cupsDoFileRequest: authstring="" [31728] cupsDoFileRequest: post... [31728] cupsDoFileRequest: ipp write... [31728] cupsDoFileRequest: update... [31728] cupsDoFileRequest: status = 200 [31728] cupsDoFileRequest: response... [31728] cups_connect("default", (nil), (nil)) [31728] connecting to localhost on port 631... [31728] cupsDoFileRequest(0x5199d0, 0x522b50, '/', '(null)') [31728] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [31728] cupsDoFileRequest: authstring="" [31728] cupsDoFileRequest: post... [31728] cupsDoFileRequest: ipp write... [31728] cupsDoFileRequest: update... [31728] cupsDoFileRequest: status = 200 [31728] cupsDoFileRequest: response... [31728] cupsDoFileRequest(0x5172d0, 0x51eb10, '/', '(null)') [31728] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [31728] cupsDoFileRequest: authstring="" [31728] cupsDoFileRequest: post... [31728] cupsDoFileRequest: ipp write... [31728] cupsDoFileRequest: update... [31728] cupsDoFileRequest: status = 200 [31728] cupsDoFileRequest: response... nothing in /var/log/messages :( Tru
It is happening while printing several files in rapid succession and it is always the last file which is printed multiple times. Tru
The sort of thing I'm expecting to see in /var/log/messages is this: Apr 22 11:11:50 eagle cups-lpd[5261]: Connection from cyberelk.elk (192.168.1.1) Apr 22 11:11:50 eagle cups-lpd[5261]: Receive print job for bw750 Apr 22 11:11:51 eagle cups-lpd[5261]: Print file - job ID = 3 Apr 22 11:11:51 eagle cups-lpd[5261]: Closing connection Can you make sure that /etc/syslog.conf contains a line like this?: # Log anything (except mail) of level info or higher. *.info;mail.none;authpriv.none;cron.none /var/log/messages and that '*.info' is listed? I would really expect to see something there..
/etc/syslog/conf reads: *.info;mail.none;authpriv.none;cron.none /var/log/messages ... greping 'cups' inside /var/log/messages only yields Apr 20 04:02:03 nao cups: cupsd shutdown succeeded Apr 20 04:02:04 nao cups: cupsd startup succeeded Apr 20 23:18:24 nao cups: cupsd shutdown succeeded Apr 20 23:45:37 nao cups: cupsd startup succeeded error.log has more info: D [20/Apr/2005:21:28:41 +0200] StartJob(381, 0x5483c0) D [20/Apr/2005:21:28:41 +0200] StartJob() id = 381, file = 0/1 D [20/Apr/2005:21:28:41 +0200] job-sheets=none,none D [20/Apr/2005:21:28:41 +0200] banner_page = 0 D [20/Apr/2005:21:28:41 +0200] StartJob: argv = "gaston","381","coluzzi","AsEsCoLa.pdf","1","","/var/spool/cups/d00381-001" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[0]="PATH=/usr/lib64/cups/filter:/bin:/usr/bin" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[1]="SOFTWARE=CUPS/1.1" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[2]="USER=root" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[3]="CHARSET=iso-8859-1" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[4]="LANG=en_US" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[5]="" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[6]="PPD=/etc/cups/ppd/gaston.ppd" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[7]="CUPS_SERVERROOT=/etc/cups" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[8]="RIP_MAX_CACHE=8m" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[9]="TMPDIR=/var/spool/cups/tmp" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[10]="CONTENT_TYPE=application/pdf" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[11]="DEVICE_URI=lpd://gaston.bis.pasteur.fr/raw?reserve=yes" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[12]="PRINTER=gaston" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[13]="CUPS_DATADIR=/usr/share/cups" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[14]="CUPS_FONTPATH=/usr/share/cups/fonts" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[15]="" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[16]="" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[17]="" D [20/Apr/2005:21:28:41 +0200] StartJob: envp[18]="" d [20/Apr/2005:21:28:41 +0200] UpdateJob: Allocating status buffer... D [20/Apr/2005:21:28:41 +0200] StartJob: statusfds = 3, 5 D [20/Apr/2005:21:28:41 +0200] StartJob: filterfds[1] = 6, -1 D [20/Apr/2005:21:28:41 +0200] StartJob: filter = "/usr/lib64/cups/filter/pdftops" D [20/Apr/2005:21:28:41 +0200] StartJob: filterfds[0] = 7, 8 D [20/Apr/2005:21:28:41 +0200] start_process("/usr/lib64/cups/filter/pdftops", 0x7fbfec6890, 0x7fbfec5cf0, 6, 8, 5) I [20/Apr/2005:21:28:41 +0200] Started filter /usr/lib64/cups/filter/pdftops (PID 22185) for job 381. D [20/Apr/2005:21:28:41 +0200] StartJob: filter = "/usr/lib64/cups/filter/pstops" D [20/Apr/2005:21:28:41 +0200] StartJob: filterfds[1] = 6, 9 D [20/Apr/2005:21:28:41 +0200] start_process("/usr/lib64/cups/filter/pstops", 0x7fbfec6890, 0x7fbfec5cf0, 7, 9, 5) I [20/Apr/2005:21:28:41 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 22186) for job 381. D [20/Apr/2005:21:28:41 +0200] StartJob: filter = "/usr/lib64/cups/filter/cupsomatic" D [20/Apr/2005:21:28:41 +0200] StartJob: filterfds[0] = 7, 8 D [20/Apr/2005:21:28:41 +0200] start_process("/usr/lib64/cups/filter/cupsomatic", 0x7fbfec6890, 0x7fbfec5cf0, 6, 8, 5) I [20/Apr/2005:21:28:41 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 22187) for job 381. D [20/Apr/2005:21:28:41 +0200] StartJob: backend = "/usr/lib64/cups/backend/lpd" D [20/Apr/2005:21:28:41 +0200] StartJob: filterfds[1] = -1, 6 D [20/Apr/2005:21:28:41 +0200] start_process("/usr/lib64/cups/backend/lpd", 0x7fbfec6890, 0x7fbfec5cf0, 7, 6, 5) I [20/Apr/2005:21:28:41 +0200] Started backend /usr/lib64/cups/backend/lpd (PID 22188) for job 381. d [20/Apr/2005:21:28:41 +0200] StartJob: Adding fd 3 to InputSet... d [20/Apr/2005:21:28:41 +0200] SendBrowseList: (183 bytes to ffffffff) 301e 4 ipp://nao.bis.pasteur.fr:631/printers/gaston "gaston, HP4100N, duplex black and white laser" "Created by redhat-config-printer 0.6.x" "HP LaserJet 4100, Foomatic + Postscript" d [20/Apr/2005:21:28:41 +0200] PID 22176 exited with no errors. D [20/Apr/2005:21:28:41 +0200] [Job 381] pdftops - opened PPD file "/etc/cups/ppd/gaston.ppd"... D [20/Apr/2005:21:28:41 +0200] [Job 381] pdftops - level = 2, width = 595, length = 842 D [20/Apr/2005:21:28:41 +0200] [Job 381] Page = 595x842; 36,36 to 559,806 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%Creator: xpdf/pdftops 1.01 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%LanguageLevel: 2s_log.1 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%DocumentSuppliedResources: (atend) D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%DocumentMedia: plain 595 842 0 () () D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%Pages: 425 access_log.4 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%EndCommentsaccess_log.O D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%BeginDefaultsror_log D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%PageMedia: plain_log.1 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%EndDefaultserror_log.2 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%BeginPrologerror_log.3 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%BeginResource: procset xpdf 1.01 0 D [20/Apr/2005:21:28:41 +0200] [Job 381] Cupsomatic backend version $Revision: 2.13 $ running... D [20/Apr/2005:21:28:41 +0200] [Job 381] ppd=/etc/cups/ppd/gaston.ppd D [20/Apr/2005:21:28:41 +0200] [Job 381] Seaerching job for option settings ... D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%EndResourcepage_log.3 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%EndProlog6 page_log.4 D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%BeginSetup D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%BeginResource: font MathematicalPi-One-A ... D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%BeginPageSetup D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%PageOrientation: Portrait D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%EndPageSetup D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%PageTrailer D [20/Apr/2005:21:28:41 +0200] [Job 381] 0 %%Trailer D [20/Apr/2005:21:28:41 +0200] [Job 381] Saw Trailer! D [20/Apr/2005:21:28:41 +0200] [Job 381] Saw EOF! D [20/Apr/2005:21:28:41 +0200] [Job 381] tail process done writing data to *main::STDOUT D [20/Apr/2005:21:28:41 +0200] [Job 381] KID4 finished D [20/Apr/2005:21:28:41 +0200] [Job 381] KID3 finished D [20/Apr/2005:21:28:41 +0200] [Job 381] Main process finished I [20/Apr/2005:21:28:41 +0200] [Job 381] Attempting to connect to host gaston.bis.pasteur.fr for printer raw I [20/Apr/2005:21:28:41 +0200] [Job 381] Connected from port 723... D [20/Apr/2005:21:28:41 +0200] [Job 381] lpd_command 02 raw D [20/Apr/2005:21:28:41 +0200] [Job 381] Sending command string (5 bytes)... D [20/Apr/2005:21:28:41 +0200] [Job 381] Reading command status... D [20/Apr/2005:21:28:41 +0200] [Job 381] lpd_command returning 0 D [20/Apr/2005:21:28:41 +0200] [Job 381] Control file is: D [20/Apr/2005:21:28:41 +0200] [Job 381] Hnao.bis.pasteur.fr D [20/Apr/2005:21:28:41 +0200] [Job 381] Pcoluzzi D [20/Apr/2005:21:28:41 +0200] [Job 381] JAsEsCoLa_pdf D [20/Apr/2005:21:28:41 +0200] [Job 381] ldfA188nao.bis.pasteur.fr D [20/Apr/2005:21:28:41 +0200] [Job 381] UdfA188nao.bis.pasteur.fr D [20/Apr/2005:21:28:41 +0200] [Job 381] NAsEsCoLa_pdf D [20/Apr/2005:21:28:41 +0200] [Job 381] lpd_command 02 109 cfA188nao.bis.pasteur.fr D [20/Apr/2005:21:28:41 +0200] [Job 381] Sending command string (30 bytes)... D [20/Apr/2005:21:28:41 +0200] [Job 381] Reading command status... D [20/Apr/2005:21:28:41 +0200] [Job 381] lpd_command returning 0 I [20/Apr/2005:21:28:41 +0200] [Job 381] Sending control file (109 bytes) I [20/Apr/2005:21:28:41 +0200] [Job 381] Control file sent successfully D [20/Apr/2005:21:28:41 +0200] [Job 381] lpd_command 03 327182 dfA188nao.bis.pasteur.fr D [20/Apr/2005:21:28:41 +0200] [Job 381] Sending command string (33 bytes)... D [20/Apr/2005:21:28:41 +0200] [Job 381] Reading command status... D [20/Apr/2005:21:28:41 +0200] [Job 381] lpd_command returning 0 I [20/Apr/2005:21:28:41 +0200] [Job 381] Sending data file (327182 bytes) I [20/Apr/2005:21:28:41 +0200] [Job 381] Spooling LPR job, 0% complete... I [20/Apr/2005:21:28:41 +0200] [Job 381] Spooling LPR job, 2% complete... I [20/Apr/2005:21:28:41 +0200] [Job 381] Spooling LPR job, 5% complete... I [20/Apr/2005:21:28:41 +0200] [Job 381] Spooling LPR job, 7% complete... I [20/Apr/2005:21:28:41 +0200] [Job 381] Spooling LPR job, 10% complete... I [20/Apr/2005:21:28:42 +0200] [Job 381] Spooling LPR job, 12% complete... I [20/Apr/2005:21:28:42 +0200] [Job 381] Spooling LPR job, 15% complete... ... I [20/Apr/2005:21:28:50 +0200] [Job 381] Spooling LPR job, 95% complete... I [20/Apr/2005:21:28:51 +0200] [Job 381] Spooling LPR job, 97% complete... I [20/Apr/2005:21:29:21 +0200] [Job 381] Data file sent successfully I [20/Apr/2005:21:29:21 +0200] [Job 381] Data file sent successfully d [20/Apr/2005:21:29:21 +0200] PID 22188 exited with no errors. D [20/Apr/2005:21:29:21 +0200] UpdateJob: job 381, file 0 is complete. d [20/Apr/2005:21:29:21 +0200] UpdateJob: Removing fd 3 from InputSet... D [20/Apr/2005:21:29:21 +0200] CancelJob: id = 381 D [20/Apr/2005:21:29:21 +0200] StopJob: id = 381, force = 0 D [20/Apr/2005:21:29:21 +0200] StopJob: printer state is 3 d [20/Apr/2005:21:29:21 +0200] StopJob: Freeing status buffer...
Having re-read this entire report, I think that this is not a cups-lpd problem at all (don't remember why I ever thought it was). It does seem, however, that the problem lies inside the CUPS library somewhere, although cupsDoFileRequest() seems to be innocent. The next suspicious function is cupsPrintFiles(). My instructions on setting up /tmp/cups/err_output were written with cups-lpd in mind, and will not capture everything from running 'lp' as a normal user. So: Please set /tmp/cups/err_output writeable by everyone, just for testing: chmod a+w /tmp/cups/err_output chmod a+x /tmp/cups Then clear it out: >/tmp/cups/err_output and let me know what it says next time there are multiple copies printed. Tru: you had said in comment #24 that Firefox was used to print. Can you please verify that the print command (File->Print..., select printer, Properties, Print Command) is correct? What does it say? Also, what does "cat ~/.lpoptions" say for the user reporting multiple copies? What about "cat /etc/cups/lpoptions"? Just trying to eliminate the obvious things..
Firefox setup: lpr ${MOZ_PRINTER_NAME:+'-P'}${MOZ_PRINTER_NAME}" so it default to "lpr" the black and white printer (gaston) no ~/.lpoptions or /etc/cups/lpoptions (no any client machine) /etc/cups/lpoptions on the printer server: Dest color cpi=12 lpi=7 page-bottom=36 page-left=36 page-right=36 page-top=36 scaling=100 wrap=true Dest color-ss cpi=12 lpi=7 page-bottom=36 page-left=36 page-right=36 page-top=36 scaling=100 wrap=true Dest colour cpi=12 lpi=7 page-bottom=36 page-left=36 page-right=36 page-top=36 scaling=100 wrap=true Dest colour-ss cpi=12 lpi=7 page-bottom=36 page-left=36 page-right=36 page-top=36 scaling=100 wrap=true Dest demesmaecker cpi=12 lpi=7 page-bottom=36 page-left=36 page-right=36 page-top=36 scaling=100 wrap=true Default gaston cpi=12 lpi=7 page-bottom=36 page-left=36 page-right=36 page-top=36 scaling=100 wrap=true Dest gaston-ss cpi=12 lpi=7 page-bottom=36 page-left=36 page-right=36 page-top=36 scaling=100 wrap=true I also had multiples prints though gimp and the color printer (demesmaecker, alias color) with the following print command "lp -s -dcolor -oraw" so I don't think it is lpr/lp (both from cups-1.1.17-13.3.27). Tru
On the contrary, I think *both* lp and lpr are affected -- see comment #16. The reason this may be is that the actual code they use to talk to the IPP server comes from libcups. This: lpr ${MOZ_PRINTER_NAME:+'-P'}${MOZ_PRINTER_NAME}" is correct for the Print Command. I wanted to check that there weren't any extra options, and there aren't. So it's looking like it really is something in the libcups functions that both lp and lpr use: cupsPrintFiles(), or something below that point in the stack. I hope /tmp/cups/err_output is now writeable for everyone, and that it captures something useful next time the multiple copies bug appears. Thanks for helping with this.
lpr/lp fix: I was referring to comment #6 and #16 too, it thought it was happening more often with lpr than with lp. _live_ it just happened again: 2 prints with acroread (lpr -Pcolor) /tmp/cups/err_output [10363] cupsDoFileRequest(0x5159e0, 0x517fb0, '/', '(null)') [10363] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10363] cupsDoFileRequest: authstring="" [10363] cupsDoFileRequest: post... [10363] cupsDoFileRequest: ipp write... [10363] cupsDoFileRequest: update... [10363] cupsDoFileRequest: status = 200 [10363] cupsDoFileRequest: response... [10363] cupsDoFileRequest(0x5159e0, 0x51fb30, '/', '(null)') [10363] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10363] cupsDoFileRequest: authstring="" [10363] cupsDoFileRequest: post... [10363] cupsDoFileRequest: ipp write... [10363] cupsDoFileRequest: update... [10363] cupsDoFileRequest: status = 200 [10363] cupsDoFileRequest: response... [10363] cups_connect("default", (nil), (nil)) [10363] connecting to localhost on port 631... [10363] cupsDoFileRequest(0x51bb50, 0x51fb30, '/', '(null)') [10363] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10363] cupsDoFileRequest: authstring="" [10363] cupsDoFileRequest: post... [10363] cupsDoFileRequest: ipp write... [10363] cupsDoFileRequest: update... [10363] cupsDoFileRequest: status = 200 [10363] cupsDoFileRequest: response... [10363] cupsDoFileRequest(0x5133d0, 0x51e270, '/', '(null)') [10363] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10363] cupsDoFileRequest: authstring="" [10363] cupsDoFileRequest: post... [10363] cupsDoFileRequest: ipp write... [10363] cupsDoFileRequest: update... [10363] cupsDoFileRequest: status = 200 [10363] cupsDoFileRequest: response... [10363] cupsDoFileRequest(0x5133d0, 0x51b940, '/', '(null)') [10363] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10363] cupsDoFileRequest: authstring="" [10363] cupsDoFileRequest: post... [10363] cupsDoFileRequest: ipp write... [10363] cupsDoFileRequest: update... [10363] cupsDoFileRequest: status = 200 [10363] cupsDoFileRequest: response... [10364] cupsDoFileRequest(0x5198e0, 0x51beb0, '/', '(null)') [10364] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10364] cupsDoFileRequest: authstring="" [10364] cupsDoFileRequest: post... [10364] cupsDoFileRequest: ipp write... [10364] cupsDoFileRequest: update... [10364] cupsDoFileRequest: status = 200 [10364] cupsDoFileRequest: response... [10364] cupsDoFileRequest(0x5198e0, 0x522b50, '/', '(null)') [10364] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10364] cupsDoFileRequest: authstring="" [10364] cupsDoFileRequest: post... [10364] cupsDoFileRequest: ipp write... [10364] cupsDoFileRequest: update... [10364] cupsDoFileRequest: status = 200 [10364] cupsDoFileRequest: response... [10364] cups_connect("default", (nil), (nil)) [10364] connecting to localhost on port 631... [10364] cupsDoFileRequest(0x5199d0, 0x522b50, '/', '(null)') [10364] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10364] cupsDoFileRequest: authstring="" [10364] cupsDoFileRequest: post... [10364] cupsDoFileRequest: ipp write... [10364] cupsDoFileRequest: update... [10364] cupsDoFileRequest: status = 200 [10364] cupsDoFileRequest: response... [10364] cupsDoFileRequest(0x5172d0, 0x51eb10, '/', '(null)') [10364] cupsDoFileRequest: setup... (max_attempts is 19, file_sent is 0) [10364] cupsDoFileRequest: authstring="" [10364] cupsDoFileRequest: post... [10364] cupsDoFileRequest: ipp write... [10364] cupsDoFileRequest: update... [10364] cupsDoFileRequest: status = 200 [10364] cupsDoFileRequest: response... all the printing are spooled to this server, nobody else than root/cupsd will write there... Do I have to install the instumented client everywhere? Tru
Ah, I see -- yes, the instrumented clients (including libcups) need to be on each originating host. :-( This instrumented package was originally intended for debugging a problem seen on a system where all jobs were originated locally. Will it be a problem to install this package on every client, and have /etc/cups/err_output set up correctly on each one? I'm guessing it will.. :-/ If I made a package that used syslog() (i.e. /var/log/messages) instead of writing directly to /etc/cups/err_output, would that be easier for testing with? I really wish I could reproduce this bug in-house..
yes remote log would be better, but I am not sure that would help: just to clarify our setup, we have a single printer server running the instrumented cups. All our clients are printing to this server, no direct printing to the color or b/w printers. We just got hit again, client: 32 bits xeon machine printing from acroread7. The previous report was on 64 bits client opteron machines. Tru
The problem really does seem to be occurring on the clients, not on the server. The log on the client IPP daemon shows multiple normal (and separate) local connections from the job submission clients (lp/lpr presumably). In other words: the job is submitted to the local cupsd process several times by the lp/lpr command, and then the local cupsd process spools each one to the cupsd running on the server. So we really need to understand what's going on in the lp/lpr command, and one way or another we'll need to instrument it. Let me know how I can help make that happen -- if you need a syslog-enabled version of the err_output patch, I can do that.
Actually, can I just double-check?: are the clients running RHEL3, with cups-1.1.17-13.3.27?
they are now running cups-1.1.17-13.3.27.0.0.1.x86_64 cups-1.1.17-13.3.27.0.0.1.i386 Tru
That's great -- are they also running the 13.3.27.0.0.1 versions of cups-libs? The instrumented code is in the libcups library. Also, of course, they'll need /etc/cups/err_output to exist and be writable.
yes, and yes. Have a nice week-end [tru@sillage ~]$ ls /Bis/scratch/tru/cups/x86_64/ cups-1.1.17-13.3.27.0.0.1.x86_64.rpm cups-debuginfo-1.1.17-13.3.27.0.0.1.x86_64.rpm cups-devel-1.1.17-13.3.27.0.0.1.x86_64.rpm cups-libs-1.1.17-13.3.27.0.0.1.x86_64.rpm cups.sh [tru@sillage ~]$ ls /Bis/scratch/tru/cups/i386/ cups-1.1.17-13.3.27.0.0.1.i386.rpm cups-debuginfo-1.1.17-13.3.27.0.0.1.i386.rpm cups-devel-1.1.17-13.3.27.0.0.1.i386.rpm cups-libs-1.1.17-13.3.27.0.0.1.i386.rpm cups.sh [tru@sillage ~]$ more /Bis/scratch/tru/cups/i386/cups.sh #!/bin/sh sudo rpm -Fvh *.rpm sudo mkdir /tmp/cups sudo touch /tmp/cups/err_output sudo chown lp /tmp/cups/err_output sudo chmod 777 /tmp/cups/err_output /tmp/cups same thing for the x86_64 machine now wait and see what we can catch... btw our color printer is jammed, only the b/w will be available for some days. :( Tru
I have the same problem reported in this thread. I have installed the packages found in ftp://people.redhat.com/twaugh/tmp/cups-rhel3/ I have now reproduced the multiprint problem and collected data in /tmp/cups/err_output. I hope this can help solving this bug. Olav
Created attachment 113821 [details] /tmp/cups/err_output
Thanks!
Okay, the debug log shows that httpUpdate() is returning HTTP_ERROR but not setting http->error, and that can happen in three places. This particular condition doesn't just occur when spooling files, but seems to happen when doing things like 'lpstat' too. One place that this might happen is in parsing the HTTP response code from the IPP server, and interestingly enough there has been a small fix to that section of code since 1.1.17. (A typecast was being used incorrectly.) It might have some bearing on this problem, but then again it might not. Another circumstance in which this might happen is if the IPP server says something completely unexpected in the HTTP header -- i.e. not an HTTP response message, or a field setting. The final circumstance that can cause this is if TLS (SSL) is being used, but I don't think it is in this case, is it? I have built a new test CUPS package (1.1.17-13.3.27.0.0.5) for running on the *clients* (not the server), which I hope will shed more light on what the problem might be. This package includes the small typecast fix in the HTTP request parsing code, and so there is a possibility that it will fix the problem altogether. It also includes a limit (3) on the number of unwanted multiple copies that will get sent -- i.e. how many times it will send the print job when httpUpdate() is returning HTTP_ERROR but http->error is unset. Finally, it adds debugging instrumentation in the http.c module which includes the httpUpdate() function -- again, this will be written to the /tmp/cups/err_output file. I hope it will be possible for you to try this package out on a client machine, and report back as soon as there are multiple copies printed. (Of course, this time there won't be so many unwanted copies hopefully..) The 1.1.17-13.3.27.0.0.5 RPM packages can be found in the same place, here: ftp://people.redhat.com/twaugh/tmp/cups-rhel3/ The 'cups' and 'cups-libs' packages both need to be installed on the client machine, and none of these modified packages need to be installed on the server.
I got unwanted copies (3x and 4x) reported this morning (~9h15/~9h30 localtime) with the 0.0.5 versions on our BW printer (gaston): Printing via mozilla (lpr ${MOZ_PRINTER_NAME:+'-P'}${MOZ_PRINTER_NAME}) I guess they were the jobs 2-8. error_log (x86_64 client) I [02/May/2005:09:06:30 +0200] Job 1 queued on 'demesmaecker' by 'nathsav'. I [02/May/2005:09:06:30 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 12050) for job 1. I [02/May/2005:09:07:19 +0200] Job 2 queued on 'gaston' by 'nathsav'. I [02/May/2005:09:07:19 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 12053) for job 2. I [02/May/2005:09:21:10 +0200] Job 3 queued on 'gaston' by 'nathsav'. I [02/May/2005:09:21:10 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 12073) for job 3. I [02/May/2005:09:21:10 +0200] Job 4 queued on 'gaston' by 'nathsav'. I [02/May/2005:09:21:10 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 12074) for job 4. I [02/May/2005:09:21:10 +0200] Job 5 queued on 'gaston' by 'nathsav'. I [02/May/2005:09:21:10 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 12075) for job 5. I [02/May/2005:09:31:48 +0200] Job 6 queued on 'gaston' by 'nathsav'. I [02/May/2005:09:31:48 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 12084) for job 6. I [02/May/2005:09:31:48 +0200] Job 7 queued on 'gaston' by 'nathsav'. I [02/May/2005:09:31:48 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 12085) for job 7. I [02/May/2005:09:31:49 +0200] Job 8 queued on 'gaston' by 'nathsav'. I [02/May/2005:09:31:49 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 12086) for job 8. page_log(x86_64 client) demesmaecker nathsav 1 [02/May/2005:09:06:30 +0200] 1 1 gaston nathsav 2 [02/May/2005:09:07:19 +0200] 1 1 gaston nathsav 3 [02/May/2005:09:21:11 +0200] 1 1 gaston nathsav 4 [02/May/2005:09:21:11 +0200] 1 1 gaston nathsav 5 [02/May/2005:09:21:11 +0200] 1 1 gaston nathsav 6 [02/May/2005:09:31:49 +0200] 1 1 gaston nathsav 7 [02/May/2005:09:31:49 +0200] 1 1 gaston nathsav 8 [02/May/2005:09:31:49 +0200] 1 1
Created attachment 113917 [details] /tmp/cups/err_output from x86_64 client
Great, this is exactly what I needed -- thanks. For some reason the server is sending gibberish for one of the fields, so I need to take a look to see why that might be happening.
I have put new packages here: ftp://people.redhat.com/twaugh/tmp/cups-rhel3/ In 1.1.17-13.3.27.0.0.6 the debugging messages are still in place, and will write to /tmp/cups/err_output, but I have back-ported a change from a later version of CUPS. I hope that this change will fix the problem. The CUPS client (lp or lpr) was not specifying a list of languages that it would accept replies in, and the CUPS server (in this case, the cupsd running on the same machine as lp/lpr, not the final destination server) was sending gibberish in the Content-Language header field as a result. Please install these new packages on both the client machines and the server machine. With any luck, there will be no more unwanted multiple copies! We'll give it some time to be sure, and then it will be possible to make an official update for CUPS. Thanks for your continued help!
By the way, to disable the debugging output it should be safe to remove the /tmp/cups directory (rm -rf /tmp/cups).
Has anyone using these test packages seen any unwanted multiple copies yet?
no unwanted multiples copies here (.0.6 installed on may 4th) , but the previous week was +- holidays. Thx Tru
No, I haven't seen any. The problem seems to be solved. Thanks. Olav
Great, looks like this is the correct fix then.
it's back after the latest security update for cups :( cups server: [tru@nao ~]$ rpm -q cups cups-libs cups-1.1.17-13.3.29.x86_64 cups-libs-1.1.17-13.3.29.x86_64 cups client: [tru@diane ~]$ rpm -q cups cups-libs cups-1.1.17-13.3.29.x86_64 cups-libs-1.1.17-13.3.29.x86_64 cups-libs-1.1.17-13.3.29.i386 error_log on client (diane): 20 unwanted copies jobs 5-24 I [19/Jul/2005:16:04:31 +0200] Job 1 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:04:32 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15822) for job 1. I [19/Jul/2005:16:07:57 +0200] Job 2 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:07:57 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15824) for job 2. I [19/Jul/2005:16:08:10 +0200] Job 3 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:08:10 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15828) for job 3. I [19/Jul/2005:16:10:47 +0200] Job 4 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:10:47 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15836) for job 4. I [19/Jul/2005:16:27:42 +0200] Job 5 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15930) for job 5. I [19/Jul/2005:16:27:42 +0200] Job 6 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15931) for job 6. I [19/Jul/2005:16:27:42 +0200] Job 7 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15932) for job 7. I [19/Jul/2005:16:27:42 +0200] Job 8 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15933) for job 8. I [19/Jul/2005:16:27:42 +0200] Job 9 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15934) for job 9. I [19/Jul/2005:16:27:42 +0200] Job 10 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15935) for job 10. I [19/Jul/2005:16:27:42 +0200] Job 11 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15936) for job 11. I [19/Jul/2005:16:27:42 +0200] Job 12 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15937) for job 12. I [19/Jul/2005:16:27:42 +0200] Job 13 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15938) for job 13. I [19/Jul/2005:16:27:42 +0200] Job 14 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15939) for job 14. I [19/Jul/2005:16:27:42 +0200] Job 15 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15940) for job 15. I [19/Jul/2005:16:27:42 +0200] Job 16 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15941) for job 16. I [19/Jul/2005:16:27:42 +0200] Job 17 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15942) for job 17. I [19/Jul/2005:16:27:42 +0200] Job 18 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15943) for job 18. I [19/Jul/2005:16:27:42 +0200] Job 19 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15944) for job 19. I [19/Jul/2005:16:27:42 +0200] Job 20 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15945) for job 20. I [19/Jul/2005:16:27:42 +0200] Job 21 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15946) for job 21. I [19/Jul/2005:16:27:42 +0200] Job 22 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15947) for job 22. I [19/Jul/2005:16:27:42 +0200] Job 23 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15948) for job 23. I [19/Jul/2005:16:27:42 +0200] Job 24 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/ipp (PID 15949) for job 24. error log on server (nao in debug): I [19/Jul/2005:16:10:47 +0200] Job 38 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:10:47 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 25644) for job 38. I [19/Jul/2005:16:10:47 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 25645) for job 38. I [19/Jul/2005:16:10:47 +0200] Started backend /usr/lib64/cups/backend/socket (PID 25646) for job 38. W [19/Jul/2005:16:27:42 +0200] Possible DoS attack - more than 10 clients connecting from 157.99.90.17! I [19/Jul/2005:16:27:42 +0200] Job 39 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:42 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26453) for job 39. I [19/Jul/2005:16:27:42 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26454) for job 39. I [19/Jul/2005:16:27:42 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26455) for job 39. I [19/Jul/2005:16:27:43 +0200] Job 40 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 41 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 42 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 43 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 44 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 45 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 46 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 47 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 48 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 49 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 50 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 51 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:43 +0200] Job 52 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:44 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26460) for job 40. I [19/Jul/2005:16:27:44 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26461) for job 40. I [19/Jul/2005:16:27:44 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26462) for job 40. I [19/Jul/2005:16:27:47 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26467) for job 41. I [19/Jul/2005:16:27:47 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26468) for job 41. I [19/Jul/2005:16:27:47 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26469) for job 41. I [19/Jul/2005:16:27:50 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26474) for job 42. I [19/Jul/2005:16:27:50 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26475) for job 42. I [19/Jul/2005:16:27:50 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26476) for job 42. I [19/Jul/2005:16:27:53 +0200] Job 53 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:53 +0200] Job 54 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:53 +0200] Job 55 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:53 +0200] Job 56 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:53 +0200] Job 57 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:27:53 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26481) for job 43. I [19/Jul/2005:16:27:53 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26482) for job 43. I [19/Jul/2005:16:27:53 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26483) for job 43. I [19/Jul/2005:16:28:01 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26488) for job 44. I [19/Jul/2005:16:28:01 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26489) for job 44. I [19/Jul/2005:16:28:01 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26490) for job 44. I [19/Jul/2005:16:28:03 +0200] Job 58 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:28:03 +0200] Job 59 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:28:08 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26495) for job 45. I [19/Jul/2005:16:28:08 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26496) for job 45. I [19/Jul/2005:16:28:08 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26497) for job 45. I [19/Jul/2005:16:28:15 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26502) for job 46. I [19/Jul/2005:16:28:15 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26503) for job 46. I [19/Jul/2005:16:28:15 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26504) for job 46. I [19/Jul/2005:16:28:22 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26509) for job 47. I [19/Jul/2005:16:28:22 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26510) for job 47. I [19/Jul/2005:16:28:22 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26511) for job 47. I [19/Jul/2005:16:28:24 +0200] Job 60 queued on 'lagaffe' by 'nathsav'. I [19/Jul/2005:16:28:29 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26516) for job 48. I [19/Jul/2005:16:28:29 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26517) for job 48. I [19/Jul/2005:16:28:29 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26518) for job 48. I [19/Jul/2005:16:28:36 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26523) for job 49. I [19/Jul/2005:16:28:36 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26524) for job 49. I [19/Jul/2005:16:28:36 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26525) for job 49. I [19/Jul/2005:16:28:44 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26530) for job 50. I [19/Jul/2005:16:28:44 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26531) for job 50. I [19/Jul/2005:16:28:44 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26532) for job 50. W [19/Jul/2005:16:28:44 +0200] Possible DoS attack - more than 10 clients connecting from 157.99.90.17! I [19/Jul/2005:16:28:51 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26537) for job 51. I [19/Jul/2005:16:28:51 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26538) for job 51. I [19/Jul/2005:16:28:51 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26539) for job 51. I [19/Jul/2005:16:28:58 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26544) for job 52. I [19/Jul/2005:16:28:58 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26545) for job 52. I [19/Jul/2005:16:28:58 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26546) for job 52. I [19/Jul/2005:16:29:05 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26551) for job 53. I [19/Jul/2005:16:29:05 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26552) for job 53. I [19/Jul/2005:16:29:05 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26553) for job 53. I [19/Jul/2005:16:29:12 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26558) for job 54. I [19/Jul/2005:16:29:12 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26559) for job 54. I [19/Jul/2005:16:29:12 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26560) for job 54. I [19/Jul/2005:16:29:19 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26565) for job 55. I [19/Jul/2005:16:29:19 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26566) for job 55. I [19/Jul/2005:16:29:19 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26567) for job 55. I [19/Jul/2005:16:29:26 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26572) for job 56. I [19/Jul/2005:16:29:26 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26573) for job 56. I [19/Jul/2005:16:29:26 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26574) for job 56. I [19/Jul/2005:16:29:33 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26579) for job 57. I [19/Jul/2005:16:29:33 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26580) for job 57. I [19/Jul/2005:16:29:33 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26581) for job 57. I [19/Jul/2005:16:29:47 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26586) for job 58. I [19/Jul/2005:16:29:47 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26587) for job 58. I [19/Jul/2005:16:29:47 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26588) for job 58. I [19/Jul/2005:16:29:54 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26593) for job 59. I [19/Jul/2005:16:29:54 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26594) for job 59. I [19/Jul/2005:16:29:54 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26595) for job 59. I [19/Jul/2005:16:30:01 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26603) for job 60. I [19/Jul/2005:16:30:01 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26604) for job 60. I [19/Jul/2005:16:30:01 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26605) for job 60. I [19/Jul/2005:16:33:55 +0200] Job 61 queued on 'lagaffe' by 'habeck'. I [19/Jul/2005:16:33:55 +0200] Started filter /usr/lib64/cups/filter/pstops (PID 26944) for job 61. I [19/Jul/2005:16:33:55 +0200] Started filter /usr/lib64/cups/filter/cupsomatic (PID 26945) for job 61. I [19/Jul/2005:16:33:55 +0200] Started backend /usr/lib64/cups/backend/socket (PID 26946) for job 61.
Yes, the fix is not included in the security update. We are targetting the next quarterly update release for including this fix.
I thought it was included: [tru@sillage ~]$ rpm -q cups --changelog | grep -B 2 138123 * Wed May 04 2005 Tim Waugh <twaugh@redhat.com> - Backport upstream patch to fix bug #138123 (unwanted multiple copies). -- * Wed May 04 2005 Tim Waugh <twaugh@redhat.com> - Backport upstream patch to fix bug #138123 (unwanted multiple copies). Can you roll out a temporary fix meanwhile? thanks Tru
Yes, but see the 5th July changelog entry, in which that change is reverted for the security update. I have uploaded 1.1.17-13.3.27.0.0.8, which includes the security fix.
thanks a *lot* :P
the only issue, is downgrading from 1.1.17-13.3.29 to 1.1.17-13.3.27.0.0.8... Should I --force? Or can your 'Include fix for STR #700.' be added directly to the 1.1.17-13.3.29 version?
1.1.17-13.3.29 already includes the fix for STR #700, and 1.1.17-13.3.27.0.0.8 includes the multiple copies fix as well as the STR #700 security fix. So you can use --oldpackage to downgrade to the test package I uploaded.
I am seeing this same error on a rhel4 install. Is there a set of fixed packages that will install on RHEL4? I tried installing with --oldpackage, but there are still conflicts with dbus. Should I open a new bug on this, since it's a different platform?
Yes, please file a separate bug against Red Hat Enterprise Linux 4. Thanks.
Tim, A partner's L3 channel has reported this problem in a new issue tracker(IT-79267). From this case history it looks promising that there is a fix for this issue in your test package cups-1.1.17-13.3.27.0.0.8. Please verify that the U6 beta package cups-1.1.17-13.3.30 has this fix included in it. If this is true then I can ask the customer to test the beta package to confirm that it fixes this problem for them.
It does.
An advisory has been issued which should help the problem described in this bug report. This report is therefore being closed with a resolution of ERRATA. For more information on the solution and/or where to find the updated files, please follow the link below. You may reopen this bug report if the solution does not work for you. http://rhn.redhat.com/errata/RHBA-2005-631.html