Bugzilla will be upgraded to version 5.0 on a still to be determined date in the near future. The original upgrade date has been delayed.
Bug 59876 - Rsh makes an error after >~290 repeats
Rsh makes an error after >~290 repeats
Product: Red Hat Raw Hide
Classification: Retired
Component: rsh (Show other bugs)
i386 Linux
medium Severity medium
: ---
: ---
Assigned To: Radek Vokal
Ben Levenson
: 88644 (view as bug list)
Depends On:
  Show dependency treegraph
Reported: 2002-02-14 07:38 EST by takujiro katayama
Modified: 2007-04-18 12:40 EDT (History)
2 users (show)

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Last Closed: 2004-09-14 06:56:37 EDT
Type: ---
Regression: ---
Mount Type: ---
Documentation: ---
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---

Attachments (Terms of Use)

  None (edit)
Description takujiro katayama 2002-02-14 07:38:45 EST
From Bugzilla Helper:
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows 98; Q312461)

Description of problem:
When I rsh a short script to a remote host repeatedly (more than around 290 
times), the following error occurs:

poll: protocol failure in circuit setup


rcmd: socket: All ports in use.

Once this error comes out, no more rsh jobs can be invoked for a while.  
Strangely, after some period of time, this situation goes out.

Version-Release number of selected component (if applicable):

How reproducible:

Steps to Reproduce:
1.Just repeat "rsh <hostname> ls" or something else that consumes small amount 
of time

Actual Results:  An error occurs after around 290 rsh's:

poll: protocol failure in circuit setup


rcmd: socket: All ports in use

Expected Results:  No error occurs.

Additional info:
Comment 1 Phil Knirsch 2002-02-24 05:23:45 EST
Hmmm.. That sounds as if your box is running out of sockets on one end. Could
you verify if your rsh scripts don't take forever to complete? If thats the
case, it can easily happen that your user limit for open file descriptors runs
out. Try the same thing as root or use the ulimit command to change the maximum
number of open filedescriptor.

Read ya, Phil
Comment 2 takujiro katayama 2002-03-08 04:11:37 EST
I tried to change the maximum number of  open filedescriptor by using limit command, but I got the 
same error message as before.I think this error is caused by anather cause.
Comment 3 Phil Knirsch 2003-06-25 11:01:56 EDT
*** Bug 88644 has been marked as a duplicate of this bug. ***
Comment 4 Barry K. Nathan 2004-07-14 12:29:58 EDT
I've seen this bug before (on Red Hat 7.2). It turns out that this
xinetd erratum fixes it:

Note You need to log in before you can comment on or make changes to this bug.