Bug 59876 - Rsh makes an error after >~290 repeats
Rsh makes an error after >~290 repeats
Status: CLOSED CURRENTRELEASE
Product: Red Hat Raw Hide
Classification: Retired
Component: rsh (Show other bugs)
1.0
i386 Linux
medium Severity medium
: ---
: ---
Assigned To: Radek Vokal
Ben Levenson
:
: 88644 (view as bug list)
Depends On:
Blocks:
  Show dependency treegraph
 
Reported: 2002-02-14 07:38 EST by takujiro katayama
Modified: 2007-04-18 12:40 EDT (History)
2 users (show)

See Also:
Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of:
Environment:
Last Closed: 2004-09-14 06:56:37 EDT
Type: ---
Regression: ---
Mount Type: ---
Documentation: ---
CRM:
Verified Versions:
Category: ---
oVirt Team: ---
RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: ---


Attachments (Terms of Use)

  None (edit)
Description takujiro katayama 2002-02-14 07:38:45 EST
From Bugzilla Helper:
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; Windows 98; Q312461)

Description of problem:
When I rsh a short script to a remote host repeatedly (more than around 290 
times), the following error occurs:

poll: protocol failure in circuit setup

or

rcmd: socket: All ports in use.


Once this error comes out, no more rsh jobs can be invoked for a while.  
Strangely, after some period of time, this situation goes out.



Version-Release number of selected component (if applicable):


How reproducible:
Always

Steps to Reproduce:
1.Just repeat "rsh <hostname> ls" or something else that consumes small amount 
of time
2.
3.


Actual Results:  An error occurs after around 290 rsh's:

poll: protocol failure in circuit setup

or

rcmd: socket: All ports in use


Expected Results:  No error occurs.


Additional info:
Comment 1 Phil Knirsch 2002-02-24 05:23:45 EST
Hmmm.. That sounds as if your box is running out of sockets on one end. Could
you verify if your rsh scripts don't take forever to complete? If thats the
case, it can easily happen that your user limit for open file descriptors runs
out. Try the same thing as root or use the ulimit command to change the maximum
number of open filedescriptor.

Read ya, Phil
Comment 2 takujiro katayama 2002-03-08 04:11:37 EST
I tried to change the maximum number of  open filedescriptor by using limit command, but I got the 
same error message as before.I think this error is caused by anather cause.
Comment 3 Phil Knirsch 2003-06-25 11:01:56 EDT
*** Bug 88644 has been marked as a duplicate of this bug. ***
Comment 4 Barry K. Nathan 2004-07-14 12:29:58 EDT
I've seen this bug before (on Red Hat 7.2). It turns out that this
xinetd erratum fixes it:
https://rhn.redhat.com/errata/RHSA-2003-160.html

Note You need to log in before you can comment on or make changes to this bug.