Note: This bug is displayed in read-only format because the product is no longer active in Red Hat Bugzilla.

Bug 1085367

Summary: concurrent connections to horizon triggering 500 errors
Product: Red Hat OpenStack Reporter: Kambiz Aghaiepour <kambiz>
Component: python-django-horizonAssignee: RHOS Maint <rhos-maint>
Status: CLOSED INSUFFICIENT_DATA QA Contact: Ami Jeain <ajeain>
Severity: medium Docs Contact:
Priority: unspecified    
Version: 4.0CC: aortega, jpichon, kambiz, mrunge, mwagner, yeylon
Target Milestone: ---   
Target Release: 5.0 (RHEL 7)   
Hardware: Unspecified   
OS: Unspecified   
Whiteboard:
Fixed In Version: Doc Type: Bug Fix
Doc Text:
Story Points: ---
Clone Of: Environment:
Last Closed: 2014-05-21 08:51:34 UTC Type: Bug
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Attachments:
Description Flags
check-horizon-login.sh none

Description Kambiz Aghaiepour 2014-04-08 13:02:49 UTC
Description of problem:

When testing concurrent connections to horizon, I appear to be triggering 500 errors in apache.  I have a vanilla installation of RHOS4, with one controller, one neutron, and a number of compute nodes.  I have tested this using jmeter as well as a custom shell script wrapper around wget (attaching the custom script).

Version-Release number of selected component (if applicable):
python-django-horizon-2013.2.2-1.el6ost.noarch

How reproducible:
run custom script that can illustrate the issue against a horizon interface. Attaching the script to this bug.

Steps to Reproduce:
1. ./check-horizon-login.sh <horizon host or IP> admin <admin pass> 1 30
   (this will show the session cookies were used in a single process for 30 iterations yielding all 200 OK codes)

2. ./check-horizon-login.sh <horizon host or IP> admin <admin pass> 2 15
   (this is yielding lots of 500 errors, this is not expected)

Expected results:
  all 200 codes.

Comment 2 Matthias Runge 2014-04-08 16:22:26 UTC
would you mind to share your script to reproduce?

Comment 3 Kambiz Aghaiepour 2014-04-08 19:04:07 UTC
Created attachment 884217 [details]
check-horizon-login.sh

Comment 4 Kambiz Aghaiepour 2014-04-08 19:04:44 UTC
Sorry, I thought I had attached it already.  It should appear now.

Comment 5 Alvaro Lopez Ortega 2014-04-08 19:12:33 UTC
Matthias, could you please take a look at this bug?

Comment 6 Matthias Runge 2014-04-09 12:20:23 UTC
I wonder, what version of novaclient you're using. Please make sure, it's at least python-novaclient-2.15.0-3.el6ost

Comment 7 Matthias Runge 2014-04-09 12:34:23 UTC
I tried to reproduce the issue locally:
[mrunge@sofja ~]$ sh check-horizon-login.sh localhost demo demo 10 300
===== Results =====
HTTP/1.1 200 ===>  3000 / 3000
HTTP/1.1 302 ===>  3000 / 3000

Hmm. Not exactly the same results as yours.

And also: One can speed up the test by adding/changing the first line of 
/etc/httpd/conf.d/openstack-dashboard.conf to
WSGIDaemonProcess dashboard processes=4 threads=30

Comment 8 Kambiz Aghaiepour 2014-04-10 15:36:51 UTC
we're using: python-novaclient-2.15.0-2.el6ost.noarch
from the 2014-02-28.3 package set.  The reason we didn't update to the latest is that the keystone commands changed and our packstack modules pass an option to keystone that don't appear to be supported.  I can look into updating the package manually.  Also, you shouldn't see any 302 if authentication is working.
I'll check the WSGI settings as well.

Comment 9 Matthias Runge 2014-04-11 06:40:10 UTC
Kambiz, please update to at least python-novaclient-2.15.0-3.el6ost

The earlier version is known to produce a lot of open files (too many open files, to be exact, leading to all kinds of issues). Please check the number of open files on your horizon host. I wouldn't expect more than 100, or even 200 during running this script. In your case I'd expect a massively higher number
The issue is, that novaclient opens, but does not close connections. They time out after a long time.


You don't need to re-install, when updating to that novaclient version. Currently, I'm inclined to close this as notabug, as this issue is fixed by the later novaclient.