Description of problem: When trying to fetch files using a proxy for https connections, urlgrabber gives a traceback. I guess the problems is in the urllib/urllib2 module so filing against python. Version-Release number of selected component (if applicable): python-2.4.3-43.el5.x86_64 python-urlgrabber-3.1.0-6.el5.noarch squid-2.6.STABLE21-6.el5.x86_64 Steps to Reproduce: 1. yum install -y mod_ssl squid 2. mkdir /var/www/html/grabber && touch /var/www/html/grabber/file.html 3. service httpd start 4. sed -i 's/allow localhost/allow all/' /etc/squid/squid.conf 5. service squid start 6. export https_proxy=http://`hostname`:3128 7. urlgrabber https://`hostname`/grabber/file.html file.html Actual results: Traceback (most recent call last): File "/usr/bin/urlgrabber", line 124, in ? main() File "/usr/bin/urlgrabber", line 120, in main filename = urlgrab(url,filename=file,**kwargs) File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 602, in urlgrab return default_grabber.urlgrab(url, filename, **kwargs) File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 936, in urlgrab return self._retry(opts, retryfunc, url, filename) File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 854, in _retry r = apply(func, (opts,) + args, {}) File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 922, in retryfunc fo = URLGrabberFileObject(url, filename, opts) File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 1010, in __init__ self._do_open() File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 1093, in _do_open fo, hdr = self._make_request(req, opener) File "/usr/lib/python2.4/site-packages/urlgrabber/grabber.py", line 1216, in _make_request raise new_e urlgrabber.grabber.URLGrabError: [Errno 14] HTTP Error 503: Service Unavailable Expected results: file written to file.html Additional info: Seems that urllib expects "https" in the proxy specification, but I believe this is wrong. Unfortunately I was not able to find any specification about how the https_proxy URL should look like. But it seems "http" is commonly used. So the following can be used as a workaround: export https_proxy=https://`hostname`:3128 Works fine with wget.
This request was evaluated by Red Hat Product Management for inclusion in the current release of Red Hat Enterprise Linux. Because the affected component is not scheduled to be updated in the current release, Red Hat is unfortunately unable to address this request at this time. Red Hat invites you to ask your support representative to propose this request, if appropriate and relevant, in the next release of Red Hat Enterprise Linux.
This request was evaluated by Red Hat Product Management for inclusion in a Red Hat Enterprise Linux release. Product Management has requested further review of this request by Red Hat Engineering, for potential inclusion in a Red Hat Enterprise Linux release for currently deployed products. This request is not yet committed for inclusion in a release.
This request was not resolved in time for the current release. Red Hat invites you to ask your support representative to propose this request, if still desired, for consideration in the next release of Red Hat Enterprise Linux.
Development Management has reviewed and declined this request. You may appeal this decision by reopening this request.