Red Hat Bugzilla – Bug 487101
yum having trouble with https URLs when proxy specified on XO
Last modified: 2014-01-21 18:08:17 EST
User-Agent: Mozilla/5.0 (OS/2; U; Warp 4.5; en-US; rv:220.127.116.11) Gecko/20081212 PmWFx/18.104.22.168
When I type in 'yum --help', yum responds with "Loaded plugins: dellsysidplugin2, refresh-packagekit", then shows me the "help" output. When I type in 'yum install wget', yum again responds with the "Loaded plugins" line -- then hangs.
This is on an OLPC (XO-1), using file 20090223.img (installed via copy-nand).
[I think Yum does not have the information of where the rawhide repositories are.]
Steps to Reproduce:
1.Install rawhide-xo image to XO
2.Boot into Sugar, start Terminal session
3.enter 'yum install wget'
Was unable to populate system with modules not included in initial system image.
Expected to fetch (and later execute) wget.
On the XO, yum works for me with 8.2.1 builds and with Joyride builds.
can you run:
yum -v repolist
and report the output?
I left the system hanging last night and went to bed. This morning I saw it had produced this additional output: "Error: Cannot retrieve repository metadata (repmod.xml) for repository: rawhide. Please verify its path and try again."
Just now ran 'yum -v repolist'. Got the output lines -
Not loading "blacklist" plugin, as it is disabled
Loading "dellsysidplugin2" plugin
Loading "refresh-packagekit" plugin
Not loading "whiteout" plugin, as it is disabled
Config time: 13.182
Yum Version: 3.2.21
and there it hangs. I expect that after a long long time it will show that error message again.
Can you post your .repo files and is this machine behind any sort of proxy or firewall?
Created attachment 333057 [details]
tar of /etc/yum.repos.d/
Can you reach this url from a web browser on the system:
This machine is behind a proxy. The following commands were entered via the command line before the 'yum install wget' was issued:
[I am at least able to use 'rsync' at this machine to fetch data from the internet - so I believe this machine is able to work through the proxy. And of course other OLPC machines on my LAN (running 8.2.1 or Joyride instead of rawhide) are able to use 'yum' through this same proxy setup.]
okay - comment out the line which says metalink in your rawhide repo file.
And uncomment the line which says mirrorlist
then tell me if things start working.
(In reply to comment #5)
> Can you reach this url from a web browser on the system:
That's why I was trying to fetch 'wget' - with it, I might be able to install a working browser on the system. The browser built-in to rawhide-xo image 20090223 fails to launch (with an error message "ImportError: No module named xpcom") -- so I have no working browser on this system.
(In reply to comment #7)
> okay - comment out the line which says metalink in your rawhide repo file.
> And uncomment the line which says mirrorlist
> then tell me if things start working.
There are three sections in that rawhide repo file. I did what you said to the first section within the repo file. Now 'yum' worked, and I was able to fetch the wget package.
Is it possible your proxy doesn't allow/like https urls?
(In reply to comment #10)
> Is it possible your proxy doesn't allow/like https urls?
I have no idea. I do not recall noticing any problem with using https urls.
Just tried the URL you gave in comment #5. Basically, the browsers in my *other* OLPC XOs (their internet access is through that same proxy) are having difficulty with that URL, and eventually time out. With my Linux desktop machines (also connected through that same proxy), their browsers do see that "metalink" file (through the https URL) but don't know what to do with it - Firefox offers to store that file, while Opera shows me that file's content inside its own window.
Tried again with the URL 'https://bugzilla.redhat.com/show_bug.cgi?id=487101'. My OLPC XOs again eventually time out. My Linux desktop machines (also connected through that same proxy) show the webpage at that https URL.
[I suspect that it is the OLPC's networking software, rather than the proxy, that doesn't like https urls.]
Created attachment 333617 [details]
LAN packet trace of trying to use yum on rawhide-xo
This is on an OLPC (XO-1), using file 20090227.img (installed via copy-nand).
Tried 'yum' with the original rawhide repo file and looked at packets on my ethernet LAN. No packets were sent out by the XO. Then entered ' export https_proxy="https://192.168.1.1:8080" ' and tried 'yum' again. A trace of the resultant packets is attached (XO is 192.168.1.11, proxy is 192.168.1.1). It appears to me that the *proxy* tries to contact the XO on port 113 (authentication) and the XO rejects that attempt. [According to how I have my proxy configured, there is __NO__ user/password authentication requirement needed for contacting my proxy.]
I also tried using the XO in an establishment where a proxy was not required. There the XO ran 'yum' fine with the original rawhide repo file. So the problem is definitely the interaction of the rawhide-xo software with my proxy software (squid v3.0 stable13). But I do not know at which end the problem lies.
Googling for 'Yum +proxy +https' turns up "similar occurrences", such as https://bugzilla.redhat.com/show_bug.cgi?id=486324 and https://bugzilla.redhat.com/show_bug.cgi?id=484491.
Was able to bypass the problem by the method suggested in 484491 - change all https links in 'fedora-rawhide.repo' to http.
(In reply to comment #5)
> Can you reach this url from a web browser on the system:
Retried this on XO with SoaS2-200904062216.img booted from an USB stick.
The native XO Browser still is not working for me. Tried the above URL with the Opera browser - was able to view the text contents sent by mirrors.fedoraproject.org . [So at least the *basic* functionality for https support is capable of working on an XO using rawhide.] Tried the above URL with the Firefox browser - it timed out trying to access that page.
I note that the configuration options for Opera allow explicit specification of the proxy to be used for https. The configuration options for Firefox do not include an explicit field for https setup. I have not been able to determine whether the problems that 'yum' and 'Firefox' have with https_over_a_proxy are due to an incorrect request being issued by those applications, or due to incorrect handling by the system of the request being issued by those applications.
Marking this report as triaged, even though it is probably a duplicate of Bug 484491 and it sounds like this bug is not XO-specific.
Package 'fedora-release-10.92' specified the 'https:' protocol only within the fedora-rawhide.repo file, and I was manually changing those to 'http:'. The new 'fedora-release-11' package specifies the 'https:' protocol within all the files in /etc/yum.repos.d .
It was suggested to me that, prior to issuing the CLI command 'yum', I should enter: 'export https_proxy=http://192.168.1.1:8080/' [note that this appears to override the protocol]. On my XO-1 system (running "installation image" 20090512), adding that 'export' appears to have allowed 'yum' to work correctly through my proxy, even with all those 'https:' protocols still being specified in the /etc/yum.repos files.
This bug appears to have been reported against 'rawhide' during the Fedora 11 development cycle.
Changing version to '11'.
More information and reason for this action is here:
(In reply to comment #17)
> It was suggested to me that, prior to issuing the CLI command 'yum', I should
> enter: 'export https_proxy=http://192.168.1.1:8080/' [note that this appears to
> override the protocol]. On my XO-1 system (running "installation image"
> 20090512), adding that 'export' appears to have allowed 'yum' to work correctly
> through my proxy, even with all those 'https:' protocols still being specified
> in the /etc/yum.repos files.
I received a comment to the effect that I had said the Bug 487101 problem was fixed. That is NOT what I meant to say -- I was reporting having managed to get the XO software to recognize a proxy for https: -- but only __after__ making some sort of change with respect to the default ssh handler (or something like that - I forget what exactly I had done).
To verify that the BUG 487101 problem is not fixed, I booted a recent F11-on-XO1 build (distributed as http://dev.laptop.org/smparrish/xo-1/builds/os5.img) and made sure that the /etc/yum.repos.d/fedora.repo line for mirrors had 'https:' in it. I entered (what others had suggested) ' export https_proxy="https://192.168.1.1:8080/" ' on the command line and then ' yum upgrade '. Yum stalled -- after about half an hour the command exited with the message "Error: Cannot retrieve repository metadata (repmod.xml) for repository: fedora. Please verify its path and try again."
I then entered ' export https_proxy="http://192.168.1.1:8080/" ' on the command line and then ' yum upgrade '. This time the command exited quickly, but with that same error message.
Bottom line: I find the Bug 487101 problem to be NOT fixed.
can you tell me what ver of yum and of python-urlgrabber are in that image? since that's what ultimately will fix this problem, I believe.
(In reply to comment #20)
> can you tell me what ver of yum and of python-urlgrabber are in that image?
> since that's what ultimately will fix this problem, I believe.
As far as I know, in yum on the XO, 'https:' has *never* worked through a proxy. In my opinion it is not a question of the urlgrabber, nor does it matter what the version of yum is.
On F11-on-XO I am currently running the smparrish "os4" build of 20090805. It has python-urlgrabber 3.0.0-15.fc11.noarch, and yum 3.2.23-3.fc11.noarch . [The symptom of specifying 'https:' with a repo URL on F11 is that yum takes horribly long to time out, then simply terminates.]
Just to try things, I specified 'https:' within fedora.repo on F9-on-XO build 802. That build has python-urlgrabber 3.0.0-9.fc9.noarch and yum 3.2.19-3.1.olpc3.noarch. The result was:
> updates-newkey | 3.4 kB 00:00
> Could not retrieve mirrorlist https://mirrors.fedoraproject.org/mirrorlist?repo
> =fedora-9&arch=i386 error was
> [Errno 4] IOError: <urlopen error (110, 'Connection timed out')>
[In addition, I've tried things with a CentOS 3.5 system, which also has yum 3.2.19 -- that system is also unable to use 'https:' with a repo URL. As I indicated in comment #12, yum appears to be sending SSL requests *to* the proxy, instead of *through* the proxy.]
I meant to say the CentOS system was unable to use 'https:' with a repo URL, when trying to go through a proxy !
[None of this mattered much on CentOS, nor on earlier releases of Fedora -- these all used 'http:' for repo URLs, and worked fine that way through a proxy. It was only F11 that replaced the 'http:' protocol (in fedora repo URLs) with 'https:' -- and at least on the XO, yum is not handling the 'https:' protocol specification properly.]
This is why I was asking for the version of python-urlgrabber since that is what yum uses to setup the network communication.
'yum' doesn't handle https at all - it hands it all off to a library - python-urlgrabber.
If you'd answer the questions I asked it would make life a lot easier.
(In reply to comment #23)
> If you'd answer the questions I asked it would make life a lot easier.
I do not understand this remark. You asked:
>> can you tell me what ver of yum and of python-urlgrabber are in that image?
> On F11-on-XO I am currently running the smparrish "os4" build of 20090805.
> It has python-urlgrabber 3.0.0-15.fc11.noarch, and yum 3.2.23-3.fc11.noarch.
What more do you want ???
[The problem that caused me to write this report -- still shows up when I am running this current system (having the python-urlgrabber and yum as identified above). Please look at what I wrote before deciding that I had not answered you. I believe that neither the ver of python-urlgrabber nor the ver of yum has changed in F11-on-XO1 "images" in many many weeks - they ought to be the exact same versions that the official Fedora-11 repositories contain.]
fair enough. If nothing has changed in the image then there is no reason to think this problem will be fixed.
If you update to python-urlgrabber from rawhide and then test- that would help me a lot to fixing this problem.
Did testing with F11-on-XO1 build http://people.sugarlabs.org/~mtd/soas-x1/soasxo59.img . [As part of booting to the command line (in Sugar's 'Terminal' Activity, at home I always enter ' export http_proxy="http://192.168.1.1:8080/" '.] This build originally had yum 3.2.23-3.fc11 and python-urlgrabber 3.0.0-15.fc11 . [I had to first install package python-pycurl 7.19.0-1.fc11 before I could apply your updated python-urlgrabber.]
Enabled /etc/yum.repos.d/fedora-rawhide.repo. Did _only_ 'yum install python-urlgrabber' -- that fetched and installed python-urlgrabber 3.9.0-8.fc12. I did not try to pick up yum from rawhide. Disabled /etc/yum.repos.d/fedora-rawhide.repo .
Changed the mirrorlist= line in my /etc/yum.repos.d/fedora.repo from 'http:' (that's how I had modified all mirrorlist= lines when I did the build (image) install to my XO) to 'https:'. Entered 'yum upgrade' after entering ' export https_proxy="http://192.168.1.1:8080/" '. As far as I could tell, yum worked and accessed the (mirror) fedora.repo repository. Entered 'yum upgrade' after entering ' export https_proxy="https://192.168.1.1:8080/" '. As far as I could tell, yum worked and accessed the (mirror) fedora.repo repository. Entered 'yum upgrade' after entering ' export https_proxy=" " '. As far as I could tell, yum worked and accessed the (mirror) fedora.repo repository.
Anything else you want me to test ?
Same setup as above (with python-urlgrabber 3.9.0-8.fc12, and with 'https:' in the mirrorlist= entry in /etc/yum.repos.d/fedora.repo, and with https_proxy unset). Entered 'yum clean all'. Entered 'yum upgrade'. Yum told me "Cannot retrieve repository metadata (repmod.xml) for repository: fedora. Please verufy its path and try again." and terminated. I entered ' export https_proxy="https://192.168.1.1:8080/" '. Entered 'yum upgrade'. Yum went ahead and fetched information from all repositories (including fedora) and did what I expected it to do (it worked).
okay. If I'm reading this correctly then it seems like the new python-urlgrabber with the correct proxy setting works. That's great news.