Red Hat Bugzilla – Bug 251218
rhnpush fails with Solaris 10 SPARC package or patch mpm push to "Sparc Solaris" channel
Last modified: 2009-09-10 15:40:28 EDT
Description of problem:
rhnpush fails with Solaris 10 SPARC package or patch mpm push to "Sparc Solaris"
Version-Release number of selected component (if applicable):
5.0.0 Satellite on 4AS server, fully patched.
Patch is downloaded from SUN. Patch is put on solaris 10 sparc server, and
solaris2mpm is run successfully. Patch is then attempted to be pushed to
rhnpush --server=<servername> --channel=solaris10
Traceback (most recent call last):
File "/usr/share/rhn/server/apacheRequest.py", line 108, in call_function
response = apply(func, params)
File "/usr/share/rhn/server/handlers/app/packages.py", line 308, in
return self._channelPackageSubscription(authobj, info)
File "/usr/share/rhn/server/handlers/app/packages.py", line 356, in
File "/usr/share/rhn/server/importlib/importLib.py", line 592, in run
File "/usr/share/rhn/server/importlib/packageImport.py", line 73, in fix
File "/usr/share/rhn/server/importlib/packageImport.py", line 174, in
"Invalid channel architecture %s" % charch)
InvalidArchError: Invalid channel architecture 510
Package appears in satellite under "Packages in no channels" and shows arch as
"sparc solaris" (or sparc solaris patch, as relevant) but cannot be added to a
"sparc solaris" channel.
rhnpush succeeds to push package or patch mpm to channel.
I asked/suggested on IRC to verify that the schema was in a sane state.
If this was done and everything looks fine, yet still your getting the error, I
strongly recommend to go through the support process for replication/debugging.
We tested multiple Solaris packages, patch clusters and patches for Solaris 10
during QA cycle of Satellite 5.0.
Flipping to needinfo reporter, pending an IT being associated to this bug by SEG.
Yes, the schema is sane, considering its brand new, and we followed your
suggestions on IRC as well. I am no longer onsite at the client. The problem is
that this is an eval, and it seems a bit shady to be like "here open a ticket
and work through it."
Replication is easy, this is not a one-off "happened once" problem.
I will suggest to the client that they contact support with this issue and
reference this bugtracker, so that SEG can make the needed association.
I've just retried this on Satellite-5.3.0-RHEL5-re20090306.2 on i386 and the rhnpush run alright, with the following in the log:
10.34.34.139 - - [23/Mar/2009:14:17:51 +0100] "GET /PACKAGE-PUSH HTTP/1.1" 200 - "-" "-"
vmware139.englab.brq.redhat.com - - [23/Mar/2009:14:18:08 +0100] "POST /APP HTTP/1.1" 200 135 "-" "rhn.rpclib.py/$Revision: 136589 $"
vmware139.englab.brq.redhat.com - - [23/Mar/2009:14:18:12 +0100] "POST /APP HTTP/1.1" 200 175 "-" "rhn.rpclib.py/$Revision: 136589 $"
10.34.34.139 - - [23/Mar/2009:14:18:12 +0100] "POST /PACKAGE-PUSH HTTP/1.1" 200 6 "-" "rhnpush"
vmware139.englab.brq.redhat.com - - [23/Mar/2009:14:18:35 +0100] "POST /APP HTTP/1.1" 200 136 "-" "rhn.rpclib.py/$Revision: 136589 $"
vmware139.englab.brq.redhat.com - - [23/Mar/2009:14:18:36 +0100] "POST /APP HTTP/1.1" 200 99 "-" "rhn.rpclib.py/$Revision: 136589 $"
And the package is shown in that channel's package list.
Since there seems to have been no followup on this issue via SEG and IT, I assume the issue did not appear again for the customer.
Moving ON_QA without doing any change in code.
Tested with several Solaris 10 packages pushed to multiple channels whose base architecture was of "Sparc Solaris". All packages (short of those I remarked about in BZ #495778) were uploaded successfully and appeared in the channel package list. Given that I cannot reproduce the reported issue, I'm moving this bug to VERIFIED, and it was tested on 530-re20090413.0.
RELEASE_PENDING from latest Stage build.
An advisory has been issued which should help the problem
described in this bug report. This report is therefore being
closed with a resolution of ERRATA. For more information
on therefore solution and/or where to find the updated files,
please follow the link below. You may reopen this bug report
if the solution does not work for you.