Red Hat Satellite engineering is moving the tracking of its product development work on Satellite to Red Hat Jira (issues.redhat.com). If you're a Red Hat customer, please continue to file support cases via the Red Hat customer portal. If you're not, please head to the "Satellite project" in Red Hat Jira and file new tickets here. Individual Bugzilla bugs will be migrated starting at the end of May. If you cannot log in to RH Jira, please consult article #7032570. That failing, please send an e-mail to the RH Jira admins at rh-issues@redhat.com to troubleshoot your issue as a user management inquiry. The email creates a ServiceNow ticket with Red Hat. Individual Bugzilla bugs that are migrated will be moved to status "CLOSED", resolution "MIGRATED", and set with "MigratedToJIRA" in "Keywords". The link to the successor Jira issue will be found under "Links", have a little "two-footprint" icon next to it, and direct you to the "Satellite project" in Red Hat Jira (issue links are of type "https://issues.redhat.com/browse/SAT-XXXX", where "X" is a digit). This same link will be available in a blue banner at the top of the page informing you that that bug has been migrated.
Bug 1031001 - Invalid pulp yum_importer.json can be written with missing --proxy-* arguments
Summary: Invalid pulp yum_importer.json can be written with missing --proxy-* arguments
Keywords:
Status: CLOSED ERRATA
Alias: None
Product: Red Hat Satellite
Classification: Red Hat
Component: Installation
Version: 6.0.2
Hardware: Unspecified
OS: Unspecified
unspecified
medium
Target Milestone: Unspecified
Assignee: Eric Helms
QA Contact: Elyézer Rezende
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2013-11-15 13:07 UTC by Dominic Cleal
Modified: 2017-02-23 21:18 UTC (History)
3 users (show)

Fixed In Version:
Doc Type: Bug Fix
Doc Text:
Clone Of:
Environment:
Last Closed: 2015-08-12 05:07:52 UTC
Target Upstream Version:
Embargoed:


Attachments (Terms of Use)


Links
System ID Private Priority Status Summary Last Updated
Red Hat Bugzilla 1127397 0 unspecified CLOSED Running the installer a second time with no proxy values creates an invalid config 2021-02-22 00:41:40 UTC
Red Hat Bugzilla 1128296 0 high CLOSED Content sync should not silently hang/fail due do to possibly malformed proxy url 2021-02-22 00:41:40 UTC
Red Hat Bugzilla 1147049 0 unspecified CLOSED Passing empty values to katello-installer proxy settings creates invalid JSON 2021-02-22 00:41:40 UTC
Red Hat Product Errata RHSA-2015:1592 0 normal SHIPPED_LIVE Important: Red Hat Satellite 6.1.1 on RHEL 6 2015-08-12 09:04:35 UTC

Internal Links: 1127397 1128296 1147049

Description Dominic Cleal 2013-11-15 13:07:28 UTC
Description of problem:
Missing out --proxy-* arguments can result in a yum_importer.json pulp plugin config file that isn't valid JSON, preventing pulp from starting up.

Version-Release number of selected component (if applicable):
MDP2

How reproducible:
Always

Steps to Reproduce:
1. run katello-configure with --proxy-url, --proxy-port, --proxy-pass but NOT --proxy-user
2. check pulp status
3. check /var/log/pulp/pulp.log

Actual results:
2013-11-15 14:27:55,725 pulp.server.webservices.application:CRITICAL: The Pulp server failed to start due to the following reasons:
2013-11-15 14:27:55,726 pulp.server.webservices.application:ERROR:   One or more plugins failed to initialize. If a new type has been added, run pulp-manage-db to load the type into the database and restart the application. Error message: Expecting , delimiter: line 4 column 22 (char 98)
Traceback (most recent call last):
  File "/usr/lib/python2.6/site-packages/pulp/server/webservices/application.py", line 180, in wsgi_application
    _initialize_pulp()
  File "/usr/lib/python2.6/site-packages/pulp/server/webservices/application.py", line 118, in _initialize_pulp
    plugin_api.initialize()
  File "/usr/lib/python2.6/site-packages/pulp/plugins/loader/api.py", line 80, in initialize
    loading.load_plugins_from_entry_point(*entry_point)
  File "/usr/lib/python2.6/site-packages/pulp/plugins/loader/loading.py", line 108, in load_plugins_from_entry_point
    cls, cfg = entry_point.load()()
  File "/usr/lib/python2.6/site-packages/pulp_rpm/plugins/importers/yum/importer.py", line 41, in entry_point
    plugin_config = read_json_config(CONF_FILENAME)
  File "/usr/lib/python2.6/site-packages/pulp/common/config.py", line 670, in read_json_config
    config = json.load(f)
  File "/usr/lib64/python2.6/json/__init__.py", line 267, in load
    parse_constant=parse_constant, **kw)
  File "/usr/lib64/python2.6/json/__init__.py", line 307, in loads
    return _default_decoder.decode(s)
  File "/usr/lib64/python2.6/json/decoder.py", line 319, in decode
    obj, end = self.raw_decode(s, idx=_w(s, 0).end())
  File "/usr/lib64/python2.6/json/decoder.py", line 336, in raw_decode
    obj, end = self._scanner.iterscan(s, **kw).next()
  File "/usr/lib64/python2.6/json/scanner.py", line 55, in iterscan
    rval, next_pos = action(m, context)
  File "/usr/lib64/python2.6/json/decoder.py", line 193, in JSONObject
    raise ValueError(errmsg("Expecting , delimiter", s, end - 1))

Expected results:
pulp is running

Additional info:
The installer template doesn't check whether all of the --proxy-* parameters are present, so it will write invalid JSON:

https://github.com/Katello/katello-installer/blob/c524e2ae/modules/pulp/templates/etc/pulp/server/plugins.conf.d/yum_importer.json

Just quoting the values might be enough.

Comment 2 Eric Helms 2014-11-04 21:02:48 UTC
This issue has been addressed upstream here -https://bugzilla.redhat.com/show_bug.cgi?id=1127397

Comment 5 Elyézer Rezende 2015-02-13 17:20:02 UTC
Verified on Satellite-6.1.0-RHEL-6-20150210.0

Verification steps:

[root@amd-dinar-01 ~]# katello-installer --katello-proxy-url="http://proxy.example.com" --katello-proxy-port=8888 --katello-proxy-password=proxypass
Installing             Done                                               [100%] [.................................]
  Success!
  * Katello is running at https://amd-dinar-01.example.com
      Initial credentials are admin / changeme
  * Capsule is running at https://amd-dinar-01.example.com:9090
  * To install additional capsule on separate machine continue by running:"

      capsule-certs-generate --capsule-fqdn "$CAPSULE" --certs-tar "~/$CAPSULE-certs.tar"

  The full log is at /var/log/katello-installer/katello-installer.log

[root@amd-dinar-01 ~]# katello-service status
tomcat6 (pid 16004) is running...[  OK  ]
mongod (pid  5654) is running...
listening on 127.0.0.1:27017
connection test successful
elasticsearch (pid  4934) is running...
celery init v10.0.
Using config script: /etc/default/pulp_resource_manager
node resource_manager (pid 923) is running...
celery init v10.0.
Using config script: /etc/default/pulp_workers
node reserved_resource_worker-0 (pid 1598) is running...
node reserved_resource_worker-1 (pid 1474) is running...
node reserved_resource_worker-2 (pid 1358) is running...
node reserved_resource_worker-3 (pid 1744) is running...
node reserved_resource_worker-4 (pid 1789) is running...
node reserved_resource_worker-5 (pid 1241) is running...
node reserved_resource_worker-6 (pid 1531) is running...
node reserved_resource_worker-7 (pid 1427) is running...
node reserved_resource_worker-8 (pid 1194) is running...
node reserved_resource_worker-9 (pid 1216) is running...
node reserved_resource_worker-10 (pid 1892) is running...
node reserved_resource_worker-11 (pid 1668) is running...
node reserved_resource_worker-12 (pid 1381) is running...
node reserved_resource_worker-13 (pid 1767) is running...
node reserved_resource_worker-14 (pid 1146) is running...
node reserved_resource_worker-15 (pid 1287) is running...
node reserved_resource_worker-16 (pid 1450) is running...
node reserved_resource_worker-17 (pid 1500) is running...
node reserved_resource_worker-18 (pid 1404) is running...
node reserved_resource_worker-19 (pid 1310) is running...
node reserved_resource_worker-20 (pid 1573) is running...
node reserved_resource_worker-21 (pid 1724) is running...
node reserved_resource_worker-22 (pid 1694) is running...
node reserved_resource_worker-23 (pid 1644) is running...
node reserved_resource_worker-24 (pid 1170) is running...
node reserved_resource_worker-25 (pid 1838) is running...
node reserved_resource_worker-26 (pid 1869) is running...
node reserved_resource_worker-27 (pid 1263) is running...
node reserved_resource_worker-28 (pid 1621) is running...
node reserved_resource_worker-29 (pid 1551) is running...
node reserved_resource_worker-30 (pid 1820) is running...
node reserved_resource_worker-31 (pid 1333) is running...
celery init v10.0.
Using configuration: /etc/default/pulp_workers, /etc/default/pulp_celerybeat
pulp_celerybeat (pid 799) is running.
httpd (pid  2026) is running...
dynflow_executor is running.
dynflow_executor_monitor is running.

[root@amd-dinar-01 ~]# cat /var/log/messages | grep pulp
# omitted some messages
Feb 13 12:00:50 amd-dinar-01 pulp: pulp.server.webservices.application:INFO: *************************************************************
Feb 13 12:00:50 amd-dinar-01 pulp: pulp.server.webservices.application:INFO: The Pulp server has been successfully initialized
Feb 13 12:00:50 amd-dinar-01 pulp: pulp.server.webservices.application:INFO: *************************************************************
Feb 13 12:00:50 amd-dinar-01 pulp: gofer.transport.qpid.broker:INFO: {amd-dinar-01.example.com:5671} connected to AMQP

Comment 6 Bryan Kearney 2015-08-11 13:25:57 UTC
This bug is slated to be released with Satellite 6.1.

Comment 7 errata-xmlrpc 2015-08-12 05:07:52 UTC
Since the problem described in this bug report should be
resolved in a recent advisory, it has been closed with a
resolution of ERRATA.

For information on the advisory, and where to find the updated
files, follow the link below.

If the solution does not work for you, open a new bug report.

https://access.redhat.com/errata/RHSA-2015:1592


Note You need to log in before you can comment on or make changes to this bug.