Login
Log in using an SSO provider:
Fedora Account System
Red Hat Associate
Red Hat Customer
Login using a Red Hat Bugzilla account
Forgot Password
Create an Account
Red Hat Bugzilla – Attachment 1978826 Details for
Bug 2226145
python-ansible-runner: FTBFS in Fedora rawhide/f39
Home
New
Search
Simple Search
Advanced Search
My Links
Browse
Requests
Reports
Current State
Search
Tabular reports
Graphical reports
Duplicates
Other Reports
User Changes
Plotly Reports
Bug Status
Bug Severity
Non-Defaults
Product Dashboard
Help
Page Help!
Bug Writing Guidelines
What's new
Browser Support Policy
5.0.4.rh89 Release notes
FAQ
Guides index
User guide
Web Services
Contact
Legal
[?]
This site requires JavaScript to be enabled to function correctly, please enable it.
build.log
build.log (text/plain), 32.00 KB, created by
Fedora Release Engineering
on 2023-07-25 19:07:58 UTC
(
hide
)
Description:
build.log
Filename:
MIME Type:
Creator:
Fedora Release Engineering
Created:
2023-07-25 19:07:58 UTC
Size:
32.00 KB
patch
obsolete
>test_load_file_text_cache_hit >[gw1] [ 96%] PASSED test/unit/test_loader.py::test_load_file_text_cache_hit >test/unit/test_loader.py::test_load_file_json >[gw1] [ 96%] PASSED test/unit/test_loader.py::test_load_file_json >test/unit/test_loader.py::test_load_file_type_check >[gw1] [ 96%] PASSED test/unit/test_loader.py::test_load_file_type_check >test/unit/test_loader.py::test_get_contents_ok >[gw1] [ 96%] PASSED test/unit/test_loader.py::test_get_contents_ok >test/unit/test_loader.py::test_get_contents_invalid_path >[gw1] [ 96%] PASSED test/unit/test_loader.py::test_get_contents_invalid_path >test/unit/test_loader.py::test_get_contents_exception >[gw1] [ 96%] PASSED test/unit/test_loader.py::test_get_contents_exception >test/unit/test_runner.py::test_simple_spawn >[gw1] [ 96%] PASSED test/unit/test_runner.py::test_simple_spawn >test/unit/test_runner.py::test_error_code >[gw1] [ 96%] PASSED test/unit/test_runner.py::test_error_code >test/unit/test_runner.py::test_job_timeout >[gw1] [ 96%] PASSED test/unit/test_runner.py::test_job_timeout >test/unit/test_runner.py::test_cancel_callback >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_cancel_callback >test/unit/test_runner.py::test_cancel_callback_error >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_cancel_callback_error >test/unit/test_runner.py::test_verbose_event_created_time >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_verbose_event_created_time >test/unit/test_runner.py::test_env_vars[abc123] >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_env_vars[abc123] >test/unit/test_runner.py::test_env_vars[I\xf1t\xebrn\xe2ti\xf4n\xe0liz\xe6ti\xf8n] >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_env_vars[I\xf1t\xebrn\xe2ti\xf4n\xe0liz\xe6ti\xf8n] >test/unit/test_runner.py::test_event_callback_data_check >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_event_callback_data_check >test/unit/test_runner.py::test_event_callback_interface_has_ident >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_event_callback_interface_has_ident >test/unit/test_runner.py::test_event_callback_interface_calls_event_handler_for_verbose_event >[gw0] [ 97%] PASSED test/integration/test_events.py::test_basic_serializeable >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_event_callback_interface_calls_event_handler_for_verbose_event >test/integration/test_events.py::test_event_omission >test/unit/test_runner.py::test_status_callback_interface >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_status_callback_interface >test/unit/test_runner.py::test_stdout_file_write[pexpect] >[gw1] [ 97%] SKIPPED test/unit/test_runner.py::test_stdout_file_write[pexpect] >test/unit/test_runner.py::test_stdout_file_write[subprocess] >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_stdout_file_write[subprocess] >test/unit/test_runner.py::test_stdout_file_no_write[pexpect] >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_stdout_file_no_write[pexpect] >test/unit/test_runner.py::test_stdout_file_no_write[subprocess] >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_stdout_file_no_write[subprocess] >test/unit/test_runner.py::test_multiline_blank_write[pexpect] >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_multiline_blank_write[pexpect] >test/unit/test_runner.py::test_multiline_blank_write[subprocess] >[gw1] [ 97%] PASSED test/unit/test_runner.py::test_multiline_blank_write[subprocess] >test/unit/test_runner.py::test_no_ResourceWarning_error[subprocess] >[gw1] [ 97%] FAILED test/unit/test_runner.py::test_no_ResourceWarning_error[subprocess] >test/unit/test_utils.py::test_artifact_permissions >[gw1] [ 97%] PASSED test/unit/test_utils.py::test_artifact_permissions >test/unit/__main__/main/test_worker.py::test_worker_delete >[gw1] [ 97%] PASSED test/unit/__main__/main/test_worker.py::test_worker_delete >[gw0] [ 97%] PASSED test/integration/test_events.py::test_event_omission >test/integration/test_events.py::test_event_omission_except_failed >[gw0] [ 97%] PASSED test/integration/test_events.py::test_event_omission_except_failed >test/integration/test_events.py::test_runner_on_start >[gw0] [ 98%] PASSED test/integration/test_events.py::test_runner_on_start >test/integration/test_events.py::test_playbook_on_stats_summary_fields >[gw0] [ 98%] PASSED test/integration/test_events.py::test_playbook_on_stats_summary_fields >test/integration/test_events.py::test_include_role_events >[gw0] [ 98%] PASSED test/integration/test_events.py::test_include_role_events >test/integration/test_events.py::test_include_role_from_collection_events >[gw0] [ 98%] PASSED test/integration/test_events.py::test_include_role_from_collection_events >test/integration/test_interface.py::test_run >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_run >test/integration/test_interface.py::test_run_playbook_data[playbook0] >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_run_playbook_data[playbook0] >test/integration/test_interface.py::test_run_playbook_data[playbook1] >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_run_playbook_data[playbook1] >test/integration/test_interface.py::test_run_async >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_run_async >test/integration/test_interface.py::test_repeat_run_with_new_inventory >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_repeat_run_with_new_inventory >test/integration/test_interface.py::test_env_accuracy >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_env_accuracy >test/integration/test_interface.py::test_no_env_files >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_no_env_files >test/integration/test_interface.py::test_env_accuracy_inside_container[docker] >[gw0] [ 98%] SKIPPED test/integration/test_interface.py::test_env_accuracy_inside_container[docker] >test/integration/test_interface.py::test_env_accuracy_inside_container[podman] >[gw0] [ 98%] SKIPPED test/integration/test_interface.py::test_env_accuracy_inside_container[podman] >test/integration/test_interface.py::test_multiple_inventories >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_multiple_inventories >test/integration/test_interface.py::test_inventory_absolute_path >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_inventory_absolute_path >test/integration/test_interface.py::test_run_command >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_run_command >test/integration/test_interface.py::test_run_command_injection_error >[gw0] [ 98%] PASSED test/integration/test_interface.py::test_run_command_injection_error >test/integration/test_interface.py::test_run_command_injection_error_within_container[docker] >[gw0] [ 98%] SKIPPED test/integration/test_interface.py::test_run_command_injection_error_within_container[docker] >test/integration/test_interface.py::test_run_command_injection_error_within_container[podman] >[gw0] [ 98%] SKIPPED test/integration/test_interface.py::test_run_command_injection_error_within_container[podman] >test/integration/test_interface.py::test_run_ansible_command_within_container[docker] >[gw0] [ 98%] SKIPPED test/integration/test_interface.py::test_run_ansible_command_within_container[docker] >test/integration/test_interface.py::test_run_ansible_command_within_container[podman] >[gw0] [ 98%] SKIPPED test/integration/test_interface.py::test_run_ansible_command_within_container[podman] >test/integration/test_interface.py::test_run_script_within_container[docker] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_run_script_within_container[docker] >test/integration/test_interface.py::test_run_script_within_container[podman] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_run_script_within_container[podman] >test/integration/test_interface.py::test_run_command_async >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_run_command_async >test/integration/test_interface.py::test_get_plugin_docs >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_get_plugin_docs >test/integration/test_interface.py::test_get_plugin_docs_async >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_get_plugin_docs_async >test/integration/test_interface.py::test_get_plugin_docs_within_container[docker] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_plugin_docs_within_container[docker] >test/integration/test_interface.py::test_get_plugin_docs_within_container[podman] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_plugin_docs_within_container[podman] >test/integration/test_interface.py::test_get_plugin_docs_list >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_get_plugin_docs_list >test/integration/test_interface.py::test_get_plugin_docs_list_within_container[docker] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_plugin_docs_list_within_container[docker] >test/integration/test_interface.py::test_get_plugin_docs_list_within_container[podman] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_plugin_docs_list_within_container[podman] >test/integration/test_interface.py::test_ansible_config >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_ansible_config >test/integration/test_interface.py::test_get_inventory >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_get_inventory >test/integration/test_interface.py::test_get_inventory_within_container[docker] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_inventory_within_container[docker] >test/integration/test_interface.py::test_get_inventory_within_container[podman] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_inventory_within_container[podman] >test/integration/test_interface.py::test_run_role >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_run_role >test/integration/test_interface.py::test_get_role_list >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_get_role_list >test/integration/test_interface.py::test_get_role_list_within_container[docker] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_role_list_within_container[docker] >test/integration/test_interface.py::test_get_role_list_within_container[podman] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_role_list_within_container[podman] >test/integration/test_interface.py::test_get_role_argspec >[gw0] [ 99%] PASSED test/integration/test_interface.py::test_get_role_argspec >test/integration/test_interface.py::test_get_role_argspec_within_container[docker] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_role_argspec_within_container[docker] >test/integration/test_interface.py::test_get_role_argspec_within_container[podman] >[gw0] [ 99%] SKIPPED test/integration/test_interface.py::test_get_role_argspec_within_container[podman] >test/integration/test_main.py::test_help[None-expected0] >[gw0] [100%] PASSED test/integration/test_main.py::test_help[None-expected0] >=================================== FAILURES =================================== >_____________________ test_dump_artifacts_inventory_object _____________________ >[gw2] linux -- Python 3.12.0 /usr/bin/python3 >mocker = <pytest_mock.plugin.MockerFixture object at 0xf4279c90> > def test_dump_artifacts_inventory_object(mocker): > mock_dump_artifact = mocker.patch('ansible_runner.utils.dump_artifact') > > inv = {'foo': 'bar'} > inv_string = '{"foo": "bar"}' > kwargs = {'private_data_dir': '/tmp', 'inventory': inv} > dump_artifacts(kwargs) > >> assert mock_dump_artifact.called_once_with(inv_string, '/tmp/inventory', 'hosts.json') >inv = {'foo': 'bar'} >inv_string = '{"foo": "bar"}' >kwargs = {'inventory': <MagicMock name='dump_artifact()' id='4105425712'>, 'private_data_dir': '/tmp'} >mock_dump_artifact = <MagicMock name='dump_artifact' id='4096347688'> >mocker = <pytest_mock.plugin.MockerFixture object at 0xf4279c90> >test/unit/utils/test_dump_artifacts.py:147: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >self = <MagicMock name='dump_artifact' id='4096347688'> >name = 'called_once_with' > def __getattr__(self, name): > if name in {'_mock_methods', '_mock_unsafe'}: > raise AttributeError(name) > elif self._mock_methods is not None: > if name not in self._mock_methods or name in _all_magics: > raise AttributeError("Mock object has no attribute %r" % name) > elif _is_magic(name): > raise AttributeError(name) > if not self._mock_unsafe and (not self._mock_methods or name not in self._mock_methods): > if name.startswith(('assert', 'assret', 'asert', 'aseert', 'assrt')) or name in _ATTRIB_DENY_LIST: >> raise AttributeError( > f"{name!r} is not a valid assertion. Use a spec " > f"for the mock if {name!r} is meant to be an attribute.") >E AttributeError: 'called_once_with' is not a valid assertion. Use a spec for the mock if 'called_once_with' is meant to be an attribute.. Did you mean: 'assert_called_once_with'? >name = 'called_once_with' >self = <MagicMock name='dump_artifact' id='4096347688'> >/usr/lib/python3.12/unittest/mock.py:663: AttributeError >__________________ test_no_ResourceWarning_error[subprocess] ___________________ >[gw1] linux -- Python 3.12.0 /usr/bin/python3 >rc = <ansible_runner.config.runner.RunnerConfig object at 0xf1b65ac8> >runner_mode = 'subprocess' > @pytest.mark.parametrize('runner_mode', ['subprocess']) > @pytest.mark.filterwarnings("error") > def test_no_ResourceWarning_error(rc, runner_mode): > """ > Test that no ResourceWarning error is propogated up with warnings-as-errors enabled. > > Not properly closing stdout/stderr in Runner.run() will cause a ResourceWarning > error that is only seen when we treat warnings as an error. > """ > rc.command = ['echo', 'Hello World'] > rc.runner_mode = runner_mode > runner = Runner(config=rc) >> status, exitcode = runner.run() >rc = <ansible_runner.config.runner.RunnerConfig object at 0xf1b65ac8> >runner = <ansible_runner.runner.Runner object at 0xf1b658e8> >runner_mode = 'subprocess' >test/unit/test_runner.py:205: >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >ansible_runner/runner.py:273: in run > stdout_handle.write(stdout_response) > command = ['echo', 'Hello World'] > command_filename = '/tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_ResourceWarning_error_0/artifacts/af21d5ce-2871-4a08-acc9-c703020d7cfd/command' > cwd = '/tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_ResourceWarning_error_0' > env = {} > error_fd = -1 > f = <codecs.StreamReaderWriter object at 0xf1b656c0> > input_fd = None > job_events_path = '/tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_ResourceWarning_error_0/artifacts/af21d5ce-2871-4a08-acc9-c703020d7cfd/job_events' > kwargs = {'check': True, 'cwd': '/tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_ResourceWarning_error_0', 'env': {}, 'stderr': -1, ...} > output_fd = -1 > password_patterns = [] > password_values = [] > pexpect_env = {} > proc_out = CompletedProcess(args=['echo', 'Hello World'], returncode=0, stdout='Hello World\n', stderr='') > self = <ansible_runner.runner.Runner object at 0xf1b658e8> > stderr_filename = '/tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_ResourceWarning_error_0/artifacts/af21d5ce-2871-4a08-acc9-c703020d7cfd/stderr' > stderr_handle = <ansible_runner.utils.OutputEventFilter object at 0xf1b5a360> > stderr_response = '' > stdout_filename = '/tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_ResourceWarning_error_0/artifacts/af21d5ce-2871-4a08-acc9-c703020d7cfd/stdout' > stdout_handle = <ansible_runner.utils.OutputEventFilter object at 0xf1b5a288> > stdout_response = 'Hello World\n' > subprocess_timeout = None > suppress_ansible_output = True >ansible_runner/utils/__init__.py:354: in write > self._emit_event(line) > data = 'Hello World\n' > line = 'Hello World\n' > lines = ['Hello World\n'] > remainder = None > self = <ansible_runner.utils.OutputEventFilter object at 0xf1b5a288> > should_search = False >ansible_runner/utils/__init__.py:399: in _emit_event > self._event_callback(event_data) > buffered_stdout = 'Hello World\n' > event_data = {'counter': 1, 'end_line': 1, 'event': 'verbose', 'runner_ident': 'af21d5ce-2871-4a08-acc9-c703020d7cfd', ...} > n_lines = 1 > next_event_data = {} > self = <ansible_runner.utils.OutputEventFilter object at 0xf1b5a288> > stdout_chunk = 'Hello World\n' > stdout_chunks = ['Hello World\n'] >_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ >self = <ansible_runner.runner.Runner object at 0xf1b658e8> >event_data = {'counter': 1, 'end_line': 1, 'event': 'verbose', 'runner_ident': 'af21d5ce-2871-4a08-acc9-c703020d7cfd', ...} > def event_callback(self, event_data): > ''' > Invoked for every Ansible event to collect stdout with the event data and store it for > later use > ''' > self.last_stdout_update = time.time() > if 'uuid' in event_data: > filename = '{}-partial.json'.format(event_data['uuid']) > partial_filename = os.path.join(self.config.artifact_dir, > 'job_events', > filename) > full_filename = os.path.join(self.config.artifact_dir, > 'job_events', > '{}-{}.json'.format(event_data['counter'], > event_data['uuid'])) > try: > event_data.update(dict(runner_ident=str(self.config.ident))) > try: > with codecs.open(partial_filename, 'r', encoding='utf-8') as read_file: > partial_event_data = json.load(read_file) > event_data.update(partial_event_data) > if self.remove_partials: > os.remove(partial_filename) > except IOError as e: > msg = "Failed to open ansible stdout callback plugin partial data" \ > " file {} with error {}".format(partial_filename, str(e)) > debug(msg) > if self.config.check_job_event_data: > raise AnsibleRunnerException(msg) > > # prefer 'created' from partial data, but verbose events set time here > if 'created' not in event_data: >> event_data['created'] = datetime.datetime.utcnow().isoformat() >E DeprecationWarning: datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.now(datetime.UTC). >event_data = {'counter': 1, 'end_line': 1, 'event': 'verbose', 'runner_ident': 'af21d5ce-2871-4a08-acc9-c703020d7cfd', ...} >filename = 'b0987d03-32a4-499f-958a-94bb768d2cf1-partial.json' >full_filename = '/tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_ResourceWarning_error_0/artifacts/af21d5ce-2871-4a08-acc9-c703020d7cfd/job_events/1-b0987d03-32a4-499f-958a-94bb768d2cf1.json' >msg = "Failed to open ansible stdout callback plugin partial data file /tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_R...g_error_0/artifacts/af21d5ce-2871-4a08-acc9-c703020d7cfd/job_events/b0987d03-32a4-499f-958a-94bb768d2cf1-partial.json'" >partial_filename = '/tmp/pytest-of-mockbuild/pytest-0/popen-gw1/test_no_ResourceWarning_error_0/artifacts/af21d5ce-2871-4a08-acc9-c703020d7cfd/job_events/b0987d03-32a4-499f-958a-94bb768d2cf1-partial.json' >self = <ansible_runner.runner.Runner object at 0xf1b658e8> >ansible_runner/runner.py:83: DeprecationWarning >=============================== warnings summary =============================== >../../../../usr/lib/python3.12/site-packages/xdist/plugin.py:252 >../../../../usr/lib/python3.12/site-packages/xdist/plugin.py:252 >../../../../usr/lib/python3.12/site-packages/xdist/plugin.py:252 >../../../../usr/lib/python3.12/site-packages/xdist/plugin.py:252 >../../../../usr/lib/python3.12/site-packages/xdist/plugin.py:252 >../../../../usr/lib/python3.12/site-packages/xdist/plugin.py:252 > /usr/lib/python3.12/site-packages/xdist/plugin.py:252: DeprecationWarning: The --rsyncdir command line argument and rsyncdirs config variable are deprecated. > The rsync feature will be removed in pytest-xdist 4.0. > config.issue_config_time_warning(warning, 2) >test/integration/test_main.py: 9 warnings >test/integration/test_runner.py: 23 warnings >test/unit/test_runner.py: 10 warnings > /usr/lib/python3.12/pty.py:95: DeprecationWarning: This process (pid=563) is multi-threaded, use of forkpty() may lead to deadlocks in the child. > pid, fd = os.forkpty() >test/integration/test___main__.py: 2 warnings >test/integration/test_config.py: 3 warnings >test/integration/test_display_callback.py: 35 warnings >test/integration/test_events.py: 7 warnings >test/integration/test_interface.py: 13 warnings > /usr/lib/python3.12/pty.py:95: DeprecationWarning: This process (pid=560) is multi-threaded, use of forkpty() may lead to deadlocks in the child. > pid, fd = os.forkpty() >test/integration/test_main.py::test_role_start >test/integration/test_main.py::test_playbook_start > /usr/lib/python3.12/multiprocessing/popen_fork.py:66: DeprecationWarning: This process (pid=563) is multi-threaded, use of fork() may lead to deadlocks in the child. > self.pid = os.fork() >test/integration/test_runner.py: 52662 warnings >test/integration/test_display_callback.py: 3 warnings >test/unit/test_runner.py: 23 warnings >test/integration/test_interface.py: 4565 warnings > /builddir/build/BUILD/ansible-runner-2.3.3/ansible_runner/runner.py:83: DeprecationWarning: datetime.utcnow() is deprecated and scheduled for removal in a future version. Use timezone-aware objects to represent datetimes in UTC: datetime.now(datetime.UTC). > event_data['created'] = datetime.datetime.utcnow().isoformat() >-- Docs: https://docs.pytest.org/en/stable/how-to/capture-warnings.html >----------- coverage: platform linux, python 3.12.0-beta-4 ----------- >Name Stmts Miss Branch BrPart Cover >------------------------------------------------------------------------------------------- >ansible_runner/__main__.py 310 37 95 17 85% >ansible_runner/cleanup.py 118 52 49 4 52% >ansible_runner/config/_base.py 390 52 190 27 83% >ansible_runner/config/ansible_cfg.py 30 1 12 1 95% >ansible_runner/config/command.py 43 5 18 5 84% >ansible_runner/config/doc.py 73 7 38 10 85% >ansible_runner/config/inventory.py 49 1 30 1 97% >ansible_runner/config/runner.py 242 28 116 21 85% >ansible_runner/display_callback/callback/awx_display.py 442 134 208 37 64% >ansible_runner/interface.py 190 24 40 10 80% >ansible_runner/loader.py 61 2 20 2 95% >ansible_runner/output.py 45 4 16 3 85% >ansible_runner/runner.py 333 64 152 28 78% >ansible_runner/streaming.py 248 198 96 0 17% >ansible_runner/utils/__init__.py 296 19 127 15 91% >ansible_runner/utils/base64io.py 121 43 39 10 58% >ansible_runner/utils/capacity.py 36 12 12 0 67% >ansible_runner/utils/importlib_compat.py 4 1 2 1 67% >ansible_runner/utils/streaming.py 76 1 48 3 97% >------------------------------------------------------------------------------------------- >TOTAL 3123 685 1308 195 74% >8 files skipped due to complete coverage. >Coverage HTML written to dir test/coverage/reports/html >Coverage XML written to file test/coverage/reports/coverage.xml >============================= slowest 10 durations ============================= >13.32s call test/integration/test_runner.py::test_run_command_long_running_children >13.15s call test/integration/test_runner.py::test_run_command_long_running >5.55s call test/integration/test_display_callback.py::test_callback_plugin_no_log_filters[playbook4] >5.52s call test/integration/test_display_callback.py::test_module_level_no_log[playbook0] >5.15s call test/integration/test___main__.py::test_cmdline_playbook >4.74s call test/integration/test_interface.py::test_run_playbook_data[playbook0] >4.71s call test/integration/test_interface.py::test_run_playbook_data[playbook1] >4.69s call test/integration/test_display_callback.py::test_callback_plugin_strips_task_environ_variables[playbook0] >4.60s call test/integration/test_interface.py::test_repeat_run_with_new_inventory >4.58s call test/integration/test_runner.py::test_run_command_ansible_rotate_artifacts >=========================== short test summary info ============================ >SKIPPED [1] test/integration/test_transmit_worker_process.py:356: Ansible could not initialize the preferred locale: unsupported locale setting >SKIPPED [1] test/integration/test_display_callback.py:166: can not resolve example.com in build system >SKIPPED [1] test/integration/test_display_callback.py:202: ansible version lookup is blank in build >SKIPPED [1] test/integration/containerized/test_cleanup_images.py:13: docker is not installed >SKIPPED [1] test/integration/containerized/test_cleanup_images.py:13: podman is not installed >SKIPPED [1] test/integration/containerized/test_cli_containerized.py:13: docker is not installed >SKIPPED [1] test/integration/containerized/test_cli_containerized.py:13: podman is not installed >SKIPPED [1] test/integration/containerized/test_cli_containerized.py:26: docker is not installed >SKIPPED [1] test/integration/containerized/test_cli_containerized.py:26: podman is not installed >SKIPPED [1] test/integration/containerized/test_cli_containerized.py:44: docker is not installed >SKIPPED [1] test/integration/containerized/test_cli_containerized.py:44: podman is not installed >SKIPPED [1] test/integration/containerized/test_cli_containerized.py:55: docker is not installed >SKIPPED [1] test/integration/containerized/test_cli_containerized.py:55: podman is not installed >SKIPPED [1] test/integration/containerized/test_container_management.py:52: docker is not installed >SKIPPED [1] test/integration/containerized/test_container_management.py:52: podman is not installed >SKIPPED [1] test/integration/containerized/test_container_management.py:78: docker is not installed >SKIPPED [1] test/integration/containerized/test_container_management.py:78: podman is not installed >SKIPPED [1] test/integration/containerized/test_container_management.py:97: docker is not installed >SKIPPED [1] test/integration/containerized/test_container_management.py:97: podman is not installed >SKIPPED [1] test/integration/containerized/test_container_management.py:146: docker is not installed >SKIPPED [1] test/integration/containerized/test_container_management.py:146: podman is not installed >SKIPPED [2] test/integration/test_events.py:7: docker is not installed >SKIPPED [2] test/integration/test_events.py:7: podman is not installed >SKIPPED [2] test/integration/test_events.py:53: docker is not installed >SKIPPED [2] test/integration/test_events.py:53: podman is not installed >SKIPPED [1] test/unit/test_runner.py:153: Writing to stdout can be flaky, probably due to some pexpect bug >SKIPPED [1] test/integration/test_interface.py:139: docker is not installed >SKIPPED [1] test/integration/test_interface.py:139: podman is not installed >SKIPPED [1] test/integration/test_interface.py:235: docker is not installed >SKIPPED [1] test/integration/test_interface.py:235: podman is not installed >SKIPPED [1] test/integration/test_interface.py:249: docker is not installed >SKIPPED [1] test/integration/test_interface.py:249: podman is not installed >SKIPPED [1] test/integration/test_interface.py:270: docker is not installed >SKIPPED [1] test/integration/test_interface.py:270: podman is not installed >SKIPPED [1] test/integration/test_interface.py:330: docker is not installed >SKIPPED [1] test/integration/test_interface.py:330: podman is not installed >SKIPPED [1] test/integration/test_interface.py:356: docker is not installed >SKIPPED [1] test/integration/test_interface.py:356: podman is not installed >SKIPPED [1] test/integration/test_interface.py:395: docker is not installed >SKIPPED [1] test/integration/test_interface.py:395: podman is not installed >SKIPPED [1] test/integration/test_interface.py:452: docker is not installed >SKIPPED [1] test/integration/test_interface.py:452: podman is not installed >SKIPPED [1] test/integration/test_interface.py:511: docker is not installed >SKIPPED [1] test/integration/test_interface.py:511: podman is not installed >XPASS test/integration/test_runner.py::test_password_prompt Test is unstable >FAILED test/unit/utils/test_dump_artifacts.py::test_dump_artifacts_inventory_object - AttributeError: 'called_once_with' is not a valid assertion. Use a spec for... >FAILED test/unit/test_runner.py::test_no_ResourceWarning_error[subprocess] - DeprecationWarning: datetime.utcnow() is deprecated and scheduled for remov... >= 2 failed, 2053 passed, 48 skipped, 1 xpassed, 57363 warnings in 201.02s (0:03:21) = >error: Bad exit status from /var/tmp/rpm-tmp.dKKBPj (%check) > Bad exit status from /var/tmp/rpm-tmp.dKKBPj (%check) >RPM build errors: >Child return code was: 1 >EXCEPTION: [Error('Command failed: \n # /usr/bin/systemd-nspawn -q -M d580ebc2460e437681ea31c765a16f79 -D /var/lib/mock/f39-build-44332137-5276202/root -a -u mockbuild --capability=cap_ipc_lock --bind=/tmp/mock-resolv.4dn1ua1j:/etc/resolv.conf --bind=/dev/btrfs-control --bind=/dev/mapper/control --bind=/dev/loop-control --bind=/dev/loop0 --bind=/dev/loop1 --bind=/dev/loop2 --bind=/dev/loop3 --bind=/dev/loop4 --bind=/dev/loop5 --bind=/dev/loop6 --bind=/dev/loop7 --bind=/dev/loop8 --bind=/dev/loop9 --bind=/dev/loop10 --bind=/dev/loop11 --console=pipe --setenv=TERM=vt100 --setenv=SHELL=/bin/bash --setenv=HOME=/builddir --setenv=HOSTNAME=mock --setenv=PATH=/usr/bin:/bin:/usr/sbin:/sbin --setenv=PROMPT_COMMAND=printf "\\033]0;<mock-chroot>\\007" --setenv=PS1=<mock-chroot> \\s-\\v\\$ --setenv=LANG=C.UTF-8 --resolv-conf=off bash --login -c /usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-ansible-runner.spec\n', 1)] >Traceback (most recent call last): > File "/usr/lib/python3.11/site-packages/mockbuild/trace_decorator.py", line 93, in trace > result = func(*args, **kw) > ^^^^^^^^^^^^^^^^^ > File "/usr/lib/python3.11/site-packages/mockbuild/util.py", line 597, in do_with_status > raise exception.Error("Command failed: \n # %s\n%s" % (command, output), child.returncode) >mockbuild.exception.Error: Command failed: > # /usr/bin/systemd-nspawn -q -M d580ebc2460e437681ea31c765a16f79 -D /var/lib/mock/f39-build-44332137-5276202/root -a -u mockbuild --capability=cap_ipc_lock --bind=/tmp/mock-resolv.4dn1ua1j:/etc/resolv.conf --bind=/dev/btrfs-control --bind=/dev/mapper/control --bind=/dev/loop-control --bind=/dev/loop0 --bind=/dev/loop1 --bind=/dev/loop2 --bind=/dev/loop3 --bind=/dev/loop4 --bind=/dev/loop5 --bind=/dev/loop6 --bind=/dev/loop7 --bind=/dev/loop8 --bind=/dev/loop9 --bind=/dev/loop10 --bind=/dev/loop11 --console=pipe --setenv=TERM=vt100 --setenv=SHELL=/bin/bash --setenv=HOME=/builddir --setenv=HOSTNAME=mock --setenv=PATH=/usr/bin:/bin:/usr/sbin:/sbin --setenv=PROMPT_COMMAND=printf "\033]0;<mock-chroot>\007" --setenv=PS1=<mock-chroot> \s-\v\$ --setenv=LANG=C.UTF-8 --resolv-conf=off bash --login -c /usr/bin/rpmbuild -ba --noprep --noclean --target noarch --nodeps /builddir/build/SPECS/python-ansible-runner.spec >
You cannot view the attachment while viewing its details because your browser does not support IFRAMEs.
View the attachment on a separate page
.
View Attachment As Raw
Actions:
View
Attachments on
bug 2226145
: 1978826 |
1978829
|
1978831