python-llm fails to build with Python 3.15.0a8. ________ test_gpt5_verbosity_option_is_sent_to_openai_chat_completions _________ httpx_mock = <pytest_httpx._httpx_mock.HTTPXMock object at 0x7fa657cb3610> def test_gpt5_verbosity_option_is_sent_to_openai_chat_completions(httpx_mock): httpx_mock.add_response( method="POST", url="https://api.openai.com/v1/chat/completions", json={ "model": "gpt-5", "usage": {}, "choices": [{"message": {"content": "Verbose enough"}}], }, headers={"Content-Type": "application/json"}, ) runner = CliRunner() result = runner.invoke( cli, [ "-m", "gpt-5", "-o", "verbosity", "high", "--no-stream", "--key", "x", "Say hi", ], catch_exceptions=False, ) assert result.exit_code == 0 request_body = json.loads(httpx_mock.get_requests()[-1].content) > assert request_body["verbosity"] == "high" E AssertionError: assert 'VerbosityEnum.high' == 'high' E E - high E + VerbosityEnum.high https://docs.python.org/3.15/whatsnew/3.15.html For the build logs, see: https://copr-be.cloud.fedoraproject.org/results/@python/python3.15/fedora-rawhide-x86_64/10425222-python-llm/ For all our attempts to build python-llm with Python 3.15, see: https://copr.fedorainfracloud.org/coprs/g/python/python3.15/package/python-llm/ Testing and mass rebuild of packages is happening in copr. You can follow these instructions to test locally in mock if your package builds with Python 3.15: https://copr.fedorainfracloud.org/coprs/g/python/python3.15/ Let us know here if you have any questions. Python 3.15 is planned to be included in Fedora 45. To make that update smoother, we're building Fedora packages with all pre-releases of Python 3.15. A build failure prevents us from testing all dependent packages (transitive [Build]Requires), so if this package is required a lot, it's important for us to get it fixed soon. We'd appreciate help from the people who know this package best, but if you don't want to work on this now, let us know so we can try to work around it on our side.