Bug 2369480 (CVE-2025-48944) - CVE-2025-48944 vllm: vLLM Tool Schema denial of service
Summary: CVE-2025-48944 vllm: vLLM Tool Schema denial of service
Keywords:
Status: NEW
Alias: CVE-2025-48944
Product: Security Response
Classification: Other
Component: vulnerability
Version: unspecified
Hardware: All
OS: Linux
medium
medium
Target Milestone: ---
Assignee: Product Security DevOps Team
QA Contact:
URL:
Whiteboard:
Depends On:
Blocks:
TreeView+ depends on / blocked
 
Reported: 2025-05-30 19:01 UTC by OSIDB Bzimport
Modified: 2025-06-03 14:25 UTC (History)
5 users (show)

Fixed In Version:
Clone Of:
Environment:
Last Closed:
Embargoed:


Attachments (Terms of Use)

Description OSIDB Bzimport 2025-05-30 19:01:25 UTC
vLLM is an inference and serving engine for large language models (LLMs). In version 0.8.0 up to but excluding 0.9.0, the vLLM backend used with the /v1/chat/completions OpenAPI endpoint fails to validate unexpected or malformed input in the "pattern" and "type" fields when the tools functionality is invoked. These inputs are not validated before being compiled or parsed, causing a crash of the inference worker with a single request. The worker will remain down until it is restarted. Version 0.9.0 fixes the issue.


Note You need to log in before you can comment on or make changes to this bug.