Bug 2427743 (CVE-2026-21869) - CVE-2026-21869 llama.cpp: llama.cpp: Remote code execution via invalid n_discard parameter in server endpoints
Summary: CVE-2026-21869 llama.cpp: llama.cpp: Remote code execution via invalid n_disc...
Keywords:
Status: NEW
Alias: CVE-2026-21869
Product: Security Response
Classification: Other
Component: vulnerability
Version: unspecified
Hardware: All
OS: Linux
high
high
Target Milestone: ---
Assignee: Product Security DevOps Team
QA Contact:
URL:
Whiteboard:
Depends On: 2427783 2427784
Blocks:
TreeView+ depends on / blocked
 
Reported: 2026-01-08 00:03 UTC by OSIDB Bzimport
Modified: 2026-01-08 05:22 UTC (History)
0 users

Fixed In Version:
Clone Of:
Environment:
Last Closed:
Embargoed:


Attachments (Terms of Use)

Description OSIDB Bzimport 2026-01-08 00:03:22 UTC
llama.cpp is an inference of several LLM models in C/C++. In commits 55d4206c8 and prior, the n_discard parameter is parsed directly from JSON input in the llama.cpp server's completion endpoints without validation to ensure it's non-negative. When a negative value is supplied and the context fills up, llama_memory_seq_rm/add receives a reversed range and negative offset, causing out-of-bounds memory writes in the token evaluation loop. This deterministic memory corruption can crash the process or enable remote code execution (RCE). There is no fix at the time of publication.


Note You need to log in before you can comment on or make changes to this bug.