Bug 2427743 (CVE-2026-21869)

Summary: CVE-2026-21869 llama.cpp: llama.cpp: Remote code execution via invalid n_discard parameter in server endpoints
Product: [Other] Security Response Reporter: OSIDB Bzimport <bzimport>
Component: vulnerabilityAssignee: Product Security DevOps Team <prodsec-dev>
Status: NEW --- QA Contact:
Severity: high Docs Contact:
Priority: high    
Version: unspecifiedKeywords: Security
Target Milestone: ---   
Target Release: ---   
Hardware: All   
OS: Linux   
Whiteboard:
Fixed In Version: Doc Type: ---
Doc Text:
A flaw was found in llama.cpp. A remote attacker can exploit an input validation vulnerability in the server's completion endpoints. By supplying a negative value for the `n_discard` parameter in JSON input, an attacker can cause out-of-bounds memory writes. This can lead to a process crash or enable remote code execution (RCE), allowing the attacker to run arbitrary code on the affected system.
Story Points: ---
Clone Of: Environment:
Last Closed: Type: ---
Regression: --- Mount Type: ---
Documentation: --- CRM:
Verified Versions: Category: ---
oVirt Team: --- RHEL 7.3 requirements from Atomic Host:
Cloudforms Team: --- Target Upstream Version:
Embargoed:
Bug Depends On: 2427783, 2427784    
Bug Blocks:    

Description OSIDB Bzimport 2026-01-08 00:03:22 UTC
llama.cpp is an inference of several LLM models in C/C++. In commits 55d4206c8 and prior, the n_discard parameter is parsed directly from JSON input in the llama.cpp server's completion endpoints without validation to ensure it's non-negative. When a negative value is supplied and the context fills up, llama_memory_seq_rm/add receives a reversed range and negative offset, causing out-of-bounds memory writes in the token evaluation loop. This deterministic memory corruption can crash the process or enable remote code execution (RCE). There is no fix at the time of publication.