icon

We found results for “

CVE-2025-62426

Good to know:

icon
icon

Date: November 20, 2025

vLLM is an inference and serving engine for large language models (LLMs). From version 0.5.5 to before 0.11.1, the /v1/chat/completions and /tokenize endpoints allow a chat_template_kwargs request parameter that is used in the code before it is properly validated against the chat template. With the right chat_template_kwargs parameters, it is possible to block processing of the API server for long periods of time, delaying all other requests. This issue has been patched in version 0.11.1.

Severity Score

Severity Score

Weakness Type (CWE)

Allocation of Resources Without Limits or Throttling

CWE-770

Top Fix

icon

Upgrade Version

Upgrade to version vllm - 0.11.1;https://github.com/vllm-project/vllm.git - v0.11.1

Learn More

CVSS v3.1

Base Score:
Attack Vector (AV): NETWORK
Attack Complexity (AC): LOW
Privileges Required (PR): LOW
User Interaction (UI): NONE
Scope (S): UNCHANGED
Confidentiality (C): NONE
Integrity (I): NONE
Availability (A): HIGH

Do you need more information?

Contact Us