vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
Metrics
Affected Vendors & Products
References
History
Tue, 24 Jun 2025 18:00:00 +0000
Type | Values Removed | Values Added |
---|---|---|
First Time appeared |
Vllm
Vllm vllm |
|
CPEs | cpe:2.3:a:vllm:vllm:*:*:*:*:*:*:*:* | |
Vendors & Products |
Vllm
Vllm vllm |
Sat, 31 May 2025 03:15:00 +0000
Type | Values Removed | Values Added |
---|---|---|
References |
| |
Metrics |
threat_severity
|
threat_severity
|
Fri, 30 May 2025 21:15:00 +0000
Type | Values Removed | Values Added |
---|---|---|
Metrics |
ssvc
|
Fri, 30 May 2025 18:45:00 +0000
Type | Values Removed | Values Added |
---|---|---|
Description | vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue. | |
Title | vLLM DOS: Remotely kill vllm over http with invalid JSON schema | |
Weaknesses | CWE-248 | |
References |
| |
Metrics |
cvssV3_1
|

Status: PUBLISHED
Assigner: GitHub_M
Published:
Updated: 2025-05-30T20:37:06.015Z
Reserved: 2025-05-28T18:49:07.581Z
Link: CVE-2025-48942

Updated: 2025-05-30T20:37:00.956Z

Status : Analyzed
Published: 2025-05-30T19:15:30.130
Modified: 2025-06-24T17:44:47.737
Link: CVE-2025-48942
