Open
Conversation
Co-authored-by: pacoxu <2010320+pacoxu@users.noreply.github.com>
Copilot
AI
changed the title
[WIP] Update vLLM contribution guidelines
docs: add vLLM 开发入门 guide to vllm/
Mar 3, 2026
Member
|
@noooop 这边也同步了一份 |
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Adds a new Chinese-language developer onboarding guide for vLLM contributors under
vllm/vLLM-dev-guide.md, based on a community contribution by noooop.Contents
uv venv+ two install paths (Python-only withVLLM_USE_PRECOMPILED=1vs full C++/CUDA)launch.jsonconfig for single-GPU vLLM server with debugpy--profiler-configflag,/start_profile//stop_profileendpoints, perfetto.dev visualizationpre-commitsetup and manual invocation (pre-commit run -a)Original prompt
This section details on the original issue you should resolve
<issue_title>vLLM 开发入门</issue_title>
<issue_description>作者:noooop
大家好!如果你对 vLLM 项目感兴趣,这里有一份来自社区成员的贡献指南,希望能帮你顺利上手。
vLLM 欢迎任何形式的帮助! 例如:
如果不知道从何开始:
可以到项目的 Job Board(任务板) 看看,那里标注了许多适合入门的任务和新模型支持任务,挑一个感兴趣的即可!
🚀 第一步:准备好开发环境
git clone https://github.com/vllm-project/vllm.git
cd vllm
推荐使用 uv 管理环境,更轻量快速。安装 uv 后,一键创建环境:
uv venv --python 3.12 --seed
source .venv/bin/activate
注意:建议使用 Python 3.12,vLLM 的主要测试和兼容性基于此版本,可减少本地与测试环境不一致的问题。
VLLM_USE_PRECOMPILED=1 uv pip install -e .
uv pip install -e .
🔧 开发与调试小贴士
如果使用 VS Code,可以直接复制下面提供的 launch.json 配置,一键启动带调试的 vLLM 服务。
{
// Use IntelliSense to learn about possible attributes.
// Hover to view descriptions of existing attributes.
// For more information, visit: https://go.microsoft.com/fwlink/?linkid=830387
"version": "0.2.0",
"configurations": [
{
"name": "vllm server single",
"type": "debugpy",
"request": "launch",
"module": "vllm.entrypoints.cli.main",
"env": {
"VLLM_LOGGING_LEVEL": "DEBUG",
// "VLLM_USE_MODELSCOPE": "True",
// "MODELSCOPE_DOWNLOAD_PARALLELS": "10",
},
"args": [
"serve",
"Qwen/Qwen3-0.6B",
"--reasoning-parser",
"qwen3",
"--gpu-memory-utilization",
"0.8",
"--port",
"8000",
"--enforce-eager",
"--max-model-len",
"5120",
"-tp",
"1",
],
},
]
}
vllm serve Qwen/Qwen3-0.6B --profiler-config '{"profiler": "torch", "torch_profiler_dir": "./vllm_profile"}'
We need first call /start_profile api to start profile.
$ curl -X POST http://localhost:8000/start_profile
Call model generate.
curl -X POST http://localhost:8000/v1/chat/completions
-H "Content-Type: application/json"
-d '{
"model": "Qwen/Qwen3-0.6B",
"messages": [
{
"role": "user",
"content": "9.11 and 9.8, which is greater?"
}
]
}'
After need call /stop_profile api to stop profile.
$ curl -X POST http://localhost:8000/stop_profile
We need first call /start_profile api to start profile.
$ curl -X POST http://localhost:8000/start_profile
Call model generate.
curl -X POST http://localhost:8000/v1/chat/completions
-H "Content-Type: application/json"
-d '{
"model": "Qwen/Qwen3-0.6B",
"messages": [
{
"role": "user",
"content": "9.11 and 9.8, which is greater?"
}
]
}'
After need call /stop_profile api to stop profile.
$ curl -X POST http://localhost:8000/stop_profile
[图片]
vLLM 使用 pre-commit 自动检查代码格式。安装后,每次 git commit 会自动运行检查。
uv pip install pre-commit
pre-commit install
📖 写文档和跑测试
pytest tests/
📬 正式贡献流程:提交 Issue 与 PR
提交 Issue(问题或需求)
代码完成后即可提交 PR。为保障流程顺畅,请遵守以下约定:
在每个 commit 信息中需添加一行 Signed-off-by: 你的名字 <邮箱>。
使用 git commit -s 可自动添加,也可在 VS Code 或 PyCharm 中设置自动添加。
标题前加上明确前缀,便于快速识别修改类型,例如:
若涉及多个方面,可合并相关前缀。
如果改动较大(代码超过 500 行,不含数据、配置、测试),请...
💬 We'd love your input! Share your thoughts on Copilot coding agent in our 2 minute survey.