vllm 启动inter val大模型

/workspace/.devcontainer/devcontainer.json

{
    "name": "vLLM Dev Container",
    "image": "vllm/vllm-openai:v0.11.0",
    "runArgs": ["--gpus", "all",
               "-p", "8000:8000",
               "-p", "8080:8080",
               "--restart=unless-stopped"],
    "mounts": [
        "source=/data/lbg/models,target=/data/lbg/models,type=bind",
        "source=/data/lbg/latex_fanyi/,target=/workspace,type=bind"
    ],
    "workspaceFolder": "/workspace",
    "postCreateCommand": "pip install --ignore-installed blinker==1.9.0 flask flask-cors requests && cd /workspace && (vllm serve /data/lbg/models/brandonbeiler_InternVL3_5-8B-FP8-Dynamic --quantization compressed-tensors --served-model-name internvl3_5-8b --trust-remote-code --max-model-len 2048 --tensor-parallel-size 1 &) && sleep 10 && python3 /workspace/server.py",
    "customizations": {
        "vscode": {
            "extensions": [
                "ms-python.python",
                "ms-python.vscode-pylance"
            ]
        }
    }
}

重启验证:

# 手动停止容器,看它是否会自动重启
docker stop latex-ai-container

# 等待 15 秒
sleep 15

# 检查容器是否自动启动
docker ps | grep latex-ai-container
预期结果:容器应该自动重新启动(因为 unless-stopped 策略)。

3. 测试系统重启(最终验证)
bash
# 重启整个系统
sudo reboot

# 系统重启后,重新登录,执行:
docker ps | grep latex-ai-container
curl http://localhost:8080/health

Logo

免费领 50 小时云算力,进群参与显卡、AI PC 幸运抽奖

更多推荐