部署 litellm proxy 服务

创建日期: 2025-06-03 18:26 | 作者: 风波 | 浏览次数: 23 | 分类: AI

0. 安装

pip install 'litellm[proxy]' -i https://mirrors.aliyun.com/pypi/simple/  --trusted-host mirrors.aliyun.com

1. 准备 config.yaml 文件

需要代理的模型列表

model_list:
  - model_name: gpt-3.5-turbo ### RECEIVED MODEL NAME ###
    litellm_params: # all params accepted by litellm.completion() - https://docs.litellm.ai/docs/completion/input
      model: azure/gpt-turbo-small-eu ### MODEL NAME sent to `litellm.completion()` ###
      api_base: https://my-endpoint-europe-berri-992.openai.azure.com/
      api_key: "os.environ/AZURE_API_KEY_EU" # does os.getenv("AZURE_API_KEY_EU")
      rpm: 6      # [OPTIONAL] Rate limit for this deployment: in requests per minute (rpm)
  - model_name: bedrock-claude-v1 
    litellm_params:
      model: bedrock/anthropic.claude-instant-v1

2. 启动服务

litellm --config /path/to/config.yaml

Docker

docker run \
    -v $(pwd)/litellm_config.yaml:/app/config.yaml \
    -e AZURE_API_KEY=d6*********** \
    -e AZURE_API_BASE=https://openai-***********/ \
    -p 4000:4000 \
    ghcr.io/berriai/litellm:main-latest \
    --config /app/config.yaml --detailed_debug
23 浏览
15 爬虫
0 评论