跳到主要内容

Anthropic SDK

Anthropic 的透传端点 - 调用特定提供商的端点,使用原生格式(无翻译)。

功能支持说明
成本跟踪支持 /messages 端点上的所有模型
日志记录适用于所有集成
终端用户跟踪通过 litellm.disable_end_user_cost_tracking_prometheus_only 禁用 prometheus 跟踪
流式处理

只需将 https://api.anthropic.com 替换为 LITELLM_PROXY_BASE_URL/anthropic

示例用法

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--header "Authorization: bearer sk-anything" \
--data '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

支持 所有 Anthropic 端点(包括流式处理)。

查看所有 Anthropic 端点

快速入门

让我们调用 Anthropic 的 /messages 端点

  1. 将 Anthropic API Key 添加到您的环境中
export ANTHROPIC_API_KEY=""
  1. 启动 LiteLLM Proxy
litellm

# RUNNING on http://0.0.0.0:4000
  1. 测试一下!

让我们调用 Anthropic 的 /messages 端点

curl http://0.0.0.0:4000/anthropic/v1/messages \
--header "x-api-key: $LITELLM_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

示例

http://0.0.0.0:4000/anthropic 之后的任何内容都将被视为特定于提供商的路由,并进行相应处理。

关键变化

原始端点替换为
https://api.anthropic.comhttp://0.0.0.0:4000/anthropic (LITELLM_PROXY_BASE_URL="http://0.0.0.0:4000")
bearer $ANTHROPIC_API_KEYbearer anything (如果在代理上设置了虚拟密钥,请使用 bearer LITELLM_VIRTUAL_KEY

示例 1: Messages 端点

LiteLLM Proxy 调用

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages \
--header "x-api-key: $LITELLM_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

直接 Anthropic API 调用

curl https://api.anthropic.com/v1/messages \
--header "x-api-key: $ANTHROPIC_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

示例 2: Token 计数 API

LiteLLM Proxy 调用

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages/count_tokens \
--header "x-api-key: $LITELLM_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "anthropic-beta: token-counting-2024-11-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-sonnet-20241022",
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

直接 Anthropic API 调用

curl https://api.anthropic.com/v1/messages/count_tokens \
--header "x-api-key: $ANTHROPIC_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "anthropic-beta: token-counting-2024-11-01" \
--header "content-type: application/json" \
--data \
'{
"model": "claude-3-5-sonnet-20241022",
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

示例 3: 批量 Messages

LiteLLM Proxy 调用

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages/batches \
--header "x-api-key: $LITELLM_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "anthropic-beta: message-batches-2024-09-24" \
--header "content-type: application/json" \
--data \
'{
"requests": [
{
"custom_id": "my-first-request",
"params": {
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}
},
{
"custom_id": "my-second-request",
"params": {
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hi again, friend"}
]
}
}
]
}'

直接 Anthropic API 调用

curl https://api.anthropic.com/v1/messages/batches \
--header "x-api-key: $ANTHROPIC_API_KEY" \
--header "anthropic-version: 2023-06-01" \
--header "anthropic-beta: message-batches-2024-09-24" \
--header "content-type: application/json" \
--data \
'{
"requests": [
{
"custom_id": "my-first-request",
"params": {
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}
},
{
"custom_id": "my-second-request",
"params": {
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hi again, friend"}
]
}
}
]
}'

高级

前提条件

使用此方法可以避免向开发者暴露原始的 Anthropic API 密钥,同时仍允许他们使用 Anthropic 端点。

与虚拟密钥一起使用

  1. 设置环境
export DATABASE_URL=""
export LITELLM_MASTER_KEY=""
export COHERE_API_KEY=""
litellm

# RUNNING on http://0.0.0.0:4000
  1. 生成虚拟密钥
curl -X POST 'http://0.0.0.0:4000/key/generate' \
-H 'Authorization: Bearer sk-1234' \
-H 'Content-Type: application/json' \
-d '{}'

预期响应

{
...
"key": "sk-1234ewknldferwedojwojw"
}
  1. 测试一下!
curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--header "Authorization: bearer sk-1234ewknldferwedojwojw" \
--data '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
]
}'

发送 litellm_metadata(标签、终端用户成本跟踪)

curl --request POST \
--url http://0.0.0.0:4000/anthropic/v1/messages \
--header 'accept: application/json' \
--header 'content-type: application/json' \
--header "Authorization: bearer sk-anything" \
--data '{
"model": "claude-3-5-sonnet-20241022",
"max_tokens": 1024,
"messages": [
{"role": "user", "content": "Hello, world"}
],
"litellm_metadata": {
"tags": ["test-tag-1", "test-tag-2"],
"user": "test-user" # track end-user/customer cost
}
}'