跳到主要内容

Deepseek

https://deepseek.com/

我们支持所有 Deepseek 模型,只需在发送补全请求时设置 deepseek/ 作为前缀即可

API Key

# env variable
os.environ['DEEPSEEK_API_KEY']

用法示例

from litellm import completion
import os

os.environ['DEEPSEEK_API_KEY'] = ""
response = completion(
model="deepseek/deepseek-chat",
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response)

用法示例 - 流式

from litellm import completion
import os

os.environ['DEEPSEEK_API_KEY'] = ""
response = completion(
model="deepseek/deepseek-chat",
messages=[
{"role": "user", "content": "hello from litellm"}
],
stream=True
)

for chunk in response:
print(chunk)

支持的模型 - 支持所有 Deepseek 模型!

我们支持所有 Deepseek 模型,只需在发送补全请求时设置 deepseek/ 作为前缀即可

模型名称函数调用
deepseek-chatcompletion(model="deepseek/deepseek-chat", messages)
deepseek-codercompletion(model="deepseek/deepseek-coder", messages)

推理模型

模型名称函数调用
deepseek-reasonercompletion(model="deepseek/deepseek-reasoner", messages)
from litellm import completion
import os

os.environ['DEEPSEEK_API_KEY'] = ""
resp = completion(
model="deepseek/deepseek-reasoner",
messages=[{"role": "user", "content": "Tell me a joke."}],
)

print(
resp.choices[0].message.reasoning_content
)