跳到主要内容

DeepInfra

https://deepinfra.com/

提示

我们支持所有 DeepInfra 模型,只需在发送 litellm 请求时将 model=deepinfra/<any-model-on-deepinfra> 设置为前缀即可

API 密钥

# env variable
os.environ['DEEPINFRA_API_KEY']

示例用法

from litellm import completion
import os

os.environ['DEEPINFRA_API_KEY'] = ""
response = completion(
model="deepinfra/meta-llama/Llama-2-70b-chat-hf",
messages=[{"role": "user", "content": "write code for saying hi from LiteLLM"}]
)

示例用法 - 流式传输

from litellm import completion
import os

os.environ['DEEPINFRA_API_KEY'] = ""
response = completion(
model="deepinfra/meta-llama/Llama-2-70b-chat-hf",
messages=[{"role": "user", "content": "write code for saying hi from LiteLLM"}],
stream=True
)

for chunk in response:
print(chunk)

聊天模型

模型名称函数调用
meta-llama/Meta-Llama-3-8B-Instructcompletion(model="deepinfra/meta-llama/Meta-Llama-3-8B-Instruct", messages)
meta-llama/Meta-Llama-3-70B-Instructcompletion(model="deepinfra/meta-llama/Meta-Llama-3-70B-Instruct", messages)
meta-llama/Llama-2-70b-chat-hfcompletion(model="deepinfra/meta-llama/Llama-2-70b-chat-hf", messages)
meta-llama/Llama-2-7b-chat-hfcompletion(model="deepinfra/meta-llama/Llama-2-7b-chat-hf", messages)
meta-llama/Llama-2-13b-chat-hfcompletion(model="deepinfra/meta-llama/Llama-2-13b-chat-hf", messages)
codellama/CodeLlama-34b-Instruct-hfcompletion(model="deepinfra/codellama/CodeLlama-34b-Instruct-hf", messages)
mistralai/Mistral-7B-Instruct-v0.1completion(model="deepinfra/mistralai/Mistral-7B-Instruct-v0.1", messages)
jondurbin/airoboros-l2-70b-gpt4-1.4.1completion(model="deepinfra/jondurbin/airoboros-l2-70b-gpt4-1.4.1", messages)