Galadriel
https://docs.galadriel.com/api-reference/chat-completion-API
LiteLLM 支持 Galadriel 上的所有模型。
API 密钥
import os
os.environ['GALADRIEL_API_KEY'] = "your-api-key"
示例用法
from litellm import completion
import os
os.environ['GALADRIEL_API_KEY'] = ""
response = completion(
model="galadriel/llama3.1",
messages=[
{"role": "user", "content": "hello from litellm"}
],
)
print(response)
示例用法 - 流式传输
from litellm import completion
import os
os.environ['GALADRIEL_API_KEY'] = ""
response = completion(
model="galadriel/llama3.1",
messages=[
{"role": "user", "content": "hello from litellm"}
],
stream=True
)
for chunk in response:
print(chunk)
支持的模型
无服务器端点
我们支持所有 Galadriel AI 模型,发送补全请求时只需将 galadriel/
设置为前缀
我们支持完整的模型名称和简化的名称匹配。
您可以指定完整的模型名称或简化的版本,例如 llama3.1:70b
模型名称 | 简化名称 | 函数调用 |
---|---|---|
neuralmagic/Meta-Llama-3.1-8B-Instruct-FP8 | llama3.1 or llama3.1:8b | completion(model="galadriel/llama3.1", messages) |
neuralmagic/Meta-Llama-3.1-70B-Instruct-quantized.w4a16 | llama3.1:70b | completion(model="galadriel/llama3.1:70b", messages) |
neuralmagic/Meta-Llama-3.1-405B-Instruct-quantized.w4a16 | llama3.1:405b | completion(model="galadriel/llama3.1:405b", messages) |
neuralmagic/Mistral-Nemo-Instruct-2407-quantized.w4a16 | mistral-nemo or mistral-nemo:12b | completion(model="galadriel/mistral-nemo", messages) |