跳到主要内容

Petals

Petals: https://github.com/bigscience-workshop/petals

Open In Colab

先决条件

确保已安装 petals

pip install git+https://github.com/bigscience-workshop/petals

用法

确保为所有 petals LLM 添加 petals/ 作为前缀。这将把 custom_llm_provider 设置为 petals。

from litellm import completion

response = completion(
model="petals/petals-team/StableBeluga2",
messages=[{ "content": "Hello, how are you?","role": "user"}]
)

print(response)

流式传输用法

response = completion(
model="petals/petals-team/StableBeluga2",
messages=[{ "content": "Hello, how are you?","role": "user"}],
stream=True
)

print(response)
for chunk in response:
print(chunk)

模型详情

模型名称函数调用
petals-team/StableBelugacompletion('petals/petals-team/StableBeluga2', messages)
huggyllama/llama-65bcompletion('petals/huggyllama/llama-65b', messages)