规则
使用此功能根据 LLM API 调用的输入或输出使请求失败。
import litellm
import os
# set env vars
os.environ["OPENAI_API_KEY"] = "your-api-key"
os.environ["OPENROUTER_API_KEY"] = "your-api-key"
def my_custom_rule(input): # receives the model response
if "i don't think i can answer" in input: # trigger fallback if the model refuses to answer
return False
return True
litellm.post_call_rules = [my_custom_rule] # have these be functions that can be called to fail a call
response = litellm.completion(model="gpt-3.5-turbo", messages=[{"role": "user",
"content": "Hey, how's it going?"}], fallbacks=["openrouter/gryphe/mythomax-l2-13b"])
可用端点
litellm.pre_call_rules = []
- 在进行 API 调用之前迭代执行的函数列表。每个函数应返回 True(允许调用)或 False(拒绝调用)。litellm.post_call_rules = []
- 在进行 API 调用之前迭代执行的函数列表。每个函数应返回 True(允许调用)或 False(拒绝调用)。
规则的预期格式
def my_custom_rule(input: str) -> bool: # receives the model response
if "i don't think i can answer" in input: # trigger fallback if the model refuses to answer
return False
return True
输入
input
: str: 用户输入或 LLM 响应。
输出
bool
: 返回 True(允许调用)或 False(拒绝调用)
规则示例
示例 1:如果用户输入过长则失败
import litellm
import os
# set env vars
os.environ["OPENAI_API_KEY"] = "your-api-key"
def my_custom_rule(input): # receives the model response
if len(input) > 10: # fail call if too long
return False
return True
litellm.pre_call_rules = [my_custom_rule] # have these be functions that can be called to fail a call
response = litellm.completion(model="gpt-3.5-turbo", messages=[{"role": "user", "content": "Hey, how's it going?"}])
示例 2:如果 LLM 拒绝回答,则回退到未审查的模型
import litellm
import os
# set env vars
os.environ["OPENAI_API_KEY"] = "your-api-key"
os.environ["OPENROUTER_API_KEY"] = "your-api-key"
def my_custom_rule(input): # receives the model response
if "i don't think i can answer" in input: # trigger fallback if the model refuses to answer
return False
return True
litellm.post_call_rules = [my_custom_rule] # have these be functions that can be called to fail a call
response = litellm.completion(model="gpt-3.5-turbo", messages=[{"role": "user",
"content": "Hey, how's it going?"}], fallbacks=["openrouter/gryphe/mythomax-l2-13b"])