原始请求/响应日志记录
日志记录
在您的日志记录提供商(OTEL/Langfuse 等)中查看 LiteLLM 发送的原始请求/响应。
- SDK
- 代理
# pip install langfuse
import litellm
import os
# log raw request/response
litellm.log_raw_request_response = True
# from https://cloud.langfuse.com/
os.environ["LANGFUSE_PUBLIC_KEY"] = ""
os.environ["LANGFUSE_SECRET_KEY"] = ""
# Optional, defaults to https://cloud.langfuse.com
os.environ["LANGFUSE_HOST"] # optional
# LLM API Keys
os.environ['OPENAI_API_KEY']=""
# set langfuse as a callback, litellm will send the data to langfuse
litellm.success_callback = ["langfuse"]
# openai call
response = litellm.completion(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Hi 👋 - i'm openai"}
]
)
litellm_settings:
log_raw_request_response: True
预期日志
返回原始响应头
返回 LLM 提供商的原始响应头。
目前仅支持 openai。
- SDK
- 代理
import litellm
import os
litellm.return_response_headers = True
## set ENV variables
os.environ["OPENAI_API_KEY"] = "your-api-key"
response = litellm.completion(
model="gpt-3.5-turbo",
messages=[{ "content": "Hello, how are you?","role": "user"}]
)
print(response._hidden_params)
- 设置 config.yaml
model_list:
- model_name: gpt-3.5-turbo
litellm_params:
model: gpt-3.5-turbo
api_key: os.environ/GROQ_API_KEY
litellm_settings:
return_response_headers: true
- 测试一下!
curl -X POST 'http://0.0.0.0:4000/chat/completions' \
-H 'Content-Type: application/json' \
-H 'Authorization: Bearer sk-1234' \
-D '{
"model": "gpt-3.5-turbo",
"messages": [
{ "role": "system", "content": "Use your tools smartly"},
{ "role": "user", "content": "What time is it now? Use your tool"}
]
}'
预期响应