跳到主要内容

Langsmith - 记录 LLM 输入/输出

适用于应用程序生命周期每个步骤的一体化开发者平台 https://smith.langchain.com/

提示

我们想了解如何改进回调功能!认识 LiteLLM 的 创始人 或加入我们的 Discord

先决条件

pip install litellm

快速入门

只需两行代码,即可使用 Langsmith 立即记录 跨所有提供商 的响应

litellm.callbacks = ["langsmith"]
import litellm
import os

os.environ["LANGSMITH_API_KEY"] = ""
os.environ["LANGSMITH_PROJECT"] = "" # defaults to litellm-completion
os.environ["LANGSMITH_DEFAULT_RUN_NAME"] = "" # defaults to LLMRun
# LLM API Keys
os.environ['OPENAI_API_KEY']=""

# set langsmith as a callback, litellm will send the data to langsmith
litellm.callbacks = ["langsmith"]

# openai call
response = litellm.completion(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Hi 👋 - i'm openai"}
]
)

高级

本地测试 - 控制批量大小

设置 Langsmith 一次处理的批量大小,默认值为 512.

在本地测试时设置 langsmith_batch_size=1,以便快速查看日志。

import litellm
import os

os.environ["LANGSMITH_API_KEY"] = ""
# LLM API Keys
os.environ['OPENAI_API_KEY']=""

# set langsmith as a callback, litellm will send the data to langsmith
litellm.callbacks = ["langsmith"]
litellm.langsmith_batch_size = 1 # 👈 KEY CHANGE

response = litellm.completion(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Hi 👋 - i'm openai"}
]
)
print(response)

设置 Langsmith 字段

import litellm
import os

os.environ["LANGSMITH_API_KEY"] = ""
# LLM API Keys
os.environ['OPENAI_API_KEY']=""

# set langsmith as a callback, litellm will send the data to langsmith
litellm.success_callback = ["langsmith"]

response = litellm.completion(
model="gpt-3.5-turbo",
messages=[
{"role": "user", "content": "Hi 👋 - i'm openai"}
],
metadata={
"run_name": "litellmRUN", # langsmith run name
"project_name": "litellm-completion", # langsmith project name
"run_id": "497f6eca-6276-4993-bfeb-53cbbbba6f08", # langsmith run id
"parent_run_id": "f8faf8c1-9778-49a4-9004-628cdb0047e5", # langsmith run parent run id
"trace_id": "df570c03-5a03-4cea-8df0-c162d05127ac", # langsmith run trace id
"session_id": "1ffd059c-17ea-40a8-8aef-70fd0307db82", # langsmith run session id
"tags": ["model1", "prod-2"], # langsmith run tags
"metadata": { # langsmith run metadata
"key1": "value1"
},
"dotted_order": "20240429T004912090000Z497f6eca-6276-4993-bfeb-53cbbbba6f08"
}
)
print(response)

让 LiteLLM Proxy 使用自定义的 LANGSMITH_BASE_URL

如果您使用的是自定义的 LangSmith 实例,您可以设置 LANGSMITH_BASE_URL 环境变量指向您的实例。例如,您可以使用此配置让 LiteLLM Proxy 将日志记录到本地 LangSmith 实例

litellm_settings:
success_callback: ["langsmith"]

environment_variables:
LANGSMITH_BASE_URL: "https://:1984"
LANGSMITH_PROJECT: "litellm-proxy"

支持与创始人交流