LLM tracking
Here we'll cover how to automatically track events from LLM models in Node.js. Events such as prompts, generations and tool calls can be captured with just a few lines of code.
Initialization
Trubrics must be initialized with your LLM client. This helps make the automatic tracking more accurate.
Foundation models
OpenAI must be imported alongside AzureOpenAI:
Frameworks
Tracking
You can now start tracking prompts, generations and tool calls by using the withProperties
wrapper.
The withProperties
function wraps around your LLM function and returns the same response object as your LLM function.
It takes an optional properties
dictionary, which allows you to add context to your LLM events such as user ID's and thread ID's.
const properties = { $user_id: "my-user", $thread_id: "1532ds-243kj-3538", custom_property: "custom_value" };
const llmFunction = (messages) => openai.chat.completions.create({
model: "gpt-4o-mini",
messages
});
const completion = await trubrics.withProperties(properties, () => llmFunction(messages));
// Properties can be null
const completionWithoutProperties = await trubrics.withProperties(null, () => llmFunction(messages));
Trubrics properties
Trubrics properties must be prefixed with dollar signs.