Skip to content
Get started

How To Use Tracing

Instrument your LLM stack with Overmind tracing in Python or JavaScript.

Tracing helps you understand how agents behave in production and creates the data foundation for optimization and model improvement.

Install:

Terminal window
pip install overmind openai

Initialize once at startup:

from overmind import init
init(
service_name="my-service",
environment="production",
providers=["openai"],
)

After init(), your provider calls are traced automatically.

Install:

Terminal window
npm install @overmind-lab/trace-sdk openai

Initialize tracing:

import { OpenAI } from "openai";
import { OvermindClient } from "@overmind-lab/trace-sdk";
const overmindClient = new OvermindClient({
apiKey: process.env.OVERMIND_API_KEY!,
appName: "my-app",
});
overmindClient.initTracing({
enableBatching: true,
enabledProviders: { openai: OpenAI },
});
  • set service_name/appName clearly per service
  • tag traces with user/workflow metadata
  • run tracing in staging first, then production
  • keep tracing enabled continuously for better improvement signals