How To Use Tracing
Instrument your LLM stack with Overmind tracing in Python or JavaScript.
Tracing helps you understand how agents behave in production and creates the data foundation for optimization and model improvement.
Python
Section titled “Python”Install:
pip install overmind openaiInitialize once at startup:
from overmind import init
init( service_name="my-service", environment="production", providers=["openai"],)After init(), your provider calls are traced automatically.
JavaScript / TypeScript
Section titled “JavaScript / TypeScript”Install:
npm install @overmind-lab/trace-sdk openaiInitialize tracing:
import { OpenAI } from "openai";import { OvermindClient } from "@overmind-lab/trace-sdk";
const overmindClient = new OvermindClient({ apiKey: process.env.OVERMIND_API_KEY!, appName: "my-app",});
overmindClient.initTracing({ enableBatching: true, enabledProviders: { openai: OpenAI },});Best practices
Section titled “Best practices”- set
service_name/appNameclearly per service - tag traces with user/workflow metadata
- run tracing in staging first, then production
- keep tracing enabled continuously for better improvement signals