Integrations
Overmind works with popular LLM frameworks and providers. The SDK automatically instruments supported providers and frameworks.
LLM Providers
Section titled “LLM Providers”Python
Section titled “Python”| Provider | Status | providers value |
|---|---|---|
| OpenAI | Supported | "openai" |
| Anthropic | Supported | "anthropic" |
| Google Gemini | Supported | "google" |
| Agno | Supported | "agno" |
Call overmind_sdk.init() once at startup with the providers you use. Your existing LLM client code stays unchanged — no import swaps, no proxy routing.
from overmind_sdk import init
init(service_name="my-service", providers=["openai", "anthropic"])Pass an empty providers list (or omit it) to auto-detect all installed providers.
JavaScript / TypeScript
Section titled “JavaScript / TypeScript”| Provider | Status | enabledProviders key | Package |
|---|---|---|---|
| OpenAI | Supported | openai: OpenAI | openai |
| Anthropic | Supported | anthropic: Anthropic | @anthropic-ai/sdk |
| Google Gemini | Supported | googleGenAI: GoogleGenAI | @google/genai |
Pass the imported provider module to initTracing({ enabledProviders: { ... } }). Your existing LLM code works unchanged.
Frameworks
Section titled “Frameworks”| Framework | Status | Notes |
|---|---|---|
| Agno | Supported | Traces agent runs, tool calls, and reasoning steps |
| OpenAI Agents SDK | Supported | Traces agent tool calls and reasoning steps |
| LangChain | Coming soon | Protocol-level support for chains and agents |
| CrewAI | Coming soon | Multi-agent workflow tracing |
Auto-Instrumentation
Section titled “Auto-Instrumentation”Python
Section titled “Python”The Python SDK uses OpenTelemetry auto-instrumentation. Call init() once at startup and every subsequent LLM call from your existing client is automatically traced.
- OpenAI — captures all chat completion, response, and embedding calls
- Anthropic — captures all message calls
- Google Gemini — captures all
generate_contentcalls - Agno — captures agent runs, tool calls, and model interactions
from overmind_sdk import init
init(service_name="my-service") # instruments all installed providersJavaScript / TypeScript
Section titled “JavaScript / TypeScript”The JS SDK instruments LLM calls via initTracing(). Call it once before any LLM calls and all subsequent provider calls are traced automatically.
- OpenAI — captures chat completions, completions, and image generation calls
- Anthropic — captures all message calls
- Google Gemini — captures all generateContent and streaming calls
Sending Traces from Any Language
Section titled “Sending Traces from Any Language”If you’re not using Python, you can send traces directly via the OTLP HTTP endpoint:
POST https://api.overmindlab.ai/api/v1/tracesHeader: X-API-Token: ovr_your_token_hereContent-Type: application/x-protobufThe endpoint accepts standard OpenTelemetry OTLP trace data. Any OpenTelemetry SDK (Go, Java, Node.js, etc.) can export to this endpoint.
What’s Next
Section titled “What’s Next”We’re actively adding support for more providers and frameworks. If you have a specific integration request, reach out to support@overmindlab.ai