Skip to main content
Prefer a unified API? Check out the Vercel AI SDK integration for multi-provider support.

How It Works

Observ wraps your provider SDK clients with a transparent proxy that:
  1. Traces all LLM calls with detailed metrics
  2. Caches similar prompts when recall: true is enabled
  3. Preserves the exact same API - no code changes needed

Supported Providers

Anthropic

Claude Sonnet 4.5, Claude OpusTypeScript and Python support

OpenAI

GPT-4, GPT-3.5, and all modelsTypeScript and Python support

Mistral

Mistral Large, Medium, SmallTypeScript and Python support

Google

Gemini 1.5 Pro, FlashPython only

xAI

Grok modelsTypeScript and Python support

OpenRouter

Access to 100+ modelsTypeScript and Python support

Quick Example

import Anthropic from "@anthropic-ai/sdk";
import Observ from "observ-sdk";

const ob = new Observ({
  apiKey: "your-observ-api-key",
  recall: true, // Enable semantic caching
});

const client = new Anthropic({ apiKey: "your-anthropic-key" });
const wrappedClient = ob.anthropic(client);

// Use it exactly as you would normally
const response = await wrappedClient.messages.create({
  model: "claude-sonnet-4-20250514",
  max_tokens: 1024,
  messages: [{ role: "user", content: "Hello!" }],
});

Benefits

  • Drop-in replacement - Wrap your existing client and use it normally
  • Zero code changes - Same API, enhanced with observability
  • Automatic tracing - All calls tracked in your Observ dashboard
  • Semantic caching - Reduce costs with intelligent prompt caching
  • Session tracking - Group related calls with session IDs
  • Custom metadata - Attach debugging context to traces

Next Steps

1

Get API Key

Create an account and get your API key from Settings → API Keys
2

Install SDK

Follow the installation guide for your language and provider
3

Wrap your client

Learn how to use the SDK in the usage guide