Prefer a unified API? Check out the Vercel AI SDK
integration for multi-provider support.
How It Works
Observ wraps your provider SDK clients with a transparent proxy that:- Traces all LLM calls with detailed metrics
- Caches similar prompts when
recall: trueis enabled - Preserves the exact same API - no code changes needed
Supported Providers
Anthropic
Claude Sonnet 4.5, Claude OpusTypeScript and Python support
OpenAI
GPT-4, GPT-3.5, and all modelsTypeScript and Python support
Mistral
Mistral Large, Medium, SmallTypeScript and Python support
Gemini 1.5 Pro, FlashPython only
xAI
Grok modelsTypeScript and Python support
OpenRouter
Access to 100+ modelsTypeScript and Python support
Quick Example
- TypeScript
- Python
Benefits
- Drop-in replacement - Wrap your existing client and use it normally
- Zero code changes - Same API, enhanced with observability
- Automatic tracing - All calls tracked in your Observ dashboard
- Semantic caching - Reduce costs with intelligent prompt caching
- Session tracking - Group related calls with session IDs
- Custom metadata - Attach debugging context to traces
Next Steps
Get API Key
Create an account and get your API key from Settings → API
Keys
Install SDK
Follow the installation guide for your
language and provider
Wrap your client
Learn how to use the SDK in the usage guide
Installation
Install for TypeScript or Python
Usage Guide
Detailed usage with examples
Configuration
Configuration options reference
Vercel AI SDK
Alternative: Unified multi-provider API
