The Vercel AI SDK provides a standardized way to interact with multiple
providers. Simply wrap your model with
observ.wrap() and everything works
automatically.Why Use Vercel AI SDK?
Provider Flexibility
Switch between OpenAI, Anthropic, Google, Mistral, and 20+ other providers
without changing your code
Full Feature Support
Streaming, structured outputs (
generateObject), and tool calling all work
automaticallyAutomatic Caching
Enable
recall: true for intelligent semantic caching across all providersMinimal Setup
Wrap your model with
observ.wrap() and use it normally - that’s itQuick Example
Supported Providers
The Vercel AI SDK supports 25+ providers. Here are some popular ones:- OpenAI - GPT-4, GPT-3.5
- Anthropic - Claude Sonnet 4.5, Claude Opus
- Google - Gemini Pro, Gemini Flash
- Mistral - Mistral Large, Mistral Medium
- Cohere - Command R+
- And many more…
Features
Streaming
Streaming works automatically with full observability:Structured Outputs
Generate typed objects withgenerateObject:
Tool Calling
Use tools (function calling) with any compatible provider:Next Steps
Installation
Install the SDK and dependencies
Usage Guide
Detailed usage with all providers
Sessions
Track conversation threads
Metadata
Add custom metadata to traces
