Skip to main content
The Vercel AI SDK provides a standardized way to interact with multiple providers. Simply wrap your model with observ.wrap() and everything works automatically.

Why Use Vercel AI SDK?

Provider Flexibility

Switch between OpenAI, Anthropic, Google, Mistral, and 20+ other providers without changing your code

Full Feature Support

Streaming, structured outputs (generateObject), and tool calling all work automatically

Automatic Caching

Enable recall: true for intelligent semantic caching across all providers

Minimal Setup

Wrap your model with observ.wrap() and use it normally - that’s it

Quick Example

import { Observ } from "observ-sdk";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";

const observ = new Observ({
  apiKey: "your-observ-api-key",
  recall: true, // Enable semantic caching
});

// Wrap the model
const model = observ.wrap(openai("gpt-4"));

// Use it normally
const result = await generateText({
  model,
  prompt: "What is TypeScript?",
});

console.log(result.text);

Supported Providers

The Vercel AI SDK supports 25+ providers. Here are some popular ones:
  • OpenAI - GPT-4, GPT-3.5
  • Anthropic - Claude Sonnet 4.5, Claude Opus
  • Google - Gemini Pro, Gemini Flash
  • Mistral - Mistral Large, Mistral Medium
  • Cohere - Command R+
  • And many more…

Features

Streaming

Streaming works automatically with full observability:
import { streamText } from "ai";

const stream = await streamText({
  model, // Your wrapped model
  prompt: "Write a detailed explanation of async/await",
});

for await (const chunk of stream.textStream) {
  process.stdout.write(chunk);
}

Structured Outputs

Generate typed objects with generateObject:
import { generateObject } from "ai";
import { z } from "zod";

const result = await generateObject({
  model,
  schema: z.object({
    name: z.string(),
    age: z.number(),
  }),
  prompt: "Generate a person's profile",
});

Tool Calling

Use tools (function calling) with any compatible provider:
import { generateText, tool } from "ai";
import { z } from "zod";

const result = await generateText({
  model,
  tools: {
    weather: tool({
      description: "Get weather information",
      parameters: z.object({
        location: z.string(),
      }),
      execute: async ({ location }) => ({
        temperature: 72,
        location,
      }),
    }),
  },
  prompt: "What's the weather in San Francisco?",
});

Next Steps