1. Get your API Key
Create an account and get your API key from Settings → API Keys.
2. Choose Your Integration
Quick Example: Vercel AI SDK
Install dependencies
npm install observ-sdk ai @ai-sdk/openai
Wrap your model
import { Observ } from "observ-sdk";
import { openai } from "@ai-sdk/openai";
import { generateText } from "ai";
const observ = new Observ({
apiKey: "your-observ-api-key",
recall: true, // Enable semantic caching
});
// Wrap the model
const model = observ.wrap(openai("gpt-4"));
Use it normally
const result = await generateText({
model,
prompt: "What is TypeScript?",
});
console.log(result.text);
View traces in dashboard
All your LLM calls are now automatically traced! View them in the Dashboard.
Quick Example: Provider SDKs
Install dependencies
npm install observ-sdk @anthropic-ai/sdk
Wrap your client
import Anthropic from "@anthropic-ai/sdk";
import Observ from "observ-sdk";
const ob = new Observ({
apiKey: "your-observ-api-key",
recall: true,
});
const client = new Anthropic({ apiKey: "your-anthropic-key" });
const wrappedClient = ob.anthropic(client);
Use it normally
const response = await wrappedClient.messages.create({
model: "claude-sonnet-4-20250514",
max_tokens: 1024,
messages: [{ role: "user", content: "Hello!" }],
});
console.log(response.content[0].text);
Install dependencies
pip install observ-sdk anthropic
Wrap your client
import anthropic
from observ import Observ
ob = Observ(
api_key="your-observ-api-key",
recall=True,
)
client = anthropic.Anthropic(api_key="your-anthropic-key")
wrapped_client = ob.anthropic(client)
Use it normally
response = wrapped_client.messages.create(
model="claude-sonnet-4-20250514",
max_tokens=1024,
messages=[{"role": "user", "content": "Hello!"}],
)
print(response.content[0].text)
What’s Next?
Set recall: true to enable semantic caching and reduce costs by up to 85% on
similar prompts.