What are Sessions?
Sessions allow you to group related LLM calls under a single identifier. This is useful for:- Multi-turn conversations - Track the entire conversation flow
- User workflows - Analyze how users interact with your AI features
- Debugging - Trace issues across multiple related calls
- Analytics - Understand conversation patterns and user behavior
Use Cases
Chatbots
Group all messages in a conversation thread together
Multi-step Workflows
Track related API calls in complex workflows
A/B Testing
Compare conversation flows across different experiments
User Journeys
Analyze how users navigate your AI features
Vercel AI SDK
Pass session IDs viaproviderOptions.observ.sessionId:
Provider SDKs
Chain.withSessionId() (TypeScript) or .with_session_id() (Python) before your API call.
Multi-turn Conversation Example
- TypeScript
- Python
All Providers
Sessions work with all supported providers:- TypeScript
- Python
Anthropic
Anthropic
OpenAI
OpenAI
Mistral
Mistral
xAI
xAI
OpenRouter
OpenRouter
Best Practices
Use consistent session IDs
Use consistent session IDs
Use the same session ID across all calls in a conversation thread. Generate unique IDs per conversation (e.g., UUIDs, user IDs + timestamp).
Include context in session IDs
Include context in session IDs
Make session IDs meaningful for easier debugging:
Clean up old sessions
Clean up old sessions
Session IDs are just strings - they don’t consume resources. However, for privacy, you may want to rotate them periodically.
Viewing Sessions in Dashboard
All traces with the same session ID are grouped together in the Dashboard. You can:- View the entire conversation flow
- See timing and cost for each call
- Filter by session ID
- Analyze conversation patterns
