Overview
The Aden TypeScript SDK provides real-time usage tracking, budget enforcement, and cost control for LLM applications. It automatically instruments OpenAI, Anthropic Claude, and Google Gemini API calls without modifying your application code.GitHub Repository
View source code and contribute
Key Features
Multi-Provider Support
Works with OpenAI, Anthropic, and Google Gemini with a single integration.
Zero Code Changes
Automatic instrumentation - just call
instrument() once at startup.Real-Time Cost Control
Budget limits, throttling, and automatic model degradation.
Comprehensive Metrics
Track tokens, latency, costs, tool calls, and more.
Supported Providers
| Provider | SDK Package | Status |
|---|---|---|
| OpenAI | openai | Full support (Chat, Responses API, streaming) |
| Anthropic | @anthropic-ai/sdk | Full support (Messages API, streaming, tools) |
| Google Gemini | @google/generative-ai | Full support (generateContent, chat) |
Framework Compatibility
The SDK works seamlessly with popular AI frameworks:- Vercel AI SDK - Via fetch instrumentation
- LangChain - Instruments underlying LLM providers
- LlamaIndex - Works with instrumented providers
- Mastra - Full agent stack tracking support
What Gets Tracked
Every LLM API call is captured with:| Metric | Description |
|---|---|
input_tokens | Prompt/input tokens used |
output_tokens | Completion/output tokens generated |
cached_tokens | Tokens served from prompt cache |
reasoning_tokens | Reasoning tokens (o1/o3 models) |
latency_ms | Request duration in milliseconds |
model | Model name (e.g., gpt-4o, claude-3-5-sonnet) |
provider | Provider name (openai, anthropic, gemini) |
tool_calls | Function/tool calls made |
trace_id | OpenTelemetry-compatible trace ID |
agent_stack | Named agent context for multi-agent systems |