Models and Providers
aisdk offers a sophisticated, unified API for seamless interaction across multiple AI providers and models. Rather than depending on a single LLM vendor, the SDK exposes an abstraction layer that allows you to easily plug and play providers while respecting their unique native capabilities.
Philosophy
The SDK is built on the principle of Native Capability Retention. While we provide a unified interface, we ensure that provider-specific features (like Anthropic’s caching or Gemini’s search) are not lost when using proxies or specialized creators.
Ecosystem Overview
We categorize providers into several groups. Explore the detailed documentation for each:
Major Native Providers
These providers have deep, native integrations within the SDK:
Aggregators and Proxies
Unified endpoints that route to multiple backends:
- AIHubMix: A specialized proxy with native compatibility wrappers for Claude and Gemini.
- OpenRouter: Access to open and closed models via a single API.
Specialized & Regional Providers
Optimized connectors for high-performance backends:
Custom Connections
- Custom Providers: Connect to internal gateways, local LLMs (Ollama), or any OpenAI-compatible API.
Environment Configuration
Before using a provider, ensure your API keys are set up in your .Renviron. The SDK automatically looks for these variables:
| Provider | Environment Variable |
|---|---|
| OpenAI | OPENAI_API_KEY |
| Anthropic | ANTHROPIC_API_KEY |
| Gemini | GEMINI_API_KEY |
| AIHubMix | AIHUBMIX_API_KEY |
| DeepSeek | DEEPSEEK_API_KEY |
Provider Registry
If you use multiple providers, you can use the ProviderRegistry to manage them:
registry <- get_default_registry()
registry$register("openai", create_openai())
registry$register("claude", create_anthropic())
# Access models via provider:model syntax
model <- registry$language_model("openai:gpt-4o")Next Steps
- Learn about Structured Outputs.
- Explore the Agent System.
- Build tools with the Tool DSL.