Vercel
Vercel aggregates models from multiple providers with enhanced features like rate limiting and failover. Access 86 models through NotPixel’s model router.
Learn more in the Vercel AI SDK documentation .
Usage
import Ads from 'notpixel';
const ads = new Ads({
publisherId: 'pub_xxx',
model: 'vercel/openai/gpt-5.2',
input: 'How do I deploy a Next.js app?',
});
const response = await ads.offer();
console.log(response.text);Configuration
.env
VERCEL_API_KEY=your-gateway-key
# Or use provider API keys directly
OPENAI_API_KEY=sk-...
ANTHROPIC_API_KEY=sk-ant-...Available Models
| Model |
|---|
vercel/alibaba/qwen3-coder-plus |
vercel/alibaba/qwen3-max |
vercel/alibaba/qwen3-next-80b-a3b-instruct |
vercel/alibaba/qwen3-next-80b-a3b-thinking |
vercel/alibaba/qwen3-vl-instruct |
vercel/alibaba/qwen3-vl-thinking |
vercel/amazon/nova-lite |
vercel/amazon/nova-micro |
vercel/amazon/nova-pro |
vercel/anthropic/claude-3-haiku |
vercel/anthropic/claude-opus-4.5 |
vercel/anthropic/claude-3.5-haiku |
vercel/anthropic/claude-3.5-sonnet |
vercel/anthropic/claude-3.7-sonnet |
vercel/anthropic/claude-4-1-opus |
vercel/anthropic/claude-4-opus |
vercel/anthropic/claude-4-sonnet |
vercel/anthropic/claude-4.5-sonnet |
vercel/anthropic/claude-haiku-4.5 |
vercel/anthropic/claude-opus-4.5 |
vercel/deepseek/deepseek-r1 |
vercel/deepseek/deepseek-r1-distill-llama-70b |
vercel/deepseek/deepseek-v3.1-terminus |
vercel/deepseek/deepseek-v3.2-exp |
vercel/deepseek/deepseek-v3.2-exp-thinking |
vercel/google/gemini-2.0-flash |
vercel/google/gemini-2.0-flash-lite |
vercel/google/gemini-2.5-flash |
vercel/google/gemini-2.5-flash-lite |
vercel/google/gemini-2.5-flash-lite-preview-09-2025 |
vercel/google/gemini-2.5-flash-preview-09-2025 |
vercel/google/gemini-2.5-pro |
vercel/google/gemini-3-pro-preview |
vercel/meta/llama-3.3-70b |
vercel/meta/llama-4-maverick |
vercel/meta/llama-4-scout |
vercel/minimax/minimax-m2 |
vercel/mistral/codestral |
vercel/mistral/magistral-medium |
vercel/mistral/magistral-small |
vercel/mistral/ministral-3b |
vercel/mistral/ministral-8b |
vercel/mistral/mistral-large |
vercel/mistral/mistral-small |
vercel/mistral/mixtral-8x22b-instruct |
vercel/mistral/pixtral-12b |
vercel/mistral/pixtral-large |
vercel/moonshotai/kimi-k2 |
vercel/morph/morph-v3-fast |
vercel/morph/morph-v3-large |
vercel/openai/gpt-4-turbo |
vercel/openai/gpt-4.1 |
vercel/openai/gpt-4.1-mini |
vercel/openai/gpt-4.1-nano |
vercel/openai/gpt-5.2 |
vercel/openai/gpt-5.2-mini |
vercel/openai/gpt-5 |
vercel/openai/gpt-5-codex |
vercel/openai/gpt-5-mini |
vercel/openai/gpt-5-nano |
vercel/openai/gpt-oss-120b |
vercel/openai/gpt-oss-20b |
vercel/openai/o1 |
vercel/openai/o3 |
vercel/openai/o3-mini |
vercel/openai/o4-mini |
vercel/perplexity/sonar |
vercel/perplexity/sonar-pro |
vercel/perplexity/sonar-reasoning |
vercel/perplexity/sonar-reasoning-pro |
vercel/vercel/v0-1.0-md |
vercel/vercel/v0-1.5-md |
vercel/xai/grok-2 |
vercel/xai/grok-2-vision |
vercel/xai/grok-3 |
vercel/xai/grok-3-fast |
vercel/xai/grok-3-mini |
vercel/xai/grok-3-mini-fast |
vercel/xai/grok-4 |
vercel/xai/grok-4-fast |
vercel/xai/grok-4-fast-non-reasoning |
vercel/xai/grok-code-fast-1 |
vercel/zai/glm-4.5 |
vercel/zai/glm-4.5-air |
vercel/zai/glm-4.5v |
vercel/zai/glm-4.6 |
Model Format
Use the format vercel/{provider}/{model}:
// OpenAI via Vercel
model: 'vercel/openai/gpt-5.2'
// Anthropic via Vercel
model: 'vercel/anthropic/claude-3.5-sonnet'
// Google via Vercel
model: 'vercel/google/gemini-2.5-pro'
// Meta via Vercel
model: 'vercel/meta/llama-3.3-70b'Why Use Vercel Gateway?
- Unified API: Single interface for multiple providers
- Rate limiting: Built-in rate limit management
- Failover: Automatic failover between providers
- Edge optimized: Low latency from Vercel’s edge network
Environment Variable
| Variable | Description |
|---|---|
VERCEL_API_KEY | Your Vercel gateway API key |