Announcements
New models, providers, functionality, and ecosystem updates.
Find more discussion on Discord and X.
Introducing Presets: Manage LLM Configs from Your Dashboard!
Centralize your LLM logic, iterate faster, and clean up your code—Presets are now live on OpenRouter.
Dev & BYOK Updates: Uptime API + Smarter Key Management
Track model uptime via API and get more control over your BYOK setup—including usage limits and testable keys.
Simplifying Our Platform Fee
We’re rolling out a simpler and more transparent platform fee structure:
GIF Prompts, Omni Search, Tool Caching, and BYOK Flags
Faster workflows, smarter tooling, and smoother image support—GIFs, provider search, Anthropic tool-call caching, and BYOK confirmation are all now live.
New Features: Reasoning Streams, Crypto Invoices, End-User IDs & More
Stream reasoning summaries, protect your rate limits, pay with crypto, and lock down your keys—now all live on OpenRouter.
Passkeys, DevEx Upgrades, and a New Guide for TypeScript Agents
Secure your account with passkeys, explore provider slugs, and build agents faster with fresh docs.
New Provider Drop: Cerebras Is Here
A provider built for speed and scale—from wafer to token. See what becomes possible when memory bottlenecks disappear.
Better Insights, Faster Metrics, and New Developer Power Tools
Deeper usage insights, sharper perf metrics, and new dev tools to speed up your workflow.
Privacy Clarity, New Providers, OAuth Upgrade, and Gemini Gets Parallel Tools
A couple quality-of-life improvements for developers!
Universal PDF Support
OpenRouter now supports PDF processing for every model.