OpenRouter-compatible. Attested. 99.9% router availability.
Migrate From OpenRouter
Change base_url, keep OpenAI-compatible clients, and test the trust path.
1 linebase_url migration
99.9%router availability
0prompt/output logs
Base URL swap
Migrate in the time it takes to run one request.
Migration is very, very simple. It's very easy to migrate from just calling a single LLM, an OpenAI-compatible LLM, including OpenAI or Anthropic or DeepSeek or anything else, or OpenRouter or something like it, to us. It is as simple as changing one URL, so one line of code.
Start with trustedrouter/auto or a model from the catalog.
Before / afterbase_url
from openai import OpenAI
client = OpenAI(
base_url="https://api.quillrouter.com/v1",
api_key="sk-tr-v1-..."
)
1Swap URL
The wedge: one-line replace your direct LLM call.
2Verify trust
Check the source commit and attestation instructions.
3Measure
Measure uptime, latency, and time to first token.
4Route more
Move sensitive routes once the path checks out.
Trust
Verify before production.
Customers can verify that the code that they're reaching out to and talking to is in fact what they're seeing in the open source repo.
Verify trust
Credit
$100 design partner credit.
Migration credits are available by approval for qualified design partners.
No automatic grant. Apply one-off credit manually.
Stack
Bring Python, JavaScript, agents, or BYOK.
There are no stack expectations because all we do is implement a fully OpenAI-compatible API.
Design partners
For teams spending more than $100/month on LLMs.
Today we are looking for design partners and we are therefore willing to give a hundred dollar credit for those teams to switch over a part of their workflows to us.
Thirty minutes? I mean, 30 seconds after signup you'll have an API key and then you'll drop it into your app and you'll see that it's operating much better, faster, cheaper, more uptime.
We move very fast, so we're really eager to have design partners give us their most juicy problems and solve them.