OpenRouter-compatible. Attested. 99.9% router availability.
OpenRouter-Compatible, But Verifiable
Change base_url, keep your models, get a verifiable non-logging prompt path.
1 linebase_url migration
99.9%router availability
0prompt/output logs
OpenRouter-compatible, but verifiable.Same OpenAI-shaped client, different trust model.
One APIAll the LLMs, provably private.
99.9%Router availability above individual endpoints.
Open sourceBackend, config, bring-up, and UI.
Why switch
All the routing convenience. A prompt path you can verify.
TrustedRouter lets you route to any LLM very easily. Switch between all your LLMs, have higher uptime, have backups, and try out much cheaper models, saving you money without needing to trust TrustedRouter servers as an intermediary any more than you need to.
Many engineers are using OpenRouter but only for low-security non-sensitive data because they know it's not so safe to put sensitive data in this third-party router.
Drop-in routeOpenAI-compatible
client = OpenAI(
base_url="https://api.quillrouter.com/v1",
api_key="sk-tr-v1-..."
)
model = "trustedrouter/auto"
Migrate
Change one URL.
It is as simple as changing one URL, so one line of code.
Open migration guide
Verify
Check the exact code path.
The trust page lists the commit hash of what code exactly is running.
Verify trust
Route
Use more prompts safely.
They have a lot more other prompts that they would love to use something like OpenRouter for.
OpenRouterTrustedRouter
API shapeOpenAI-compatible routingOpenAI-compatible routing
Trust modelOpaque third-party routerOpen source plus attestation
Prompt handlingYou have to trust the routerWe never log your prompt or the output
Best fitLow-sensitivity routingPrivacy-sensitive agent and app traffic
Founder words
OpenRouter convenience for traffic you could not send before.
The most common feedback I hear from engineers is that they don't use OpenRouter at all because of their security and privacy concerns, especially for sensitive prompts, or they use it only for low-sensitivity prompts.
Then you can see it working flawlessly. You can measure the uptime, the latency, the time to first token, and see that it's all really good and fast.
Most importantly it's all open source software. Every part of the backend infrastructure, configuration, bring up, and UI is entirely open source.