Orq.ai Spins Out Its AI Router as Standalone Product After Watching Customers Build — Then Abandon — Their Own

The Amsterdam-based platform is betting that routing will become the critical control layer for production AI. Its new standalone Router addresses cost explosion, vendor lock-in, and European sovereignty demands — without the 5-6% markup most gateways charge.

When bunq — one of Europe’s leading fintech banks — built its own LLM routing infrastructure, it seemed like the pragmatic move. Control the stack, avoid vendor dependency, and optimise costs in-house. What followed was predictable to anyone who has watched infrastructure complexity compound: maintenance became expensive, observability remained patchy, and the internal team found itself managing plumbing instead of building features.

Eventually, bunq replaced the whole system with Orq.ai’s Router, a production-ready gateway that handles governance, scalability, and cost monitoring without requiring the bank to maintain custom infrastructure.

AI Router, Orq.ai
AI Router, Orq.ai

That pattern — build internally, hit a wall, switch to a managed solution — is what prompted Orq.ai to spin out its Router as a standalone product. Previously bundled within the company’s broader agent lifecycle platform, the Router is now available independently, giving engineering teams a dedicated entry point for managing, routing, and optimising requests across multiple large language models through a single gateway. The timing reflects a broader shift: as AI systems move into production, routing is no longer plumbing. It is the control layer.

Why Routing Became a Bottleneck

New large language models are entering the market at a pace that makes any single-model strategy untenable. To balance cost, performance, and reliability, most organisations now run multiple models in production — a setup that introduces operational complexity traditional application stacks were never designed to handle. For Orq.ai’s early customers in regulated industries like financial services, fintech, and healthcare, this complexity collides directly with non-negotiable requirements around reliability, control, and compliance.

“We built our own LLM routing infrastructure, but maintaining it became increasingly expensive and time-consuming, while still leaving gaps in observability and performance,” said Benjamin Kleppe, GenAI Lead at bunq. “We chose to work with Orq.ai to replace that internal setup with a production-ready AI Router that meets our governance, scalability, and cost-monitoring requirements.”

The problem bunq faced is structural, not specific to one company. As soon as AI systems expand beyond a single model, routing becomes a production bottleneck rather than an implementation detail. Requests need to be directed intelligently across providers, costs need to be tracked granularly, performance needs to be monitored continuously, and compliance constraints need to be enforced consistently. Doing all of that in-house is feasible — until it becomes the thing your team spends time on instead of the thing your product actually does.

Sovereignty Is No Longer Abstract

“In Europe, AI sovereignty is no longer an abstract policy debate; it’s a direct consequence of today’s geopolitical reality,” said Sohrab Hosseini, co-founder of Orq.ai. “Enterprises need to know where AI inference runs, who controls the infrastructure, and how quickly they can adapt as conditions change. Those decisions are enforced at the routing layer, which is why we made it available as a standalone product.”

The Orq.ai Router gives teams explicit control over how requests are routed across providers and regions. Routing policies can be defined around geography, latency, cost, or custom constraints, allowing organisations to adapt infrastructure decisions without rewriting applications or re-architecting systems.

Critically, unlike most AI routers, Orq.ai can run entirely within a customer’s own infrastructure, supporting deployments across both public and private models. For European enterprises navigating the EU AI Act and GDPR compliance, that architectural flexibility is not optional — it is the baseline.

Pricing That Doesn’t Punish Scale

The way Orq.ai prices the Router reflects its architectural philosophy. Most AI gateways apply percentage-based markups on top of underlying model costs — often in the range of 5 to 6 per cent — which compounds quickly as usage scales. The Orq.ai Router takes a different approach: routing itself is available without a platform markup. Teams pay only for tracing and logging of processed data, based on volume. That pricing model matters because it aligns incentives correctly. Organisations are not penalised for scaling usage; they pay for the observability and governance capabilities they actually need.

“As soon as AI systems move beyond a single model, routing turns from plumbing into a production bottleneck. Making the router standalone lets teams regain control early, with a single line of code,” said Hosseini.

From Routing to Full Agent Lifecycle

The Router’s standalone release is designed as an entry point, not an endpoint. Teams can deploy it independently today to address immediate routing and cost-control challenges, then expand into Orq.ai’s broader agent lifecycle capabilities — prompt management, experimentation, and deployment orchestration — as operational requirements grow. That modular approach reflects how production AI infrastructure actually evolves: teams start with the immediate pain point (routing), prove the value, then expand scope incrementally.

Orq.ai has raised €7.3 million in total funding and now serves over 500 engineering and product teams across regulated sectors. The company is SOC 2 certified, GDPR-compliant, and aligned with the EU AI Act — credentials that matter significantly when selling into European financial services, healthcare, and government customers.

The founding team includes Sohrab Hosseini, who previously led tech operations at Neocles and Transdev, and Anthony Diaz, former CTO at Agrúpalo, both based in Amsterdam. The 25-person distributed team reflects Orq.ai’s positioning as a European-first AI infrastructure company built for sovereignty-conscious enterprises.

Sohrab Hosseini and Anthony Diaz, co-founders of Orq.ai
Sohrab Hosseini and Anthony Diaz, co-founders of Orq.ai

Other customers beyond bunq include Zonneplan, a Dutch energy tech company that used Orq.ai to build an AI chatbot that automated 98% of customer support tickets during billing peaks, and Evergrowth, a customer intelligence platform that accelerated AI development by 4x through faster prompt iteration and granular cost tracking. These deployments span fintech, energy, and SaaS sectors where reliability, compliance, and cost transparency are not negotiable.

The Router is available immediately as a standalone product. For teams already running multi-model AI systems in production and discovering that routing has quietly become their most critical infrastructure decision, the value proposition is straightforward: regain control without building it yourself, avoid vendor markup that scales with usage, and maintain the flexibility to run inference wherever sovereignty or compliance demands require. That is not a hypothetical pitch — it is the lesson bunq, and others learned the expensive way.