Verda and Siili Solutions launch a sovereign LLM-as-a-Service offering
Verda (formerly DataCrunch) and Siili Solutions Plc have launched a strategic partnership to equip European enterprises and public-sector organizations with low-barrier and cost-efficient access to sovereign AI models.
Delivered in an LLM-as-a-Service model, LLM Gateway offers a wide range of inference endpoints for open-source models - all running on NVIDIA’s cutting-edge GPU hardware, hosted entirely in Finland and powered by 100% renewable energy.
This joint solution is sovereign and GDPR-compliant by design, unlocking efficient LLM serving for all use cases: from rapid prototyping to production-grade systems in highly-regulated environments.
Built for Europe’s most regulated sectors
This offering targets industries where privacy and operational continuity are critical: finance, healthcare, public services, and large enterprises modernizing their AI stacks.
All compute stays within the EU’s jurisdiction, giving organizations a fully sovereign and compliant environment with zero reliance on US hyperscalers and external LLM vendors.
Unlocking friction-free LLM serving
By leveraging each party’s strengths, we can ensure that LLM Gateway delivers efficient LLM serving without engineering overhead at competitive pricing.
LLM Gateway is a one-stop shop endpoint connecting to multiple AI models. The endpoint is managed by Solution Architects at Siili, running on the Verda Cloud platform.
The serverless approach ensures performant and scalable LLM inference with out-of-the-box auto-scaling and predictable pay-per-token pricing.
Get started
You can try LLM Gateway by signing up for the beta program.
Or learn more about the Verda x Siili Solutions partnership.