Skip to content

EPFL spin-off Anyway Systems challenges the need for large data centers in AI

Tech

15 December 2025

EPFL researchers have developed a new distributed AI software that enables powerful language models to run locally, challenging the dominance of cloud-based data centers while strengthening data privacy, sovereignty and sustainability. Large cloud data centers currently power most AI inference workloads, but EPFL spin-off Anyway Systems aims to reduce reliance on centralized infrastructure by enabling advanced AI models to run on local networks. | © iStock

EPFL researchers have developed a new distributed AI software that enables powerful language models to run locally, challenging the dominance of cloud-based data centers while strengthening data privacy, sovereignty and sustainability.

Researchers at EPFL have developed new software that could significantly reduce reliance on large cloud-based data centers for artificial intelligence, opening new perspectives for data privacy, AI sovereignty and sustainability. The technology, now spun off into a start-up called Anyway Systems, enables organizations to run powerful AI models locally, without sending data to third-party cloud providers.

Today, most AI applications rely on cloud infrastructure. When a user submits a query, data is sent to remote servers where the AI performs inference before returning a result. While effective, this model concentrates computing power in the hands of a few large technology companies and raises concerns around confidentiality, control of sensitive data, energy consumption and national sovereignty.

Developed at EPFL’s Distributed Computing Laboratory (DCL) by researchers Gauthier Voron, Geovani Rizk and Professor Rachid Guerraoui, Anyway Systems proposes an alternative approach. The software allows users to download open-source AI models and deploy them on local networks by coordinating multiple machines into an on-premise computing cluster. Using self-stabilization techniques, the system optimizes available hardware and maintains robustness without requiring centralized data centers.

Privacy, sovereignty and sustainability

According to EPFL, large language models can be deployed on a small number of standard machines equipped with commodity GPUs, at a fraction of the cost typically associated with specialized AI server infrastructure. Installation can be completed in under an hour, with all data remaining inside the local network, ensuring privacy and data sovereignty.

The implications extend beyond security. AI inference is estimated to account for up to 90% of AI-related computing power, driving the rapid expansion of energy-intensive data centers. By enabling distributed, local inference, Anyway Systems could help reduce the environmental footprint of AI deployments while maintaining accuracy, albeit with modest trade-offs in response time.

The technology builds on years of research in distributed systems, fault tolerance and optimization at EPFL, with roots in earlier work on blockchain and decentralized computing. Recently, Anyway Systems was selected as one of six inaugural grantees of the Startup Launchpad AI Track powered by UBS, supporting its transition from research to market.