Smaller, Greener, Local: The AI Strategy Challenging Big Tech

A‍‌‍‍‌‍‌‍‍‌ David-versus-Goliath shift is quietly taking shape in artificial intelligence, as a small Nordic startup sets out to challenge Big Tech’s dominance of cloud-based large language models (LLMs).

Copenhagen-based open-source company NobodyWho has raised €2 million in pre-seed funding to accelerate its vision of running Small Language Models (SLMs) directly on users’ devices, bypassing the cloud altogether.

Backing for a European Sovereign AI Alternative

The round was backed by PSV Tech, The Footprint Firm, and Norrsken Evolve, reflecting growing investor interest in decentralised, energy-efficient and privacy-first AI infrastructure. NobodyWho’s ambition is explicitly European: to offer a sovereign alternative to AI systems controlled by a handful of non-European hyperscalers.

Built on a Decade of Local AI Experience

Howdy uno builds upon almost ten years of experience in local AI technologies and is the brainchild of award-winning entrepreneur and artist Cecilie Waagner Falkenstrøm, who serves as the company’s CEO. Waagner has been involved in interactive AI since 2016, and, together with co-founder and CTO Asbjørn Olling, has led projects ranging from UN-commissioned initiatives to a 2021 edge-AI experiment aboard NASA’s International Space Station.

The Limits of Cloud-Based AI

Today’s dominant AI paradigm relies on massive cloud-hosted models that require enormous computational resources, constant internet connectivity, and the continuous transfer of user data to third-party servers. According to NobodyWho, this model drives up costs, creates vendor lock-in, increases emissions, and leads to a structural loss of European data sovereignty.

A Device-First Approach to AI

Howdy does things quite differently. Their open-source engine makes it possible for SLMs to be executed locally on laptops and mobile devices, ensuring that data is never sent elsewhere. The device-first architecture aligns closely with privacy and security principles by eliminating reliance on centralised cloud infrastructure.

“These models are still large by most standards,” says Falkenstrøm, “but they are far smaller than systems such as ChatGPT. They are equivalent to the first generation of large models — and they are more than sufficient for a plethora of real-world scenarios.”

Security, Resilience and Cost Advantages

Local execution also brings architectural resilience. Instead of concentrating computation in central servers vulnerable to disruption, inference is distributed across thousands or even millions of devices. Cost dynamics shift as well: users provide their own hardware, meaning there is no growing cloud inference bill as usage scales. This opens advanced AI to NGOs, public-sector organisations and early-stage startups previously priced out of cloud-based models.

Cutting AI’s Carbon Footprint

Environmental impact is another major driver behind local AI adoption. Training and running LLMs has become one of the fastest-growing sources of emissions in the tech sector. NobodyWho claims its local-first SLM approach delivers up to 100x lower training footprint and as much as 500x lower inference footprint, positioning the company at the intersection of AI and climate technology.

Open Source at the Core

Open source underpins the company’s strategy. NobodyWho keeps its inference engine, libraries and developer integrations fully open, focusing on infrastructure rather than proprietary models. Its engine supports more than 10,000 open-source language models across multiple devices and operating systems.

“The models are already there,” says Falkenstrøm. “The real bottleneck is making them practical to deploy.”

Making Local AI Developer-Friendly

To lower barriers for non-ML specialists, NobodyWho integrates directly with major developer frameworks. With newly launched Python support, the company aims to make running a local language model as simple as adding a standard software dependency.

An Open-Core Business Model

Revenue comes through an open-core approach. While core tools remain free, NobodyWho offers paid fine-tuning services, allowing teams to avoid managing costly compute infrastructure themselves. Once fine-tuned, models can be deployed to millions of devices without additional inference costs.

Growing Developer Adoption

Momentum is already building. More than 5,000 developers are using NobodyWho via GitHub, supported by an active Discord community that contributes feedback and use cases.

A Different Path for Europe’s AI Future

For Falkenstrøm, the strategy reflects both pragmatism and values. “Europe will not win the ‘bigger is better’ race against the US or China,” she says. “But smaller, local models create a different way to compete — one grounded in privacy, sustainability and democratic control.”

As demand grows for affordable and privacy-respecting AI, NobodyWho is betting that the future of artificial intelligence will be shaped not by size, but by decentralisation — and by models that run where the data is stored.

Exit mobile version