Skip to content
GEA

Proposal · Education as the twelfth Moonshot

The European Moonshot for Education

AI-powered learning as sovereign infrastructure

AI in education is an infrastructure question, and the infrastructure has already been decided — just not by any government. In mid-2025, Google embedded LearnLM into the productivity suite already used by 170 million students and educators across more than 230 countries and territories. No learner opted in. No pedagogical authority reviewed it. A school administrator toggled a setting, and the largest AI tutoring deployment in history was live.

No one decided

The MFF 2028–2034 proposal lists eleven Moonshot projects — fusion energy, quantum computing, clean aviation, next-generation AI, data sovereignty, automated transport, regenerative therapies, the space economy among them. The system that trains the scientists, engineers and physicians for all eleven — the floor on which every other Moonshot stands — is absent. Education is not on the list.

The activation decision was architecturally embedded in a prior platform decision. Google Workspace for Education had been adopted through procurement processes, data processing agreements and, in Europe, DPIA assessments. When LearnLM arrived as a feature update inside that platform in mid-2025, no new procurement decision was required. No new pedagogical review. No new DPIA. The governance mechanisms that would normally apply to a new educational technology deployment never engaged.

The engineering is serious. In controlled studies, students using LearnLM under human tutor supervision were 5.5 percentage points more likely to solve novel problems than those working with human tutors alone. The deployment at scale is an accomplishment that deserves honest acknowledgement. But the architecture decision that embedded this technology into a generation's daily learning was made without anyone deciding.

170 million students. No one decided.

Two architectures, one fork

The OECD has measured what happens when general-purpose AI enters education without intentional pedagogical architecture. Students with access to general-purpose generative AI produce higher-quality outputs — better-structured essays, more complete problem sets. But when those students sit exams without AI access, the advantage disappears. In some cases it reverses. The OECD's language is precise: "Offloading cognitive tasks to general-purpose chatbots creates risks of metacognitive laziness and disengagement."

The students performed more. They did not learn more. Purpose-built educational AI with intentional pedagogical design — tools that scaffold thinking rather than replace it — shows sustained improvements that persist when the AI is removed. LearnLM is closer to this category than to a general-purpose chatbot; its Socratic scaffolding is real. But effective tutoring without data sovereignty, longitudinal profiles, auditable knowledge governance, institutional permanence and offline capability is a partial answer to a structural question.

Five structural constraints define what no platform company — regardless of engineering talent — can deliver. A policy promise that student data will not train models is reversible by a terms-of-service update; an architecture where the cognitive profile is processed only on the learner's local system, with no interface through which it could be bulk-extracted, requires a fiduciary structure no company whose revenue depends on data aggregation can credibly adopt. No platform maintains a structured, auditable knowledge graph separating universal scientific consensus from culturally sovereign content. A publicly traded corporation has fiduciary duties to shareholders; if a tutoring product becomes premium, is discontinued or changes its pedagogical logic, no governance body prevents it. Everything is cloud-based — for 272 million children without school, many without reliable electricity, these tools do not exist.

These are not failures of execution. They are consequences of structure. A company that monetises attention cannot build a system designed to make itself unnecessary. A company governed by shareholder return cannot place a learner's cognitive profile beyond its own reach. A company that operates in the cloud cannot serve a child who has no connection. The gap between platform products and trustee infrastructure is not a quality gap. It is a category difference.

Whoever operates the infrastructure must not control the knowledge.

Architecture decisions, once the installed base is built, do not reverse. The generation now in school will either own its learning or rent it. That choice is being made — by default, in admin consoles, one toggle at a time.

Eleven doors. No floor.

The MFF 2028–2034 proposal funds fusion energy, quantum computing, clean aviation, next-generation AI, data sovereignty, automated transport, regenerative therapies and the space economy, among others. Eleven Moonshot projects in total. Every one of them depends on people the current education system does not produce in sufficient number — scientists for fusion, engineers for quantum computing, physicians for regenerative therapies.

The Commission has built the chain link by link. GDPR was the answer to data misuse. The AI Act was the answer to uncontrolled artificial intelligence. The Chips Act was the answer to semiconductor dependence. Each Moonshot recognised a domain where dependence on external architectures posed a structural risk. Cognitive sovereignty — control over the systems that shape how a generation learns to think, what knowledge they encounter, and who holds the record of their intellectual development — is the next link in that chain. Not completing it is not caution. It is incoherence.

The economics compress into two facts. For two hundred years, Baumol's cost disease meant every additional student required an additional teacher — costs grew linearly with enrolment. This architecture breaks that logic: marginal cost for an additional learner approaches zero. And the cost of the status quo is not stagnation but hemorrhage — the World Bank estimates twenty-one trillion dollars in lost lifetime earnings for the current generation; Hanushek and Woessmann calculated that raising OECD PISA scores by twenty-five points over twenty years would yield between one hundred fifteen and two hundred sixty trillion dollars in GDP gains. The EU invested seventy billion euros through the Recovery and Resilience Facility. PISA scores declined.

The technology that makes AI-powered education universal is arriving regardless of any policy decision. Inference costs for GPT-3.5-level performance fell two hundred and eightyfold in under two years. Edge AI hardware doubles in performance every 1.9 years on the same form factor. Satellite connectivity reaches global coverage by 2028 at five to ten dollars per month in developing countries. Every learner on Earth will have an AI tutor within a decade. What remains undecided is the architecture.

Eleven doors. No floor.

The sovereignty chain

Europe has built sovereignty in three layers — chip, data, digital. The fourth, cognitive, is missing. Without it, the other three protect a perimeter inside which no one decides what learners learn.

The sovereignty chainEurope has built sovereignty in three layers — chip, data, digital. The fourth, cognitive, is missing. Without it, the other three protect a perimeter inside which no one decides what learners learn.ChipBUILTDataBUILTDigitalBUILTCognitivePROPOSED
Three layers of sovereignty exist. The fourth is the gap.

More than seventy per cent of European cloud infrastructure runs on US hyperscalers. The CLOUD Act allows US authorities to demand data from US companies regardless of where the servers stand. When the International Criminal Court's chief prosecutor was locked out of his Outlook account following US sanctions, the institution replaced Microsoft Office entirely. Now consider what a sovereign education architecture holds: not spreadsheets but how a person thinks, where they struggle, which memories matter. Storing this on CLOUD Act infrastructure means delegating cognitive sovereignty over a generation to a foreign jurisdiction.

The architecture requires, at its foundation, that no entity other than the learner can access their cognitive profile. No regulation forbids access — the architecture ensures that no entity other than the learner's local system processes the cognitive profile, and no interface exists through which it could be bulk-extracted or queried by third parties. The most intimate data about a human being is not a medical history. It is the record of how their mind learns.

A state that demands access gets nothing — not because a contract forbids it, but because no interface exists.

This single architectural requirement eliminates two of the three global powers capable of building at scale. The US technology sector operates under business models that treat learning data as a platform asset. China operates under a state model that treats learning data as a governance instrument. Europe remains. Not by aspiration — by elimination.

Why Europe — and why now

Europe is the only actor that combines a regulatory framework making data sovereignty a legal baseline, existing compute infrastructure, a seven-year budget mechanism capable of sustained commitment, and institutional precedent for building sovereign infrastructure through multilateral cooperation. A coalition of middle powers or a UNESCO-led initiative could contribute — the CERN model anticipates exactly this: European initiation leading to global participation. But initiation requires an actor with both institutional capacity and the political will to move first.

The obvious objection is Europe's mixed track record on large-scale digital infrastructure. Gaia-X was supposed to deliver European cloud sovereignty and largely stalled, captured by the hyperscalers who joined the consortium. The European Health Data Space is years behind schedule. But Gaia-X failed because it tried to compete with hyperscalers on their own terms — general-purpose cloud infrastructure. This project does not compete with Google on general-purpose AI. It builds specific-purpose infrastructure for a domain where the institutional requirements — data fiduciary, knowledge graph governance, offline operation, progressive de-adaptation — are structurally incompatible with the platform model.

The political window is concrete and closing. The MFF 2028–2034 trilogue is underway. Final agreement is expected by the end of 2027. Once agreed, budget lines are fixed for seven years. Horizon Europe proposes one hundred seventy-five billion euros, the European Competitiveness Fund's Digital Leadership Window adds fifty-four point eight billion, Erasmus+ stands at forty point eight billion. Education is absent from the Moonshot list. Every year of delay is also a year in which the commercial installed base grows by tens of millions, making the alternative harder to build and lock-in deeper to reverse.

Three commissioners' portfolios converge on this project. Mînzatu holds Skills and Education — Erasmus+ at forty point eight billion in the proposed MFF. Zaharieva holds Research and Innovation, with Horizon Europe at one hundred seventy-five billion. Virkkunen holds Tech Sovereignty — the AI Act, digital sovereignty, the infrastructure agenda. The institutional bridge exists at the personal level. What does not exist is the project that walks across it.

Precedent: CERN, Galileo, GSM

December 1949. Louis de Broglie stands before the European Cultural Conference in Lausanne and proposes what no single nation can afford: a laboratory for particle physics at the scale the science demands. Europe's best physicists have already left for Brookhaven. Governments hear "nuclear research" and think of the bomb. The proposal failed at first. Six months later, Isidor Rabi introduced a resolution at the fifth UNESCO General Conference. In February 1952, eleven countries signed the provisional council agreement. In September 1954, CERN existed. Five years from idea to organisation. Today: twenty-five member states, more than twelve thousand visiting scientists from over eighty countries, and — as a side product — the World Wide Web, invented by Tim Berners-Lee in 1989.

Galileo was proposed not because GPS was technically inadequate, but because critical navigation infrastructure should not depend on a single foreign power's military system. Costs escalated to approximately twenty-two billion euros over twenty years. At multiple points the project was declared nearly dead. Each time, member states voted to continue. Civilian-controlled satellite navigation now serves nearly four billion users.

In the early 1980s, each European country operated its own mobile telephone standard. Calls stopped at borders. The United States chose competing proprietary standards; the market fragmented for a decade. Europe chose differently: GSM, an open standard any manufacturer could implement, any country could adopt. The decision was political, not technological. GSM became the global standard.

Three precedents, one pattern. The lesson is not that Europe builds smoothly. It is that when Europe decides critical infrastructure must not depend on a foreign power, the infrastructure gets built — even when the path is difficult. CERN did not try to out-fund Brookhaven. It built something Brookhaven's institutional form could not build.

The ask

The demand is specific. Education as a twelfth Moonshot project in the MFF 2028–2034 — not as an extension of Erasmus+, but as infrastructure on the level of fusion energy and quantum computing. The budget category exists. The architecture is described in a companion paper (Pochmann 2026). The first step is not to build the system. It is to commission the feasibility study — the same step that moved CERN from an idea at a cultural conference to a provisional council in under three years.

A feasibility study is a decision to look. Refusing to look is also a decision. The technical capacity for a first phase exists: Apertus, released by ETH Zürich, EPFL and CSCS in September 2025, delivers eight-billion and seventy-billion parameter open models trained across more than one thousand languages, EU AI Act compliant. Nineteen AI Factories are selected across Europe; five Gigafactories are in planning. For a pilot, Europe has enough.

The demand is already being articulated — in fragments. UNESCO's Giannini warns that commercial AI risks eroding the human practices education depends on. The UN Special Rapporteur demands regulation and acknowledges the UN has no technical plan. More than fifty African states signed the Africa Declaration on Artificial Intelligence in Kigali, demanding sovereign computing infrastructure. Each sees a piece. No one has assembled them.

AI-powered education is arriving with the same inevitability as GSM unified a continent's networks. Whether the tutor that accompanies a generation operates as sovereign infrastructure or as a platform product is not a technology question. It is a question of who holds the record of how a generation learned to think.

One path produces citizens who own their learning. The other produces users who rent it.
Moonshot paper, in full. DOI 10.5281/zenodo.18759299, CC BY 4.0. Companion: Architecture paper, DOI 10.5281/zenodo.18759134.