Thin sovereignty, thick autonomy

Illustration: Vahram Muradyan

13 February 2026

Europe faces an intense cyber threat landscape in which both state-sponsored and non-state actors endanger not only data confidentiality but also the functioning of critical infrastructure, democratic institutions, and everyday public services on which hundreds of millions depend.

Yet Europe’s response to these cybersecurity challenges is increasingly entangled with a broader project of digital sovereignty, driven by valid concerns about American and Chinese technological dominance, alarm over foreign interference and online harms, and a desire to assert regulatory authority and strategic autonomy. The resulting political impulse often equates security with control: treating the protection of European societies as synonymous with controlling digital infrastructure, excluding or restricting non-European providers, and reasserting state authority over digital spaces. But this can misdiagnose what cybersecurity requires: resilience, options, and cooperation among European and non-European partners.

Some degree of digital sovereignty is essential for law enforcement, national security, and constitutional legitimacy. But digital sovereignty is neither achievable nor desirable for Europe at scale, and it does not, in itself, reliably support Europe’s cybersecurity interests, which are rooted in the resilience and continued functioning of digitally dependent societies. An overemphasis on sovereign control risks fragmentation, inefficiency, and strategic self-constraint, resulting in greater vulnerability. Therefore, Europe should pursue a design that keeps sovereignty thin and targeted while building thick autonomy to position it to deal effectively with persistent digital stress. 

Sovereignty versus autonomy

The digital sovereignty debate often conflates authority with capability. Sovereignty concerns enforceability – who can compel access, impose obligations, and exercise jurisdiction. Digital sovereignty is best understood as ultimate authority over digital infrastructure and data: the capacity to enforce rules and to govern citizens’ lives in the digital domain. Digital autonomy, by contrast, concerns performance under stress – the ability to maintain digital functions under adverse circumstances.

‘Thin’ sovereignty therefore refers to narrowly scoped legal and political control, with ‘thick’ autonomy denoting the practical operational capacity to adapt, recover, and continue functioning under pressure. For cybersecurity, resilience depends less on ownership or jurisdiction than on options, redundancy, and recoverability. Here, autonomy preserves options, while sovereignty constrains them.

Digital sovereignty has become politically compelling in Europe because it bundles several agendas – platform power, online harms, economic competitiveness, and geopolitical risk – into a single intuitive promise: ‘take back control.’ This resonates for two main reasons. First, very large technology companies increasingly exercise quasi-governmental influence over infrastructure, standards, and information environments, blurring the line between private product decisions and public governance. Second, Europe depends on foreign technology providers, including those from the US and China, whose policies are not always consistent with Europe’s rights-based and regulatory approach to governing digital technologies. 

Europe cannot realistically replicate the scale, capital intensity, and global ecosystems of US and Chinese hyperscalers in the foreseeable future, making exclusive reliance on European platforms neither feasible nor sustainable. This constraint makes autonomy – not control – the most plausible organising principle for resilience at scale. While EU policy already invokes ‘strategic autonomy,’ practice too often equates it with sovereignty – treating autonomy as infrastructure ownership rather than operational capability. 

Europe’s cybersecurity interests

At its core, cybersecurity is about the functioning of society itself. Modern economies, public services, democratic processes, and societal trust depend on the reliable operation of digital systems. By protecting the confidentiality, integrity, and availability of digital systems, cybersecurity helps ensure that data can be trusted, services remain accessible, and institutions can operate predictably. When cybersecurity fails, the consequences are immediate and material: hospitals are disrupted, markets stall, public trust erodes, and governments struggle to deliver basic services. Cybersecurity therefore serves a deeper interest than technical risk reduction – it underpins societal resilience.

EU initiatives like NIS2, DORA, and the Cyber Resilience Act reflect this logic of resilience engineering, framing cybersecurity as a matter of economic security and societal resilience rather than purely defence. The objective is not invulnerability, but the ability of the single market and public institutions to function predictably under persistent threat. When resilience is the benchmark, the test for sovereignty is straightforward: does it improve Europe’s ability to function when systems are stressed, or does it merely redistribute legal authority without strengthening operational capacity?

Does digital sovereignty support Europe’s cybersecurity interests?

Digital sovereignty supports cybersecurity in narrow, high-sensitivity domains where authority must be technically absolute, such as classified systems, lawful access, and continuity-of-government functions. In these cases, foreign exposure directly undermines security. This approach is increasingly recognised in Europe and beyond: Canada’s emerging digital policy similarly prioritises operational resilience while reserving strict control for core state functions.

Beyond core high-sensitivity domains, treating sovereignty as a system-wide objective undermines cybersecurity for three reasons. First, sovereignty-at-scale risks fragmentation: divergent national approaches to ‘sovereign’ infrastructure, procurement, and standards reduce interoperability and slow collective detection and response in a threat environment that rewards speed and coordination. This problem is structural: the EU is not a unitary state, and ‘European’ authority over digital systems is split across member states with different threat perceptions, legal powers, and levels of trust – conditions that make a unified sovereign stack difficult to design and even harder to operate. 

Second, thick sovereignty is often inefficient: duplicating large-scale digital infrastructure, like data centres and cloud platforms, without global scale drains resources and slows innovation. Gaia-X, the European sovereign cloud initiative launched in 2020, has struggled with precisely this problem. Despite substantial investment, it remains fragmented across national implementations and has limited market adoption. Third, and most paradoxically, maximalist sovereignty produces self-constraint. In seeking to avoid dependence, Europe narrows its own options by excluding useful technologies, weakening alliances, and entrenching new forms of lock-in, thereby reducing the very autonomy that resilience requires.

The right question is therefore not whether Europe should be digitally sovereign, but where sovereignty is indispensable, where digital autonomy delivers better cybersecurity outcomes, and how to manage the inevitable trade-offs between legal compliance, operational continuity, and security effectiveness. In practice, however, many sovereignty cases are driven by jurisdiction and rights, not by technical insecurity. 

Jurisdictional sovereignty, cyber risk, and the limits of control: FISA Section 702, and the case of Denmark

Many digital sovereignty efforts are not made for cybersecurity reasons but are responses to legal requirements, especially around data protection and jurisdiction. Legality and cybersecurity do not automatically align: a policy can reduce jurisdictional exposure while still posing cybersecurity challenges like operational fragility, transition risk, or weakened defence. 

The legal controversy surrounding Section 702 of the US Foreign Intelligence Surveillance Act, which permits US intelligence agencies to compel American technology companies to provide access to non-US persons’ data, illustrates why digital sovereignty concerns are not merely rhetorical. Reinforced by the US CLOUD Act, Section 702 raises genuine concerns about foreign jurisdictional reach, which the European Court of Justice responded to in the Schrems II ruling, which limited data flows between Europe and the US. But Section 702 does not vindicate maximalist sovereignty. The risks it exposes are jurisdictional (intelligence access, constitutional protections), not operational (ransomware, supply-chain attacks, availability failures). Excluding major platforms to address intelligence-access concerns may therefore weaken defences against more immediate threats while doing little to achieve meaningful autonomy in practice. 

Denmark’s post-Schrems II restrictions on Google Workspace and Microsoft 365 in schools and parts of the public sector illustrate the limits of this approach. These decisions reflected legitimate GDPR compliance concerns about children’s data and foreign legal compulsion – not claims about technical insecurity. Similar restrictions by data protection authorities in Austria, France, and Italy show how shared EU legal obligations, not national sovereignty agendas, have driven convergent reactions.

Yet Denmark’s experience also highlights why sovereignty-driven responses do not automatically strengthen cybersecurity: Danish restrictions imposed transition costs and workflow disruptions, and likely also had second-order security effects, including impacts on patching continuity, identity management, and incident-response readiness. By extending sovereignty logic to routine administrative systems – rather than clearly distinguishing them from genuinely high-sensitivity state functions – Denmark narrowed its own options and accepted new operational constraints, without materially improving security. 

Thin sovereignty and thick autonomy as policy

If Europe’s cybersecurity interest is ultimately the continuity of digitally dependent societies, then digital sovereignty should be treated as a tool in a resilience toolkit, not as the organising principle for the entire digital world. The practical design challenge is to selectively apply sovereign control where it materially protects legitimacy and high-consequence functions, while building autonomy at scale so Europe can still operate when dependencies fail. This design challenge can be translated into three practical principles: segment sovereignty by sensitivity, govern dependencies rather than exclude them, and build collective resilience.

Principle 1: Sovereignty by sensitivity – classify and segment 

To preserve both constitutional legitimacy and operational resilience, Europe should operationalise a small number of risk tiers tied to consequence, not geography. Tier 1 would include classified or continuity-of-government systems, requiring strict sovereignty – measures like EU or national-only infrastructure, on-premises hosting, and software and hardware supply chain control. Tier 2 would comprise essential services, like healthcare, energy, finance, and transport. These are best treated by governing dependencies, for example, disclosing jurisdictional exposure and providing contractual and technical mitigations against foreign legal compulsion. Tier 3, consisting of general administration and consumer services, should face minimal restrictions – for example, on service provider portability or transparency. 

The EU Cybersecurity Certification (EUCC), which requires service providers to disclose jurisdictional exposure and security controls, already provides the governance architecture for Tier 2. Europe now faces a political challenge: preventing ‘tier inflation’ where routine systems are classed as essential to avoid procurement friction. France’s restrictions of cloud services in education illustrate this risk: extending Tier 1 requirements to Tier 3 systems narrows options without improving security. This principle avoids that error by making sensitivity classifications auditable, evidence-based, and tied to operational consequence.

Principle 2: Governing dependencies – immunity, not residency

For the broad middle of the economy (Tiers 2-3), resilience comes from transparency and enforceability, not ownership. The EUCC operationalises this: providers of products and services seeking ‘substantial’ or ‘high’ cybersecurity assurance certification must disclose jurisdictional exposure (Which foreign laws compel data access?), document security architecture (eg, encryption, access controls, incident response), and accept contractual obligations that create ‘immunity’ from arbitrary foreign interference – for example, requiring providers to notify European customers of foreign legal demands and to exhaust legal challenges before compliance. EUCC does not eliminate foreign jurisdiction, but it makes exposure visible, bounded, and mitigatable. This approach has precedent: Estonia’s X-Road platform – used across the Baltics and beyond – allows governments to use foreign digital infrastructure while maintaining cryptographic sovereignty: data is encrypted with keys held by national authorities, not service providers. Operational autonomy doesn’t require infrastructure ownership; it requires transparent, enforceable conditions that preserve control even when infrastructure is foreign-owned. The EUCC creates this architecture; Europe must now enforce it consistently across member states.

Principle 3: Building collective resilience – autonomy through coalition

Thick autonomy is not isolationist. Europe’s resilience is strengthened by interdependence with trusted partners, not by replicating full digital stacks within its borders. First, coordinated procurement can increase collective switching power. When multiple governments align procurement requirements, especially around portability and interoperability, vendors face stronger incentives to reduce lock-in without Europe having to own infrastructure itself. Second, federated threat intelligence and incident response amplify resilience at scale. Europe’s computer security incident response team (CSIRT) networks already benefit from close cooperation with international partners; enhancing interoperability for major incidents would convert episodic coordination into durable surge capacity. Third, greater recognition of baseline security certifications reduces compliance friction while preserving security standards, allowing vendors to serve multiple trusted markets without fragmenting defensive ecosystems.

The limits of sovereignty

Europe’s digital sovereignty impulse is understandable, but, as a cybersecurity strategy, it works only when bounded. Sovereignty is indispensable where authority must be absolute: constitutional legitimacy, classified systems, and continuity of government. Treated as a system-wide project, however, it can erode resilience by fragmenting interoperability, draining resources, and narrowing operational options. Europe’s cybersecurity interest is not control for its own sake, but the ability to function under stress. That requires thick autonomy: governable dependencies, operational portability, and collective response capacity – within Europe and with trusted partners. Designed as autonomy rather than control, sovereignty can protect legitimacy without undermining resilience.

The views expressed in this essay are those of the author and do not reflect the policy or position of NATO or any other organisation with which the author is, or has been, affiliated. All information used to produce this essay is from publicly available sources.

Read the other 2025-2026 Binding Hook-Munich Security Conference Essay Prize Competition winners here.