Technology and Power — Essay Series
How emerging technologies reshape economic power, governance and global competition.
AI Infrastructure and the New Front Line: Cybersecurity, Sovereignty and the Future of UK Resilience
The core issue
The UK’s primary national security exposure is no longer physical infrastructure alone.
It is the AI-enabled information systems that underpin the economy, democracy, and defence.
Cybersecurity and resilience policy must now treat artificial intelligence infrastructure as core national security infrastructure.
Power implications
- AI infrastructure is becoming sovereignty infrastructure
- Speed of AI deployment is now a security variable
- Information integrity is a core resilience frontier
Note: ‘Deployment’ does not refer only to the speed at which AI is integrated into defence and military systems. It also encompasses the pace of AI adoption across civilian critical infrastructure (such as energy, telecommunications, and digital networks) as well as in protecting the information environment (such as countering disinformation and cyber threats). The speed of deployment across these interconnected domains now directly affects national resilience and security.
Introduction: a shifting attack surface
As the UK Parliament considers the Cybersecurity and Resilience Bill, a structural shift in national security must be recognised. The primary attack surface facing advanced economies is no longer confined to physical assets or traditional cyber networks. It now includes the artificial intelligence–enabled information systems that shape decision-making, public discourse and operational capability across the state and economy.
Cybersecurity can therefore no longer be treated as a discrete technical domain. It has become inseparable from the governance of artificial intelligence, the ownership of digital infrastructure and the integrity of information environments.
The front line of national resilience now runs through AI-enabled information systems.
These reflections draw on parliamentary discussions, including an APPG AI roundtable examining artificial intelligence in defence, cybersecurity and national resilience held in the UK Parliament, together with wider expert dialogue across defence, policy, research and industry communities on the implications of AI-enabled systems for national security and democratic stability.
The Cybersecurity and Resilience Bill provides a critical opportunity to align legislative frameworks with this emerging reality.
From cyber defence to systemic resilience
The Bill’s focus on strengthening cyber resilience across critical national infrastructure is timely. Yet evidence across defence, policy and technology communities indicates that cybersecurity can no longer be understood as a standalone technical function. It is now structurally linked to:
- artificial intelligence deployment and governance
- data and compute infrastructure
- cloud ownership and control
- information integrity and democratic resilience
Adversaries increasingly operate across these domains simultaneously. Cyber intrusion, disinformation, economic pressure and AI-driven influence campaigns form part of a continuous spectrum of strategic activity.
The distinction between peace and conflict, civilian and military, influence and attack has become progressively blurred.
A resilience framework grounded in 20th-century assumptions risks failing to address this integrated threat environment.
AI infrastructure as sovereignty infrastructure
Artificial intelligence is no longer an auxiliary capability layered onto existing systems. It is rapidly becoming part of the infrastructure through which sovereignty itself is exercised.
Control over:
- data
- compute
- cloud architecture
- and advanced AI models
increasingly determines the capacity of states to act autonomously, coordinate with allies and maintain operational readiness.
Where critical AI infrastructure is privately owned, externally controlled or governed through closed architectures, strategic dependence can emerge regardless of formal regulatory authority. This has implications not only for defence and intelligence operations but also for economic resilience and democratic governance.
The central policy question is therefore not simply how systems are protected, but:
- resilience for whom, and under whose control?
Speed as a security variable
A further structural vulnerability lies in the pace at which advanced capabilities can be adopted and deployed. In an environment where adversaries innovate continuously, slow procurement and deployment processes create exposure.
Many public-sector procurement frameworks remain designed for static hardware acquisition rather than adaptive, learning software systems. The result is a recurring pattern: promising pilot programmes that fail to scale, innovations that do not reach operational use, and delayed adoption that reduces deterrence.
In high-tempo digital environments, speed of deployment becomes a core dimension of cybersecurity.
Resilience therefore depends not only on preventing breaches but on maintaining operational tempo – the capacity to detect, decide and respond at machine speed. Procurement reform, institutional agility and technological integration must be understood as central components of national cyber defence.
Information warfare and democratic resilience
Perhaps the most significant and under-addressed vulnerability lies in the integrity of the information environment itself. Advanced AI systems now enable sustained influence operations capable of shaping narratives, perceptions and social cohesion at scale.
Such operations typically occur below the threshold of armed conflict yet can have destabilising effects on democratic processes and public trust. Responsibility for defending information integrity remains fragmented across institutions, creating gaps in national resilience.
Cybersecurity frameworks focused narrowly on system integrity, without addressing information integrity, risking overlooking one of the most consequential dimensions of contemporary strategic warfare.
Artificial intelligence infrastructure is becoming a core determinant of national security and democratic resilience.
Ethics, accountability and meaningful control
Ethical governance of AI in defence and security contexts is often framed as a constraint on innovation. In practice, it is a prerequisite for legitimacy and resilience. Effective oversight requires more than symbolic human involvement; it depends on meaningful human control, clear accountability and sovereign oversight of critical infrastructure.
In high-tempo operational environments, poorly designed human-in-the-loop arrangements can increase rather than reduce risk. Resilience depends not only on whether humans remain involved, but on how decision authority and responsibility are structured within complex systems.
Ethics, accountability and operational effectiveness must therefore be addressed together rather than treated as separate domains.
Implications for the Cybersecurity and Resilience Bill
For the UK’s legislative framework to remain effective in an AI-enabled security environment, several principles should be explicitly recognised:
- Cyber resilience is inseparable from artificial intelligence capability and governance
- AI infrastructure now constitutes critical national infrastructure
- Speed of technological adoption is a core security variable
- Information integrity is central to democratic resilience
- Sovereign oversight of digital infrastructure underpins accountability and strategic autonomy
The UK possesses significant expertise across defence, technology and policy communities. The challenge is not one of insight but of institutional alignment: ensuring that legislative frameworks reflect the reality that national resilience increasingly depends on AI-enabled information systems.
Conclusion: securing the new front line
The nature of national security is changing. The systems that shape economic activity, public discourse and defence capability are now deeply intertwined with artificial intelligence and digital infrastructure. Protecting these systems requires a shift from viewing cybersecurity as a technical domain to recognising it as a core component of national strategy.
The Cybersecurity and Resilience Bill offers an opportunity to make this shift explicit.
Resilience in the AI era will depend not only on defending networks, but on ensuring that the infrastructure of intelligence itself remains secure, accountable and aligned with democratic governance.
Technology and Power — Essay Series
This series explores how emerging technologies reshape economic structures, governance systems and global power relations.
Professor Birgitte Andersen is Professor of the Economics and Management of Innovation and leads research on the political economy of emerging technologies.





