Five Eyes in the Library of Babel
26 Jul 2019|

Recently I considered the differences between American and European understandings of technology in a strategic context. There’s a further wrinkle to be teased out concerning a specific group of Western nations: those that belong to the Five Eyes alliance.

The Five Eyes, an intelligence-sharing arrangement between the US, the UK, Canada, New Zealand and Australia, owes its foundations to World War II. Significantly, cooperation started by sharing intelligence on foreign communications—through interception, collection and analysis, and associated cryptographic tools.

In those days, intelligence-gathering comprised collection by spies (human intelligence) and by interception of communications (signals intelligence). Human intelligence doesn’t scale well, and the use of agents is, typically, life-endangering. In contrast, signals intelligence scales much more readily, and with the US as the pre-eminent global technological hub for many years, the Five Eyes community derived a considerable advantage.

The Five Eyes is, at its core, a trust-based form of technological cooperation. Technologies collect the information, decrypt it and share it among the members—a deep-level enabler of cooperation and action in a competitive world. Once secret, the Five Eyes arrangement has become a public talisman, signifying membership of an elite club defined by technological advantage. Member nations are conscious of being in an inner circle, even within other alliance structures such as NATO, or the US ‘hub and spokes’ arrangement in Asia.

And for good reason. In the post-9/11 environment, the importance of identifying weak signals in an ever-noisier world has increased. And so has Five Eyes governments’ reliance on the technological prowess of their intelligence agencies to counter a growing array of threats.

It’s only reasonable, then, to assume that the Five Eyes partners have a sophisticated understanding of the significance of technology. On the face of it, in terms of intelligence and security, that would appear to be the case. In the cyber domain, intelligence agencies pursue an offensive strategy, ranging from the discovery and exploitation of vulnerabilities for espionage, to active attacks. Advice to the Five Eyes governments on 5G reflected the agencies’ technical know-how.

Yet, information technologies are not inherently pro-Western in their strategic affiliation. They lend themselves to unconventionality and so to the disruption of traditional business models. They’re asymmetric in nature, empowering weaker actors. They’re also dynamic and interconnected so that changes—whether in malware or in spying tools—propagate quickly in the broader environment. But technological advantage can be fleeting, and the democratisation of encryption capabilities means the intelligence agencies have had to rely on legislation to try to shore up their operations in their own backyards.

Long accustomed to the privileges of technological pre-eminence, Five Eyes countries now find themselves wrestling with a new question: is what’s good for their agencies good for them, or do they have a broader set of strategic needs in relation to information technology?

That question isn’t an implied criticism of a strong stance on Huawei or the Chinese government. In fact, it’s the opposite. My concern is that an intelligence- and security-driven focus may deny us a more comprehensive and more effective counter to authoritarian powers.

The challenge here is not to downplay the difficulty of finding sense in Jorge Luis Borges’s Library of Babel. Intelligence and national security agencies still have a strong and urgent mission, but they’re struggling to apply 20th-century business models to a world of infinite data and infinite computational capacity and where the technological centre of gravity resides deep in the civilian world, not in the military or national security domain.

Persisting with a 20th-century security-led view of technology can lead us down some unintended paths. For example, security assumes threat-based models. We can count the ways: coercion (in its many forms), terrorism, cybersecurity, and increasingly interference and subversion. We are learning to fear not only our actual and potential adversaries, but our own systems, and increasingly each other. Our ability to look after own individual needs and security is eroded; instead, we are expected to turn to authority. Aside from the moral hazard that induces, it’s not healthy for our society. It risks pervasive fatalism—what Timothy Snyder calls a ‘politics of eternity’—while encouraging rally-around-the-flag behaviour, undermining our ability to withstand authoritarianism.

A security mindset may limit diversity and natural human messiness. That’s an issue with any system design: it’s easier to build systems for a standardised model of humans, as the Chinese Communist Party well knows. It’s also a reason why many technology systems fail. But activities seen through a narrow lens of security are likely to be assessed by their security utility, rather than for exploration and innovation. That leaves us less able to identify opportunity and technological change and its consequences.

Lastly, a security focus foreshortens the future. It decreases our ability to think and act long term, because the concern is always about the here and now. In the constant now, even the smallest deviation grabs attention, demanding resources. It plays into the same dynamics that drive Twitter. It prevents us from seeing the deep ocean currents and long-term trends.

Decision-makers need to encourage diversity, creativity, and long-term thinking and action. Sustaining the Five Eyes relationship has benefited in the past from that approach. And it will remain a valuable asset to its members. But we would do well to remind ourselves of its limits. We may find, for example, that we need to re-emphasise the human in security and intelligence. And we should remember that security isn’t a substitute for strategy, which frequently demands risk-taking.

In short, Five Eyes countries need to reset their understanding of technology, both to tolerate risk and to appreciate the value of diversity and resilience in their societies and economies.