What can supercomputers tell us about the Third Offset?
2 Jun 2017|

Image courtesy of Pixabay user blickpixel.

Paraphrased, the US Third Offset strategy runs something like ‘bolstering US conventional deterrence through technological and operational innovation’. The strategy hinges on innovation, and sees the integration of autonomy into weapon systems and organisational constructs as a decisive advantage. But innovation is prized around the world, and it’s risky to pin hopes on a possible monopoly on innovation, or the ability to leverage innovation into military effects.

Last year Xi Jinping, recognising that ‘core technologies’ aren’t yet indigenously controlled in China, issued a clear directive for scientists to ‘respond to major strategic demands’. Substantial alignment between China’s state and technological sectors is increasingly evident, and strategically-targeted science funding is ramping up as part of the 13th five-year plan. Evidence in the public domain suggests that, at least in certain key areas, dividends are emerging. In the fields of artificial intelligence and quantum information science, China is ascendant on the world stage. There’s a similar story in supercomputing, which serves as a good case study, and it reminds us that the US isn’t the only nation attempting to flip technological innovation into operational capability.

High performance computing (HPC) is vital because of the ‘classic trinity’ of national security applications: cryptanalysis, weapons design and weather forecasting. Historically, the US has led the way, but several recent developments leave an American lead in serious doubt. The Top500 list indexes the fastest commercial supercomputers in the world; as of June 2016, China has the top two, and has held the number one spot since 2013. In June 2015, there were 233 American machines in the top 500, compared to 37 from China. In June 2016 China edged out the US to lead the tally, and most recently in November 2016 there was parity, with both at 171 entries.

While the Top500 tracks commercial systems, applications like decryption require specialised, classified machines. The Intercept recently reported that detailed plans for an NSA supercomputer were hosted unwittingly on the internet. The easily-accessible plans dated from 2012, and one expert guessed that the machine could ‘pretty much wipe the floor’ with past and current commercial systems for codebreaking. But it’s true that, like a pyramid, a broad commercial base of expertise supports the know-how enabling more sensitive undertakings.

Secret plans left lying around don’t help the cause, but, even if there’s more to that story, those in the know still worry that the strategic landscape is shifting. In a report released in March (PDF), HPC experts at the NSA and US Department of Energy argued that absent ‘aggressive action’ and a surge of investment, the US ‘will not control its own future’ in supercomputing. While that’s a not-so-subtle call for more funding, the technical meeting from which the report resulted was held before President Trump announced relevant budget cuts. With only ‘minor disagreement on the timescale,’ the consensus was that China would qualitatively lead the US from as early as 2020, particularly as hardware reflected China’s ‘impressive advancements in indigenously developed system software and HPC applications’.

Underlying the American assessment is the Chinese-developed and manufactured TiahuLight system, the fastest (publicly-revealed) supercomputer in the world. According to the NSA-DOE report, unlike previous Chinese supercomputers, which were ‘unimpressive except for running benchmarks,’ the TiahuLight system is a ‘serious, innovative “capability” machine’ that demonstrates ‘real-world applications running well at scale’.

Chinese researchers envisage the system predominantly as ‘a strategic capability for improving the country’, but US experts noted that heightened interest in HPC represents a ‘good proxy’ for the tools needed to design, develop and analyse many national security and modern weapons systems. In other words: ‘Leading-edge HPC is now instrumental to getting a world-class, large-scale engineering system out the door.’ The TiahuLight system shows the substantial will to operationalise the products of intense innovation—and the ability to do it successfully.

Other factors might argue against the idea that innovation can be quarantined. Commercial technology development increasingly precedes military requirements, and that’s a program that Beijing can direct far better than the US. The indigenisation of China’s semiconductor industry, for example, has been rapid and thus-far successful (the TiahuLight system uses all-Chinese chips), and was prioritised after US chipmakers embargoed certain supercomputing projects in China. That highlights the challenges the US will face in keeping technologies to itself. If Shenzhen, China’s ‘Silicon Delta,’ received orders to jump, there’d be no doubting the outcome. In a post-Snowden, Silicon Valley-dominated world, the Pentagon’s probably wishing it had that kind of clout.

What’s more, even as China builds its own innovative capability, it’s hedging its bets by investing in US startups and high-tech European firms. The scale of this effort is worrying the Pentagon, and that’s independent of largescale IP theft, and the outright efforts to steal sensitive technology (PDF).

The US Third Offset strategy is ultimately contingent on the ability to innovate faster and better than the adversary, on the ability to leverage that into operational capability, and on the capacity to maintain the advantages afforded by the first two abilities. Provided the need to “offset” an adversary remains, and that technology is viewed as the best vector to accomplish that task, any future strategy will have to make a similar gamble on innovation. But, on the face of current trends in supercomputing and in other high-tech fields, it’s not at all clear that the bet will land in favour of the US and its allies.