The company with Aussie roots that’s helping build China’s surveillance state

The Chinese government’s surveillance state in Xinjiang, where an estimated 1.5 million Uyghurs are being detained in ‘re-education camps’, has created a booming business for high-tech surveillance companies. Koala AI Technology is one of the many Chinese artificial intelligence start-ups riding a wave of Chinese government demand for surveillance technology. But unlike its competitors, Koala AI may have benefited from connections with Australian universities and Australian government funding. The company is led by scientists who worked and studied in Australia before relocating to China through Chinese government talent-recruitment schemes.

In 2011, Heng Tao Shen became one of the University of Queensland’s youngest-ever professors at age 34. Three years later he was recruited through the Chinese government’s Thousand Talents Plan to work at the University of Electronic Science and Technology of China (UESTC), where he became head of its School of Computer Science and Engineering and was given a laboratory and research team to lead. He founded Koala AI a year later, in 2015. His LinkedIn page and personal page on the UQ website say that he only left for UESTC in 2017, but Koala AI’s own reporting indicates that he was already working there in 2014.

According to Shen, ‘a shortage of leading talents in China’s artificial intelligence industry is becoming a stifling force on development’. In 2016, a senior Chinese government official estimated that the country had an AI talent shortage of over 5 million people.

Shen turned back to Australia to address this need, hiring colleagues and students from Australia while juggling his professorships at UESTC and UQ, where he still holds an honorary professorship. Most of Koala AI’s executives also worked or studied at Australian universities, sometimes under Shen or his colleagues, before joining the company with financial support from Chinese government talent-recruitment schemes. Shen continues to collaborate extensively with Australian scientists on technologies directly related to the AI security systems offered by his company.

Members of Koala AI’s research team reportedly include Thousand Talents Plan scholars currently working at the University of New South Wales and UQ, as well as a leading scientists from the University of Melbourne and the National University of Singapore. The Thousand Talents Plan sometimes allows participants to spend most of their time at their overseas ‘base’, provided they also work in China for a few months each year.

Koala AI also draws on research from the Center for Future Media at UESTC, which functions as the company’s R&D wing. Both run by Shen, they work ‘hand in hand’ on AI research, and Chinese media has reported that ‘all effective research outputs from the Center for Future Media are plugged into the R&D of Koala AI products, assisting the development of the AI industry’. The centre has hosted visiting professors from the University of Adelaide and the University of Queensland.

Koala AI claims to be worth ¥1 billion (A$200 million) and aims to become western China’s first AI unicorn—a company worth over US$1 billion—by 2020.

At the same time, the Chinese Communist Party has greatly expanded its oppression of religious and ethnic minorities across the country and in Xinjiang in particular. According to one of the leading experts on Xinjiang, Adrian Zenz, spending on security-related construction there tripled in 2017. ASPI’s International Cyber Policy Centre found that the size of concentration camps in Xinjiang has grown by more than 400% since 2016. High-tech surveillance systems enable repression and control of China’s population by the government’s powerful internal security and public security ministries and agencies.

As AI becomes increasingly sophisticated, oppression in Xinjiang now relies on algorithms and apps as much as it does on batons and boots. Leading companies such as Huawei, Hikvision and iFlytek all supply surveillance and public security technology to the region, where their products are likely facilitating human rights abuses.

Koala AI’s co-founder, Adelaide University graduate Shen Fumin, believes the future of security lies in AI. ‘Using AI to define “new security” is the future we foresee, and it’s also our mission’, he told a Chinese media outlet. At a conference in Xinjiang, CEO Heng Tao Shen demonstrated a surveillance system that the company supplies to the Altay region at Xinjiang’s northwestern edge. The system helps the Chinese government manage its border with Kazakhstan, through which many Uyghurs and Kazakhs seek to flee the region. It can uncover, categorise and recognise targets, alerting police to ‘suspicious individuals and cars’.

Products like these are driving the rapid expansion of Koala AI, which also runs a joint laboratory with China’s Ministry of Public Security. Co-founder Shen Fumin said in 2019: ‘The company is developing so quickly. The government’s assistance and the billion-dollar market for intelligent security means that we can’t rest for even a moment.’

Koala AI describes its surveillance system as an example of ‘self-dependent development’—a priority for China as it tries to end its reliance on technology from abroad—but Shen Heng Tao’s past research was supported by as much as $2.6 million in funding from the Australian Research Council. Up to $1.6 million of that funding covered projects Shen worked on after he established Koala AI and set up a laboratory at a Chinese university. Research he carried out with ARC funding focused on surveillance-related topics such as event recognition in videos. The funding agreement for one of the ARC schemes, a Future Fellowship, prohibits recipients from holding other fellowships that are remunerated or might impair their duties to the ARC. Recipients of ARC funding are also required to avoid and report any conflicts of interest.

Visuals from a demonstration of Koala AI’s surveillance system, showing part of China’s border with Kazakhstan.

In the United States, the Thousand Talents Plan has attracted scrutiny for its links to economic espionage, but not yet for its human rights implications. The scheme is a flagship of the Chinese government’s technology-transfer efforts, which rely heavily on reversing China’s brain drain by encouraging overseas scientists to bring their expertise to China. Since its establishment in 2008, it has recruited over 7,000 leading scientists and entrepreneurs from abroad.

The Federal Bureau of Investigation’s growing scrutiny of the program has sent it underground. In September 2018, the Chinese government circulated a notice instructing official outlets to censor references to it, and thousands of web pages have since been taken offline. Organisations recruiting scientists for the Thousand Talents Plan have been instructed to do so covertly by inviting potential participants to China using the guise of academic conferences and by not communicating with them through email. At least five participants in Chinese government talent-recruitment schemes have been charged with crimes including economic espionage and fraud. In addition, dozens of US and Australian employees of the government or universities are believed to have joined Chinese government talent-recruitment programs while failing to declare their external employment.

However, the human rights implications of these applications of AI research and technology transfer in China are just as worrying. And, as the case of Koala AI shows, Western universities and even government funding may be used to help carry out research as well as train, fund and recruit talent for AI-enabled state surveillance.

The enormity of the abuses the Chinese government is committing in Xinjiang means that the Australian government, universities and scientists must do more to scrutinise end users of their research. Meeting basic ethical standards would only affect a tiny share of research collaboration between Australia and China. But, currently, our universities do not appear to have sufficient internal mechanisms to enforce and ensure compliance with their policies on conflicts of interest and external employment, to understand who their employees are collaborating with, or to identify participants in foreign talent-recruitment programs. Without change on this front, universities could be unable to meet the guidelines on research collaboration they are asking for from the government.

Better aligning Australia’s engagement with China with our interests and values will often be difficult. But there are clear red lines, and aiding technology-enhanced human rights abuses is surely one of them.