Can TikTok alone tackle CCP-linked information ops?
18 Sep 2023|

In a welcome development last year, TikTok announced that it would start publishing insights about the covert influence operations it identifies and removes from its platform globally in its quarterly community guidelines enforcement reports.

Since then, the platform has published three quarterly reports that identified 22 separate covert influence operations originating in countries as various as Russia, Azerbaijan, Ireland, Georgia, Kenya, and Taiwan. But there has been one glaring exception: China.

The omission is curious considering that almost every other major social media platform has reported on the presence of covert operations linked to the Chinese party-state. It is, of course, hardly surprising though, considering that TikTok is owned by ByteDance, a Chinese company over which the ruling Communist Party has decisive leverage.

Late last month, one of TikTok’s major competitors decided to give it a helping hand. Facebook owner Meta published details about a Chinese influence campaign it described as the ‘largest known cross-platform covert influence operation in the world’. TikTok was among the more than 50 online platforms and forums where Meta found evidence of the Chinese political spam network known as ‘Spamouflage’. After The Guardian reached out to TikTok to ask what action they would be taking against the accounts, the platform removed 284 of them.

It is a positive step that TikTok has finally acted against these accounts, but it’s one they could have taken themselves months ago. At ASPI, we’ve been actively monitoring Spamouflage accounts on the Chinese-owned video-sharing app for the past year. In April, my colleagues published ‘Gaming Public Opinion,’ which laid out concrete evidence of this type of activity occurring on the app. TikTok’s trust and safety team might like to give it a read; some of the accounts identified in that report are still up on the platform.

Many of the videos shared in the Spamouflage influence operation focused on positive commentary about China’s Xinjiang province including videos featuring local Uyghurs, likely corralled by the propaganda department, to respond in testimonials to reports of forced labour in Xinjiang. Meta’s investigation of the accounts sharing this content found links to ‘individuals associated with Chinese law enforcement.’

None of this comes as a surprise. Three years ago in our report on TikTok and WeChat, my colleagues and I wrote that Xinjiang-related ‘state-linked information campaigns are highly likely to be taking place on TikTok,’ but that it would be unlikely to ‘conduct any transparent investigation to stop state-actor manipulation of its platform.’

How were we able to confidently predict this? Because back in 2018, ByteDance founder Zhang Yiming stated on the record that he would ensure his products served to promote the CCP’s propaganda agenda. In fact, as we noted in our 2020 report, ByteDance works closely with PRC public security bureaus to not just disseminate that propaganda, but to actually produce it in the first place.

The various transparency reports TikTok puts together, including the ones it publishes as part of its obligations as a signatory to the Australian Code of Practice for Disinformation and Misinformation, sure seem comprehensive. As they note in their 2022 transparency report, they are a ‘dedicated signatory’ that ‘opts in to all Objectives and Outcomes’ under the code. Indeed, in some ways TikTok has gone further than other platforms to deal with the various online harms that are prevalent on all platforms.

But given the facts outlined above, TikTok clearly cannot be trusted to root out CCP-led information operations on its platform on their own accord. And while Meta has been of assistance in this instance, there is need for a more sustainable solution that does not rely on competitors to police the information operations that are taking place on rival platforms. It is time we moved from the current self-regulatory model where the platforms are left to create, implement, and enforce their own rules and standards, to one where the government can provide oversight and enforce compliance.

As various efforts to regulate TikTok in the US flounder and stall, the Australian government’s proposed Combatting Misinformation and Disinformation Bill 2023 might finally put us on a path towards this co-regulatory model. If passed, the bill would empower the Australian Communications and Media Authority (ACMA) to gather crucial information regarding TikTok’s efforts to counter foreign interference. The ACMA will also have the power to level substantial fines if, as seems to be the case with the Spamouflage accounts, TikTok’s efforts to deal with them are untimely or inadequate.

The draft bill is not perfect. In our own feedback on it (to be published soon), my colleague Albert Zhang and I propose no fewer than 18 recommendations for how it can be improved. But it at the very least gives the government the ability to force digital platforms like TikTok to do more to combat information operations on their platforms. The clock has been ticking on TikTok for far too long. It’s time we actually did something about it.