Why TikTok isn’t really a social media app
12 Mar 2021|

There’s one thing we’re all getting wrong about TikTok: it’s not really a social media app. As TikTok Australia’s general manager told the Senate Select Committee on Foreign Interference through Social Media in September last year, the app is ‘less about social connection and more about broadcasting creativity and expression’.

Put another way, think of TikTok more as the modern incarnation of a media publisher—like a newspaper or a TV network—than as a social forum like Facebook or Twitter. That’s because TikTok is much more assertively curatorial than its competitors. It’s not a forum, it’s an editor. Its algorithm decides what each user sees, and it’s the opacity of that algorithm that presents the most worrying national security risk.

It may sound like an insignificant distinction, but TikTok’s emphasis on an ‘interest graph’ instead of a ‘social graph’ took the app’s competitors completely by surprise, and has largely gone over the heads of most lawmakers. The app, owned by Chinese technology company ByteDance, hit 2.3 billion all-time downloads in August 2020, so it’s high time policymakers understood exactly what makes TikTok tick.

An essay by Eugene Wei should be at the top of their reading list. A San Francisco–based start-up investor and former Amazon and Facebook employee, Wei dissects TikTok’s strategy and shows how its recommendation engine keeps users glued to their screens. It does it not by connecting them with friends or family, but by closely analysing their behaviour on the app and serving them more of what they’re interested in.

Wei’s opus, which approaches 20,000 words and is only the first in a three-part series, explains how TikTok is not the same as the major social media platforms we’re more familiar with. Put simply, on Facebook and Twitter, the content that users see is largely decided by who they follow. On TikTok, however, the user doesn’t have to follow anyone. Instead, the algorithm very quickly learns from how users interact with the content they’re served in the app’s ‘For You’ feed to decide what it should deliver to them next.

The approach is similar to that of Spotify and Netflix, whose recommendation algorithms take note of which songs and movies you listen to or watch in full and which you skip to decide what new content to suggest. As Wei puts it, ‘TikTok’s algorithm is so effective that it doesn’t feel like work for viewers. Just by watching stuff and reacting, the app learns your tastes quickly. It feels like passive personalization.’

It’s a strategy, Wei argues, that allowed a team of Chinese engineers—who didn’t necessarily have a good understanding of the cultures in the places where the app is available—to take the world by storm.

TikTok didn’t just break out in America. It became unbelievably popular in India and in the Middle East, more countries whose cultures and language were foreign to the Chinese Bytedance product teams. Imagine an algorithm so clever it enables its builders to treat another market and culture as a complete black box. What do people in that country like? No, even better, what does each individual person in each of those foreign countries like? You don’t have to figure it out. The algorithm will handle that. The algorithm knows.

But that’s not the only thing the algorithm knows. In a recent Protocol China exposé, a former censor at ByteDance said the company’s ‘powerful algorithms not only can make precise predictions and recommend content to users—one of the things it’s best known for in the rest of the world—but can also assist content moderators with swift censorship’.

The former employee, who described working at ByteDance as like being ‘a tiny cog in a vast, evil machine’, said that even live-streamed shows on the company’s apps are ‘automatically transcribed into text, allowing algorithms to compare the notes with a long and constantly-updated list of sensitive words, dates and names, as well as Natural Language Processing models. Algorithms would then analyze whether the content was risky enough to require individual monitoring.’

There’s no doubt that TikTok and its parent company have these abilities to monitor and censor. The question is, will they continue to use it? Certainly, the blunt censorship that typified TikTok’s earlier approach to content moderation and is par for the course on ByteDance’s domestic apps is unlikely to continue, especially after the public scrutiny over TikTok’s censoring of content related to the Tiananmen Square massacre, Black Lives Matter protests and Beijing’s persecution of Uyghurs and other ethnic minorities.

But there’s ample room for ByteDance to covertly tweak users’ feeds, subtly nudging them towards content favoured by governments and ruling parties—including the Chinese Communist Party. After all, it’s an approach that would be in line with the strategy that China’s Ministry of Foreign Affairs and state media are already deploying.

Beijing is exploiting pre-existing grievance narratives and amplifying pro-CCP Western influencers in the knowledge that Western voices are more likely to penetrate target online networks than official CCP spokespeople. The strategy, referred to as ‘Borrowing mouths to speak’ (借嘴说话), is reminiscent of the Kremlin’s approach and is perfectly suited to being covertly deployed on Chinese-owned and -operated social media apps.

Just as experiments have shown that TikTok’s algorithm can hurtle users from a politically neutral feed into a far-right firehose of content, so too can it easily be used to send users down any extreme rabbit hole. By design, the app groups people into ‘clusters’ (otherwise known as filter bubbles) based on their preferences. TikTok’s executives stress that they have measures in place to ensure people don’t become trapped in those filter bubbles. TikTok’s recommendation system ‘works to intersperse diverse types of content along with those you already know you love’, the company claims. The goal, they say, is to ensure that users are exposed to ‘new perspectives and ideas’, but who decides which new perspectives and ideas?

What’s to stop Beijing from pressuring TikTok to encourage communities of Xinjiang denialists to flourish on the platform, for instance? As our report revealed, there’s already evidence that this is happening. Our analysis of the hashtag #Xinjiang showed a depiction of the region that glosses over the human-rights tragedy unfolding there and instead provides a more politically convenient version for the CCP, replete with smiling and dancing Uyghurs.

The power of social media apps has been underestimated before. When Facebook started as a ‘hot or not’ website in a Harvard dorm room at the turn of the millennium, who would have expected it would go on to play a role in inciting violence 13,000 kilometres away?

So how do policymakers deal with a Chinese-owned social media app that isn’t really a social media app but a modern-day interactive TV station, whose editorial decisions are made by an opaque algorithm developed and maintained in Beijing?

It’s past time governments realised the unique problem TikTok presents and they must now tailor solutions to deal with it properly.