- The Strategist - https://www.aspistrategist.org.au -

Editors’ picks for 2021: ‘Why TikTok isn’t really a social media app’

Posted By on January 3, 2022 @ 06:00

Originally published 12 March 2021.

There’s one thing we’re all getting wrong about TikTok: it’s not really a social media app. As TikTok Australia’s general manager told [1] the Senate Select Committee on Foreign Interference through Social Media in September last year, the app is ‘less about social connection and more about broadcasting creativity and expression’.

Put another way, think of TikTok more as the modern incarnation of a media publisher—like a newspaper or a TV network—than as a social forum like Facebook or Twitter. That’s because TikTok is much more assertively curatorial than its competitors. It’s not a forum, it’s an editor. Its algorithm decides what each user sees, and it’s the opacity of that algorithm that presents the most worrying national security risk.

It may sound like an insignificant distinction, but TikTok’s emphasis on an ‘interest graph [2]’ instead of a ‘social graph [3]’ took the app’s competitors completely by surprise, and has largely gone over the heads of most lawmakers. The app, owned by Chinese technology company ByteDance, hit 2.3 billion all-time downloads in August 2020, so it’s high time policymakers understood exactly what makes TikTok tick.

An essay by Eugene Wei should be at the top of their reading list. A San Francisco–based start-up investor and former Amazon and Facebook employee, Wei dissects TikTok’s strategy and shows how its recommendation engine keeps users glued to their screens. It does it not by connecting them with friends or family, but by closely analysing their behaviour on the app and serving them more of what they’re interested in.

Wei’s opus [4], which approaches 20,000 words and is only the first in a three-part [5] series [6], explains how TikTok is not the same as the major social media platforms we’re more familiar with. Put simply, on Facebook and Twitter, the content that users see is largely decided by who they follow. On TikTok, however, the user doesn’t have to follow anyone. Instead, the algorithm very quickly learns from how users interact with the content they’re served in the app’s ‘For You’ feed to decide what it should deliver to them next.

The approach is similar to that of Spotify and Netflix, whose recommendation algorithms take note of which songs and movies you listen to or watch in full and which you skip to decide what new content to suggest. As Wei puts it [7], ‘TikTok’s algorithm is so effective that it doesn’t feel like work for viewers. Just by watching stuff and reacting, the app learns your tastes quickly. It feels like passive personalization.’

It’s a strategy, Wei argues, that allowed a team of Chinese engineers—who didn’t necessarily have a good understanding of the cultures in the places where the app is available—to take the world by storm.

TikTok didn’t just break out in America. It became unbelievably popular in India and in the Middle East, more countries whose cultures and language were foreign to the Chinese Bytedance product teams. Imagine an algorithm so clever it enables its builders to treat another market and culture as a complete black box. What do people in that country like? No, even better, what does each individual person in each of those foreign countries like? You don’t have to figure it out. The algorithm will handle that. The algorithm knows.

But that’s not the only thing the algorithm knows. In a recent Protocol China [8] exposé, a former censor at ByteDance said the company’s ‘powerful algorithms not only can make precise predictions and recommend content to users—one of the things it’s best known for in the rest of the world—but can also assist content moderators with swift censorship’.

The former employee, who described working at ByteDance as like being ‘a tiny cog in a vast, evil machine’, said that even live-streamed shows on the company’s apps are ‘automatically transcribed into text, allowing algorithms to compare the notes with a long and constantly-updated list of sensitive words, dates and names, as well as Natural Language Processing models. Algorithms would then analyze whether the content was risky enough to require individual monitoring.’

There’s no doubt that TikTok and its parent company have these abilities to monitor and censor. The question is, will they continue to use it? Certainly, the blunt censorship that typified TikTok’s earlier approach to content moderation and is par for the course on ByteDance’s domestic apps is unlikely to continue, especially after the public scrutiny over TikTok’s censoring of content related to the Tiananmen Square massacre [9]Black Lives Matter protests [10] and Beijing’s persecution of Uyghurs and other ethnic minorities [11].

But there’s ample room for ByteDance to covertly tweak users’ feeds, subtly nudging them towards content favoured by governments and ruling parties—including the Chinese Communist Party. After all, it’s an approach that would be in line with the strategy that China’s Ministry of Foreign Affairs and state media are already deploying.

Beijing is exploiting [12] pre-existing grievance narratives and amplifying pro-CCP Western influencers in the knowledge that Western voices are more likely to penetrate target online networks than official CCP spokespeople. The strategy, referred to as ‘Borrowing mouths to speak’ (借嘴说话), is reminiscent of the Kremlin’s approach and is perfectly suited to being covertly deployed on Chinese-owned and -operated social media apps.

Just as experiments have shown [13] that TikTok’s algorithm can hurtle users from a politically neutral feed into a far-right firehose of content, so too can it easily be used to send users down any extreme rabbit hole. By design, the app groups people into ‘clusters’ [14] (otherwise known as filter bubbles [15]) based on their preferences. TikTok’s executives stress that they have measures in place to ensure people don’t become trapped in those filter bubbles. TikTok’s recommendation system ‘works to intersperse diverse types of content along with those you already know you love’, the company claims [16]. The goal, they say, is to ensure that users are exposed to ‘new perspectives and ideas’, but who decides which new perspectives and ideas?

What’s to stop Beijing from pressuring TikTok to encourage communities of Xinjiang denialists to flourish on the platform, for instance? As our report revealed [17], there’s already evidence that this is happening. Our analysis of the hashtag #Xinjiang showed a depiction of the region that glosses over the human-rights tragedy unfolding there and instead provides a more politically convenient version for the CCP, replete with smiling and dancing Uyghurs.

The power of social media apps has been underestimated before. When Facebook started as a ‘hot or not’ website [18] in a Harvard dorm room at the turn of the millennium, who would have expected it would go on to play a role in inciting violence [19] 13,000 kilometres away?

So how do policymakers deal with a Chinese-owned social media app that isn’t really a social media app but a modern-day interactive TV station, whose editorial decisions are made by an opaque algorithm developed and maintained in Beijing?

It’s past time governments realised the unique problem TikTok presents and they must now tailor solutions to deal with it properly.



Article printed from The Strategist: https://www.aspistrategist.org.au

URL to article: https://www.aspistrategist.org.au/editors-picks-for-2021-why-tiktok-isnt-really-a-social-media-app/

URLs in this post:

[1] told: https://www.aph.gov.au/Parliamentary_Business/Hansard/Hansard_Display?bid=committees/commsen/1a5e6393-fec4-4222-945b-859e3f8ebd17/&sid=0002

[2] interest graph: https://en.wikipedia.org/wiki/Interest_graph#:~:text=Relationship%20of%20interest%20graph%20to%20social%20graph,-Interest%20graphs%20and&text=Much%20as%20social%20graphs%20are,follow%20them%20across%20the%20web.

[3] social graph: https://en.wikipedia.org/wiki/Social_network

[4] opus: https://www.eugenewei.com/blog/2020/8/3/tiktok-and-the-sorting-hat

[5] three-part: https://www.eugenewei.com/blog/2020/9/18/seeing-like-an-algorithm

[6] series: https://www.eugenewei.com/blog/2021/2/15/american-idle

[7] puts it: https://twitter.com/eugenewei/status/1290629794359435267?s=20

[8] Protocol China: https://www.protocol.com/china/i-built-bytedance-censorship-machine

[9] Tiananmen Square massacre: https://www.theguardian.com/technology/2019/sep/25/revealed-how-tiktok-censors-videos-that-do-not-please-beijing

[10] Black Lives Matter protests: https://www.cnbc.com/2020/06/02/tiktok-blacklivesmatter-censorship.html

[11] Beijing’s persecution of Uyghurs and other ethnic minorities: https://www.theguardian.com/technology/2019/nov/28/tiktok-says-sorry-to-us-teenager-blocked-after-sharing-xinjiang-videos

[12] exploiting: https://www.aspi.org.au/report/trigger-warning

[13] experiments have shown: https://www.vice.com/en/article/kzdwn9/tiktok-cant-save-us-from-algorithmic-content-hell

[14] groups people into ‘clusters’: https://www.axios.com/inside-tiktoks-killer-algorithm-52454fb2-6bab-405d-a407-31954ac1cf16.html

[15] filter bubbles: https://www.inputmag.com/culture/tiktok-lifts-the-cover-off-its-algorithm-data-practices

[16] the company claims: https://newsroom.tiktok.com/en-us/how-tiktok-recommends-videos-for-you

[17] our report revealed: https://s3-ap-southeast-2.amazonaws.com/ad-aspi/2020-09/TikTok%20and%20WeChat.pdf?7BNJWaoHImPVE_6KKcBP1JRD5fRnAVTZ=

[18] started as a ‘hot or not’ website: https://www.thecrimson.com/article/2003/11/19/facemash-creator-survives-ad-board-the/

[19] play a role in inciting violence: https://www.nytimes.com/2018/11/06/technology/myanmar-facebook.html

Copyright © 2024 The Strategist. All rights reserved.