TikTok's Influence: China's Invisible Hand | Giada Condello | The Blogs

TikTok’s Influence: China’s Invisible Hand

Image by Solen Feyissa
TikTok running on an iPhone while the China's flag is displayed behind it. Image by Solen Feyissa/unsplash

Since its launch in 2017, TikTok, the video app from Chinese tech giant ByteDance, has skyrocketed in popularity with 1 billion monthly users and a remarkable 232 million downloads in Q4 2023. Now it stands proudly among the top five social media platforms and it has become a leading source of information among the younger generations, who are estimated to make up 60% of its total users. 

But why has a seemingly harmless app featuring yoga videos, recipes, and funny dance moves attracted such significant attention from Western governments?

The answer lies in the nuanced nature of TikTok’s content. While some users’ feeds are filled with harmless content like healthy kale smoothies or quinoa salad recipes, many others are exposed to material that is not as innocent as it may seem. In essence, TikTok appears to have connections to China’s global interests and agenda, suggesting a deeper complexity behind its seemingly casual appearance.

On April 25th, the US Senate passed legislation requiring TikTok to divest from its parent company ByteDance within one year and sell to a government-approved buyer. Failure to comply will result in a ban on the app. But this didn’t happen out of the blue. Already one year ago, in March 2023, concerns were raised by Christopher Wray, the Director of the FBI, who warned about the fact that TikTok’s parent company, ByteDance, “is, for all intents and purposes, beholden to the CCP (Chinese Communist Party)“. This implies that, based on Chinese laws, the Chinese government has the authority and power to access ByteDance data, potentially gaining access to millions of devices, shaping narratives and collecting data. Data of American citizens (but not only) that the US government is not willing to share with the Chinese Communist Party.

Indeed, despite the reassurances of Singaporean TikTok CEO Shou Chew before Congress, who claimed to store user data outside of China and stated that the company has implemented measures to distance itself from Chinese government influence, it is certain that the CCP has implemented several laws to increase its involvement in the internal business decisions of Chinese companies. These laws, such as the “golden share,” grant the government the power to influence business decisions, which in the case of tech companies like ByteDance, may extend to content and its approval. It appears that ByteDance is legally compelled to cooperate with national intelligence efforts under the National Intelligence Law, and to engage in “monitoring, preventing, and handling cybersecurity risks and threats arising both within and outside the mainland territory of the People’s Republic of China” as stipulated in the state Cybersecurity Law. Therefore, the Chinese government could easily use its influence over ByteDance to shape content moderation policies in ways that align with its interests, such as censoring content critical of the government or promoting content that advances its narratives.

But what’s the connection with the American youth then? The issue arises when, according to a study published by the Pew Research Center, one-third of young Americans are using TikTok as their main source of information. This trend has quadrupled, from 3% in 2020 to 14% in 2023, making the hypothetical scenario of China influencing America’s public opinion through the app, not so far fetched.

In fact it looks like it has already been practising it. The Network Contagion Research Institute has analysed the hashtag ratios between Instagram and TikTok on national/regional topics considered ‘sensitive’ for the CCP. The results showed a concerning underrepresentation between content available on TikTok that doesn’t align with the Chinese government’s agenda: Tiananmen Square, Taiwan, Tibet, just to mention a few, and content that does. This tendency has been extended also to international arenas that comprise areas of strategic interest to China, like Ukraine, Israel or the region of Kashmir. In the case of the current Israel-Hamas war for example, when compared to the Instagram’s hashtag ratio, Pro-Israel content available on TikTok was between 5.0 to 11 times lower than Meta’s platform, indicating possible content manipulation to shape user perception on those specific topics.

This seems to be further evidenced also by the company ‘Humanz,’ which analysed 117 billion posts on TikTok and Instagram for the month of October 2023. They discovered that pro-Palestinian content and tags were published 15 times more often than pro-Israeli content and tags. More specifically 109.61 billion pro-Palestinian hashtags versus the 7.39 billion pro-Israel posts, almost 94% of the total analyzed content. 

Another testimony in support of this trend was given by Barak Herscowitz, TikTok’s top government relations representative in Israel. Herscowitz acknowledged that the company’s management is aware of its internal content regulation issues. He confirmed that TikTok displayed bias in approving content for advertising campaigns and in general content moderation, particularly regarding the ongoing conflict between Israel and Hamas. 

Despite TikTok’s policy prohibiting political or issue-based advertising, Herscowitz observed highly political and graphic pro-Palestine campaigns being pre-approved by moderators and running on the platform. This occurred while the official TikTok account of the State of Israel was blocked from running any campaigns since the conflict began. Herscowitz suggested that TikTok appeared to selectively apply its policies, choosing when to enforce them and when not to.

Additionally, He disclosed that on the night of the initial release of the hostages, several TikTok employees were observed celebrating the release of convicted Palestinian terrorists in an internal company chat. They expressed support for Hamas, the Houthis, and advocated for BDS. Subsequently, it was revealed that all of these employees were part of TikTok’s Trust and Safety team: the moderators in charge of approving, blocking or removing content on the platform. I am sure now you are start connecting the dots as to why TikTok appears to amplify pro-Palestinian campaigns.

However, the issue extends beyond the moderators. In fact, while Instagram’s algorithm appears to learn from user interactions with content and also partially applies a chronological order to content from accounts the user follows, according to an investigation by the Wall Street Journal, TikTok’s algorithm seems to learn from the users’ lack of engagement. This doesn’t imply that TikTok can read minds, as some TikTokers humorously suggest, but rather that users are actively engaging with content even when they may not realise it.

In the investigation 100 automated accounts were used to watch hundreds of thousands of videos and it became evident that every user, regardless of their selected preferences during registration, underwent the same initial “test” by the app. This test involved displaying a selection of the platform’s most popular videos — all pre-approved by whom? Well, the trustworthy moderators. During this initial test phase the algorithm begins to gather information on the user’s preferences, but on the contrary of IG, it seems to use only one metric: ‘time watched’. 

Here’s how it seems to work: when a user stops scrolling through videos, hesitates, pauses, or returns to a particular piece of content to rewatch it, the algorithm records that behaviour. At this point, the app acknowledges that the user has indicated a preference and may be interested in a specific topic, falling within a broader category — let’s generalise it as ‘the Gaza War’, for example.  The algorithm then suggests more specific videos from this category to understand what evoked the user’s interest in that general topic. Did horrible images of suffering children, destroyed buildings, rows of small white bags, and actual fighting scenes from the Gaza Strip keep users on the app longer? If so, the algorithm will prioritise showing those types of content, quickly displaying almost exclusively those videos.

The problem is that the ability of the algorithm to become extremely specific seems to push users to the margins of the mainstream clutter of interests. And when the user reaches that point, it is deep in what’s been called “the TikTok rabbit hole”.

Now imagine that the average TikTok user spends 95 minutes on the app daily, and that the only “reality” he sees, due to the algorithm selection and the moderator’s job, is very extreme and very biased towards or against one specific topic. What will likely happen? The users will probably tend to radicalise its views towards or against this topic. It creates a sort of ‘guided reality’ and this is how extreme content like “Letter to America” written by the terrorist Bin Laden gets praised and becomes viral worldwide on this app. 

At this point, it’s undeniable that this app has played a role in promoting a specific viewpoint and influencing political pressure on governments to quit supporting Israel and its war against Hamas and terrorism. This began with months of highly radical propaganda, using videos that compelled users to repeatedly view distressing images from Gaza. Some of these images, it must be noted, were either fabricated or sourced from the conflict in Syria, but that often went unnoticed. At this point it would not even matter to raise that point since you can’t forget the scream of a mother crying a dead child, or seeing the actual little body. Nobody is able to do that. I still remember images from October 7 that I wish I hadn’t seen myself. So all these horrible images of war  and death had brought thousands of users, who are in fact real people, to march on the streets calling for their country to stop supporting Israel. It has brought students to occupy Universities and demand to divest from Israel. But it also has brought violence and a concerning wave of antisemitism all over Europe and the States. 

But why would TikTok overlook the presence of Hamas supporters among its employees or vigorously promote the Palestinian cause, besides keeping users engaged on their app for longer periods? 

Perhaps the answer lies in China’s own motivations and its partnership with Iran, a country that has sent 300 missiles and drones to Israel last 14th of April, and which is funding terrorist proxies in the Middle East like Hamas or Hezbollah, not to mention extremist radical groups in Europe.  

China  is deeply invested in economic and strategic pursuits with Iran. It has signed an agreement with it investing $400 billions in exchange for discounted oil supply over 25 years and wants to expand its economic and strategic influence across Asia, Africa, and Europe through the Belt and Road Initiative, where Iran represents a crucial partner. This alignment may explain certain political campaigns on TikTok, or the absence of it, like the Masha Amini’s campaign, with millions of supportive posts and hashtags on other platforms, or the undeniable anti-Israel, pro-Palestine and pro-Hamas content, which mirrors Iran’s stance and actions in the Middle East. Furthermore, China and Iran share common obstacles in navigating US sanctions and hegemony, while striving for a new world order where they have greater influence.

About the Author
Giada Condello is an Italian professional who decided to move to Israel after falling in love with the city of Tel Aviv and the Israeli chutzpah. Since then, she has been calling Israel her home and she is grateful to be part of a culture that empowers individuals and encourages them to take risks and speak their minds. She holds a Bachelor's Degree in International Development and Cooperation from the University of Bologna and a Master's Degree in Intercultural Cooperation for Development from the University of Trieste.
Related Topics
Related Posts