Home Featured Inside TikTok’s AI-powered algorithms – POLITICO
Inside TikTok’s AI-powered algorithms – POLITICO

Inside TikTok’s AI-powered algorithms – POLITICO

by host

This article is part of a series, Bots and ballots: How artificial intelligence is reshaping elections worldwide, presented by Luminate.

When Hamas attacked Israel on Oct. 7, many sought updates from their main source for news: social media. 

But unlike previous global conflicts, where the digital discourse was dominated by Facebook and X (formerly Twitter), the ongoing Middle East crisis has seen people flock to TikTok, in their millions, to relate news and express opinions.

Even as the video-sharing app’s popularity has ballooned, the inner workings of its complex, artificial intelligence-powered algorithms remain a mystery.

Individuals see only a fraction of what is posted daily on TikTok. And what they do see is highly curated by the company’s automated systems designed to keep people glued to their smartphones. Using AI technology known as machine learning and so-called recommender systems, these systems determine, within milliseconds, what content to display to social media users.

POLITICO set out to shed light on how TikTok’s algorithms work, and to root out which side in the war in the Middle East — Israeli or Palestinian — was winning hearts and minds on the social network now heavily favored by young people. 

That’s become a hot political question after pro-Israeli groups and some Western lawmakers accused TikTok — owned by Beijing-based ByteDance — of unfairly promoting pro-Palestinian content for potential political impact. TikTok denies the accusations.

The conflict’s political effects are already evident in partisan clashes across Western democracies as people pick sides in the war — and decide how to vote. U.S. President Joe Biden’s support for Israel has drawn criticism from Arab-Americans, and it could eventually cost him the November election. In the United Kingdom, the populist independent candidate George Galloway harnessed pro-Palestinian sentiment to win a seat in the British parliament in March. University campus protests have erupted on both sides of the Atlantic. 

TikTok’s algorithms are crucial to how all kinds of political content reaches social media feeds. Examining the company’s algorithms is a good proxy for how artificial intelligence is now a key player in determining what we see online.

POLITICO teamed up with Laura Edelson, a researcher at Northeastern University in Boston, to track pro-Palestinian and pro-Israeli TikTok content over four months between Oct. 7, 2023, and Jan. 29, 2024. 

That involved creating a list of 50 popular hashtags like #IStandWithIsrael or #SavePalestine that could be directly associated with either side. More apolitical hashtags, like #Gaza or #Israel, were used to collect data on posts that did not have a specific leaning.

In total, Edelson analyzed 350,000 TikTok posts from the United States.

To make the data more digestible, she broke down the posts into three-day windows around specific events. That includes the initial Hamas attacks (Oct. 7-9); Israel’s invasion of Gaza (Oct. 27-29); and the release of the first Israeli hostages (Nov. 24-27.) As a control for bias, she also included Nov. 6-8 in the analysis, as a proxy for periods when no major events took place.

“TikTok, like other social media platforms, amplifies some content more than others,” said Edelson. “That can have a distorting effect on what people see in their feeds.”

What emerged was evidence of TikTok grappling with its role — in real-time — as one of the main global digital town squares where people gather to express their opinions and, often, disagree. 

Over the four-month period, Edelson’s research found approximately 20 times more pro-Palestinian content produced, based on the hashtags analyzed, compared with pro-Israeli material. Yet that didn’t necessarily equate to more pro-Palestininan posts winding up in the average person’s TikTok feed.

Instead, Edelson found three distinct times when the likelihood of people seeing pro-Israeli or pro-Palestinian content in their TikTok feeds changed markedly — no matter how much overall material was being produced by either side. 

TikTok did not respond to specific requests for comment about the Northeastern University research. In a blog post in April, the company said it had removed more than 3.1 million videos and suspended more than 140,000 livestreams in Israel and Palestine for violating its terms of service.

Much about how these social media algorithms work is unknown. It is unclear who within companies — engineers, policy officials or top executives — determines how they function. It’s also difficult to determine when changes are made, although regulatory efforts by the European Union and the United States are trying to shine a larger spotlight on these practices. 

What follows below is an example of how, when you dig into the numbers, much of what users see on social media relies heavily on complex algorithms that are regularly tweaked with little — if any — oversight. 

The TikTok posts were collected separately via Junkipedia, a repository of social media content managed by the National Conference on Citizenship, a nonprofit organization. They represent the most viewed partisan posts over each time period.

Oct. 7 – Oct. 27: Pro-Palestinian content dominates

Oct. 27 – Dec. 15: Pro-Israel material takes the lead

Dec. 15 – Jan. 29: Both sides lose their audience

The TikTok effect

Many — especially those above the age of 30 — see the video-sharing network as fluff, mostly dance crazes and digital fads with nothing to do with politics. 

They’re mistaken. 

Edelson said that TikTok was similar to other social media giants in that its algorithms were designed to promote what is popular. The reasoning: to serve up what people want to see so they stick around as long as possible.

That’s OK when it’s viral videos of dogs or cute babies. It’s something completely different when it’s highly charged political content about a geopolitical hotspot where people are dying every day. Such events leave social networks like TikTok and their automated curation models in the unenviable position of determining what is popular — at the risk of crowding out minority opinions.

“When it comes to politics, like anything else, the discourse of social media prioritizes the majority,” added Edelson. “We should think very seriously about what that means.”

This article is part of a series, Bots and ballots: How artificial intelligence is reshaping elections worldwide, presented by Luminate. The article is produced with full editorial independence by POLITICO reporters and editors. Learn more about editorial content presented by outside advertisers.

Source link

You may also like