The fast-growing platform’s poor track record during recent voting abroad does not bode well for elections in the U.S., researchers said.
The same qualities that allow TikTok to fuel viral dance fads can also make inaccurate claims difficult to contain.Credit...Anjum Naveed/Associated Press
By Tiffany Hsu
Aug. 14, 2022
In Germany, TikTok accounts impersonated prominent political figures during the country’s last national election. In Colombia, misleading TikTok posts falsely attributed a quotation from one candidate to a cartoon villain and allowed a woman to masquerade as another candidate’s daughter. In the Philippines, TikTok videos amplified sugarcoated myths about the country’s former dictator and helped his son prevail in the country’s presidential race.
Now, similar problems have arrived in the United States.
Ahead of the midterm elections this fall, TikTok is shaping up to be a primary incubator of baseless and misleading information, in many ways as problematic as Facebook and Twitter, say researchers who track online falsehoods. The same qualities that allow TikTok to fuel viral dance fads — the platform’s enormous reach, the short length of its videos, its powerful but poorly understood recommendation algorithm — can also make inaccurate claims difficult to contain.
Baseless conspiracy theories about certain voter fraud in November are widely viewed on TikTok, which globally has more than a billion active users each month. Users cannot search the #StopTheSteal hashtag, but #StopTheSteallll had accumulated nearly a million views until TikTok disabled the hashtag after being contacted by The New York Times. Some videos urged viewers to vote in November while citing debunked rumors raised during the congressional hearings into the Jan. 6, 2021, attack on the Capitol. TikTok posts have garnered thousands of views by claiming, without evidence, that predictions of a surge in Covid-19 infections this fall are an attempt to discourage in-person voting.
The spread of misinformation has left TikTok struggling with many of the same knotty free speech and moderation issues that Facebook and Twitter have faced, and have addressed with mixed results, for several years.
But the challenge may be even more difficult for TikTok to address. Video and audio — the bulk of what is shared on the app — can be far more difficult to moderate than text, especially when they are posted with a tongue-in-cheek tone. TikTok, which is owned by the Chinese tech giant ByteDance, also faces many doubts in Washington about whether its business decisions about data and moderation are influenced by its roots in Beijing.
“When you have extremely short videos with extremely limited text content, you just don’t have the space and time for nuanced discussions about politics,” said Kaylee Fagan, a research fellow with the Technology and Social Change Project at the Harvard Kennedy School’s Shorenstein Center.
TikTok had barely been introduced in the United States at the time of the 2018 midterm elections and was still largely considered an entertainment app for younger people during the 2020 presidential election. Today, its American user base spends an average of 82 minutes a day on the platform, three times more than on Snapchat or Twitter and twice as long as on Instagram or Facebook, according to a recent report from the app analytics firm Sensor Tower. TikTok is becoming increasingly important as a destination for political content, often produced by influencers.
The service blocked so-called deepfake content and coordinated misinformation campaigns ahead of the 2020 election, made it easier for users to report election falsehoods and partnered with 13 fact-checking organizations, including PolitiFact. Researchers like Ms. Fagan said TikTok had worked to shut down problematic search terms, though its filters remain easy to evade with creative spellings.