With the presidential election just weeks away, TikTok is facing scrutiny for its inconsistent enforcement of its ban on political advertising, according to a new report by the nonprofit Global Witness.
In September, researchers tested the political ad moderation systems of TikTok, Facebook, and YouTube by submitting misleading ads that violated all three platforms’ policies. These ads included false claims about voting requirements and calls to action that echoed the January 6 Capitol attack. The goal was to see how well each platform could detect and reject such disinformation.
While Facebook and YouTube managed to filter out most of the problematic ads, TikTok’s performance raised eyebrows. Despite a long-standing policy prohibiting political ads since 2019, the platform approved four out of eight misleading submissions. Notably, TikTok rejected ads that named specific candidates but allowed others that perpetuated voter suppression to slip through.
Ava Lee, Global Witness’s campaign lead for digital threats, expressed surprise at TikTok’s lax moderation. “The ads that got through were quite concerning,” she noted, emphasizing that disinformation rarely presents itself so clearly in real life.
In response, TikTok spokesperson Ben Rathe acknowledged that four ads were incorrectly approved but emphasized that they did not run on the platform. He reaffirmed the company’s commitment to enforcing its political ad ban.
This issue is not new for TikTok. A similar study before the 2022 U.S. midterm elections found that the platform approved 90% of political disinformation ads. Furthermore, misleading ads were recently identified in Ireland ahead of European Union elections.
In comparison, Facebook approved one misleading ad in the recent test, while YouTube rejected all submissions from the researchers, pausing their account for not providing identity verification.
Experts caution against interpreting these results as definitive assessments of the platforms’ overall capabilities. Katie Harbath, CEO of Anchor Change and a former Facebook public policy director, explained that while the report highlights TikTok’s challenges in moderating political content, it does not fully capture the effectiveness of their content violation enforcement.
“This situation underscores the ongoing gap between tech companies’ policies and their actual enforcement,” Harbath said. As the election approaches, the spotlight on social media’s role in political discourse intensifies, raising questions about accountability and the spread of misinformation.