ÂÌñÏׯÞ

ÂÌñÏׯÞ

Menu Close

Cyberspace: Why TikTok deleted over 3.6m videos by Nigerian content creators –Report

*ByteDance’s popular video-sharing application TikTok explains the elimination of over 3.6 million videos from Nigerian content creators during Q1 2025 over artificial boost to popularity and uploading of harmful content, among other regulatory infractions

Isola Moses | ÂÌñÏׯÞ

ByteDance’s popular video-sharing application TikTok announced the elimination of over 3.6 million videos from Nigerian content creators during First Quarter (Q1) 2025 due to community guidelines violations.

ÂÌñÏ×ÆÞ gathered the company said most of the country’s content creators created fake accounts, uploaded harmful content, and got involved in artificial engagement to boost popularity on the global platform.

TikTok’s Q1 2025 Community Guidelines Enforcement Report indicated this represented a significant 50 percent surge compared to the preceding quarter, where 2.4 million videos from Nigeria faced similar removal actions.

The data also highlights the platform’s commitment to maintaining a secure, respectful, and reliable online space.

TikTok achieved a 98.4 percent proactive detection rate, meaning content was eliminated before user reports, with 92.1 percent of violating videos removed within a 24-hour timeframe.

The report further noted during March 2025, TikTok also eliminated 129 accounts across West Africa, in connection with “covert influence operations”.

Worldwide, the platform removed over 211 million videos in Q1 2025, representing an increase from 153 million in the previous quarter, with automation handling more than 184 million removals.

The platform achieved a 99 percent global proactive detection rate, showcasing enhanced capabilities in swiftly identifying and eliminating harmful material.

Regarding spam and artificial engagement, TikTok eliminated 44.7 million comments from fraudulent accounts between January and March 2025.

On fake ‘likes’, other infractions

The platform also removed 4.3 billion fake likes during the same period.

These eliminated likes, followers, and follow requests originated from automated or inauthentic mechanisms, according to TikTok.

The company stated: “We remain vigilant in our efforts to detect external threats and safeguard the platform from fake accounts and engagement.

“These threats persistently probe and attack our systems, leading to occasional fluctuations in the reported metrics within these areas.

“Despite this, we are steadfast in our commitment to promptly identify and remove any accounts, content, or activities that seek to artificially boost popularity on our platform.”

Although TikTok LIVE facilitates real-time connections between creators and audiences for community building, the platform has strengthened its LIVE Monetisation Guidelines, providing clearer guidance on content ineligible for monetisation, it said.

TikTok noted the LIVE content moderation has remained a primary focus. During Q1 2025, TikTok disclosed it prohibited 42,196 LIVE rooms and interrupted 48,156 streams in Nigeria for community guidelines violations.

Despite these extensive interventions, harmful content constitutes less than 1 percent of all material uploaded to TikTok globally, the report said.

Despite its enforcement efforts, TikTok continues to face scrutiny from various countries regarding platform usage, content, as well as security concerns, especially in the United States (US).

It is noted in October 2024, 13 US states plus the District of Columbia (DC) initiated legal action against the social media company, alleging inadequate protection of young users from harm.

These separate lawsuits, filed in New York, California, Washington D.C., and eleven additional states, claim TikTok’s platform deliberately creates addictive experiences, exploiting children’s vulnerabilities for profit maximisation.

Besides, the mounting legal challenges escalate TikTok’s regulatory battles with the US authorities, with plaintiffs demanding financial penalties and enhanced accountability for the Chinese-owned corporation.

In regard to child mental health concerns, content moderation, US state officials argued TikTok’s software intentionally maintains prolonged user engagement, especially among children, raising mental health concerns and questioning content moderation effectiveness.

Kindly Share This Story

 

 

Kindly share this story