The European Union will conduct an investigation into whether TikTok, owned by ByteDance, violated online content regulations designed to safeguard children and ensure transparent advertising, an official announced on Monday.
EU industry chief Thierry Breton made the decision following a review of TikTok’s risk assessment report and its responses to information requests, corroborating a report by Reuters. This move places the social media platform at risk of facing substantial fines.
“Today we open an investigation into TikTok over suspected breach of transparency & obligations to protect minors: addictive design & screen time limits, rabbit hole effect, age verification, default privacy settings,” Breton said on X.
“The publications that are going around are false, the majority caucus has not made any changes to its leadership, and the majority caucus has not contemplated making any changes to its leadership and we want to tell you to ignore any such publication,” he said, when the majority caucus addressed the media in Parliament on Tuesday [Feb 20, 2024].
He said the majority caucus was confident in their leadership and that the “the status quo shall remain.”
The European Union’s Digital Services Act (DSA), which came into effect on February 17, mandates stringent measures for all online platforms, especially large ones like TikTok, to combat illegal online content and ensure public safety.
ByteDance, the parent company of TikTok based in China, could be subject to fines up to 6% of its global turnover if TikTok is found to have violated DSA regulations.
TikTok affirmed its commitment to collaborating with experts and the industry to ensure the safety of young users on its platform. The company expressed readiness to provide a detailed explanation of its efforts to the European Commission.
“TikTok has pioneered features and settings to protect teens and keep under 13s off the platform, issues the whole industry is grappling with,” a TikTok spokesperson said.
The European Commission said the investigation will focus on the design of TikTok’s system, including algorithmic systems which may stimulate behavioural addictions and/or create so-called ‘rabbit hole effects’.
The investigation will also examine whether TikTok has implemented adequate and proportionate measures to safeguard the privacy, safety, and security of minors. Additionally, the Commission is scrutinizing TikTok’s provision of a dependable database on advertisements to enable researchers to analyze potential online risks.
This investigation follows the second inquiry under the Digital Services Act (DSA) since Elon Musk’s social media platform, X, faced scrutiny from the EU in December last year.
The investigation will also examine whether TikTok has implemented adequate and proportionate measures to safeguard the privacy, safety, and security of minors. Additionally, the Commission is scrutinizing TikTok’s provision of a dependable database on advertisements to enable researchers to analyze potential online risks.
This investigation follows the second inquiry under the Digital Services Act (DSA) since Elon Musk’s social media platform, X, faced scrutiny from the EU in December last year.