The European Union has launched a formal investigation into TikTok’s practices to determine whether the social media giant is adequately protecting minors on its platform and adhering to the bloc’s Digital Services Act. This move comes amidst growing concerns over the impact of social media on children and the need for stricter regulations to ensure their safety online.
At the heart of the investigation are questions regarding the effectiveness of TikTok’s age verification tools in preventing children from accessing inappropriate content. The European Commission has raised doubts about the reasonableness, proportionality, and effectiveness of these measures, prompting a closer examination of TikTok’s functionalities, systems, and policies related to child protection.
In addition to concerns about age verification, the investigation will also assess TikTok’s compliance with requirements aimed at mitigating the risk of users becoming addicted to their content and safeguarding minors’ privacy and safety. Transparency regarding advertisements on the platform and access to data for researchers will also be scrutinized as part of the probe.
TikTok, which boasts almost 136 million monthly active users in the EU, has asserted that it has implemented features and settings to protect teenagers and prevent those under the age of 13 from accessing the platform. The company has expressed willingness to cooperate with the investigation and looks forward to explaining its efforts in detail to the European Commission.
The investigation falls under the framework of the Digital Services Act, which imposes stricter obligations on large tech companies operating in the EU. These companies, defined as those with more than 45 million monthly users in the bloc, face potential fines of up to 6% of their annual global revenue for non-compliance with the regulations outlined in the act.
This is the second formal probe launched by the European Commission against a major social media company within a short timeframe. In December, the commission initiated proceedings against another company, identified as X, to determine whether it had failed to meet certain legal obligations to combat the spread of illegal content and disinformation.
Both TikTok and company X had previously been requested to provide more information on their compliance with the Digital Services Act. The formal investigations follow these requests and highlight the EU’s commitment to ensuring that tech companies uphold their responsibilities in protecting users’ rights online, particularly those of minors.
As the investigation unfolds, stakeholders will be closely watching for developments and potential outcomes that could have far-reaching implications for the regulation of social media platforms in the EU and beyond.