Table of Contents
The European Commission is considering stricter regulations to enhance children’s safety on social media platforms. During a press briefing in Brussels, spokesperson Thomas Regnier emphasized the need for potential age restrictions to adapt to evolving online environments and protect young users.
Regnier highlighted the Digital Services Act, a key piece of legislation introduced by the EU, as essential for safeguarding children online. However, he acknowledged the rapid changes in social media and the possibility of further action. “If we need to go a step further to protect our kids, then why not?” he stated, indicating a willingness to explore additional measures.
Commission President’s Vision for Digital Safety
Regnier’s comments followed recent statements by Commission President Ursula von der Leyen at a conference in New York. She advocated for establishing a legal framework around a digital majority age, aimed at setting a minimum age for online users, particularly on platforms popular among children.
Von der Leyen plans to convene a panel of experts by year-end to explore actionable strategies for this initiative.
Concerns over Popular Platforms
Regnier noted the growing influence of certain platforms, specifically mentioning that in Germany, about half of children aged 6 to 13 are using TikTok.
This statistic is concerning, as it falls below the platform’s own age restrictions outlined in its terms of service. The gap between user demographics and established guidelines has sparked discussions on the need for stricter regulations.
In response, Paolo Ganino, head of policy communications for TikTok in Europe, defended the platform’s practices.
He stated that TikTok actively removes around 6 million underage accounts each month as part of its comprehensive trust and safety program designed to promote the welfare of families and teenagers.
Balancing Safety and Freedom
As the EU navigates these complex issues, various stakeholders, including the U.S.
government and tech firms, have expressed concerns about the potential implications of the EU’s content regulations on civil liberties and free speech. Regnier clarified the Commission’s position, stating, “We will not, as a public institution, decide what social media our citizens can use. This is not our role.” The focus is on holding companies accountable to adhere to guidelines that ensure safe online interactions for users.
Age Restrictions and Future Directions
While the Digital Services Act provides a framework for content moderation, Regnier made it clear that it does not offer a legal basis for determining a minimum age for social media usage. Nonetheless, there is a growing consensus among European governments advocating for stricter enforcement of age limits on online platforms. This trend reflects a broader desire for national and collective EU-level actions to better protect minors.
As discussions progress, it remains to be seen how these initiatives will unfold. The European Commission is preparing to take a proactive approach to ensure a safer online environment for children, which could set a precedent for global standards in digital safety.