Technology

EU Pushes to Make Social Media Safer and Less Addictive for Children

Europe's regulators are targeting the mechanics of addiction itself, not just content, after finding TikTok's infinite scroll and algorithmic design breach EU law.

Marcus Williams3 min read
Published
Listen to this article0:00 min
Share this article:
EU Pushes to Make Social Media Safer and Less Addictive for Children
Source: cloudfront-us-east-2.images.arcpublishing.com

The European Commission's February 6, 2026 preliminary finding that TikTok violated the Digital Services Act did not cite illegal content or data theft. It cited the way the platform was designed to keep users scrolling: infinite scroll, algorithmic amplification of emotionally charged content, push notifications engineered to interrupt, and the deliberate absence of stopping cues. That distinction matters enormously. European regulators are no longer asking platforms to add parental controls on top of addictive architecture. They are demanding platforms tear out the architecture itself.

The Commission's investigation into TikTok, which began in February 2024, focuses on the "rabbit hole effect" of the platform's recommendation systems, the risk of minors misrepresenting their age, and the platform's obligations to ensure a high level of privacy, safety, and security for minors. If the preliminary findings are confirmed, TikTok could face a fine of up to 6% of its global annual turnover. TikTok's parent company ByteDance reportedly targeted revenues as high as $186 billion last year, making the financial exposure significant. TikTok rejected the preliminary findings outright, asserting that "the Commission's preliminary findings present a categorically false and entirely meritless depiction of our platform."

The enforcement action against TikTok is the sharpest edge of a broader regulatory push. In November 2025, MEPs adopted a non-legislative report by 483 votes in favour, 92 against, and 86 abstentions, calling for an EU-wide minimum age of 16 for social media access and bans on the most harmful addictive practices. The specific features MEPs want disabled by default for users under 18 include infinite scrolling, autoplay, pull-to-refresh, reward loops, and harmful gamification. Rapporteur Christel Schaldemose framed the dual rationale plainly: a higher access bar at 16, paired with stronger structural safeguards for those who do get in.

The legislative machinery to enforce that vision is still being assembled. The long-discussed Digital Fairness Act failed to materialize in 2025 and is now scheduled for Q4 2026. When it arrives, it is expected to address dark patterns, manipulative design, loot boxes, and engagement-based advertising targeting children. In the interim, the Commission has leaned on the DSA's existing Article 28 framework. Guidelines published in July 2025 address addictive design by disabling features such as "read receipts" to curb excessive use, and combat cyberbullying by empowering minors to block.

Age verification sits at the center of the access debate, and Brussels acknowledges it remains technically and legally unresolved. The Commission is developing an EU-wide age verification system to serve as an interim measure until the EU Digital Identity Wallet launches in 2026, enabling users to confirm 18+ status privately and without data transfer. Without a workable verification layer, any minimum age requirement risks becoming unenforceable.

AI-generated illustration
AI-generated illustration

The platforms have already moved on advertising. Snapchat, TikTok, and Meta's Instagram and Facebook no longer allow advertisers to show targeted ads to underage users in the EU. But MEPs and advocates argue that advertising restrictions alone miss the deeper problem: recommendation engines that are structurally rewarded for maximizing engagement, regardless of the user's age or mental state.

The implications extend well beyond Europe. If the Commission requires fundamental design changes such as disabling infinite scroll or restructuring the recommendation algorithm, these would likely affect TikTok's product globally. That is precisely the leverage Brussels is betting on: that forcing a product redesign inside a market of 450 million people makes a separate global product economically irrational for any platform to maintain. A 2025 Eurobarometer survey found that more than 90% of Europeans believe there is an urgent need to take action to protect children online, particularly regarding the mental health impact of social media.

Whether the regulation can outrun the design is the central question. The Commission can fine, suspend, and compel behavioral changes under the DSA. What it cannot yet guarantee is that the next generation of engagement mechanics, already being developed in the same engineering culture that produced infinite scroll, won't simply route around whatever the current rules require.

Know something we missed? Have a correction or additional information?

Submit a Tip

Never miss a story.
Get Prism News updates weekly.

The top stories delivered to your inbox.

Free forever · Unsubscribe anytime

Discussion

More in Technology