BRUSSELS — In a landmark move that could reshape the future of social media, the European Commission has officially charged TikTok with breaching the Digital Services Act (DSA) over its “addictive design” features. The preliminary findings, released on Friday, accuse the video-sharing giant of engineering its platform to exploit psychological vulnerabilities, trapping users—particularly children—in a cycle of compulsive use that fuels a growing youth mental health crisis.

The Commission’s investigation, which began two years ago, concludes that TikTok’s core interface features, including infinite scroll and autoplay, effectively shift users' brains into “autopilot mode.” By delivering a constant stream of dopamine-inducing short videos, the regulator argues, TikTok undermines user self-control and poses severe risks to the mental and physical well-being of minors. If the charges hold, TikTok’s parent company, ByteDance, could face fines amounting to 6% of its global annual turnover.

The ‘Autopilot’ Trap: How TikTok Hooks Users

At the heart of the EU’s charges is the concept of addictive app design. The Commission’s findings detail how TikTok’s “highly personalized” recommender system works in tandem with interface choices to maximize time on device. Unlike traditional media that has a natural endpoint, TikTok’s feed is endless.

Regulators specifically highlighted the “infinite scroll” mechanism, which removes natural stopping points, and the “autoplay” feature, which eliminates the user's choice to engage with the next piece of content. According to the Commission, these features are not neutral design choices but calculated mechanisms that “fuel the urge to keep scrolling.”

“Scientific research shows that this may lead to compulsive behavior and reduce users’ self-control,” the Commission stated in its preliminary view. They argue that TikTok failed to conduct adequate risk assessments regarding these addictive algorithms, effectively ignoring the potential for behavioral addiction among its massive user base of young people.

A Crisis of Compulsion: The Toll on Youth Mental Health

The implications of these design choices extend far beyond wasted time. The investigation links TikTok mental health risks to tangible harms, including anxiety, depression, and severe sleep deprivation. By keeping minors glued to their screens late into the night, the app disrupts circadian rhythms and developmental health.

The “Rabbit Hole” Effect

A critical component of the charges focuses on the so-called “rabbit hole effect.” The Commission found that TikTok’s algorithms often steer users toward increasingly extreme or harmful content based on their initial interactions. For vulnerable teenagers, this can mean a rapid descent into feeds dominated by content related to self-harm, eating disorders, or unrealistic beauty standards.

While TikTok has introduced measures to mitigate these risks, such as screen time limits and parental controls, EU regulators deemed these efforts insufficient. The Commission noted that the current time-management tools are “easy to dismiss” and lack the friction necessary to break the spell of the social media impact on children. The findings suggest that the platform prioritizes engagement metrics over the safety of its most vulnerable users.

The DSA Strikes: Legal Threats and Massive Fines

This action represents one of the most significant enforcements of the EU Digital Services Act TikTok investigation to date. The DSA, a sweeping set of regulations designed to tame Big Tech, places a heavy burden on “Very Large Online Platforms” (VLOPs) to prove they are mitigating systemic risks.

Henna Virkkunen, the Commission’s Executive Vice-President for Tech Sovereignty, emphasized the gravity of the situation. “Social media addiction can have detrimental effects on the developing minds of children and teens,” she told reporters. “The Digital Services Act makes platforms responsible for the effects they can have on their users.”

The stakes for TikTok are incredibly high. A non-compliance decision could result in a fine reaching billions of dollars. Furthermore, the EU could order TikTok to fundamentally redesign its app for the European market, potentially forcing the removal of the infinite scroll feature or the restructuring of its algorithmic feed.

TikTok’s Defiant Stance: “Categorically False”

TikTok has responded to the charges with swift defiance. In a statement issued shortly after the Commission’s announcement, a spokesperson described the findings as a “categorically false and entirely meritless depiction” of the platform.

“We will take whatever steps are necessary to challenge these findings through every means available to us,” the company stated. TikTok argues that it has invested heavily in robust safety features and that its tools for supervision are among the best in the industry. They point to features like the 60-minute screen time prompt for users under 18 as evidence of their commitment to digital well-being.

However, the battle lines are now drawn. TikTok has a window to respond in writing to the preliminary findings. If their defense fails to sway the regulators, the Commission will move toward a final non-compliance decision. As the debate over social media addiction intensifies globally, this case could set a precedent that forces a worldwide reckoning for the attention economy.