
The European Union has issued preliminary charges against TikTok for breaching online content regulations, focusing on the app’s “addictive design” features that could harm users’ wellbeing, especially children.
The European Commission announced the findings on February 6, 2026, after a year-long investigation under the Digital Services Act (DSA).
This landmark EU law mandates large platforms to mitigate risks from harmful content and practices.
Key Allegations
TikTok’s features, including infinite scroll, autoplay videos, push notifications, and a highly personalized recommender system, were cited as problematic.
These elements constantly reward users with new content, fueling endless scrolling and shifting users into “autopilot mode.”
This can lead to compulsive behavior and reduced self-control, the Commission alleged.
The platform reportedly failed to properly assess risks to physical and mental health, particularly for minors and vulnerable adults.
It ignored signs of excessive use, such as late-night activity by children or frequent app openings, and lacked effective mitigation like robust screen-time tools or parental controls.
Potential Consequences
TikTok may need to fundamentally change its app design in Europe to comply.
Failure could result in a fine of up to 6% of parent company ByteDance’s global annual turnover—a potentially massive penalty given the company’s scale.
The Commission emphasized protecting young users, with EU tech chief Henna Virkkunen stating actions are expected to redesign the service for better safety, especially for minors.
TikTok strongly rejected the claims. A spokesperson called the findings “categorically false and entirely meritless,” vowing to challenge them through all available means.
This follows prior DSA scrutiny, including a 2025 settlement over advertising transparency.
The case highlights growing EU pressure on Big Tech to prioritize user safety over engagement-driven models.