The European Commission has found TikTok in violation of the Digital Services Act due to addictive design features that harm users' mental health, potentially resulting in a fine of up to 6% of global turnover.
The European Commission has launched a formal investigation into TikTok's addictive design features, finding that the social media platform's infinite scroll, autoplay, push notifications, and personalized recommendation systems violate the EU's Digital Services Act (DSA). According to preliminary findings released today, TikTok has failed to adequately assess how these features could harm users' physical and mental well-being, particularly affecting minors and vulnerable adults.
The commission's investigation revealed that TikTok's design actively fuels users' urge to keep scrolling by shifting their brains into "autopilot mode" through constant rewards of new content. This design approach potentially reduces self-control and leads to compulsive behavior patterns among users. The platform has also disregarded important indicators of compulsive use, including the time minors spend on the app during nighttime hours and how frequently users open the application throughout the day.
If these findings are confirmed, TikTok could face a substantial fine of up to 6% of its global annual turnover. The commission has outlined specific requirements for TikTok to avoid penalties, including implementing mandatory screen time breaks, adapting its recommendation system to reduce addictive patterns, and disabling key addictive features that drive compulsive usage.
EU tech commissioner Henna Virkkunen emphasized the seriousness of the situation, stating: "Social media addiction can have detrimental effects on the developing minds of children and teens. The Digital Services Act makes platforms responsible for the effects they can have on their users. In Europe, we enforce our legislation to protect our children and our citizens online."
While TikTok has implemented some mitigation measures such as parental controls and screen-time management tools, the commission found these efforts likely ineffective. The tools are described as easy to dismiss and require parents to enable them manually, creating barriers to meaningful protection for young users.
This investigation is part of a broader regulatory crackdown on TikTok's practices in Europe. In November, French prosecutors opened a criminal investigation into the platform, accusing it of failing to safeguard the mental health of children. The Irish Data Protection Commission has already imposed significant penalties on TikTok for other violations, including a €530 million fine in May 2025 for illegally transferring personal data of European Economic Area users to China, violating GDPR regulations.
Two years prior, the Irish watchdog fined TikTok €345 million for violating children's privacy by processing their data and employing "dark patterns" during registration and video posting. These repeated violations demonstrate a pattern of regulatory non-compliance that has drawn increasing scrutiny from European authorities.
The investigation highlights the growing concern among regulators about the impact of social media design on mental health, particularly for younger users. The DSA represents a significant shift in how platforms are held accountable for the psychological effects of their design choices, moving beyond traditional data protection concerns to address the broader societal impact of addictive technology.
TikTok's response to these findings will be crucial in determining whether it faces the maximum fine or can implement meaningful changes to its platform design. The company has previously defended its practices, arguing that it provides tools for users to manage their time on the platform and that its recommendation system is designed to show relevant content rather than create addiction.
This case sets an important precedent for how social media platforms will be regulated in Europe, potentially influencing design practices globally as companies seek to comply with the DSA's requirements. The outcome could reshape how platforms approach user engagement, balancing business objectives with user well-being and regulatory compliance.

The investigation also comes amid broader concerns about TikTok's data practices and its relationship with the Chinese government. European regulators have been increasingly skeptical of the platform's data handling and its potential use for surveillance or influence operations. The combination of addictive design concerns and data protection violations creates a comprehensive regulatory challenge for TikTok in the European market.
As the investigation progresses, other social media platforms will be watching closely to understand how the DSA will be enforced and what design changes might be required to avoid similar penalties. The case represents a significant test of Europe's ability to regulate powerful tech platforms and protect users from potentially harmful design practices that prioritize engagement over well-being.

Comments
Please log in or register to join the discussion