Inside TikTok’s Plan to Tighten Age Checks With Technology in Europe
In Focus
- TikTok plans to introduce an age-verification system in Europe
- The technology analyses user profiles and behavioral signs to identify under-age users
- Specialist moderators will review accounts flagged by the system
TikTok plans to deploy an age-detection tool in Europe in a few weeks. According to Reuters, TikTok is tightening age checks in Europe as regulatory pressure over how the video streaming platform identifies and blocks accounts belonging to under-13 users mounts.
Technology Specially Designed for Europe
According to the social media platform, TikTok’s age-verification technology was specifically developed for Europe to enable the company to comply with regulatory rules in the region. The Chinese video streaming platform brings its age-detection system to Europe after piloting it in the region for a year.
The system analyzes user profile details, behavioral signals, and shared videos to estimate whether an account belongs to an under-age user. But the company does not automatically ban flagged accounts. Instead, TikTok said specialist moderators review them first.
TikTok’s under-13 age checks come to Europe as authorities in the region scrutinize how social media platforms verify the age of users under data protection rules. Previously, European regulators have expressed concern that the approaches being applied currently are overly invasive or ineffective.
The ByteDance-owned platform collaborated with the Data Protection Commission in Ireland to develop the age-verification system. TikTok plans to commence age verification in Europe as YouTube updated parental controls to improve teen safety online. The updated controls allow parents to limit how long children are exposed to content on the platform.
Global Concern Over Teen Online Safety
Social media safety for children on TikTok and other platforms has become a global concern since Australia imposed the first ban for under-16 users last year. Although the effectiveness of the social media ban has been questioned, many countries appear to be following the same path to keep teens safe online.
Last year, TikTok was among the social media platforms that committed to comply with Australia’s social media ban law despite their initial opposition. The platforms had argued that the ban would push teens to dangerous internet spaces that are poorly monitored.
Recently, Meta blocked over 500,000 under-16 accounts even as it urged the Australian government to reconsider the social media ban. The social media giant blocked teen accounts belonging to 330,000 Instagram users and 173,000 Facebook users between December 4 and 11, 2025.
EU Countries Prioritize Social Media Safety
TikTok seeks to adhere to child safety age limits weeks after the European Parliament pushed for a social media ban for children below the age of 15, unless parents explicitly approve access.
This call highlights the growing need to protect children against increasing use of social media. Some European countries are looking to ban access to social media platforms by children. In Denmark, the government is exploring a ban for users below the age of 15.
Other countries like France are looking to introduce laws that require age verification in order to access adult content on social platforms. In the U.K., the Online Safety Act of 2023 requires tech firms to apply verified age checks in order to keep underage users safe.
