TikTok will soon restrict users under 18 from using beauty filters that alter their facial features, in response to growing concerns over the impact of these filters on teenage mental health and self-esteem. The new policy, set to roll out globally in the coming weeks, will block filters like “Bold Glamour,” which smooths skin, plumps lips, and reshapes eyes—effects that are hard to distinguish from reality. However, filters for comedic purposes, such as those adding animal ears or exaggerated features, will still be accessible to teens.
This move follows a report from Internet Matters, highlighting how beauty filters contribute to a distorted view of reality by normalizing idealized images.
Read more: Australia Implemented World’s Toughest Social Media Age Restrictions
Many teens, particularly girls, expressed feeling pressure to match these altered appearances and reported that they saw their unfiltered faces as “ugly” after prolonged use. Dr. Nikki Soo from TikTok confirmed the changes, emphasizing the platform’s aim to reduce social pressure on young users and promote healthier online habits.
To enforce the restrictions, TikTok plans to introduce automated age verification systems using machine learning to detect users misrepresenting their age, as part of a broader effort to remove underage accounts. TikTok also deletes around 20 million accounts quarterly for violating age policies. While TikTok acknowledges the challenges, it remains committed to a “safety-first” approach, allowing users to appeal bans if they believe they’ve been blocked in error.
This policy shift aligns with upcoming regulations in the UK under the Online Safety Act, which will require more stringent age checks. The NSPCC has welcomed TikTok’s move but stressed the need for more action across other platforms. TikTok also plans to enhance age verification for under-13 users, a group that has historically been hard to monitor.
Other platforms are following suit, with Roblox restricting younger users from accessing violent content, and Instagram introducing “teen accounts” that allow parents to monitor activity. The Molly Rose Foundation’s Andy Burrows pointed out that these changes are largely in response to new EU and UK regulations, underscoring the need for stronger online protection laws. As 2025 regulations near, TikTok’s efforts reflect a broader shift in social media toward prioritizing user safety.