The European Union has opened a formal investigation into whether Snapchat has breached Digital Services Act (DSA) regulations regarding the safeguarding of children using its app.
Regulators say that the company, whose audience demographic has always skewed young, may not be doing enough to protect minors from grooming and "recruitment for criminal purposes." The EU is also looking into whether Snapchat’s younger users are too easily accessing information on how to buy illegal drugs and age-restricted products.
Brussels argues that while Snapchat requires users to be at least 13 years of age to sign up for an account, its self-declaration age assurance system may not be an adequate means of ensuring those younger than the minimum age can’t engage with the platform. The European Commission also says the current measures fail to assess whether users are younger than 17 years old, which it says is necessary for an "age-appropriate experience." It also alleges that adults are able to exploit the current system to lie about their own age and impersonate minors.
Investigators believe that the app itself doesn’t allow for other users to report accounts they suspect are being used by people younger than the minimum age requirements. Moreover, they argue that reporting illegal content found on the app is not easy enough, and that Snapchat may not be informing its users about "possibilities for redress.”
Other issues being looked at by the European Commission include child and teen accounts being recommended to other users by Snapchat’s Find Friends feature and insufficient guidance on available account safety features.
The investigators are now in the process of gathering evidence, sending out interview invitations and requesting information from Snap. The Commission says the investigation is based on analysis of the last three years of risk assessment reports filed by Snapchat, as well as an information request it sent on October 10 in 2025.
"The safety and wellbeing of all Snapchatters is a top priority, and our teams have worked for years to raise the bar on safety," a Snapchat spokesperson said in a statement to Engadget. "Snapchat is designed to help people communicate with close friends and family in a positive, trusted environment, with privacy and safety built in from the start - including additional protections for teens. As online risks evolve, we continuously review, strengthen, and invest in these safeguards."
The company added that it has acted proactively and transparently in its efforts to meet the DSA’s requirements, and said it would fully cooperate with the Commission throughout its investigation.
Snap is one of a number of social media companies currently facing increased scrutiny regarding the safety of minors using its platform. In 2023, the company added new features designed to make it harder for teenagers to connect with strangers. One of these measures involved increasing the amount of mutual friends users must have before appearing in search and suggested accounts.
Along with TikTok, the company recently settled a lawsuit that accused its platform of causing social media addiction. The case was brought by a 20-year-old woman who said she’d been harmed by addictive features on Meta, YouTube, TikTok and Snap as a child. This week, a jury ruled against Meta and YouTube in the trial, with the companies ordered to pay the woman, who was named as K.G.M in official documents, $6 million in damages.
































