The provisional ruling represents the most recent violation of DSA regulations by Meta and could lead to another substantial penalty.
The provisional ruling represents the most recent violation of DSA regulations by Meta and could lead to another substantial penalty.


Meta is violating Europe’s Digital Services Act (DSA) regulations by not preventing children under 13 from accessing Facebook and Instagram, as per a preliminary ruling by the European Commission.
The Commission revealed the decision on Wednesday following an almost two-year investigation, stating that Meta lacks proper measures to prevent children under 13 from using its services or to identify and remove those already present on its platforms. A significant example is that minors are able to easily input a bogus birth date when registering for Facebook and Instagram to incorrectly assert they are over 13 — the minimum age specified in Meta’s own policies — without any effective systems to verify their actual age.
“Meta’s stated general conditions indicate their services are not aimed at minors under 13,” EU tech policy leader Henna Virkkunen commented in a statement. “However, our initial findings indicate that Instagram and Facebook are doing very little to restrict access for children below this age.”
The existing tools on Facebook and Instagram for reporting minors under 13 are also “hard to navigate and ineffective,” according to the Commission, which found that even when an underage user is reported, there is frequently no follow-up to effectively remove the child from the platform. These issues put Meta at odds with DSA regulations which require it to “actively identify and manage the risks” associated with under-13s using its platforms.
The EU’s statement characterizes Meta’s own risk evaluation for safeguarding minors from unsuitable experiences as “unfinished and capricious.” The Commission contends it conflicts with “extensive evidence from across the European Union” suggesting that 10-12 percent of children under 13 are using Facebook and/or Instagram.
“Additionally, Meta appears to have ignored readily accessible scientific research indicating that younger children are more susceptible to potential dangers associated with services such as Facebook and Instagram,” stated the Commission. An ongoing investigation into the concerns that Facebook and Instagram may lead to “behavioral dependencies in children” was initiated concurrently with the age-verification inquiry.
Meta currently has the chance to address the violations, with the Commission urging Instagram and Facebook to revise their risk assessment processes and implement more effective age verification methods. Should Meta not comply and receive a non-compliance ruling, it faces fines of up to six percent of its global annual revenue. This could amount to as much as $12 billion, contingent on Meta reporting a revenue of $201 billion for 2025.
Meta has expressed disagreement with the EU’s preliminary conclusions in a statement to The Guardian:
“We are clear that Instagram and Facebook are designed for individuals aged 13 and older, and we have measures in place to identify and eliminate accounts belonging to anyone under that age. We continually invest in technologies to identify and remove underage users and will provide more information next week regarding additional measures that will be implemented soon.”