
Who Has the Power
The European Union accused Meta on Wednesday of failing to stop underage users from accessing Facebook and Instagram, putting the company under the bloc’s tough digital rules that require social media sites to protect minors. The EU’s executive branch said Meta Platforms lacked effective measures to prevent children younger than 13 from signing up, and that it was not doing enough to identify and remove children after they had opened accounts. Meta’s own minimum age to open an account on Facebook or Instagram is 13.
The European Commission said Meta is also inadequately assessing the risk of children younger than 13 being exposed to age-inappropriate experiences on the platforms. The accusation lands squarely on one of the biggest corporate gatekeepers of online life, with Brussels now using its regulatory apparatus to force the company to answer for what it allows on its platforms.
What Meta Says, What the Commission Says
Meta disagreed with the decision, saying that it has measures in place to detect and remove accounts for anyone younger than 13. The company said, “Understanding age is an industry-wide challenge, which requires an industry-wide solution, and we will continue to engage constructively with the European Commission on this important issue,” adding it will have more to share next week about additional measures it plans to roll out soon.
That is the familiar language of managed reform: promises of future measures, constructive engagement, and the suggestion that the problem is shared across the industry rather than rooted in the platform’s own design and enforcement failures. Meanwhile, the EU says the company is not doing enough now.
Brussels is targeting Meta with the Digital Services Act, a sweeping set of regulations that requires tech companies operating in the 27-nation bloc to do more to clean up online platforms and protect internet users. Meta now has the chance to respond to the preliminary findings, before the commission issues its final decision. Violations can result in hefty fines worth up to 6% of a company’s worldwide annual revenue.
Children at the Bottom, Fines at the Top
Henna Virkkunen, an executive vice president at the European Commission, said the bloc’s investigation launched in 2024 found that Instagram and Facebook “are doing very little” to prevent children from getting access despite their own terms and conditions indicating “their services are not intended for minors under 13.” Virkkunen said, “The DSA requires platforms to enforce their own rules: terms and conditions should not be mere written statements, but rather the basis for concrete action to protect users – including children.”
The commission’s case is built on the gap between what Meta says on paper and what it actually does in practice. Meta’s own minimum age is 13, yet the EU says the company lacks effective measures to stop children younger than 13 from signing up and is not doing enough to identify and remove them after accounts are opened. The platform’s written rules, in other words, are only as real as the enforcement behind them.
The investigation launched in 2024, and now the company faces preliminary findings under the Digital Services Act. The commission has not yet issued its final decision, but the threat hanging over Meta is clear: fines that can reach up to 6% of worldwide annual revenue. That is the language of the apparatus when it decides a platform has failed to police its own gates.
For now, the children the rules are supposed to protect remain the ones caught in the middle, while Meta and the European Commission trade statements about compliance, age detection, and future measures. The company says it will have more to share next week. The commission says the platform is doing very little. The system, as usual, keeps the paperwork moving while the people it claims to protect stay exposed.