Meta CEO Mark Zuckerberg and other top brass at the social media giant were well aware that millions of users on Instagram are underage, according to newly released claims in a lawsuit filed by dozens of states over the company’s alleged failure to protect teens from harm.
The details were included in a unredacted complaint filed late last week by 33 states including New York. Originally filed in October, the suit alleges that the company behind Facebook, Instagram and WhatsApp “ignored the sweeping damage” it caused to young users while implementing addictive features designed to keep them hooked on the apps.
“Within the company, Meta’s actual knowledge that millions of Instagram users are under the age of 13 is an open secret that is routinely documented, rigorously analyzed and confirmed, and zealously protected from disclosure to the public,” the lawsuit alleges.
The Instagram parent received “over 1.1 million reports of under-13 users on Instagram” through in-app reporting systems from the first quarter of 2019 through the second quarter of 2023” alone, the complaint alleged.
However, the company “disabled only a fraction of those accounts” and “routinely continued to collect children’s data without parental consent,” the suit alleges. Meta’s actions allegedly violated a federal law blocking the collection of personal data from users under 13.
The unredacted version of the lawsuit also shed new light on Meta’s internal response to whistleblower allegations by former employee Frances Haugen, who famously testified on Capitol Hill in 2021 that Zuckerberg’s firm prioritized profits over user safety despite internal research showing its platforms were causing harm.
Haugen’s allegations were first reported in a sweeping investigation by the Wall Street Journal in 2021, which revealed in part that Meta executives knew Instagram was particularly toxic for teen girls – some of whom developed body image issues, anxiety or even suicidal thoughts.
The suit cites messages from Meta spokesperson Stephanie Otway to Instagram chief Adam Mosseri in which she noted that the Journal’s “arguments [are] based on our own research so [they] are difficult to rebut” and that she was “mostly worried about the fallout from the article . . . [and] that our own research confirmed what everyone has long suspected[.]”
In an internal chat from November 2021, Mosseri reportedly wrote, “tweens want access to Instagram, and they lie about their age to get it now.” The state attorneys general argue that is one of many signs that Meta employees knew about its underage user problem.
A month later, however, Mosseri declared to Senate members during a highly-publicized hearing that children under 13 were “not permitted on Instagram.”
The lawsuit claims that company research from March 2021 revealed Instagram’s recommendation algorithm was “shown to recommend content related to eating disorders when it received indications that the user had engaged with content relating to eating disorders in the past.”
Meta employees created test accounts in which they followed profiles with names suggestive of eating disorders. Soon after, the algorithm suggested “accounts related to anorexia, such as @milkyskinandbones, @skinny._.binge, @_skinandbones__, and @applecoreanorexic.”
The lawsuit notes that Meta executives briefed Zuckerberg “as early as 2017” that targeting children under the age of 13 as potential users would boost “the rate of acquisition when users turned 13.”
The complaint cited an internal Meta document from 2018 in which the company admitted that it does “very little to keep U13s off our platform.”
The Post has reached out to Meta for comment on the newly unredacted complaint.
The states seek unspecified financial damages, as well as “injunctive relief” blocking Meta from engaging in the harmful business practices outlined in the suit.
Meta previously said it was “disappointed” that the states opted to file suit instead of working with the company to address concerns. The company said it blocks users under 13 from using Instagram and does not knowingly collect data from anyone under 13.
As The Post previously reported, the states also allege that Instagram and Facebook bombard young users with notifications that disrupt their sleep and interrupt their school.
The notifications include so-called “haptic alerts” such as phone vibrations and pulses, as well as sound or banner notifications, emails and “badge notifications” that display a red indicator showing unread messages.
Meta is one of several social media companies to face pressure from state and federal officials over their purported failure to police content served to young users.
Instagram’s chief rival, China-owned TikTok, recently faced scrutiny after users began praising Osama Bin Laden in the wake of Hamas’s attack on Israel.
Source