Meta drastically downplayed the degree to which teens were being shown alarming pictures and videos on Instagram that depicted self-harm – even as the tech giant’s own research showed the problem was far more prevalent than publicly known, according to allegations in newly revealed court papers.
The latest allegations surfaced late last week in an unredacted version of a bombshell lawsuit filed in October by 33 states, including New York – and they cite Meta’s claim in 2021 that “less than 0.05% of views were of content that violated our standards against Suicide & Self-Injury.”
Meta — whose billionaire chief executive Mark Zuckerberg faces a possible grilling on Capitol Hill early next year over Instagram and child safety — made the bold statistical claim in its “Community Standards Enforcement Report” for the third quarter of 2021.
However, a “contemporaneous” internal Meta survey that same year found that “6.7% of surveyed Instagram users had seen self-harm content within the last seven days,” according to the suit.
The number purportedly jumped to 8.4% for users aged 13 to 15.
“In other words, while a reader of the CSER Reports could reasonably understand that self-harm content on Instagram is rarely encountered by users—far less than 1% of the time—in reality, Meta knew from its user experience surveys that self-harm content is commonly encountered on Instagram,” the lawsuit claimed.
In the lawsuit, the states also describe internal dissent among Meta insiders about company research showing the potential harm that Instagram and other platforms caused for teen users.
In one case, a Meta employee reportedly likened attempts to downplay research on social harms to that of the tobacco industry.
“In September 2021, other Meta employees expressed criticism and concern towards the company for emphasizing that research into social media’s impact is inconclusive, when in fact Meta had conducted research on this issue with more definitive findings,” the suit says.
“An employee stated that Meta’s portrayal of the research as inconclusive would be akin to representations of tobacco companies, which similarly relied on uncertainty in scientific studies to deny that cigarettes caused cancer,” it adds.
The newly surfaced allegations against Meta drew a sharp rebuke from Fairplay, an online children’s safety group that has been critical of Meta’s handling of youth safety issues.
Fairplay executive director Josh Golin described Meta’s alleged understatement of self-harm content on Instagram as “beyond the pale.”
“The AGs’ unredacted complaint shows just how ruthless Meta is when it comes to extracting profit, time and attention from its core young user base,” Golin said. “Young people’s developmental vulnerabilities are viewed as a business opportunity and the company will do anything to hook kids and teens on IG, even sending them down dangerous rabbit holes. “
When reached for comment, a Meta spokesperson noted that the company “wants teens to have safe, age-appropriate experiences online” and has released more than 30 tools meant to help teens and their parents navigate social media use.
“We’ve spent a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online,” the spokesperson said in a statement. “The complaint mischaracterizes our work using selective quotes and cherry-picked documents.”
The Meta spokesperson blasted the lawsuit for drawing an “absurd comparison” between the social media and the tobacco industry.
“Unlike tobacco, Meta’s apps add value to peoples’ lives,” the spokesperson said.
The state attorneys general also claim that Meta executives were well aware that its tactics resulted in an addictive dopamine hit for young users.
The suit cites a 2020 internal Meta presentation that detailed the company’s “efforts to study adolescent biology and neuroscience in order to “gain valuable unchanging insights to inform product strategy today.”
The document purportedly noted “teens are insatiable when it comes to ‘feel good’ dopamine effects” and that Instagram has an effective way to take advance through its “Explore” page and recommendation algorithm.
“Every time one of our teen users finds something unexpected their brains deliver them a dopamine hit,” the document allegedly said.
A Meta spokesperson said the company encourages “teens to take regular breaks from Instagram and have intentionally designed features like ‘Quiet Mode’ that encourage them to leave the app and pause notifications if they’ve been scrolling for just a few minutes at night.”
“We also give parents tools to set time limits and scheduled breaks where teens are required to leave our apps,” the spokesperson added.
Earlier this week, The Post reported on the states’ allegation that the existence of millions of underage users on Instagram was a “open secret” at Meta.
The plaintiffs allege that Meta “disabled only a fraction of those accounts” and “routinely continued to collect children’s data without parental consent” in violation of the law.
The states seek unspecified financial damages, as well as “injunctive relief” blocking Meta from using harmful business practices outlined in the suit.
Source