Meta employees flew into a panic in 2020 after the company was confronted by an Apple executive whose 12-year-old child had been “solicited” on Instagram, according to newly revealed details of a lawsuit.
The kerfuffle came to light after part of a sweeping civil lawsuit filed in December by New Mexico Attorney General Raul Torrez, who alleges underage Facebook and Instagram users are exposed to potential contact with sex predators and bombarded with adult sex content on the apps.
An unredacted internal Meta document cited in the complaint showed company employees “scrambling” to address the unnamed Apple executive’s concerns – a reaction that was purportedly driven by fears that Instagram could be kicked out of Apple’s App Store.
“This is the kind of thing that pisses Apple off to the extent of threating [sic] to remove us from the App Store,” the document said, according to the lawsuit.
The lawsuit said workers at Mark Zuckerberg’s social-networking giant discussed whether there was a timeline for when the company would “stop adults from messaging minors on IG direct.” In 2021, Meta began restricting adults 19 and older from sending private messages to teen users who don’t follow them.
According to the unredacted details, Meta employees pushed to quickly remove accounts related to the phrase “sugardaddy,” warning that Apple would “reply with 100 more accounts if we’re not able to take these down.”
In one internal chat from July 2020, a Meta employee reportedly asked, “what specifically are we doing for child grooming (something I just heard about that is happening a lot on TikTok)?” Another employee responded, “somewhere between zero and negligible” and added that “child safety is an explicit non-goal this half.”
New Mexico’s complaint also alleged in an unredacted section that Meta “knew about the huge volume of inappropriate content being shared between adults and minors they did not know.”
The suit cited an internal 2021 presentation that “100,000 children per day received online sexual harassment, such as pictures of adult genitalia.”
A separate 2021 document detailed Meta’s discussion of concerns related to Facebook’s “People You May Know” feature and allegedly showed the company was aware the feature “had a direct link to trafficking,” according to lawsuit.
One Facebook employee purportedly wrote that “in the past, PYMK contributed up to 75% of all inappropriate adult-minor contact.” Another worker responded by asking, “How on earth have we not just turned off PYMK between adults and children?” and described the inaction as “really, really upsetting.”
Another internal presentation on child safety issues from March 2021 allegedly said Meta is “underinvested in minor sexualization on IG, notable on sexualized comments on content posted by minors,” the suit says.
A Meta spokesperson said the company has spent “a decade working on these issues and hiring people who have dedicated their careers to keeping young people safe and supported online.”
“The complaint mischaracterizes our work using selective quotes and cherry-picked documents,” the Meta spokesperson said in a statement.
Apple did not immediately return a request for comment.
The New Mexico attorney general office’s investigation into Meta included the creation of test accounts on Facebook and Instagram that purportedly portrayed children aged 14 or younger.
The suit alleges Meta and its CEO Mark Zuckerberg have engaged in “unacceptable” conduct by failing to properly crack down on the sick content.
“For years, Meta employees tried to sound the alarm about how decisions made by Meta executives subjected children to dangerous solicitations and sexual exploitation,” Torrez said in a statement. “Meta executives, including Mr. Zuckerberg, consistently made decisions that put growth ahead of children’s safety.”
Previous unredactions earlier this month revealed that two key corporate advertisers, Walmart and Tinder parent Match Group, had confronted Meta for running ads next to content that sexualized underage users.
Meta faces a separate lawsuit from a coalition of 33 state attorneys general who alleged the company has profited from “millions” of underage users and built addictive features into its apps that helped fuel a youth mental health crisis.
In the face of mounting legal scrutiny, Meta has publicly touted its efforts to boost online safety for teen users.
Earlier this month, Meta said it has begun “automatically placing teens into the most restrictive content control setting on Instagram and Facebook” and limiting search results for upsetting topics such as suicide, eating disorders and self-harm.
WIth Post wires
Source