Meta boss Mark Zuckerberg allegedly rejected a proposal to ban filters from his social media sites that simulate the effects of plastic surgery despite the harm the body-altering technology poses to young girls’ mental health, according to a lawsuit.
The complaint, filed by 33 states including New York against the tech behemoth behind Facebook and Instagram last week, claimed the filters are part of a set of “psychologically manipulative platform features” that exploits youngsters’ vulnerabilities.
Visual filters, in particular — which have photo-brightening, skin-smoothing, eye-brightening and even face-lifting effects — “promote young users’ body dysmorphia,” per the largely unredacted copy of the suit distributed Monday by California Attorney General Rob Bonta.
“Meta knows that what it is doing is bad for kids — period,” Bonta told Bloomberg, which earlier reported on the lawsuit. “Thanks to our unredacted federal complaint, it is now there in black and white, and it is damning.”
The states claim Meta has embraced a business model designed to maximize the amount of time teen users spend on its apps through “harmful and psychologically manipulative product features” — despite public assurances that the platforms are safe.
The lawsuit targets Meta for its use of “dopamine-manipulating recommendation algorithms,” an emphasis on a “likes” system that was “known by Meta to harm young users,” alerts that nudged young users to use Facebook and Instagram at night or during school, and visual “filters” that allegedly fueled body dysmorphia.
It also alleges that Meta violated a law blocking the collection of personal data from users under 13, quoting a paragraph from Instagram’s terms of use that reads: “We use your personal data, such as information about your activity and interests, to show you ads that are more relevant to you.”
“While filters exist across every major social platform and smartphone camera, Meta bans those that directly promote cosmetic surgery, changes in skin color or extreme weight loss,” a Meta spokesperson said. “We clearly note when a filter is being used and we work to proactively review effects against these rules before they go live.”
The suit, which was initially filed in federal court in Oakland, Calif., last month, adds to mounting scrutiny at Meta, whose Instagram Reels was put on blast by a Wall Street Journal report Monday for allegedly recommending “risqué footage of children as well as overtly sexual adult videos” to adult users who follow children.
In one instance, an ad promoting the dating app Bumble was sandwiched between a video of a person caressing a “life-size latex doll” and another clip of an underage girl exposing her midriff, according to The Journal, which set up test accounts to probe Instagram’s algorithm.
In other cases, Zuckerberg’s Meta-owned app showed a Pizza Hut commercial next to a video of a man laying in bed with a purported 10-year-old girl, while a Walmart ad was displayed next to a video of a woman exposing her crotch.
The report represents another high-profile headache for Meta’s 39-year-old CEO, who’s facing an advertiser revolt after some companies cited in the study suspended ads on all its platforms, which include Facebook, following Monday’s report by The Journal.
Source