Family Blames Instagram for Teen’s Eating Disorder in Lawsuit Challenging Section 230 Protections

Citing the Facebook Papers, two parents are targeting Instagram's algorithms rather than its third-party content in hopes of circumventing Section 230.

We may earn a commission from links on this page.
Image for article titled Family Blames Instagram for Teen’s Eating Disorder in Lawsuit Challenging Section 230 Protections
Photo: Jeff Chiu (AP)

A personal injury lawsuit filed in California federal court on Monday alleges Instagram’s parent company Meta purposely crafted products to addict young users, steering one 11-year-old girl down a years-long path of physical and psychological harm.

The case, brought by the Social Media Victims Law Center on behalf of now-19-year-old Alexis Spence, asserts Instagram “consistently and knowingly” targeted its product at young children while at the same time ignoring warnings internally about its worsening effects on the mental health of its users.

Advertisement

“As a result of Alexis’ addiction to Instagram, she had to undergo professional counseling, in-patient programs, out-patient programs, participate in eating disorder programs and will likely require help in the form of a service dog for the rest of her life, as well as ongoing medical attention to ensure she does not digress,” the lawyers said.

Advertisement

Their suit hinges on and directly cites the Facebook Papers, the trove of internal files leaked last fall by Facebook whistleblower Frances Haugen. Among them were confidential reports and presentations portraying Instagram as a blight on the mental health of adolescents. The lawsuit is among the first to use the documents against Meta in actual court rather than the court of public opinion.

Advertisement

The suit is also the latest in a wave of new cases around the country hoping to find a way around the liability shield extended to website owners and operators under Section 230 of the Communications Decency Act. Passed in 1996, Section 230 is considered foundational to the internet as exists today, enabling large tech companies and everyday users to moderate their own websites absent the fear of being buried in lawsuits over content posted by third parties.

“There’s a concerted effort across the country to re-frame lawsuits against internet services as attacking their software tools, not attacking the content that’s published using them,” Eric Goldman, a law professor at Santa Clara University, told Gizmodo by phone.

Advertisement

Meta declined to comment on the Spence case, citing active litigation, but a spokesperson pointed Gizmodo to a range of features they said are designed to help people struggling with body image issues.

Spence joined Instagram in the fifth grade, two years too young to join the app under its minimum age requirement, according to the suit. The complaint alleges her addiction to Instagram was the result of deliberate efforts by the company to design a product that is inherently addicting and has toothless support features.

Advertisement

Congress has already taken notice of the suit. Rep. Tom Malinowski, a New Jersey Democrat who co-authored a legislation last year targeting social media algorithms that neglect to prevent interference with civil rights, said the Spence case was just the latest to exhibit the “real-world harms caused by sophisticated algorithms designed to keep us glued to our screens — eating disorders, suicides, mass shootings, insurrections.”

Soon after she joined, according to the complaint, Instagram’s algorithm rapidly drove the younger Spence toward an endless stream of problematic content; underweight models and links to extreme dieting websites promoting “anorexia, negative body image and self-harm,” her lawyers say. This eventually spurred an eating disorder, they said, preceding years of anxiety, depression, and suicidal ideation. Hospitalization was inevitably necessary.

Advertisement

The Facebook Papers, which Gizmodo began making public for the first time in April, revealed to the world that Meta knew how some of its users felt about their use of social media: badly. A leaked survey conducted by the company found that many teens blamed Instagram directly for anxiety and depressive episodes. These self-diagnoses were reported “unprompted,” the company said, and were “consistent across all groups.”

“Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse,” read one leaked presentation from March 2020.

Advertisement

Facebook responded to the leak at the time by downplaying the negative effects, which it described as “quite small” in scale. Executives noted that the leaked research was not considered scientific by any academic standard. The bulk of it was conducted by marketing staff, and the ill effects based on subjective, retrospective analysis self-reported by users. At a congressional hearing last year, Meta CEO Mark Zuckerberg pointed to contrary research linking social media usage with positive mental-health benefits. Similar surveys conducted by the Pew Research Center in 2018, for example, found that teens were more likely to associate social media apps with positive emotions—though the sentiment was “far from unanimous.”

At the same time, documents showed that Meta—known simply as Facebook at the time—had endeavored to ingratiate its brand with users years too young to actually join its platforms. Internal research, never intended for the public eye, coldly portrayed children as young as 10 years old as a “valuable” and “untapped” resource pivotal to the company’s growth.

Advertisement

The immunity granted under Section 230 is not absolute, as a federal court’s recent ruling shows. About a year ago, the Ninth Circuit determined there was a sufficient enough distinction between claims over content published by a website—which would be covered under Section 230—and allegations of negligence pointing at the underlying software behind a social network’s functionality. Spence’s case against Instagram appears to be another attempt to build off those efforts.

Goldman, a staunch supporter of Section 230 who believes efforts to amend it are largely politically motivated, was also critical of the Ninth Circuit ruling.

Advertisement

That case, which remains unsettled, centers around the “speed filter” feature removed by Snapchat last year. The filter was designed to allow users to capture how fast they were moving, but allegedly led some users to pursue bragging rights by driving recklessly at excessive speeds. Several deaths between 2015 and 2017 have been attributed to the filter, including those of three young men in Wisconsin who fatally crashed into a tree. The app clocked their car moving at 123 miles per hour shortly before the crash.

The Ninth Circuit held that Section 230 was not an applicable defense so long as the plaintiffs’ remain focused on Snapchat’s software design, and not the content shared on its platform. Similarly, Spence’s lawyers seem eager to target addictive aspects of Instagram’s algorithm rather than solely the content Spence may have consumed.

Advertisement

“Again, you’re talking about the algorithm and the way that the complaint may be framed is really more about the overall service, that everything about the service was designed to encourage usage and that encouraged amount of usage is what caused the problem,” Goldman said. “It doesn’t mean they’ll win, but they may have found a way to get around Section 230.”

It’s difficult to separate a product that promotes content from the content it promotes, Goldman said, even if the courts find a legally significant distinction. “I understand at the big structural level, yes, they’re not trying to assert that they’re suing for third-party content, they’re suing based on the software design,” he said. “But to me those collapse together in every material way.”

Advertisement

Malinowski’s bill, the Protecting Americans from Dangerous Algorithms Act, was one of numerous last year aimed at amending Section 230. Whether due to the sheer complexity of the issue, unforeseeable ramifications for the global internet economy, or partisan disagreement over what precisely to amend, none of the bills gained much traction.

“Large social media platforms should not enjoy blanket legal immunity for harmful content that they actively amplify, promote, or recommend to their users,” Malinowski told Gizmodo.

Advertisement

Today, when users encounter content on Instagram related to self-harm or eating disorders, the app is supposed to flag “potentially triggering images” and blur them out, while pointing users to hotline resources offered by groups such as the National Eating Disorders Association.

Additional safeguards are also in place, Meta says, including age verification tools, restrictions on DMs between adults and teens, and default privacy settings for any user under the age of 16.

Advertisement

If you or someone you know is experiencing thoughts of suicide or self-harm, please call 911 or the National Suicide Prevention Hotline at 1-800-273-8255. Victims of social media cyberbullying can contact SMVLC at www.socialmediavictims.org or by calling 1-800-834-6994.

Advertisement