Suit: TikTok, other apps liable for 11-year-old's social media addiction, suicide
2022 PRODDBRF 0023
By Katie Pasek
WESTLAW Products Liability Daily Briefing
February 18, 2022
(February 18, 2022) - An 11-year-old girl took her own life after becoming addicted to Instagram, Snapchat and TikTok despite the platform owners' claims that their apps are not habit-forming, a federal court lawsuit has alleged.
Rodriguez v. Meta Platforms Inc. et al., No. 22-cv-401, amended complaint filed, (N.D. Cal. Feb. 16, 2022).
The social media companies are liable for Selena Rodriguez's death because they designed the apps to be addictive and encouraged their excessive use while knowing that they have a detrimental effect on users' mental health, according to the Feb. 16 amended complaint.
"Defendants have failed to design their products with any protections to account for and ameliorate the psychosocial immaturity of their minor users," the amended complaint says.
Selena's mother, Tammy Rodriguez, originally sued Instagram owner Meta Platforms Inc. — formerly Facebook Inc. — and Snapchat owner Snap Inc. on Jan. 20 in the U.S. District Court for the Northern District of California.
The amended complaint adds as defendants TikTok Inc. and ByteDance Inc., which together operate the TikTok social media platform.
The suit seeks compensatory damages, punitive damages and other relief from the California-based companies.

Suit: Apps lack safeguards

According to the amended complaint, the defendants publicly claim that their products are not addictive despite having internal research showing the apps can cause extensive mental harm to users.
"Defendants have invested billions of dollars to intentionally design their products to be addictive" because they generate revenue from the number of advertisements a user views on the app, the suit says.
The apps use rewards and complex algorithms to "exploit human psychology" and keep users seeking progressively more stimulating content, which then increases the time they spend on the platform and the number of ads they view, the amended complaint says.
Minor users are particularly susceptible to these "manipulative algorithms" because their brains have not yet developed impulse control and risk-evaluation capabilities, the plaintiff says.
The suit says Selena Rodriguez struggled with a social media addiction for more than two years before it led to her July 21, 2021, suicide.
Rodriguez became depressed and sleep deprived, developed an eating disorder, and had to be hospitalized for emergency psychiatric care, the amended complaint says.
Tammy Rodriguez says that while she was unaware of the "clinically addictive" effects of the defendants' apps, she attempted to limit her daughter's use of social media and get her mental health care.
The suit says Selena Rodriguez also received solicitations for sexual exploitative content from other app users. Through Snapchat, she sent sexual images that her classmates obtained and ridiculed her for sending, the plaintiff says.

Alleged state law violations

The defendants manufactured and marketed the addictive apps with inadequate safeguards to protect minors from exploitative content and without controls for parents to monitor and limit their use, the suit says.
The companies also failed to warn minor users and their parents of the apps' addictive qualities, the plaintiff says.
The complaint claims that the defendants promoted their products to underage users while concealing their harmful risks in violation of California's law prohibiting fraudulent business practices, Cal. Bus. & Prof. Code § 17200.
Meta Platforms and Snap also benefitted financially when they knowingly facilitated the sexual solicitation and exploitation of minors through their apps in violation of 18 U.S.C.A. §§ 1591 and 1595, the suit says.
Jennie Lee Anderson of Andrus Anderson LLP and Matthew Bergman of Social Media Victims Law Center represent the plaintiff.
By Katie Pasek
End of Document© 2024 Thomson Reuters. No claim to original U.S. Government Works.