Meta, Google, TikTok, Snap fail to stop lawsuits claiming their apps are addictive and harmful
Social media’s giants must face lawsuits brought forth by school districts nationwide, U.S. District Judge Yvonne Gonzalez Rogers ruled this week, which claim that the platforms are addictive and harming the country’s youth.
Meta, Google, TikTok, and Snap will indeed head to court to face the high-dollar damage cases, which will proceed under a more narrow scope, Rogers decided. This comes despite a contrary decision by a Los Angeles Superior Court judge this summer, Bloomberg reported. As a result, the companies won’t face hundreds of Los Angeles-based claims, but will be liable for more than 150 additional cases.
In the ruling, Rogers agreed that the companies “deliberately fostered compulsive use of their platforms” by students that placed a burden on school districts. Other claims would be thrown out under Section 230 of the Communications Decency Act, which shields internet use and providers from most civil suits.
Mashable Light Speed
TikTok execs know exactly how many videos it takes to get addicted to its platform
Meta, Google, and Snap have denied the allegations brought forth in the lawsuit. TikTok has not commented on this decision specifically, but has been vocal about its teen safety efforts in the past. Last week, the same California judge ruled that Meta has to face joint lawsuits from 34 district attorneys who allege the company’s social media platforms are exacerbating the youth mental health crisis. Earlier this month, a similar coalition of district attorneys filed lawsuits against TikTok for its “addictive algorithm” and false safety marketing — internal documents revealed in the case found that TikTok’s executives were aware of the addictive nature of its For You Page.
Rogers has overseen dozens of such cases, including a major class action filing against Meta, Google parent company Alphabet, and several other social media companies brought forth by parents and their children. In a 2023 decision related to the case, Rogers said that the social media platforms could be sued over negligence related to “defective products,” but wouldn’t move forward on claims that private messaging tools, notifications, and algorithmic recommendations were connecting minors to adults and causing harm, citing Section 230 once again.