Judge Rejects Social Media Giants' Bid to Dismiss Nationwide Lawsuits, Highlights Dangers to Children's Mental Health
A federal judge has rejected attempts by major social media companies to dismiss nationwide litigation accusing them of illegally enticing and addicting children, marking a significant victory for affected families. The ruling could lead to numerous safety claims and make legal defense more challenging for the companies.
A federal judge has rejected the attempts by major social media companies, including Alphabet, Meta Platforms, ByteDance, and Snap, to dismiss nationwide litigation accusing them of illegally enticing and addicting millions of children to their platforms, causing harm to their mental health. The ruling covers hundreds of lawsuits filed on behalf of individual children who allegedly experienced negative physical, mental, and emotional health effects from social media use, including anxiety, depression, and even suicide.
🔴 SOCIAL MEDIA ADDICTION JUDGE ALLOWS SOME CLAIMS TO PROCEED, AS META, GOOGLE, SNAP & TIKTOK 'MUST' FACE YOUTH ADDICTION CLAIMS.
— Breaking Market News (@financialjuice) November 14, 2023
The decision is seen as a significant victory for the families affected by the dangers of social media. Furthermore, more than 140 school districts and 42 states, along with the District of Columbia, have filed similar lawsuits against the industry, highlighting the issue of youth addiction to social media platforms. While the companies have denied the allegations, stating that protecting children has always been a core aspect of their work, the judge dismissed arguments that the companies were immune from being sued under the US Constitution's First Amendment and a provision of the federal Communications Decency Act.
In her ruling, Judge Yvonne Gonzalez Rogers stated that the plaintiffs' claims went beyond third-party content and argued that the defendants should be held liable for providing defective parental controls, not helping users limit screen time, and creating barriers to deactivating accounts. She highlighted that companies could have implemented age-verification tools to warn parents when their children were online, suggesting that failure to do so could harm users. The judge also noted that the companies owed a duty to their users as product makers and could be sued for negligence over their duty to design reasonably safe products and to warn users of known defects.
However, she clarified that the companies did not have a legal obligation to protect users from harm caused by third-party users of their platforms. While this ruling does not determine that social media platforms are causing harm or hold them legally liable for it, it could pave the way for numerous safety claims and make the legal defense against them more challenging. The issue of child safety on social media platforms has gained increasing attention, and lawmakers have been pushing for new laws targeting child protection, including age verification requirements.