A girl started getting suicidal thoughts and crippling depression after playing Roblox from the age of seven, a shocking lawsuit claims.
The child, identified only as L.L., started playing the popular video game in 2022 after becoming hooked on YouTube and TikTok, the suit filed in Los Angeles says.
But her mental health dramatically declined and she spiraled into crisis due to her online addiction, the court papers argue.
The claims emerged in a wider lawsuit against tech giants including Discord, YouTube, TikTok, Meta and Google, alleging children were driven to self-harm by what they saw on the sites.
Twenty-two plaintiffs from 14 states were mentioned in the suit, and their ages ranged from just 11 years old to 21.
It comes a week after a Los Angeles jury found Instagram and YouTube harmed a young girl due to features designed to hook kids.
In the new papers, L.L. was said to have been driven to mental health problems due to what she saw online and on Roblox, which she was “unable to avoid.”
The documents, filed in LA Superior Court, said: “These harms L.L. didn’t seek out or want. They were harms L.L. became unable to avoid.”
Roblox, one of the most popular online products among American teenagers, “failed to take reasonable steps to protect the children it targeted,” the lawsuit says.
Another alleged victim mentioned in the suit is M.F., who was said to have been subject to “suicidal ideation” on the messaging app Discord.
She started using it in 2019 when she was just 13, and formed an addiction that led to “self-harm in the form of cutting, exploitation, suicidal ideation, and other serious mental health harms not experienced prior to when such use began,” the papers say.
The suit added: “Discord is popular for communities of neo-Nazis and white supremacists to boost the ideas that undergird their movements.”
M.F., was also using Instagram, Snapchat and TikTok when her mental state declined, according to the documents.
Another mentioned in the lawsuit was Carmen Heneghan, who committed suicide as a result of her addiction, according to her family.
Download The California Post App, follow us on social, and subscribe to our newsletters
California Post News: Facebook, Instagram, TikTok, X, YouTube, WhatsApp, LinkedIn
California Post Sports Facebook, Instagram, TikTok, YouTube, X
California Post Opinion
California Post Newsletters: Sign up here!
California Post App: Download here!
Home delivery: Sign up here!
Page Six Hollywood: Sign up here!
The teen started using Instagram, Snapchat, and TikTok apps at the age of 12 and quickly “developed harmful dependencies on these platforms,” the suit says.
Heneghan’s parents did not allow her to use the apps and “their efforts to restrict access to them became the greatest source of conflict in their home,” according to the papers.
Her addiction resulted in her suicide on August 22, 2021, she was only 16. Friends remembered her for her excellence in Irish dance, something she lost interest in due to her addiction, the suit argues.
Last week, a Los Angeles jury found that Meta’s Instagram and Google’s YouTube harmed a young user with features designed to hook children.
The tech giants were found liable for $3 million in compensatory damages for the harm caused. The jury also awarded $3 million in punitive damages.
Meta was ultimately found liable for $4.2 million in damages and Google was found liable for $1.8 million.
The plaintiffs in the new lawsuit are seeking economic damages to be determined by the time of the trial, medical expenses, funeral expenses, punitive, and other damages.
Other companies mentioned in the lawsuit include Meta, Bytedance, and Google. Requests for comment from all the companies named in the lawsuit weren’t immediately returned.
Read the full article here

