By Clare Duffy || CNN Health
Los Angeles β
For years, social media giants have argued against claims that their platforms harm young peopleβs mental health. Starting Tuesday, they will for the first time have to defend against those claims before a jury in a court of law.
A 19-year-old, identified as KGM, and her mother, Karen Glenn, sued TikTok, Meta, Snap and Googleβs YouTube, alleging that the companies knowingly created addictive features that harmed her mental health and led to self-harm and suicidal thoughts.
TikTok agreed to settle the case under undisclosed terms one day before the suit was set to go to trial, according to the plaintiffβs attorney Mark Lanier. Snap settled last week under undisclosed terms.
Parents, advocates, health experts, tech whistleblowers and teens themselves have for years worried that that social media platforms can get young people hooked on scrolling, enable bullying, disrupt their sleep and send them down harmful content rabbit holes. Tech executives have repeatedly been hauled before Congress, at one point even apologizing to parents who say their children died or were harmed because of social media. But the companies have nonetheless faced few consequences or regulations in the United States.
KGMβs case seeks unspecified monetary damages. The outcome could influence how more than 1,000 similar personal injury cases against Meta, Snap, TikTok and YouTube are resolved. Lanier told CNN he hopes the Snap and TikTok settlements in the KGM case could set the stage for the companies to settle the other cases.
Top executives from Meta, TikTok and YouTube are expected to take the witness stand during the trial, which takes place in Los Angeles and is set to last several weeks.
In recent years, TikTok, Meta, YouTube and Snap have rolled out safety features and policies, as well as parental control tools, that they say protect young users.
The four social media companies are involved in other cases this year as well, including some brought by school districts and state attorneys general. Losses could put the tech companies on the hook for billions of dollars in damages and force them to change their platforms.
βFor parents whose children have been exploited, groomed, or died because of big tech platforms, the next six weeks are the first step toward accountability after years of being ignored by these companies,β said Sarah Gardner, CEO of the non-profit Heat Initiative, which advocates for child safety online. βThese are the tobacco trials of our generation, and for the first time, families across the country will hear directly from big tech CEOs about how they intentionally designed their products to addict our kids.β
The KGM case
KGMβs lawsuit alleges that the social media giants intentionally designed their platforms to be addictive, despite knowing the risks to young people.
KGM, a California teen, started using social media at age 10, despite her momβs attempts to use third party software to block access to the platforms, according to court documents. βDefendants design their products in a manner that enables children to evade parental consent,β the complaint states.
The βaddictive designβ of Instagram, TikTok and Snapchat and frequent notifications led her to use the platforms compulsively, the suit alleges, which corresponded with a decline in her mental health.
Features that recommend other users to connect with on Snapchat and Instagram βfacilitated and created connections between minor Plaintiff K.G.M. and complete strangers, including predatory adults and others she did not know in real life,β the complaint states. Instagram and TikTok also allegedly βtargetedβ KGM with βdepressiveβ and βharmful social comparison and body imageβ content.
On Instagram, KGM alleges she was bullied and sextorted β a scam where a bad actor threatens to share explicit photos of a person if they donβt send money or more photos. It took two weeks and βK.G.M.βs friends and family spamming and asking other Instagram users to report the persons targetingβ her for Meta to address the problem, according to the complaint.
βDefendantsβ knowing and deliberate product design, marketing, distribution, programming and operational decision and conduct caused serious emotional and mental harms to K.G.M. and her family,β the complaint states. βThose harms include, but are not limited to, dangerous dependency on their products, anxiety, depression, self-harm, and body dysmorphia.β
KGMβs is one of several bellwether cases in a larger multi-district litigation consolidating around 1,500 personal injury cases alleging similar harms because of TikTok, YouTube, Meta and Snap.
What the companies say
In 2024, then-US Surgeon General Vivek Murthy called on Congress to mandate a tobacco-style warning label on social media platforms in light of the βmental health crisisβ among young people, something state attorneys general have also advocated for. And a Pew Research Center study published last year indicated that nearly half of US teens believe social media has βmostly negativeβ effects on people their age.
But tech leaders have for years rejected the idea that social media harms young peopleβs mental health. They point to a lack of conclusive research on the subject and argue that their platforms provide benefits such as entertainment and connection to friends.
Tech giants have also repeatedly relied on Section 230, a federal law that shields them from liability over content that their users post, as a defense against safety claims. Los Angeles Superior Court Judge Carolyn Kuhl, who is overseeing the KGM and related cases, said last year that jurors should consider whether design features implemented by the companies, like endlessly scrolling feeds, have contributed to mental health harms rather than content alone.
Snap has previously said that Snapchat was βdesigned differently from traditional social media β it opens to the camera, not a feed, and has no public likes or social comparison metrics.β
Snapchatβs youth safety measures include parental control tools, message warnings desgined to prevent sextortion and mechanisms for removing age-inappropriate content.
Asked for comment, a Meta spokesperson pointed CNN to a website dedicated to its response to the youth mental health lawsuits, where the company claims the suits βmisportray our company and the work we do every day to provide young people with safe, valuable experiences online.β
βWe have listened to parents, researched the issues that matter most, and made real changes to protect teens online,β Meta states. βDespite the snippets of conversations or cherry-picked quotes that plaintiffsβ counsel may use to paint an intentionally misleading picture of the company, weβre proud of the progress weβve made, we stand by our record of putting teen safety first, and weβll keep making improvements.β
Metaβs teen safety features include βteen accounts,β which launched in 2024 to provide default privacy protections and content limits for teen users on Instagram. It also provides parental supervision tools and uses AI to try to identify minor users regardless of the age they provide when they sign up for Metaβs platforms.
In a statement to CNN, YouTube spokesperson JosΓ© CastaΓ±eda said the allegations in the youth mental health lawsuits are βsimply not true.β
βProviding young people with a safer, healthier experience has always been core to our work. In collaboration with youth, mental health and parenting experts, we built services and policies to provide young people with age-appropriate experiences, and parents with robust controls,β he said in the statement.
YouTubeβs youth safety measures include restrictions on certain kinds of sensitive content, such as violent or sexually suggestive videos, as well as AI identification of minor users. It also offers parental control tools and last week rolled out an option for parents to limit or block their kids from scrolling through its short-form video feed, among other new offerings.
TikTok did not respond to a request for comment on this story.
The youth safety and parental control features TikTok has rolled out in recent years include adding default privacy settings and disabling late-night notifications. Last year, it introduced a βguided meditationβ feature purportedly aimed at getting teens to cut back on scrolling.
Despite those efforts, many parents and advocates say social media platforms have still failed to protect young users. Soon, a jury will have a chance to decide if they agree.
Source: social-media-youth-mental-health-trial
Disclaimer
The views and opinions expressed in this article are solely those of the author and do not necessarily reflect the official stance of Kritik.com.my. As an open platform, we welcome diverse perspectives, but the accuracy and integrity of contributed content remain the responsibility of the individual writer. Readers are encouraged to critically evaluate the information presented.