Earlier this year, we covered what appears to be the first of several lawsuits filed on behalf of parents by the Social Media Victims Law Center. In that lawsuit, the mother of an eleven-year-old who committed suicide sued Meta and Snap, claiming SnapChat’s algorithmically enabled feedback loops drove her daughter to her death. The suit recounted the last few years of her daughter’s life, which increasingly revolved around social media use. Despite taking actions to limit her daughter’s interactions with these services, along with seeking psychiatric intervention, her daughter ultimately took her own life.
Seeking some form of closure or justice often follows tragedies, but trying to hold social media platforms directly responsible for the actions of users isn’t likely to achieve either of those goals. What isn’t foreclosed by Section 230 immunity is shielded by the First Amendment. Even if the plaintiff somehow manages to get past these arguments, they still have to show how the platform contributed to the user’s death.
Those hurdles aren’t deterring the Social Media Victims Law Center from filing more lawsuits with similar allegations concerning similar tragedies. This CNN report covers the story of another family dealing with the suicide of a child and who has also secured the representation of this law center.
Christopher James Dawley, known as CJ to his friends and family, was 14 years old when he signed up for Facebook, Instagram and Snapchat. Like many teenagers, he documented his life on those platforms.
CJ worked as a busboy at Texas Roadhouse in Kenosha, Wisconsin. He loved playing golf, watching “Doctor Who” and was highly sought after by top-tier colleges. “His counselor said he could get a free ride anywhere he wanted to go,” his mother Donna Dawley told CNN Business during a recent interview at the family’s home.
But throughout high school, he developed what his parents felt was an addiction to social media. By his senior year, “he couldn’t stop looking at his phone,” she said. He often stayed up until 3 a.m. on Instagram messaging with others, sometimes swapping nude photos, his mother said. He became sleep deprived and obsessed with his body image.
On January 4, 2015, while his family was taking down their Christmas tree and decorations, CJ retreated into his room. He sent a text message to his best friend – “God’s speed” – and posted an update on his Facebook page: “Who turned out the light?” CJ held a 22-caliber rifle in one hand, his smartphone in the other and fatally shot himself. He was 17. Police found a suicide note written on the envelope of a college acceptance letter. His parents said he never showed outward signs of depression or suicidal ideation.
The wrongful death lawsuit [PDF] (which CNN didn’t include in its report for unknown reasons) presents a bunch of product liability claims, along with references to recent congressional hearings about social media moderation efforts. The biggest problem facing the plaintiffs isn’t Section 230 immunity or First Amendment protections. It’s the fact that these allegations are foreclosed by the statute of limitations. Both wrongful death and product liability suits must be brought within three years. (There is an exemption that extends the product liability statute of limitations but it only applies to latent diseases caused by products or if the manufacturer has explicitly promised the product would last more than 15 years.)
Here’s how the lawsuit hopes to avoid the statute of limitations issues.
Plaintiff did not discover, or in the exercise of reasonable diligence could not have discovered, that CJ’s death by suicide was caused by the Defendant’s unreasonably dangerous products until September or October of 2021.
This refers to the information exposed by Facebook whistleblower Frances Haugen, which provided details on the inner workings of the platform’s algorithms, and how they were skewed to ensure the company made more money even if it meant making the experience worse (and potentially more dangerous) for users.
By claiming they had no idea how much Meta and Snap manipulated users until this date, the plaintiff is apparently hoping the court will consider September 2021 to be the starting point of the injury, rather than the date her son committed suicide, which was more than seven years ago. Whether the court will agree to start the clock six years after the tragedy remains to be seen, but the rest of arguments are similar to those raised in lawsuits brought against social media services by victims of terrorist attacks… and not a single one of those lawsuits has resulted in a win for the plaintiffs.
This lawsuit goes out of its way to ensure it never refers to any content hosted by SnapChat as being a contributing factor, but it does specifically refer to moderation efforts, algorithms, and other newsfeed tweaks that would appear to raise 1st Amendment and Section 230 questions, even if it’s clear the plaintiff and their reps definitely don’t want those issues raised. Trying to plead around them may be a nice try, but is unlikely to be successful. Similar cases have been dismissed on 230 and 1st Amendment grounds and it’s likely that this one will face the same fate.
There’s also the problem that, generally speaking, you can’t blame someone’s suicide on a third party. Courts frown upon such things.
This firm’s efforts appear to be in good faith… or at least in better faith than the social media/terrorism lawsuits filed en masse by 1-800-LAW-FIRM and Excolo Law. But that doesn’t mean these better-intentioned efforts are any more likely to succeed.