These 3 ‘addictive’ social media UX features are on trial
The world’s biggest tech companies are facing a legal showdown that could fundamentally change the way that social media is designed.
The trial is taking place in the Los Angeles County Superior Court, where jury selection started on January 27. It’s testing out a new legal theory intended to spur greater regulation of social media platforms like TikTok, Snap, YouTube, and Meta’s Facebook and Instagram: Lawyers are gearing up to argue that the companies behind these platforms are designing their sites to be deliberately addictive, resulting in direct personal injury to users, especially children.
Overall, the trial is expected to consist of nine cases, which have been compiled by judges across the nation as some of the strongest bellwethers for this new argument. First on the docket is a case brought by a 20-year-old plaintiff identified as K.G.M., who says that a lack of sufficient guardrails on social media sites during her youth led to compulsive use and mental health concerns such as depression, anxiety, body dysmorphia, self-harm, and risk of suicide.
The defendants named in K.G.M.’s initial suit were Bytedance, the former majority owner of TikTok; Snap, which owns Snapchat; Google, the owner of YouTube; and Meta. However, both Snap and TikTok settled the suit in the days leading up to jury selection for undisclosed sums, leaving just Meta and Google.
The results of these initial decisions are expected to serve as a testing ground for a second set of federal cases, scheduled for trial this summer, wherein several school districts, states, and attorneys general plan to argue that social media is a public nuisance and addictive to children.
At the crux of all of these suits lies a design-based claim: These tech companies are using intentionally engineered tricks to foster addictive behaviors among young users. Court documents point out several specific user experience (UX) choices as evidence of this pattern. Here are a few of the key examples in question.
Endless scroll
“Endless (or infinite) scroll” is a chief concern across almost all of the cases that have been filed. It refers to any feature that allows users to continuously scroll through video content without disruptions.
One court document, filed by the Florida attorney general’s office against Meta, claims that infinite scroll “makes it difficult for young users to disengage [from the content] because there is no natural end point for the display of new information.”
In a court filing before Bytedance’s settlement, K.G.M. testified that TikTok’s endless scroll feature disrupted her sleep and caused her to become addicted to the app. According to confidential internal messages obtained by NPR back in October, TikTok is aware of the addictive nature of its central endless scroll “Explore” page, and even calculated the number of videos required to become hooked to the app to be 260.
Ephemeral content
Another pattern of social media design that’s frequently cited in these legal documents is “ephemeral content.” This refers to any kind of post that can only be viewed under certain time parameters, like a once-viewable snap on Snapchat or an 24-hour Instagram story.
The Florida attorney general’s office specifically called out Meta’s visual design cues on Instagram Stories indicating that “the content would soon disappear forever,” noting that this tactic made young users feel more compelled to keep clicking on new content to avoid potential social consequences.
“Meta designed such ephemeral content features to induce a sense of ‘FOMO’ in young users, that is, a ‘fear of missing out,’ which would drive teen engagement,” the filing reads.
Algorithmic recommendations
One of the most concerning details in K.G.M.’s testimony regards the algorithmic recommendations that she’s encountered on social media, which she says have repeatedly directed her to content with disturbing or damaging themes.
“I have gotten a lot of content promoting that kind of stuff—just like body checking, posts [of] what I eat in a day—just a cucumber—making people feel bad if they don’t eat like that,” she said in her deposition.
Per the Florida attorney general’s filing, Meta’s algorithms direct users to concerning content like this by design. Its platforms, the document reads, “periodically [present] users with ‘emotionally gripping content to provoke intense reactions’ (e.g., relating to eating disorders, self-harm, suicide, violence, body-image issues, and more), a result of what Meta purportedly refers to as the algorithms’ ‘preference amplification.’ Despite Meta’s representations to the contrary, this design results in harm to young users.”
For their part, K.G.M.’s lawyers are grounding their arguments in past precedents established by cases ruling that products with purposefully addictive designs should be off-limits to kids.
“Borrowing heavily from the behavioral and neurobiological techniques used by slot machines and exploited by the cigarette industry, [d]efendants deliberately embedded in their products an array of design features aimed at maximizing youth engagement to drive advertising revenue,” the lawsuit alleges. It adds: “Like the cigarette industry a generation earlier, [d]efendants understand that a child user today becomes an adult user tomorrow.”