Suing Social Media Won’t Save the Children — But It Could Silence Everyone
Logging onto Facebook in the mid-2000s or opening Instagram in its early days felt less like entering a casino and more like flipping through a scrapbook. Users saw posts in chronological order. If you wanted to see something specific, you searched for it using keywords. You followed people you knew and saw their posts appear on your timeline. Your social media feed reflected these choices when it presented you with the pages you liked and intentionally engaged with.
The online world was your domain. Today, that world is largely gone.
In a landmark trial unfolding in Los Angeles, lawyers for a now-20-year-old woman, identified as Kaley, argue that Instagram and YouTube intentionally engineered addictive platforms that harmed her mental health, CNN reported. Her attorney, Mark Lanier, described the apps as “digital casinos,” telling jurors that the swipe of a finger resembles “a handle of a slow machine.” (RELATED: Parents Have Everything They Need to Keep Their Children Safe Online)
“This case is about two of the richest corporations who have engineered addiction in children’s brains,” Lanier said, pointing to features like infinite scroll, autoplay, likes, and beauty filters as dopamine-triggering mechanisms.
Kaley’s lawsuit, which is one of 1,500 similar cases, claims she developed anxiety, body dysmorphia, and suicidal thoughts after years of heavy social media use. According to her legal team, she began using YouTube at age six and Instagram at age nine. Phone records show that at 16, she once spent more than 16 hours in a single day on Instagram. Lanier cited internal documents, including a Meta strategy memo suggesting the company must “bring them in as tweens” to “win big with teens.” He argued that digital platforms intentionally created addictive “loops.”
Meta and YouTube strongly dispute those claims. YouTube’s attorney, Luis Li, told jurors flatly: “Ms. GM, Kaley GM, is not addicted to YouTube.” He cited internal data showing she averaged 29 minutes per day on the platform since 2020 and watched only about four minutes per day of autoplay-recommended videos. “Folks, infinite scroll is not infinite,” Li argued. Meta’s lawyer, Paul Schmidt, pointed to Kaley’s difficult upbringing and therapist testimony suggesting social media was not the “throughline” of her mental health struggles. Both companies emphasize safety features, parental controls, “take a break” reminders, and options to disable likes or autoplay.
There is no doubt that social media algorithms can hook users — not just kids, but adults too. Anyone who has opened an app to check one notification and resurfaced to reality 45 minutes later knows this. The modern social media feed is no longer chronological; it is curated by opaque recommendation systems optimized to maximize engagement. This is a deliberate effort to keep you scrolling, to keep you watching, and to keep you coming back for more.
Algorithms are not designed to enrich your life. They are designed to increase time-on-platform.
Social media companies are not charities. It takes money to run these digital platforms, and of course, their executives are incentivized to rake in the millions of dollars that come. Their business model depends on advertising. The longer your eyes stay on the screen, the more ads you see. The more precisely your behavior can be predicted (through data-tracking tactics), the more valuable you are to advertisers. Algorithms are not designed to enrich your life. They are designed to increase time-on-platform. This may entertain users at first, but soon, users will realize they are easily attracted to emotionally draining content that keeps them staring at screens for hours.
In the early days, users built their own digital worlds. Social media was a place to connect with friends you know in real life, to share photos, and to express yourself. Back then, the only way we curated our feed was by choosing whom to follow. We opted in by making these choices and searching for content.
Today, we are spoon-fed content based on what the algorithm predicts will hold our attention as a good background for ads. This includes text-posts, photos, and videos that inspire outrage, aspiration, validation, and envy. In a matter of moments, social media users ride an emotional rollercoaster, twisting and turning through a hyper-commercialized theme park of self-branding galore. This experience is increasingly shaped by forces we do not see. In a business-driven branding culture, social media is the manipulative host of our escapism.
And yet, as concerning as these dynamics are, the lawsuit against Meta and YouTube should fail if it attempts to regulate user content.
Digital platforms are not necessarily blameless, and they are absolutely fertile ground for psychological damage. However, the legal precedent at hand is far more dangerous.
Section 230 of the Communications Decency Act of 1996 shields platforms from liability for user-generated content. Judge Carole Kuhn has instructed jurors that they cannot hold the companies liable for allowing or recommending third-party content. That protection is foundational to online free speech. If platforms could be sued for the speech of their users, the likely result would not be a safer Internet. It would be a far more censored and politically motivated one.
If any reform comes from this wave of litigation, it should target not user content but algorithmic design.
There is a meaningful distinction between hosting speech and engineering compulsive engagement systems. The former implicates the First Amendment; the latter implicates product design and consumer transparency.
If courts or lawmakers intervene, they should focus on requiring algorithmic transparency. Platforms should be compelled to disclose, in clear language, how recommendation systems prioritize content. They should offer a simple, default chronological feed with algorithmic amplification turned off that allows users to manually opt in if they choose.
Choice is the key.
Right now, most users are automatically placed into algorithm-driven feeds optimized for engagement. A genuine reform would reverse that default. Let the baseline be chronological based on who you’re following. Let users search for what they want before the social media platform serves up what it assumes they want to see to stay longer. Let them decide if they prefer an algorithmic feed designed to “optimize” their experience before they unknowingly fall through this trapdoor.
Personal responsibility still matters. We choose to download these apps. We choose to open them. We choose how much time we spend on them. Parents bear responsibility for monitoring their children. Adults are responsible for their own habits. Users are capable of taking control of their own behavior and social media feed right now, but doing so requires discipline and deep knowledge of what user experience control mechanisms and data collection permissions these platforms are hiding in their settings. (RELATED: EU Censorship Metastasizes)
Yes, social media platforms are constructed to be sticky. Of course they are. Companies want customers. They want advertising dollars. They want user growth. This is neither shocking nor new. Television networks chase ratings. Casinos design floors without clocks. Grocery stores place candy at eye level for children.
The answer to manipulative design is not to torch the principle of free speech. It is to restore user control.
Technology will continue to evolve as human weakness remains constant. The solution isn’t to sue away temptation, hoping for the death of social media. It is to demand transparency and restore meaningful user choice without dismantling the legal protections that safeguard free expression. Reform the algorithms if you must, but don’t tamper with what people can say on the Internet.
READ MORE from Julianna Frieman:
Why Has Nancy Guthrie’s Case Become America’s Only Story?
Everyone Watches a Different Super Bowl
Julianna Frieman is a writer who covers culture, technology, and civilization. She has an M.A. in Communications (Digital Strategy) from the University of Florida and a B.A. in Political Science from UNC Charlotte. Her work has been published by the Daily Caller, The American Spectator, and The Federalist. Follow her on X at @juliannafrieman. Find her on Substack at juliannafrieman.substack.com.