Trial Set to Start Against Meta, TikTok, and YouTube, Alleging Addictive Technology
Outside of being a lawyer, I'm a parent. And one thing that has deeply troubled me is the observed impact of social media and technology on this generation of children. The evidence has become clearer as time has passed: social media is not good for kids' development.
And yet, these companies remain largely unregulated and unfettered in their ability to reach and control our children.
Trial starts this week in a California lawsuit against social media behemoths Meta, TikTok, and YouTube, alleging their algorithms knowingly and intentionally targeted children, causing serious mental health issues across a generation. This case is expected to be a signpost for numerous other lawsuits making similar allegations.
What Are These Lawsuits About?
At the core, these cases aren’t about individual bad posts or one-off incidents. They’re about product design.
Plaintiffs allege that social media companies intentionally engineered features that exploit known psychological vulnerabilities in young users, including:
Infinite scrolling
Algorithmic content amplification
Push notifications and streaks
Likes, hearts, and follower counts
Personalized recommendation systems
Internal research, some of which has already been made public, allegedly showed increased risks of anxiety, depression, sleep disruption, eating disorders, and suicidal ideation in young users. The lawsuits claim companies knew this and chose growth and profit anyway.
That’s not “free speech.” That’s alleged negligence and deception.
Who Is Suing?
The plaintiffs vary, but the legal theories overlap:
States argue consumer protection violations and public nuisance
School districts seek damages for the resources required to address student mental health crises
Parents and minors allege product liability, negligence, and failure to warn
Many cases focus on harm to children because minors lack the capacity—and often the choice—to meaningfully consent to the risks imposed by these platforms.
This Isn’t About Parenting or Personal Responsibility
One of the loudest defenses from social media companies is that parents should simply “monitor usage better.”
Courts are increasingly skeptical of that argument.
Why? Because you can’t supervise away a defective product.
If a toy is coated in lead paint, the manufacturer doesn’t escape liability by saying parents should’ve watched their kids more closely. If a car has a faulty airbag, the blame doesn’t fall on the driver.
The lawsuits argue social media should be treated the same way: if a product is foreseeably dangerous, especially to children, the company that designed it bears responsibility.
What Makes These Cases Different from Past Tech Lawsuits?
Two things stand out:
The evidence
Plaintiffs now have internal company documents, whistleblower testimony, and independent research showing companies understood the risks.The framing
These cases don’t attack content. They attack design choices, a crucial distinction that helps plaintiffs avoid First Amendment and immunity roadblocks.
Courts are being asked to consider whether social media platforms are less like neutral bulletin boards and more like intentionally engineered addiction machines.
Why This Matters | Even If You’re Not on Social Media
These lawsuits aren’t just about kids who spend too much time online. They’re about who pays the cost when corporations externalize harm.
School counselors overwhelmed. Emergency rooms flooded with youth mental health crises. Families watching their children spiral. Public systems absorbing the fallout.
When companies profit while communities pay the price, the civil justice system exists to rebalance that equation.
What Happens Next?
These cases are still working their way through courts, and outcomes will vary. But even at this stage, the lawsuits have already changed the conversation.
They force transparency.
They force accountability.
And they force a long-overdue question:
If a company knowingly designs a product that harms children then shouldn’t it be held responsible?
The courts may soon answer that question. And whatever the outcome, these cases signal a shift: tech companies are no longer immune from the basic rule that applies to every other industry.
If you create a dangerous product, you answer for the harm it causes.