The jury just heard the final words in a trial that could finally break the "Wild West" era of social media. In a Los Angeles courtroom on Thursday, March 12, 2026, lawyers for a 20-year-old woman named Kaley (identified in court as K.G.M.) finished their closing arguments against Meta and Google. After a month of high-stakes testimony, including a rare appearance by Mark Zuckerberg himself, the case is now in the hands of twelve citizens.
This isn't just about one person's screen time. It's a bellwether trial, the first of its kind, designed to test whether tech giants are legally responsible for engineering "addiction" in children. If the jury sides with Kaley, it triggers an avalanche for the 10,000 other lawsuits waiting in the wings.
The Gazelle and the Lion Strategy
Mark Lanier, Kaley's attorney, didn't hold back during his final appeal. He showed the jury an image of a lion stalking a herd of gazelles. He argued that Meta and YouTube aren't just platforms; they're predators that don't go after the strong. They target the "weakest"—the vulnerable, developing brains of children.
Lanier’s case rests on the idea that features like the infinite scroll and autoplay are "defective products." Think of them like a slot machine in a child's pocket. He compared Instagram’s endless feed to free tortilla chips at a restaurant. You don't eat them because you’re hungry; you eat them because they're there, salty, and designed to keep you chewing.
The plaintiff's team brought out internal documents from "Project Myst," a Meta study that reportedly showed the company knew traumatized or stressed kids were the most likely to get hooked. Lanier’s point was sharp: "I don’t naysay the opportunity to make money, but when you’re making money off of kids, you have to do it responsibly."
The Defense Blames the Home Life
Meta and Google aren't rolling over. Their defense has been a mix of "it’s not our fault" and "it’s the parents’ fault." Meta’s lawyer, Paul Schmidt, spent a lot of time digging into Kaley’s medical records and home life. He told the jury that her depression and suicidal thoughts weren't born in an app; they were the result of a "turbulent" family dynamic and personal trauma that existed before she ever hit "post."
Basically, the defense is saying social media was a symptom, not the cause. They pointed out that while Kaley’s therapists believe in social media addiction, they never actually diagnosed her with it. Google took it a step further, arguing that YouTube isn't even a social media platform and that Kaley only used it for about 30 minutes a day recently—hardly the mark of a "junkie."
What Mark Zuckerberg Said Under Oath
The most dramatic moment of the trial came on February 18, when Mark Zuckerberg took the stand for his first-ever jury testimony. It was a cold, tense confrontation. Lanier grilled him on internal PowerPoints titled "Creating the Future" that described how to target "younger generations."
Zuckerberg’s defense was classic "Big Tech":
- He claimed existing research doesn't prove a causal link between social media and mental health harm.
- He refused to pledge money to help victims, saying he "disagreed with the characterization" of the problem.
- He argued Meta has invested billions in safety features.
But the jury also saw unsealed emails where Meta executives allegedly ignored staff warnings about beauty filters causing body dysmorphia. It painted a picture of a company that knew the risks but chose the "like" button over the "lock" button.
Why Section 230 Might Not Save Them This Time
For decades, social media companies have hidden behind Section 230 of the Communications Decency Act. This law says they aren't responsible for what users post. If someone bullies you on Facebook, you can't sue Facebook for the words said.
Kaley’s legal team is trying to bypass that shield by suing over product design.
- Infinite Scroll: Removes natural "stopping cues" that tell the brain to take a break.
- Push Notifications: Exploits impulsivity to drag users back into the app.
- Algorithms: Specifically tuned to keep you scrolling even when you want to stop.
By arguing that the code is a defective product—rather than the content—they're forcing the court to look at social media like a defective car or a dangerous toy.
What Happens if the Jury Says Yes
This isn't just about a payout for Kaley. TikTok and Snap already settled their portions of this case for undisclosed amounts. The fact that Meta and Google are the last ones standing shows they’re terrified of the precedent.
If the jury finds them liable:
- Design Changes: We could see a legal mandate to kill the infinite scroll or limit notifications for minors.
- The $270 Valuation: Testimony revealed Meta valued some teens at $270 each to the company. A guilty verdict would make that "profit" look very small compared to potential damages.
- Mass Settlements: Thousands of school districts and families are watching. A "guilty" verdict here means Meta and Google likely start writing checks to settle the other 1,600+ cases before they ever hit a courtroom.
Practical Steps for Parents Right Now
Don't wait for a jury to tell you what's healthy. The trial experts highlighted specific "red flags" that indicate a child has moved from "using" to "addicted":
- Tolerance: Does your kid need more and more screen time to feel the same level of "fine"?
- Withdrawal: Do they get physically agitated or aggressive when the phone is taken away?
- Neglect: Are they skipping meals, sleep, or real-life friends to scroll?
Most experts in the trial suggested that "parental controls" are largely ineffective against a trillion-dollar algorithm. The most effective move is a hard delay: keep them off these platforms until their prefrontal cortex has a fighting chance—ideally age 16. If they're already on it, move the charging station out of the bedroom tonight. Blue light is the least of your worries when an algorithm is trying to "leverage" your child's self-worth for an ad impression.