The British government is picking a fight it has already lost, using a playbook that was outdated by 2012.
Prime Minister Keir Starmer’s recent vow to "fight" social media firms to protect children from "addictive" algorithms sounds noble in a soundbite. It plays well with worried parents. It looks great on a campaign flyer. But in the actual machinery of the attention economy, it is worse than useless. It is a distraction that grants Big Tech exactly what it wants: a regulatory framework that treats the symptoms while subsidizing the disease.
We are witnessing a classic political performance. Politicians treat "algorithms" like a supernatural force—a digital boogeyman that can be exorcised with enough legislative paperwork. They talk about "addiction" as if it’s a bug in the software.
It isn't a bug. It’s the product.
The Myth of the Passive Victim
The prevailing narrative—the one Starmer is peddling—is that children are helpless, passive recipients of "algorithmic harm." This is a comforting lie. It suggests that if we just "fix" the math, we fix the child.
I’ve spent fifteen years inside the engine rooms of platforms you use every day. I’ve seen how these systems are built. The "addiction" Starmer wants to fight is not a one-way street. It is a feedback loop.
When a 14-year-old spends six hours a day on TikTok, they are not just being "fed" content. They are training the system. Every swipe, every micro-pause, every re-watch is a data point. The algorithm is a mirror. It doesn't tell people what to think; it reflects what they already desire—often their darkest, most impulsive urges.
If Starmer "fights" the algorithm, he is essentially trying to legislate human curiosity. It’s like trying to ban the wind because it makes the sea too choppy. You can’t regulate a mirror into showing you a different face.
The "Addiction" Strawman
Politicians love the word "addiction" because it implies a lack of agency. It removes responsibility from the user and the parent, placing it entirely on the "pusher."
But let’s get precise. In clinical psychology, addiction has a specific definition. If we use the term loosely to describe anything that people find difficult to stop doing, then reading books, playing sports, or engaging in heated political debates are all "addictive."
The real issue isn't the time spent; it’s the displacement. What is social media replacing? It’s replacing physical play, unsupervised social interaction, and—crucially—the ability to be bored. Starmer’s focus on "fixing the firms" ignores the collapse of the "third space" for young people. When parks are closed, youth clubs are defunded, and kids are banned from hanging out in shopping centers, where else are they supposed to go?
They go to the only place that welcomes them 24/7.
By framing this as a war against "evil" tech firms, the government avoids looking at its own failure to provide a physical world worth living in. It’s cheaper to pass a "safety" bill than it is to rebuild the social infrastructure of a nation.
Why Big Tech Loves Regulation
Here is the secret Starmer won't tell you: Big Tech wants these regulations.
Meta, Google, and TikTok have thousands of lawyers and compliance officers. They can navigate a 500-page "Online Safety Act" without breaking a sweat. In fact, they welcome it.
Why? Because a complex regulatory environment is the ultimate "moat." It kills off the competition. A three-person startup in a garage cannot afford the legal team required to prove their algorithm is "non-addictive" to a government regulator. By imposing these hurdles, Starmer is effectively cementing the dominance of the very giants he claims to be fighting.
He is turning the current tech giants into the "Big Tobacco" of the digital age—regulated, taxed, and permanently entrenched.
The Algorithmic Fallacy: Why Transparency is a Trap
One of the cornerstones of the current political push is "transparency." The idea is that if we can just see how the algorithm works, we can fix it.
This is a fundamental misunderstanding of how modern machine learning operates.
Most "algorithms" are no longer a set of "if/then" rules written by a human in a hoodie. They are neural networks that have evolved through trillions of iterations. Even the engineers who built the foundation of these systems cannot tell you exactly why a specific video was served to a specific user at 3:00 PM on a Tuesday.
Demanding "algorithmic transparency" from TikTok is like demanding "neuron transparency" from a human brain. You can look at the map all day, but you won't understand the consciousness.
The Real Cost of "Safety"
When the government demands that platforms "protect" children, what they are actually demanding is mass surveillance.
To know if a user is a child, the platform must verify their identity. To verify identity, you need a passport, a face scan, or a government ID. To ensure that "harmful" content isn't reaching them, the platform must scan every message, every image, and every video sent in private groups.
Starmer’s "fight" for child safety is, in practice, a push for a "De-Anonymized Internet."
If you want a world where every single digital interaction is tied to a government-verified identity and monitored by an automated safety filter, then by all means, support these bills. But don't pretend you're doing it for the kids. You're doing it to build a panopticon.
What Actually Works (But Isn't Politically Useful)
If a government actually cared about the mental health of children, it wouldn't be arguing with Mark Zuckerberg about his recommendation engine. It would be doing things that are far more radical—and far more difficult.
- Mandate Interoperability, Not Safety.
Instead of telling platforms how to moderate, tell them they have to let users leave. Force platforms to allow users to take their data—and their social graph—to a different app. If a platform is "addictive," let a competitor build a "clean" interface that uses the same data but filters out the garbage. This breaks the monopoly on the attention span. - Tax the Harvest, Not the Sale.
The current business model is "extract data, sell ads." Taxing the collection of data—literally making it expensive to track a child—is the only thing that will change the behavior of these firms. As long as the data is free to harvest, they will find a way to harvest it, regardless of the "safety" labels. - Invest in Physical Resistance.
The only "antidote" to the digital world is the physical one. If we spent half the money we spend on "online safety" audits on building high-quality, free, safe physical spaces for teenagers, the "addiction" problem would solve itself.
The Brutal Truth
The reason Starmer’s rhetoric feels so empty to those of us who have lived this industry is that it ignores the most uncomfortable fact of all:
Parents are part of the problem. It is easier to blame an algorithm for a child's depression than it is to admit that we have handed our children "digital pacifiers" because we are too tired, too busy, or too distracted by our own phones to engage with them.
The algorithm didn't steal your child's childhood. It just filled the void that was already there.
Starmer isn't "fighting" for you. He’s performing for you. He’s giving you a villain to hate so you don't have to look at the mirror.
Stop asking the government to fix the software. Start asking why the hardware of our society—our schools, our streets, our homes—has become so hostile to young people that they feel they have no choice but to escape into a screen.
Until we fix the real world, the virtual one will always win.
Put down the bill. Open the door.