Willy Wonka Runs the Internet - and I Want To Check His Factory
Like Charlie’s grandfather, Grandpa Joe, the CMA is finally waking up.
Britain’s Competition and Markets Authority has begun inspecting the algorithms that shape our digital experience. They’ve launched a formal ‘Analysing Algorithms’ programme. They're reviewing code and under the new Digital Markets, Competition and Consumers Act, they now have the power to impose conduct rules on firms like Google and Apple.
So, the regulators are stirring. But have they really thought it all through? The CMA will approach its investigation through a lens that allows fair competition but not, I imagine, of exactly what type of content is being surfaced.
I think there’s a deeper issue hiding in plain sight. When an algorithm is designed to optimise for ad revenue, it will inevitably prioritise content that maximises engagement and clicks. That means sensational, familiar, populist material. Not nuance. Not complexity. Certainly not public service content.
This is where it gets dangerous because if the algorithm ignores or suppresses content that doesn’t drive revenue, then whole genres risk disappearing. Education, civic information, slow storytelling. Not because they aren’t valuable but because they aren’t profitable.
That’s where public service broadcasters should come in - especially those like the BBC, which receives £3.4 billion a year from the public. As my regular readers know, over the past few weeks I’ve been increasingly critical of the BBC’s desire to chase profitability over its founding principles to Inform and Educate. If it’s allowed to abandon its mission, it won’t just hurt the industry. It will hurt itself.
One producer even told me, and I don’t know if this is true, that the BBC has started to charge producers to put content on its Sounds platform. I sorely hope this isn’t the case because if so, it speaks volumes about how far Aunty has drifted.
If we want a plural creative economy, not just one driven by attention metrics then organisations with public value missions must open their platforms, and algorithms, to outsiders. That’s how you support real diversity. That’s how you create opportunity.
But this isn’t just about the BBC. It’s about all of us. How can anyone build a sustainable content business without understanding the rules of engagement? What earns money? What gets surfaced? What gets buried? These are strategic questions, not just creative ones.
Much like the government once compelled Sky to keep the BBC and other broadcasters high up on the EPG, we now need rules that apply to digital platforms - to Google, to TikTok, to whatever’s coming next. If culture is shaped by algorithm, then the rules of the algorithm must be known.
Right now, though, no one outside the platforms understands how they work and that’s the danger. We are standing outside Willy Wonka’s chocolate factory, hoping the Oompa Loompas are nice.
So who is going to sort this out?
Ask any experienced TV exec whether digital is the future, and they’ll nod along. But then ask what they’ve made for digital. What they’ve commissioned, edited, launched, scaled.
Suddenly, it gets quiet.
The truth is, most leaders in our side of the content industry built their careers in a different world. One where TV ruled the cultural agenda. Where getting a commission meant broadcast slots and BARB ratings. Where you could walk into a room and talk to someone in charge.
Digital doesn’t work like that. You don’t pitch to an algorithm. You don’t get feedback. You don’t even know who made the decisions. There are no commissioners - only curation engines.
So when traditional media tries to go digital, it often does so with TV logic. Great storytelling, great packaging, great craft and then it often sinks. Why? Because no one in the building understands how their content is being surfaced or buried. We need someone to fight our corner, someone who understands both types of content as a commissioner and creator to break into the factory and see how the sweets are made.
There’s still a myth that “the best content will win.” - told by execs from those companies, That the algorithm is somehow Darwinian in it’s design. But I think in a world of infinite choice, we don’t just need survival of the fittest. We need discoverability for different content and a system that gives new voices a chance to be heard.
The algorithm isn’t neutral. It never was.
We’re told these systems are neutral. That the algorithm just gives people more of what they like. But let’s be honest: we’ve seen this movie before.
Remember when VW got caught rigging its emissions tests? The code inside the car knew when it was being tested and changed its behaviour. That wasn’t a bug. It was the design.
So when tech giants tell us the code is fair and unbiased, forgive me if I ask to look under the bonnet.
Because here’s the problem: no one outside the system really knows how content gets chosen, elevated or suppressed. It’s not just about news. It’s about culture. Comedy. Music. Education. Diversity. Innovation. What gets seen shapes what gets made.
And the platforms aren’t just showing us what we want. They’re steering what we get.
Legacy media is a compliance prison. But at least we can see the guards.
We all know TV is over-regulated. There are compliance officers, fairness reviews, watershed rules, taste committees.
But here’s the irony: at least we know what the rules are. Everyone in the system can read the manual. There are appeals processes. Independent watchdogs. Legal frameworks. You can challenge the decision.
Compare that to digital. You post a video. It dies. No views, no explanation. You post another - 2 million views in an hour. Why? No one knows. Not even the platform staff. Or worse: they do know, but they won’t tell us.
That’s what digital creators face every day. A roulette wheel that now serves as our main cultural filter.
The algorithm is designed to give you what you want but that’s the problem. Because what you want isn’t always what you need.
If you like Star Wars, your feed fills with Star Wars. Lightsabers. Fan theories. Behind-the-scenes footage. Merchandise. Re-edited trailers. Endless sequels. But would it ever show you Game of Thrones? Or Succession? Or even something non-fiction? Probably not. Because those aren’t safe bets. They’re not what the algorithm thinks you’ll click.
And when the system only serves what you already enjoy, it becomes harder to discover anything else. Harder to grow. Harder to change your mind.
It’s like food. We all like chips. But if your diet is nothing but fast food, you’ll end up addicted, overweight, and undernourished. Where are the healthy salads? The surprising flavours? The roughage that makes you uncomfortable but keeps you balanced?
In the old world, the menu was limited. You got the salad whether you wanted it or not. Now? You just swipe past it. Or the algorithm hides it from you altogether.
That’s not just personal preference. That’s a cultural distortion.
I think we need to regulate the algorithm like we regulate food
Here’s a question - would you eat something, made by a giant corporation, without knowing what was in it?
Of course not. Every snack has to disclose its ingredients. Every medication lists its side effects. Every toy has a safety warning. That’s because we’ve decided that when something goes into your body or your child’s hands, it has to be accountable.
So why don’t we treat the stuff that goes into our minds the same way?
Digital media shapes our perception of the world. It informs our kids’ views on race, gender, politics, sex, history, values and it does so invisibly, behind closed doors, based on logic no government or regulator has ever properly inspected.
We need an ingredients label for the feed. A ‘nutritional breakdown’ of why content is shown. A regulator who understands both the code and the culture.
We need a tour of Wonka's factory - not just a ride in his glass elevator.
We need to understand how the new system works. Who built it. Who audits it. Who decides what gets elevated or hidden. Who’s checking for bias, for harm, for integrity.
If a single platform can decide what a billion people see each day, that platform must be open to inspection.
Not to censor but to understand it. To learn from it. To level the playing field.
Let’s stop pretending the algorithm is like Wonka’s magic. It’s code. Someone wrote it. It can be studied, regulated, and made fair.
Otherwise we'll all end up like Augustus Gloop - sucked in to the pipes of the factory...