You open Netflix, scroll through a list of “Top Picks,” and settle on a show. You feel like you chose it. But did you?
In today’s world, algorithms quietly shape our decisions - from what we watch and buy to who we date and even what we believe. The illusion of choice is no longer just a marketing trick.. It's a systemic feature of algorithmic design.
Recommendation engines are everywhere. Netflix, YouTube, Amazon - they all use algorithms to predict what you’ll like. And they’re good at it. According to Kartik Hosanagar, a professor at Wharton, 80% of Netflix viewing hours and a third of Amazon purchases come from algorithmic suggestions. That’s not just helpful, it's transformative.
But personalization has a dark side. These systems don’t just reflect your preferences—they shape them. The more you engage, the more the algorithm learns, and the narrower your options become. You’re not exploring; you’re being guided.
Algorithms don’t have opinions. They learn from data—your clicks, pauses, purchases, and patterns. But if that data reflects bias, the algorithm absorbs it. In one case, courtroom algorithms used to predict recidivism were found to be biased against Black defendants. Not because someone programmed racism, but because historical data encoded it.
This is the heart of the illusion: feel in control, but the system is steering you based on patterns you didn’t choose.
Most algorithms operate in opacity. You don’t know why a post showed up in your feed or why a product was recommended. Even with efforts like Explainable AI, the logic behind many decisions remains inaccessible to users. That lack of transparency makes resistance nearly impossible. How do you push back against a system you can’t see?
The more you engage with certain content, the more you’re shown similar content. This creates echo chambers—especially on social media—where your beliefs are reinforced and alternatives are filtered out.
It’s not censorship. It’s curated. And it’s subtle enough that you might not notice until your worldview feels algorithmically sealed.
Not entirely. But we’re outsourcing more decisions than we realize. As Hosanagar puts it, “Most of us really do not have the free will that we think we do.” That doesn’t mean we should ditch algorithms. It means we need to understand them—and reclaim agency where we can.
Algorithms aren’t evil. But they’re not neutral either. The illusion of choice is powerful—but awareness is the antidote.