The Problem with Algorithmic Recommendations


You don’t choose what you see on most platforms anymore. Algorithms do. They decide which posts appear in your feed, which videos play next, which products get recommended, which news articles surface.

This seems helpful. Personalization! Relevance! No more scrolling through irrelevant content! And sometimes it is helpful. But the trade-offs are significant and mostly invisible until you look for them.

The Optimization Target

Algorithms optimize for engagement. Watch time. Click-through rate. Time on platform. These metrics proxy for value—if you’re spending time, you must be getting value, right?

Except engagement doesn’t equal value. Rage scrolling is engaging. Clickbait is engaging. Outrage is incredibly engaging. Content that makes you angry keeps you clicking more than content that makes you think.

So algorithms surface content that triggers emotional responses, not necessarily content that’s accurate, important, or valuable. They’re optimizing for the wrong thing, but they’re very good at it.

The Filter Bubble

Recommendation algorithms learn what you like and show you more of it. Sounds great. In practice, it means you see progressively narrower slices of perspective.

Like true crime videos? YouTube will feed you endless true crime. Political content? You’ll see more extreme versions of your existing views. The algorithm isn’t trying to radicalize you—it’s just showing you what keeps you watching.

But constant reinforcement of existing beliefs without encountering different perspectives creates distorted worldviews. You think everyone agrees with you because your feed shows only agreement. Dissenting views become invisible.

The Novelty Trap

Algorithms favor new content over good content. A mediocre new video gets recommended over an excellent video from last week. Recency is weighted heavily.

This creates pressure to constantly produce new content even if you don’t have anything new to say. Quality suffers. Depth suffers. Evergreen content that remains valuable gets buried by the new and mediocre.

The Amplification Problem

Algorithms amplify whatever generates engagement. Sometimes that’s harmless. Sometimes it’s conspiracy theories, misinformation, or extremist content.

Not because platforms want to promote bad content, but because controversial content generates engagement. Debunking a conspiracy theory doesn’t engage as well as the conspiracy theory itself.

The algorithm doesn’t know or care about truth. It knows about clicks. That’s a problem when false but engaging content outperforms true but boring content.

The Manipulation Vulnerability

Once you know how algorithms work, you can game them. SEO isn’t about creating good content anymore—it’s about creating content algorithms reward. Same with social media, YouTube, TikTok.

Creators optimize for the algorithm rather than for audiences. Clickbait headlines, exaggerated thumbnails, engagement bait (“like and subscribe!”), video length adjusted to hit sweet spots for recommendation.

The result is a internet increasingly designed to please algorithms rather than humans. We’ve outsourced taste to machines optimized for engagement, and we’re getting exactly what we asked for.

The Loss of Serendipity

Before algorithms, you encountered unexpected things. You’d flip through a magazine and find an article on a topic you’d never considered. You’d channel surf and discover a documentary.

Algorithms reduce serendipity. You see more of what you like, less of everything else. Discovery becomes constrained by your existing preferences, which prevents your preferences from evolving.

The internet feels smaller because your view of it is smaller. The algorithm shows you a personalized slice, and that slice feels like the whole thing.

The Control You’ve Lost

You don’t decide what’s important anymore. Algorithms do. You might care deeply about updates from a specific friend or creator, but you’ll see their posts only if the algorithm decides to show them.

Chronological feeds gave you control—you saw everything in order and decided what to engage with. Algorithmic feeds take that control. You see what the algorithm surfaces, miss what it doesn’t.

You can’t opt out on most platforms. There’s no “just show me everything chronologically” option, or if there is, it’s buried and intentionally degraded.

The Business Incentive

Platforms make money from ads. Ads require attention. Algorithms maximize attention. Therefore, algorithms serve platform business models, not user interests.

When those align, everyone wins. But when they don’t—when keeping you engaged means showing you content that’s bad for you—the platform chooses engagement every time.

This isn’t evil. It’s economics. But the result is platforms optimized for their benefit, not yours, even when those interests conflict.

What Actually Helps

Actively curate. Unfollow accounts that don’t add value. Seek out diverse perspectives intentionally. Don’t rely on algorithms to show you what matters.

Use RSS feeds for content you care about. RSS is chronological and complete. No algorithm decides what you see. It’s old technology, but it puts you in control.

Limit algorithmic platforms. You don’t have to quit social media, but reducing time on algorithm-driven feeds reduces their influence over what you think about.

Seek out human curation. Newsletters, magazines, humans recommending things—these have biases too, but they’re more transparent and less optimized for engagement metrics.

Be aware of the influence. Knowing algorithms are shaping what you see helps you question whether your views are based on representative information or algorithmic filtering.

For Content Creators

The pressure to optimize for algorithms is real. Livelihoods depend on visibility, and visibility depends on pleasing algorithms.

But there’s value in resisting. Creating for humans rather than algorithms produces better work, even if it gets less immediate engagement. Building direct relationships with audiences—email lists, memberships, subscriptions—reduces algorithmic dependence.

Not everyone can afford to ignore algorithms. But those who can should consider whether optimizing for engagement is worth the trade-offs in quality and integrity.

The Regulatory Question

Should algorithms be regulated? Required to be transparent? Give users control? These are active debates without clear answers.

Forcing platforms to offer chronological feeds seems reasonable. Requiring transparency about why content gets recommended seems fair. But the details are complicated and enforcement is challenging.

Individual awareness and choice might be more effective than regulation. If enough people demand control and better recommendations, platforms will respond. They respond to user behavior, and user behavior can change.

The Bigger Picture

Algorithms aren’t evil. They’re tools. But they’re tools optimized for specific outcomes that don’t always align with user well-being.

Being aware of that misalignment helps you use platforms more intentionally. You can benefit from recommendations while resisting their influence on what you think, what you care about, and how you see the world.

The algorithm wants your attention. Whether you give it should be your choice, made consciously, not something that happens by default because the interface is designed to keep you scrolling.

Take back some control. Your attention is yours. Spend it deliberately.