Most people believe they choose what news they consume. They open an app, scroll through a feed, and read what catches their eye.
What they do not fully appreciate is that the feed they are scrolling through has already been curated for them by an algorithm that made thousands of decisions about what to show and what to hide before they ever opened the app.
Understanding how those decisions are made is one of the most important pieces of media literacy for navigating the current information environment.
What an Algorithm Actually Is
An algorithm, in the context of social media, is a set of mathematical rules that determines which content to show each user, in what order, and for how long.
These algorithms are not neutral. They are designed with a specific objective: to maximize user engagement, typically measured as time spent on the platform, posts liked or shared, and return visits. Every decision the algorithm makes is in service of that objective.
The algorithm does not know or care whether the content it is showing you is accurate, important, or good for your understanding of the world. It knows whether it is engaging.
How Engagement Is Measured
Social media algorithms measure engagement through a range of signals: clicks, likes, reactions, shares, comments, time spent viewing, and return visits.
Research from Facebook's internal teams, revealed in documents published by The Wall Street Journal as part of the Facebook Files investigation in 2021, showed that the company's own researchers had found its engagement-based ranking system was amplifying divisive and misinformation-heavy content because that content generated stronger engagement signals than accurate, measured content.
Facebook's researchers found that posts that generated "angry" reactions were more likely to contain misinformation than posts that generated other reactions. Despite this finding, the angry reaction was weighted five times more heavily than a standard like in the engagement algorithm, because it drove more engagement.
The Filter Bubble Effect
One of the most studied consequences of algorithmic content selection is the filter bubble, the tendency of algorithms to show users more of what they have already engaged with, gradually narrowing the information they are exposed to.
If you like a post criticizing one political party, the algorithm learns that this type of content engages you and shows you more of it. Over time, your feed fills with content that reinforces your existing political views and rarely exposes you to serious engagement with opposing perspectives.
Research on filter bubbles is more nuanced than the popular conception suggests. A 2023 study published in Science and co-authored by researchers from Meta found that exposure to algorithmically ranked content on Facebook did increase the proportion of content from like-minded sources, but that this effect was somewhat smaller than commonly assumed.
Platform Differences
Not all social media algorithms work the same way.
Facebook's algorithm prioritizes content that generates strong emotional reactions, particularly from people in your existing social network.
Twitter/X's algorithm has changed significantly since Elon Musk's acquisition of the platform in 2022. The current algorithm significantly amplifies content from paid subscribers to X Premium, regardless of the quality or accuracy of that content.
YouTube's recommendation algorithm is particularly powerful because it determines not just what appears in your feed but what plays next after a video ends. Research by journalists and academics has documented a pattern of the YouTube algorithm recommending progressively more extreme content.
TikTok's algorithm is widely regarded as the most powerful content recommendation system currently deployed at scale. Unlike Facebook, which relies heavily on your social network, TikTok's algorithm infers your interests from your viewing behavior and recommends content from creators you have never followed based entirely on predicted engagement.
What You Can Do
Understanding how algorithms work gives you some ability to push back against their effects, though the tools available to individual users are limited.
Actively seek out sources rather than relying on the feed. Going directly to news outlets you trust, rather than waiting for their content to appear in your algorithmic feed, puts you in control of your information diet.
Use chronological feeds where available. Most major platforms offer the option to view content in chronological order rather than algorithmic order.
Be aware of your engagement patterns. Every time you like, share, or spend significant time viewing a piece of content, you are teaching the algorithm about your preferences.
Use RSS readers or email newsletters for news. These tools deliver content directly from sources you have chosen, bypassing algorithmic curation entirely.
The Deeper Issue
Algorithmic content curation is not going away. The platforms that use it are too large, too profitable, and too embedded in daily life for that to change in the near term.
But understanding how it works changes your relationship to the information you consume. When you see a piece of content that makes you angry, you can ask: is this appearing in my feed because it is important, or because an algorithm determined it would generate engagement?
PressGrade scores media figures and outlets on the behavioral criteria that distinguish reliable sources from unreliable ones. Knowing which sources score well is the first step toward building an information diet that serves your understanding rather than an algorithm's engagement metrics.