
Who Needs Information?
Share
Art imitates life, imitates art, imitates life, imitates art…
In 1987, Pink Floyd founder Roger Waters released a solo album, Radio Kaos. In typical Waters fashion it is dripping with themes critical of the state of the world and the power dynamics that shape it.
Track two, Who Needs Information, tragically seems to be more true today than it was thirty eight years ago.
It is a critique of the overwhelming flood of media that confronts us day in and day out. And more to the point just how much that media distracts, shapes, and influences society.
The lyrics resonate eerily well when viewed through a contemporary lens. Social media platforms and digital publishers have largely displaced traditional media outlets. They have become the modern-day purveyors of news, opinions, research papers, misinformation, user generated content, blogs, disinformation, live streaming, and on and on...
If we were to look back at the state of the media when this song was released it appears laughably innocent. The sheer volume and variety of content available then seems quaint by today's standards. MSNBC and FOX News were still a decade away. The iPhone, Facebook, and Twitter were still two decades away. TikTok three.
These platforms are no longer tools for “connection” as the already antiquated term "social media" once defined. Instead they are meticulously designed information distribution ecosystems that rely and thrive on engagement.
At the heart of this designed ecosystem are algorithms which dictate every bit of information we see. More crucially, they dictate WHO SEES WHAT.
The algorithms don't tell us what to click on, not directly. But by carefully curating what we see, they act as gatekeepers of information, subtly shaping our perceptions of the world. These algorithms don’t just show us content; they decide what’s worth our attention, what’s relevant to us, and what will keep us engaged. The result? A deeply personalized experience that maximizes engagement but fractures our shared understanding of reality.
The goal of algorithmic personalization is to maximize your time spent on the platform. That means it pushes content to you based on what it “believes” you are more likely to engage with, comment on, like, share, etc…
At the heart of this system is a simple yet profound truth: what you see on social media is not what everyone else sees. Your feed is tailored to you—your interests, your biases, your past behavior. The person across the street, your co-worker, your kid’s soccer coach—they’re all seeing different versions of the same platform. They are all getting information that is tailored to their interests, tastes, and behaviors. This personalization might seem harmless when it comes to taste in music, sports, or entertainment. But what happens when it extends to critical information—news, facts, and truth itself?
What happens when there are thousands of different "tastes" for important and critical information? Different "tastes" for truth, veracity, trust, etc?
And what happens when the information distribution ecosystem itself is reinforcing and amplifying different truths, veracity, and trust?
As a society we are all seeing different fragmented bits of information, instead of a consistent base of fact or verified truth.
Algorithms personalize what appears on your screen based on what you’ve engaged with prior, both on and off the platform. They build a data profile on you based on your online and in app behaviors. And they align that profile with other users with similar profiles to build “user groups.” These user groups are known more popularly as echo-chambers.
The Fragmentation of Truth
When algorithms prioritize engagement over accuracy, they create a world where there are thousands of different “tastes” for truth. One person’s feed might be filled with credible news sources and fact-checked information, while another’s is dominated by conspiracy theories, sensationalist headlines, or outright misinformation. This isn’t just a matter of differing opinions; it’s a fundamental breakdown of a shared baseline of facts.
The problem is compounded by the way these algorithms work. They don’t just reflect our preferences—they reinforce them. If you engage with a certain type of content, the algorithm will show you more of it, creating a feedback loop that amplifies your existing beliefs. Over time, this leads to echo chambers, where conflicting viewpoints are filtered out and alternative perspectives are rarely, if ever, seen. The result is a society where people aren’t just disagreeing on solutions—they’re disagreeing on reality itself.
The Amplification of Division
What happens when the information distribution ecosystem itself is designed to reinforce and amplify these fragmented truths? Polarization, distrust in institutions, and the erosion of shared norms are all symptoms of a system that prioritizes engagement over truth. When everyone is exposed to different versions of reality, it becomes nearly impossible to have meaningful conversations, let alone find common ground.
This isn’t just a theoretical concern. Studies have shown that social media algorithms can exacerbate political polarization, spread misinformation, and even influence elections. The algorithms aren’t inherently malicious—they’re just doing what they were designed to do: keep us engaged. But in the process, they’re reshaping how we see the world and how we relate to one another.
Reclaiming Our Shared Reality
So, what can we do? The first step is awareness. Recognizing that our feeds are curated by algorithms—not by some neutral force—can help us approach social media with a more critical eye. Here are a few ways to push back against the fragmentation of truth:
- Diversify Your Sources: Don’t rely on social media for news. Seek out reputable, fact-based sources and make an effort to expose yourself to multiple perspectives.
- Question Your Feed: If a piece of content triggers a strong emotional reaction, pause and ask yourself: Why am I seeing this? Is this designed to provoke me?
- Break the Echo Chamber: Follow people and organizations that challenge your worldview. Engage in respectful conversations with those who hold different opinions.
- Demand Accountability: Hold social media platforms accountable for their role in spreading misinformation. Support regulations that promote transparency and ethical design.
The algorithms may be the gatekeepers of information, but we don’t have to accept their version of reality. By taking control of how we consume and engage with information, we can begin to rebuild a shared understanding of the world—one that’s based on shared facts, not fragments.
In the end, the question is do we want to be a UNIFIED society based on our actual shared reality, or a divided one based on corporate profit driven engagement?