top of page

Digital Marketing

Claire Roper

The Invisible Hand of Social Media: How Algorithms Shape What We See

  • Writer: Claire Roper
    Claire Roper
  • Sep 29
  • 2 min read

In today’s digital world, social media platforms are more than just places to connect with friends or share photos. They are the front page of the internet. For many people, they are the first (and sometimes only) source of news. But behind every post you see lies an invisible hand: the algorithm.


Algorithms are designed to filter, prioritise, and recommend content based on your past activity. They decide which stories, videos, and news articles appear on your feed, and which ones never reach you. On the surface, this seems convenient because you get content tailored to your interests. But the reality is more complex and, in some cases, dangerous.


People use smartphones, surrounded by social media icons connected by glowing lines on a dark background, symbolizing digital connectivity.

The Filter Bubble Effect


Social media algorithms create what’s often called a filter bubble. Instead of showing a balanced mix of viewpoints, they serve up more of what you already like, agree with, or engage with. If you comment on one political article, you will likely see more from the same perspective. If you click on health-related posts, the system may feed you more content, sometimes reliable, sometimes misleading.


This bubble can make people believe their view of the world is widely shared and uncontested, when in fact it is one of many perspectives. Over time, this narrows awareness, increases polarisation, and makes it harder to recognise misinformation.



News, But Not the Same News


One of the biggest dangers is that people can live in entirely different realities online. Two individuals scrolling at the same time may see completely different versions of the same story, or miss it altogether.


For example, during major world events, some users may see posts highlighting humanitarian concerns, while others may be shown conspiracy theories or sensationalist takes. Each person walks away thinking they are informed, but the information they have absorbed could be radically different.



Engagement Over Accuracy


Why does this happen? Algorithms are designed to maximise engagement, not accuracy. The goal is to keep you scrolling, clicking, and reacting. Unfortunately, sensational, emotional, or controversial content often gets promoted over calm, factual reporting. This means misinformation can spread quickly, while nuanced stories struggle to gain traction.



Why It Matters


When communities can no longer agree on a shared set of facts, constructive debate becomes nearly impossible. If one group sees a flood of articles about climate change denial while another only sees urgent warnings about environmental crises, the two sides will struggle to even start a conversation.


This fragmentation undermines trust in media, weakens democracy, and makes it easier for false or misleading narratives to take hold.



What Can We Do?


  • Be aware of the filter bubble. Recognise that your feed is curated, not neutral.

  • Seek out diverse sources. Do not rely on one platform or one news outlet for information.

  • Check before you share. Engagement is the currency of social media, so avoid spreading misinformation by accident.

  • Support quality journalism. Trusted reporting plays a critical role in keeping people informed with facts, not just opinions.


Social media is not going away, and neither are algorithms. By understanding how they work and their potential dangers, we can make more conscious choices about how we consume and share information.

Comments


bottom of page