mcrosby.uk

Society • Politics • Philosophy • Ideas

When Social Networks Killed Optimism: And How We Get It Back

Governments and social networks do not decide what the future is. People do.

There was a time when the future felt exciting. Technology meant progress. We imagined a connected world where ideas flowed freely and where shared knowledge would make life better for everyone.

Then social networks arrived. The promise of connection turned into an endless feed of outrage, self-comparison, and distraction. Platforms discovered that anger travels faster than hope, and they built algorithms around it.


Why Now? How We Got Here

The earliest social networks showed posts in the order they were made. If you wanted to see more from a friend, you followed them. That was it.

Then came the algorithmic feed: posts no longer appear in order, but in the sequence most likely to keep you scrolling. Smartphones and cheap mobile data put that feed in our hands all day. The longer we stay on the app, the more ads we see, the more data is collected, the more profit is made.

This shift to an attention economy means the goal is not to inform or connect, but to hold our gaze. The easiest way to do that is to provoke strong emotions.

The Outrage Economy

Twitter proved that division keeps people scrolling. Facebook followed. TikTok perfected the formula: short bursts of content designed to hit emotions before the viewer knows what is happening. Rage, envy, mockery, fear - all served in a rapid-fire loop.

Why does this work? Negativity holds attention, and attention is money.

TikTok’s Split Personality

Here is the twist: TikTok is not the same everywhere.

In China, where the app is called Douyin, the experience is tightly curated by the state. Children see science experiments, educational videos, and uplifting stories. The feed reinforces pride, ambition, and discipline.

Outside China, the algorithm leans more toward the chaotic. Content that divides, distracts, and derails tends to gain the highest reach. Whether this is deliberate strategy or simply the by-product of engagement optimisation, the outcome is similar: two different cultures growing up with two very different visions of the future.

The Psychological Toll

  • Shortened attention span: the brain adapts to novelty on demand, making real life feel dull and slow by comparison.
  • Constant comparison: everyone else’s highlight reel becomes your measure of failure, fuelling low self-worth.
  • Anxiety loops: bad news, real or exaggerated, arrives faster than we can process it, leaving a constant sense of unease.
  • Reward cycle disruption: real-world progress feels slow and unrewarding compared to the instant dopamine hit of likes, shares, and comments.
  • Conspiracy creep: repeated exposure to sensational or misleading claims makes them feel more plausible over time.
  • Fake news fatigue: the sheer volume of misinformation erodes trust, making it harder to tell what is true and tempting some people to disengage completely.

Manipulation at Scale

These same algorithms can be exploited by anyone with the resources. State actors can push divisive narratives into foreign feeds to make other countries more fractured and distracted. Coordinated misinformation campaigns can influence elections, undermine trust, or sway public opinion without the targets even realising it is happening.

For instance, on the eve of a recent U.S. presidential election, digital ads backed by a Russian-linked group targeted African American voters with messages urging them not to vote because "neither of the candidates would serve Black voters." This kind of algorithm fed misinformation doesn’t just disrupt our attention; it actively erodes trust in democracy and suppresses civic participation. source

Another high-profile example was the Cambridge Analytica scandal. The company harvested the personal data of millions of Facebook users without their consent, building psychological profiles to target people with highly tailored political ads. These ads were designed to push emotional buttons, reinforce existing biases, and in some cases discourage certain groups from voting at all. This showed how personal data and algorithmic targeting could be combined to influence democratic outcomes on a massive scale.

How Optimism Dies Online

  • Long-term thinking collapses: the future shrinks to the next swipe.
  • Good news gets buried: it does not go viral as easily as outrage or drama.
  • Perception warps reality: if your feed is full of disaster, you start believing disaster is all there is.

The Real-World Cost

  • People lose faith in progress and stop attempting hard, long-term work.
  • Communities fragment into hostile subgroups.
  • Young people internalise hopelessness, which becomes self-fulfilling.
  • People end up voting against their own interests.

Reasons to Be Hopeful

  • Human ingenuity: science, medicine, and clean energy are moving faster than ever.
  • Offline revival: people are rebuilding local spaces because they are tired of online toxicity, finding their own 3rd place.
  • User control: decentralised platforms and better tools put more power in individual hands.
  • Media literacy: younger audiences are learning to spot and resist algorithmic manipulation.

What We Can Do About It

  • Curate your feed - unfollow accounts that drain optimism, follow ones that inspire.
  • Spend more time in offline spaces that build community.
  • Support platforms and media that reward depth over outrage.
  • Teach younger generations media literacy as a core skill.

Who Decides the Future?

Governments influence the landscape, but they do not decide the future. Tech companies design the platforms, but they do not own our agency. People decide the future.

If enough of us choose curiosity over cynicism, collaboration over division, and real conversation over manufactured outrage, optimism returns. The algorithms will not change themselves, but we can change what we feed them.