A Difficult Truth: The Next Disaster Won’t Be Natural — It’ll Be Digital

A Difficult Truth: The Next Disaster Won’t Be Natural — It’ll Be Digital

This isn’t a story about technology. It’s a warning about what it’s doing to us — and what’s coming next.


Author’s Note (2025)

I first wrote this in 2021, when I started noticing signs that something was deeply wrong with the way technology was shaping our lives.

Back then, it felt like a warning for the future. But today, in 2025, I’ve seen enough to know — that future is already here, especially in Thailand.

That’s why I’ve translated this into English. Not as a professional writer, but as someone who just wants to tell the truth as I saw it — before more of it becomes reality.


A Story That’s Hard to Tell

This is going to be long, complicated, and honestly, hard for most people to fully understand.

But it needs to be written. I can't just stay silent anymore.

If you're someone who wants to know what kind of world we’re heading into in the next few years — then I think this is something you should read.

You’ll probably need to unpack it piece by piece. I’ve tried my best to write it in a way that even people who aren’t into tech — people from my mom’s generation to my future grandchildren — can follow. Still, no matter how hard I try, it feels like it’s just... hard to explain.

At one point, this whole thing left me completely drained — mentally and emotionally. I was overwhelmed, hopeless, and honestly, in one of the darkest states I’ve ever experienced. It took me days just to recover enough to write this down.

So here’s where I’ll start:


What This Is About

We're about to face one of the most serious social disasters in human history — and it’s coming very soon.

This isn’t a prediction. It’s not just some theory I made up. It’s based on what leading people in the tech industry — the very ones who built the social media platforms we use every day — are now saying. These are the people who have seen what’s wrong, especially when it comes to the ethical impact of what they helped create.

Much of what I’m going to share comes from a documentary called The Social Dilemma. Toward the end, the film tries to offer some ways to prevent what’s coming. But personally? I can’t see a way out. I really believe we’re going to have to face this — and soon.


Where It All Begins

The root of it all? It starts with the social media platforms we use: Facebook, Twitter, Instagram, YouTube, Google — all of them.

You might ask: How could that cause a disaster?

I mean, we just check Facebook now and then to see what our friends are up to. We watch some fun or educational videos on YouTube. Maybe we chat with family on LINE. It’s not like we’re addicted. We’re gaining knowledge, connection, and helpful content. So what’s the problem?

Well — there’s no short answer. But the first step is to understand the business model behind all of this.


Your Attention Is Their Product

(We’ve become the product — without knowing it)

Every kind of social media shares one core engine: it’s built to grab and keep your attention for as long and as often as possible. Because your attention is what they sell to their customers — the advertisers.

That’s the real business model. That’s how they make money. And this is where the threat begins — the global race for your attention.


Behind the Curtain: How Social Media Works

There are two key forces working behind social media platforms:

  1. Technology designers (people)
  2. Artificial Intelligence (the code)

Technology Designers

These people are experts in human psychology — not just smart, but deeply trained in understanding how our subconscious works. Their job is to design every part of your experience to make you want to stay longer, come back more often, and develop habits around the app.

They decide:

  • What kind of notifications to send
  • What buttons to show
  • What words to use
  • Which finger movement feels best — why your thumb taps here, or your index finger scrolls there

None of this is random. It’s all designed to nudge your subconscious into coming back.

One clear example? The “tagging” feature in Facebook photos. Have you ever been tagged and not clicked to see what it was?


Artificial Intelligence (AI)

Then there’s AI — algorithms that live invisibly behind the screens of Facebook, YouTube, or Instagram. They’re not humanoid robots; they’re complex decision-making code.

Here’s how they work:

Say we program an AI to play chess. We give it the rules. In the beginning, it loses to everyone — even kids. But with every game, it learns. Bit by bit, it figures out what works. It repeats, refines, improves — millions of times — until it starts winning. Eventually, it beats the world champion.

Now, replace chess with what these companies really want: your attention.

The AI is told: “Make this person stay on the platform as long as possible, and come back as often as possible.”

And then it goes to work.

The scary part? Even the people who built these systems admit they no longer fully understand how the AI works. Once it’s given a goal, no one really knows how it achieves it — or what it might do along the way.

Its only goal: your attention. And what it’s good at? Changing your behavior.


The Fallout: Addiction, Depression, and Suicide

Social media has created one of the most extreme social shifts in human history.

We’ve never, in the history of our species, had to connect with or respond to this many people. The next generation will care more about what others think than about their own sense of self-worth.

Yes, we’ve always cared what people think — but not at this scale.

The behavior of young people around the world has already been reshaped.

Social media is slowly erasing our self-worth. Many people now get plastic surgery to look more like their filtered selfies. Depression is rising sharply in young people. Suicide rates have surged since the rise of these platforms. Kids are less able to cope with real-life problems.


“We curate our lives around this perceived sense of perfection, because we get rewarded in these short-term signals — hearts, likes, thumbs up. We conflate that with value and truth, and instead, it's fake, brittle popularity that's short-lived. And it leaves you more vacant and empty than before you did it. It forces you into this vicious cycle of, 'What’s the next thing I need to do, because I need that back?' Think about two billion people having that every day.”Chamath Palihapitiya, Former VP of Growth at Facebook


Why Can’t Humans Just Adapt to It?

Because our brains can’t keep up.

Technology has accelerated billions of times in processing power in just a few years — while our brains have barely changed in thousands of years.

In an instant, tech has rewired how humans connect with each other — relationships that evolved over millions of years.

Some people might think they’re immune: “I’m not addicted. I don’t even post much. I don’t care about likes.”

But if you still have an account — you’re in the game. Whether you’re young or old, the AI knows:

  • If you’re introverted
  • If you like to silently browse
  • What you pause to look at
  • What kinds of videos you’re drawn to

It knows you better than you realize — and it knows how to work on people like you.


What the Tech Creators Do with Their Own Kids

Because of how severe these effects are — many of the people who built this technology do not let their own kids use it. At all. No exceptions.

Even platforms like YouTube Kids — made specifically for children — still run on the same underlying AI principles that manipulate adult attention.

Nobody wants an algorithm working on their child’s subconscious.

“(Behind your screen), there are thousands of the world’s smartest engineers, paired with AI supercomputers, trained in human psychology. They know everything about you — your behaviors, your weaknesses — while you know nothing about them. You and this system do not have the same goals. So ask yourself — who do you think is going to win?”Tristan Harris, Former Design Ethicist at Google, Co-founder of the Center for Humane Technology


Surveillance Capitalism: When Behavior Becomes Data

Every little thing you do on your phone is being recorded — what videos you watch, what articles you click, whose photos you like, where you travel, what you buy, who you linger on when scrolling, and even what you do late at night.

That data — from billions of people — is fed into these systems to fuel the AI, non-stop.

They know more about us than any government or organization ever has in human history.

The AI then sorts and categorizes people into different psychological groups — and packages them as products for advertisers.

With enough money, you can buy access to highly specific behavioral types. It’s not just about reaching “men aged 25–34.” It’s “people who tend to feel lonely at night,” or “teenagers who are insecure about their appearance.”


Conspiracies, Fake News, and the Death of Shared Reality

Here’s where it gets terrifying.

Because now, your attention isn’t just being harvested — it’s being shaped.

Let’s shift the wording a bit: Your “attention” isn’t just the product. It’s your behavioral tendencies — your attention as it slowly changes — that is now being sold.

And when your attention shifts, even a little, it can be manipulated toward extreme beliefs. That’s how fake news leads to polarization — and possibly even civil war.

Remember, the AI has the same goal: Keep you watching. Keep you scrolling.

If the system notices you’re likely to be interested in a certain idea — even a conspiracy — it will feed you more and more of it.

It doesn’t care if it’s true or false. It only cares that you stay engaged.

And little by little, you go from “I’m just curious about this” to “This is the truth — and everyone else is blind.”

On Twitter, fake news spreads six times faster than real news. And because attention equals money, fake news is more profitable.


You’ll start to see others who don’t think like you as:

  • Ignorant
  • Uneducated
  • Brainwashed
  • Stupid
  • Gullible
  • Not worth listening to

And the other side? They think the exact same things about you.

Billions of people are being nudged into this kind of thinking — at the same time.


Real-World Consequences: From Myanmar to the U.S.

The results are already happening.

Social media has made it incredibly easy — and cheap — to spark extreme political conflicts, spread conspiracy theories, manipulate belief systems, and divide societies.

One of the clearest, most violent examples happened close to our region: The Rohingya genocide in Myanmar.

It was fueled, in large part, by misinformation spread through Facebook.

But it’s not just there.

It can happen anywhere — regardless of how smart or “developed” a country is.

Even in the U.S., one of the most powerful nations in the world, Russian interference during the election used these exact same tactics.

And that’s just one case.


Is There Any Way to Stop This?

The film The Social Dilemma offers a few ideas — but honestly, they feel pretty thin. It would take a miracle. But we have no choice. We have to try.

The creators of the documentary later formed the Center for Humane Technology — a group that’s pushing for laws, policies, and awareness to regulate these broken business models.


And What About Us? (A Thai Perspective)

In Thailand, we use this technology with almost no limits.

After everything you’ve read so far, maybe you can now see it more clearly: We already have many of the ingredients needed for a future social disaster.

Or maybe we’ll be fine. Maybe we’re the one nation whose kids are stronger. Maybe our psychology is better. Maybe our brains are just built differently from the rest of the world.

Right?


The Final Word

I’ll leave you with one last quote from Tristan Harris:

“We’re so worried about when technology will surpass human strength and intelligence. But what we should really be concerned about is when it surpasses human weaknesses. And that moment... has already happened. Addiction. Division. Outrage. Extremism. Vanity. These are the human vulnerabilities that technology has already hacked. And this… is checkmate for humanity.”Tristan Harris, Former Google Design Ethicist, Co-founder of the Center for Humane Technology

Growing in Harmony with Nature