Welcome to Mental Garden. The following letter is part of our “Distilling Books” collection, in which we extract the most revealing ideas from literature. For the complete library, click here
🏷️ Categories: History, Decision making and biases.
Have you noticed?
Experts fail constantly. In economics, geopolitics, technology, public health. And still, we keep listening. Trusting. Betting on them. We did it during the housing bubble, during COVID, we’re doing it now with artificial intelligence—and we’ll keep doing it. Because we crave certainty, and they offer it.
But what if the problem isn’t that they’re incompetent and wrong?
What if the problem is us?
This article explores that collective self-deception. We’ll discuss the book Future Babble by Dan Gardner to understand why predictions fail, why we keep believing them anyway, and what to do to think more clearly in a world full of noise.
You won’t learn to predict the future.
You’ll learn something better: how not to depend on those who claim they can.
1. The Need to Know Tomorrow
Humans need to know what’s coming to feel safe.
It’s an emotional need. When the world turns unstable—because of war, an economic crisis, or a pandemic—our desire for certainty spikes.
What will happen tomorrow?
Will I be safe?
Will my investment perform?
Will there be another recession?
That’s when the experts show up.
The modern-day equivalent of the Oracle of Delphi. People with degrees, graphs, complex models, and a confidence that feels... hypnotic. We cling to them because they know more than the average person about the thing that’s keeping us up at night.
We place all our trust in them, even though history tells a different story.
In 1911, historians like G. P. Gooch predicted the end of wars between great powers. In 1914, World War I began.
In the 1970s, economists warned we would run out of oil. Shortly afterward, oil prices crashed due to abundance.
In the 1990s, they said Japan would dominate the global economy. Not long after, the country stagnated economically and growth slowed.
In 2020, Fernando Simón, an epidemiologist and Spain’s Health Alert Center director, repeatedly downplayed the pandemic and said there would only be a few COVID-19 cases in Spain.
Failure after failure, prediction after prediction, we keep listening.
Our craving for certainty leads us to believe them—even when they’re wrong.
2. A Complex World
The word that explains it all is “complexity.”
Experts fail because the world is a complex system—perhaps too complex to ever predict accurately. There are infinite variables and infinite combinations. A tiny change can trigger drastic, unpredictable outcomes for even the most brilliant mind.
Here’s the most astonishing case I’ve ever seen: the monkey bite.
In 1920, King Alexander of Greece was bitten by a monkey while trying to break up a fight between his dog and the monkey. The wound became infected, and the young king died unexpectedly. His death created a power vacuum, political instability, and eventually led to a war with Turkey—resulting in over 250,000 deaths.
It all started with a monkey bite.
Could anyone have predicted that? Absolutely not.
Edward Lorenz called it the butterfly effect—a butterfly flaps its wings in Brazil, and it ends in a tornado in Texas. As poetic as it sounds, it’s real. Because in a world this complex, detailed predictions aren’t just hard—they’re impossible.
3. We Are Biased
Our brains evolved to be fast and efficient decision-makers—but they’re not perfect. We have flaws.
That’s where biases come from.
The illusion of control makes us think we have more influence than we actually do. Confirmation bias makes us ignore evidence that contradicts our beliefs. The availability heuristic makes us judge probability based on how easily something comes to mind.
These and many more biases contaminate our thinking—even that of experts.
And when you combine these biases with a complex world, predictions fall apart.
4. Certainty Sells (Big Time)
If people need security, someone will always sell it…
Here’s the most striking experiment ever done: Philip Tetlock studied 284 experts for years and collected 27,450 predictions they made.
His conclusion?
The most famous experts—the ones who spoke with the most confidence—were the worst predictors. Only one quality set the best apart from the rest: a broad, interdisciplinary view of reality. The Greek poet Archilochus once wrote, “The fox knows many things, but the hedgehog knows one big thing.”
In other words, the best forecasters are foxes, not hedgehogs.
“Hedgehogs” see the world through one big idea—they’re blind to the complexity around them. But because they sound confident, the media loves them. Meanwhile, “foxes,” who are open to doubt and draw from many disciplines, are more accurate but less popular. Because saying “maybe” doesn’t sell.
That’s the business of the charlatan: selling false certainty with hypnotic confidence.
You don’t have to be an expert, just look like one.
Thats the power of the Law of Authority.
5. We Love Stories, Not Data
Humans are wired to learn through stories.
A prediction wrapped in a compelling story is irresistible. But cold data, shown as charts, equations, or models? Boring. Most people won’t understand it. Here’s an incredible case… Paul Ehrlich, a respected biologist, predicted that millions would die of a global famine in the 1970s.
He was wrong.
But his narrative was clear, persuasive, and easy to understand. He won awards. His book The Population Bomb was a bestseller and influenced politics—presidents even consulted him.
A prediction turned into a gripping story becomes irresistible.
Few can conduct deep, data-driven analysis of something like a famine—and most will get it wrong. But anyone can grasp a story that sounds logical. Ehrlich turned his calculations into a book—and that’s why it worked.
We cling to stories that make us feel like we understand.
Even when we don’t understand anything at all.
6. What Can We Do?
It’s not about stopping listening. It’s about listening better.
We shouldn’t ignore expert advice—but we must become more skeptical about their predictions. Their advice can be helpful. Their predictions? Not so much.
Back to the fox and hedgehog poem—we need a fox mindset: accept the world’s complexity, learn from many sources, and seek diverse views of reality. Don’t be the hedgehog who knows one thing and ignores the rest.
A great example of fox thinking: U.S. President Jimmy Carter.
In 1977, Carter warned—based on expert predictions—that the world was running out of oil. He launched an energy policy focused on efficiency, alternatives, and reduced consumption. Eight years later, oil prices crashed. The shortage never happened.
But his strategy still made sense.
He was wrong about the prediction—but right about the direction. Because good decisions don’t depend on predicting the future, but on preparing for multiple possible outcomes. Despite being wrong, improving energy efficiency was a good move.
The real skill isn’t predicting the future.
It’s navigating uncertainty with a cool head.
That skill can be learned. And it starts by stopping the search for the infallible expert—and building your own tools for when the path becomes unclear (because it will). Next time someone confidently says, “This is what’s going to happen,” take a breath, remember this article… and smile.
Now you know what’s behind the curtain.
Still curious? Here are 3 articles to help you anticipate the future:
✍️ Your turn: What failed predictions will you never forget? The COVID-19 ones were the wildest of all. The more uncertainty there is, the greater the hunger for certainty. Every day a different expert would say something new…
💭 Quote of the Day: “There is a severe overconfidence problem among many experts.” — Dan Gardner
References 📚
Gardner, D. (2011). Future babble: why expert predictions are wrong — and why we believe them anyway.
I think we also need to think about the denominator. Clearly, some experts have made significant mistakes and inaccurate predictions. I think it is essential to consider how predictions are generated, and it is these processes that can sometimes make mistakes, but at other times get things right. I think that, in part, we trust experts because we believe the process they use, even if flawed, is the best option available.