What is a good decision, Mr. Gigerenzer?
„It’s important not to let yourself be frightened“
The risk researcher Gerd Gigerenzer studies how people make decisions. He discovered that many people neither understand statistics nor trust their intuition – sometimes with catastrophic consequences.
Interview by Esther Kogelboom / Der Tagesspiegel
02.07.2025
Mr. Gigerenzer, the chance of rain is 30 percent. Was it reckless of me to ride my bike to the Max Planck Institute without a rain jacket? Did you get wet? No. The chance of rain is a good example of how we often think we understand something — without realizing that we actually don’t. I’ve led several studies on what people think "chance of rain" means. Most Berliners think: It will rain for 30 percent of the time — that is, seven to eight hours. Many people in Milan, on the other hand, believe it means it will rain in 30 percent of the area — probably not where they live. In New York, people think it means it rains on 30 percent of days with similar weather conditions. And most Spaniards believe it means that three meteorologists are predicting rain and seven are not.
So what’s actually correct? Meteorologists mean that, in comparable weather conditions in the past, it rained on 30 percent of the days at a specific measurement station. The astonishing thing is: very few people even ask the question about the reference point. Thirty percent of what? When it comes to rain, the potential damage is low — you get wet. But in healthcare, the digital world, or financial matters, the consequences can be far more serious. The diagnosis is: statistical illiteracy. It’s a mental pandemic that hardly anyone notices. Not everyone has the resources to become a hobby statistician. You don’t need to study statistics to ask the simple question: “Thirty percent of what?” This kind of statistical thinking can be learned by anyone in just a few days. It should be part of every school subject and taught using real-life examples. How important thinking is can be seen in a case from England: In the mid-1990s, 13,000 more women than average had an abortion within a year because they had stopped taking the pill and consequently became unintentionally pregnant. The British drug authorities had announced that women who take the new generation of the pill have a 100 percent increased risk of thrombosis compared to those who take the old generation of the pill.
„It’s not just anti-vaxxers who spread false theories, but also many people who can’t interpret statistics.“– Gerd Gigerenzer
100 percent – that sounds frightening.
Only if you don’t know that the media often use “relative” risks to scare people. That’s why many British women, especially teenagers, reacted with alarm and stopped taking the pill. The study behind it had shown that out of every 7,000 women who took the previous generation of the pill, one woman developed thrombosis. This number rose to two among those taking the new generation of the pill. So, the absolute increase in risk is 1 in 7,000, but the relative increase is 100 percent.
You don’t need a degree in statistics to understand this. It simply shows how important it is to think for yourself and not let fear take over.
You also like to cite the example of the Markus Lanz talk show.
In 2021, there was an episode featuring, among others, the virologist Melanie Brinkmann and the then Prime Minister of Lower Saxony, Stephan Weil (SPD). Due to misinterpreted numbers, the impression was given that COVID-19 vaccinations were ineffective.
I give many lectures to medical societies, so I know that most doctors don’t receive good training in understanding health statistics. But what surprises me is how incredibly slow this is changing. The virologist later apologized; as far as I know, Mr. Lanz did not – even though millions of viewers must have come away from the show with the impression that COVID-19 vaccination was useless.
Lanz could have used a later episode to explain what the numbers actually meant and how one can learn from their own mistakes. It’s not just anti-vaxxers who spread false theories, but also many people who don’t know how to read statistics. Understanding numbers today is as important as reading and writing.
What are the consequences of doctors‘ “blindness to numbers” for patients?
Unnecessary medications, unnecessary CT scans, unnecessary surgeries, and unnecessary cancer screenings. We could have a healthcare system that is much more effective and less expensive. But of course, not everyone wants that. The medical device industry or the pharmaceutical industry would have to accept losses if doctors understood the evidence better and if patients had the courage to ask questions.
You mean patients let themselves be too easily intimidated?
Studies show that most patients ask hardly any questions. And before you say, “But I can’t go to medical school,”—everyone can ask about the benefits and harms of a treatment. Ovarian cancer screening, for example, is one of the most popular out-of-pocket (IGeL) services.
If a doctor offers it to you and says the public health insurance doesn’t cover it, but “surely your health is worth 40 to 50 euros,” then you should stop and think. Check what medical societies say: none of them recommend this screening. Studies show that no lives are saved, but many healthy women end up having their ovaries removed unnecessarily.
There’s a reason why public insurers don’t pay for this service. On top of that, most of the suspicious findings detected by ultrasound are false positives. To be “safe,” doctors often remove the ovaries. According to our estimates, this happens to more than 10,000 healthy women every year in Germany. That could be avoided—if people had the courage to think for themselves and get informed.
In what areas outside of medicine do you think risk literacy is also important?
Do you remember the terrorist attack at the Christmas market in 2016 at Breitscheidplatz? Afterwards, then-Interior Minister Thomas de Maizière decided to test automatic facial recognition systems at Berlin Südkreuz train station to track down suspected terrorists more quickly. Volunteers played the role of suspects, of whom there were perhaps 400 to 600 in all of Germany at the time.
After a year, the new Interior Minister, Horst Seehofer, proudly presented the results and announced that automatic facial recognition should now be introduced at all German train stations. Two figures were presented as results: the hit rate was 80 percent. That means 80 percent of the simulated potential terrorists were correctly identified, and 20 percent were missed. The false alarm rate was only 0.1 percent. The impression was: artificial intelligence is almost foolproof.
What happened then?
Then came widespread protest—for ethical reasons. After all, who wants to live in a country where surveillance cameras are everywhere?
What was overlooked, however, was that the whole thing simply cannot work: According to Deutsche Bahn, around 12 million people pass through train stations every day. Aside from the 400 to 600 actual suspects, these are ordinary people like you and me, going to work or on vacation. And with a false alarm rate of 0.1 percent, that means 12,000 false suspects every day—people who are completely innocent, yet would have to be stopped, searched, and identified by police.
The project was then discontinued.
„The human brain, by contrast, evolved in an uncertain world—and that’s one reason why we have intuition, judgment, trust, and emotions.“ – Gerd Gigerenzer
When does facial recognition work?
It works for authentication, such as when you unlock your smartphone, and for identification, for example, when a surveillance camera records a crime in a subway station. It does not work for mass surveillance. AI is not a hammer that can solve every problem—not all problems are nails. AI can solve certain problems, but with others, it struggles significantly.
Where does it work well, and where does it run into problems?
Generative AI like ChatGPT and industrial applications involving robotics work very well. But when it comes to predicting human behavior, complex algorithms face difficulties—for instance, in selecting candidates for a job. In general, AI performs well when we have a stable, well-defined world, like in chess. But not when we’re dealing with uncertainty. This is called the principle of a stable world.
Humans are a major source of uncertainty. And even when predicting the spread of flu or coronavirus, big data isn’t very helpful, because viruses mutate quickly. The human brain, by contrast, evolved in an uncertain world—and that’s one reason why we have intuition, judgment, trust, and emotions.
„Nothing is certain in this world except death and taxes.“ – Gerd Gigerenzer
Being able to reliably predict human behavior would be of considerable value to many companies.
Not just to companies. If you’re on trial in the U.S., it’s likely the judge will rely on an algorithm such as COMPAS. This system predicts the likelihood that you will commit another crime within the next two years if released. But neither the judge nor the defendant understands how this assessment is made.
How accurate is this black box, really?
Studies show that COMPAS’s predictions are no better than those made by ordinary people. I’ve trained U.S. federal judges—intelligent and impressive individuals—but very few have training in statistical thinking or AI. All of this makes it easy for companies to sell useless crystal balls disguised as AI.
There’s also the issue of defensive decision-making. A judge may want to release you, but the algorithm assigns you a high risk score. In such situations, some judges will play it safe and follow the algorithm’s recommendation—not because they agree, but to protect themselves in case you do commit another crime. That way, they won’t have to justify their decision later.
„Intuition is a form of unconscious intelligence.“ – Gerd Gigerenzer
Many situations in which humans have to make decisions are marked by uncertainty. There are no guarantees.
Nothing is certain in this world except death and taxes.
But there’s a big difference between calculable risks and uncertainties.
Exactly. If you go to the casino tonight and play roulette, you’re taking a calculable risk. You can compute how much you’ll lose in the long run. Everything that can happen is a number between 0 and 36.
Is a casino an appealing idea to you, Mr. Gigerenzer?
I find it boring. There are fascinating studies about people in casinos. Addicted slot machine players often say they don’t go to the casino to win—but for the sense of safety. In the “machine zone,” nothing unexpected can happen to them—unlike in interactions with other people.
But in most decisions we face in life, we’re dealing with uncertainty. New things can happen, unexpected events occur—that’s the 37. If you start a company, you can’t calculate how it will turn out. That’s the reason we have intelligence; intelligence is more than just calculation.
Under uncertainty, we need emotions and trust—in others and in ourselves.
„You must not pit intuition against conscious thinking—that’s the biggest mistake, and one that even behavioral psychologists keep making.“ – Gerd Gigerenzer
You say intuition has long been dismissed as “feminine” and urgently needs to be rehabilitated—along with gut decisions.
Intuition is a form of unconscious intelligence. It’s a feeling that draws on years of experience. You sense very quickly what you should or shouldn’t do—but you can’t explain why. Intuition should not be confused with arbitrariness or a divine revelation.
How many years do I need before I can make valid intuitive decisions? Can that be defined?
The more, the better. You must not place intuition in opposition to conscious thinking—that’s the biggest mistake, and one that even behavioral psychologists keep making when they contrast an intuitive „System 1“ with a rational „System 2.“
A good doctor senses that something is off with a patient she has known for years, though she can’t explain it. That’s intuition. And it triggers conscious action—she’ll run tests now. Intuition and thinking work together, not against each other.
Can intuition be brought out and refined?
A friend of mine couldn’t decide between two women. So he made a pro-con list, listing all the traits that mattered to him in a relationship. Then he scored each woman on each trait, assigned weights, added everything up… and when he saw the result, he felt: That’s not right. So he chose the other woman. The calculation had triggered his intuition, which he had been ignoring, to protest.
A faster method is this: flip a coin—toss it high. While it’s in the air, you’ll likely feel which result you’re secretly hoping for. And then, you don’t even have to look at how it lands.
„You can measure the fear of responsibility in companies and public administrations by how much they spend on consulting firms.“ – Gerd Gigerenzer
What if I repeat the experiment and then my intuition tells me something different?
Then it could be that you don’t have a clear intuition—or that it’s being drowned out.
Drowned out? By what?
By fears. I asked executives of large companies listed on the DAX: How many of the important professional decisions you made—after seeing all the data—ended up being intuitive decisions? The answer was usually “50 percent.” But those same executives would never admit this publicly because they fear being criticized.
It’s better to justify decisions with understandable numbers, preferably Big Data, right?
With a gut decision, you have to take responsibility yourself—and today we live in a society where fewer and fewer leaders do that. So what do they do? They look for reasons afterward. That means assigning an employee to find reasons and then presenting the decision to the public as a purely data-based one. The word intuition never comes up.
The expensive version is hiring a consulting firm to justify a decision already made afterward. That’s a loss of money—and above all, time.
Do you have an example of that?
Well, I worked with one of the biggest consulting firms in the world and asked the principal over dinner: “Would you be willing to tell me how many of your client contacts consist of justifying decisions already made?” He replied, “Mr. Gigerenzer, if you don’t mention my name, I’ll tell you. It’s more than half of all cases.”
What a waste of time, intelligence, and resources! You can measure the fear of responsibility in companies and public administrations by how much they spend on consulting firms.
Germany urgently needs a culture of performance, not a culture of cover-ups.
The Interview was originally conducted in German. Link: https://www.tagesspiegel.de/wissen/was-ist-eine-gute-entscheidung-herr-gigerenzer-es-ist-wichtig-sich-nicht-angst-machen-zu-lassen-13948831.htmlhttps://www.tagesspiegel.de/wissen/was-ist-eine-gute-entscheidung-herr-gigerenzer-es-ist-wichtig-sich-nicht-angst-machen-zu-lassen-13948831.html