With the gyrations of the stock market and unsettling political and financial climates causing jitters around the world, writer Dan Gardner offers timely insight in his new book Future Babble: Why Expert Predictions Are Next to Worthless and You Can Do Better.
The Canadian journalist and author of the international bestseller, The Science of Fear, examines why the surest-sounding experts, the ones we most want to believe, are often the least credible. That has implications for the information on which many of us rely to make decisions about health, medicine and yes, the economy.
So should we believe ‘expert’ predictions about anything — or would we be better off having our cats randomly bat at our keyboards to get an answer?
Some do do modestly better than a dart-throwing chimp. I emphasize modestly and emphasize some. In the seminal research on expert predictions, [psychologist] Philip Tetlock found that the average expert was about accurate as a dart-throwing chimp. [The research involved 27,450 predictions by political scientists, economists and journalists, which were followed up to determine their accuracy.]
There was a distribution with two groups. One group [of experts] was worse than the chimp. The other was modestly better, but still miles from perfect. However, they did have some predictive insight.
Tetlock looked for all the things you think would have distinguished the delusional from the others. Nothing you think might make a difference did make a difference. The one thing that did was a style of thinking.
The group that [did worse than the chimps tended to have] one big idea, one big analytical tool that they used over and over to stamp out their predictions. They insisted on simplicity, clarity and final answers. They were sure, and were far more like to say that [something was] ‘certain’ or ‘impossible.’
The other group was far more likely to use many analytical tools, more likely to get information from diverse sources, was more comfortable with uncertainty, and they would say ‘possibly,’ ‘maybe.’ They were much less likely [to use words like] ‘impossible’ or ‘certain.’
Tetlock borrowed some terms from an Isaiah Berlin essay; he got them from an ancient Greek poet: ‘the hedgehog knows one big thing whereas the fox knows many.’ In terms of predicting the future in a more accurate way, the fox style of thinking is far superior.
Does that track with political orientation at all?
Liberals often self define as being comfortable with complexity — as opposed to hedgehog-like conservatives. [But] there was no correlation between accuracy and political position. There are liberal foxes and hedgehogs and conservative foxes and hedgehogs. What mattered was the style of thinking, not its political content.
Now, there’s a confusingly named professor in a study of experts that you write about.
‘Myron Fox’ is whom every person should model themselves after if they want to be media superstars. The name was invented by the researcher who created this stereotype. ‘Dr. Fox’ is an erudite, confident academic who gave a lecture specifically designed for the experiment. The researcher hired an actor to play the role and wrote a lecture for Dr. Fox to give, which was complete gibberish but was brilliantly delivered.
[The original lecture was given to an audience of psychiatrists, psychologists and social workers, and was about “mathematical game theory as applied to physician education.” It was full of contradictory statements and non-sequiturs.]
Did the audience recognize that he was talking gibberish? The answer is no. Even educated audiences who witnessed the performance were very impressed. It is one of most depressing pieces of social science research ever conducted.
He should have been called Dr. Hedgehog.
Yes! But the story of Dr. Fox is connected to Tetlock’s research [in another way, too]. Tetlock found an inverse correlation between accuracy and fame [in expert predictions]. The more famous the expert was, the less accurate.
And Dr. Fox helps explain why. It’s people who tell simple, clear compelling stories who are perfectly confident that they are right who become media superstars. Reporters turn to them, audiences turn to them, corporations pay huge money to them to give lectures. But as it turns out, the characteristics of hedgehogs are what make Dr. Fox tick.
What keeps reporters hooked on these experts?
One reason is that the media don’t check accuracy rates of experts, so the consequences for making bad predictions are: Heads, I win; tails, I forget that we had a bet.
There’s also a psychological aversion to uncertainty that drives demand for expert forecasts. When a reporter wants to answer questions for the reader like, What will happen with the economy? An economist would say, ‘Well, I think there are nine key factors, maybe 10. Some point one direction. Others point in a different direction. It may be possible that…’ By that point, the reporter is pulling his hair out; it isn’t a satisfying response.
A hedgehog would have one big idea, a simple story, a final answer, and that satisfies the psychological craving for certainty. Harry Truman once said that he wanted to hear from a one-armed economist [so that the guy wouldn’t say], ‘On the other hand…’
So why do we hate uncertainty so much?
There’s an even more basic psychological need for control. There’s a lot of research on this. When you put people in environments when they have absolutely no control, bad things happen. Knowledge of what’s going to happen is part of control. Even if you can’t control the outcome, knowing what’s going to happen satisfies that need for control.
That’s part of how torture works, right?
I did an investigative series on torture many years ago. One of the most disturbing findings is the psychological processes [involved]. Physical torture is only one tool. Critical to every torture process is the use of uncertainty and lack of control.
The first thing the torturer tells the victim is, ‘You’re not in control. I control your fate.’ The second thing they do is ensure that the environment in which they exist is as uncertain as possible. You’re not tortured at the same time every day. They do not use the same torture every session. They understand, as a CIA interrogation manual once put it, that the anticipation of a blow can actually cause more suffering than the blow itself.
People don’t get that control makes all the difference in the world.
What they don’t understand is that underlying the actual act is uncertainty. When [the enemy] shoves your head underwater, you don’t know if he will pull your head up in time. The soldier [during exercises] knows [that he will].
But if the experts are wrong so often, doesn’t that leave us simply choosing to believe what we prefer? Why bother with science and medical research, then?
It’s natural and understandable that we have respect for experts, and it’s legitimate and reasonable. But we also have to apply a certain level of skepticism. Ph.D.s and awards and honors do not guarantee that a person is correct. That seems like such simple point in the abstract, but it is quite amazing how people can [be convinced of ridiculous ideas] when faced with a Ph.D. or someone with amazing credentials.
We’re also deeply susceptible to confidence. We find it compelling, and think that they must know the answer. We have to learn to distinguish between the type of expert who is worthy of serious consideration and the blowhard who is trying to bowl us over.
Frankly, when I hear somebody making grand pronouncements with perfect certainty, I write them off.
What about cases where the science is extremely well-established?
In trying to communicate with other people, there is constant pressure to speak with greater confidence because that is how effective communication is done. On the other side, reality is complicated and uncertain. Left with this struggle — this tension between being precise and being an effective communicator — there’s very often no perfect sweet spot.
The Royal Academy recently issued a report on the current state of climate science. It was a well-written, clear document that very carefully laid out the uncertainties on key points and also emphasized that there are always uncertainties.
They were savaged the world over: ‘Royal Academy Forced to Admit Uncertainty.’ They were forced to admit nothing. People who talk that way don’t know what science is.
The nature of reality is fundamentally uncertain and the scientific process accepts that. When you hear an expert who says with absolute confidence that ‘there’s no possibility that that I’m wrong,’ you know you are not dealing with someone [who is credible].
What can we do about this?
One of the absolutely essential steps toward more rational thinking is awareness. You have to know about the biases and foibles we’re all subject to if you’re going to catch and correct your mistakes.
Why would people want to believe predictions of doom, which are very common?
The answer is that the psychological aversion to uncertainty is so compelling that believing that something bad will certainly happen is less psychologically burdensome than believing that something bad may happen.
Colostomy or cancer patients, who were ever told that they may have cancer and that they had to wait, know waiting is hell. One of the most commonly reported responses when confronted with the fact that you do [have the disease or problem] is relief. At least I know. They actually feel better.
Human beings are so wonderfully adaptable to even the most difficult circumstances but in order to adapt you have to know.