More on hedgehogs and foxes
To try to figure out why another blog links to my post about hedgehog economists and fox economists, I found this very interesting post at Economic Dreams--Economic Nightmares. There Dave Iverson quotes liberally from a Louis Menand New Yorker review of Philip Tetlock's 2005 book Expert Political Judgment: How Good Is It? How Can We Know? According to the review, Tetlock is a Berkeley psychiatrist who over 18 years tracked 82,361 predictions by 284 paid prognosticators and then graded them to see how accurate the predictions were. He also collected data from these pundits about their thinking processes. Some surprising findings (quotations are from the Menand review):
As a group the pundits did worse than a random number generator.
Tetlock also found that specialists are not significantly more reliable than non-specialists in guessing what is going to happen in the region they study. Knowing a little might make someone a more reliable forecaster, but Tetlock found that knowing a lot can actually make a person less reliable.
. . . .
The experts' trouble in Tetlock's study is exactly the trouble that all human beings have: we fall in love with our hunches, and we really, really hate to be wrong. . . .
Tetlock's experts were also no different from the rest of us when it came to learning from their mistakes. Most people tend to dismiss new information that doesn't fit with what they already believe. Tetlock found that his experts used a double standard: they were much tougher in assessing the validity of information that undercut their theory than they were in crediting information that supported it.
[Tetlock believes] that he discovered something about why some people make better forecasters than other people. It has to do not with what the experts believe but with the way they think. Tetlock uses Isaiah Berlin's metaphor from Archilochus, from his essay on Tolstoy, "The Hedgehog and the Fox," to illustrate the difference. He says:
Low scorers look like hedgehogs: thinkers who "know one big thing," aggressively extend the explanatory reach of that one big thing into new domains, display bristly impatience with those who "do not get it," and express considerable confidence that they are already pretty proficient forecasters, at least in the long term. High scorers look like foxes: thinkers who know many small things (tricks of their trade), are skeptical of grand schemes, see explanation and prediction not as deductive exercises but rather as exercises in flexible "ad hocery" that require stitching together diverse sources of information, and are rather diffident about their own forecasting prowess.
A hedgehog is a person who sees international affairs to be ultimately determined by a single bottom-line force: balance-of-power considerations, or the clash of civilizations, or globalization and the spread of free markets. A hedgehog is the kind of person who holds a great-man theory of history, according to which the Cold War does not end if there is no Ronald Reagan. Or he or she might adhere to the "actor-dispensability thesis," according to which Soviet Communism was doomed no matter what. Whatever it is, the big idea, and that idea alone, dictates the probable outcome of events. For the hedgehog, therefore, predictions that fail are only "off on timing," or are "almost right," derailed by an unforeseeable accident. There are always little swerves in the short run, but the long run irons them out.
Foxes, on the other hand, don't see a single determining explanation in history. They tend, Tetlock says, "to see the world as a shifting mixture of self-fulfilling and self-negating prophecies: self-fulfilling ones in which success breeds success, and failure, failure but only up to a point, and then self-negating prophecies kick in as people recognize that things have gone too far."
Tetlock did not find, in his sample, any significant correlation between how experts think and what their politics are. His hedgehogs were liberal as well as conservative, and the same with his foxes. (Hedgehogs were, of course, more likely to be extreme politically, whether rightist or leftist.) He also did not find that his foxes scored higher because they were more cautious—that their appreciation of complexity made them less likely to offer firm predictions. Unlike hedgehogs, who actually performed worse in areas in which they specialized, foxes enjoyed a modest benefit from expertise. Hedgehogs routinely over-predicted: twenty per cent of the outcomes that hedgehogs claimed were impossible or nearly impossible came to pass, versus ten per cent for the foxes. More than thirty per cent of the outcomes that hedgehogs thought were sure or near-sure did not, against twenty per cent for foxes.
Why did I like Iverson's post? Confirmation bias probably.
We, all of us, overestimate the accuracy of our judgments--even when we are presented with hard evidence that our judgments about such matters in the past have been no better than random guesses, or even much worse, according to Daniel Kahneman in The Surety of Fools.
As one example, Kahneman analyzed the rankings of 25 individual advisors in a wealth management firm over 8 years. These ranking were used by the firm to set annual bonuses. Analysis showed there were no consistent stars or consistent losers. The pattern over 8 years was almost totally random--the average correlation was 0.01. In other words, all bonuses were based on luck. Did this information, when reported to the advisors and executives, affect any behavior at all? No, and Kahneman was not surprised by that.
Kahneman suggests that humans have a "bias toward coherence," which makes us overconfident when an idea is internally consistent, comes easily to mind, and fits with what we think we know:
The confidence we experience as we make a judgment is not a reasoned evaluation of the probability that it is right. Confidence is a feeling, one determined mostly by the coherence of the story and by the ease with which it comes to mind, even when the evidence for the story is sparse and unreliable. The bias toward coherence favors overconfidence. An individual who expresses high confidence probably has a good story, which may or may not be true.
Reader Comments (1)
Interesting Roger.