Book review notes: Superforecasting: The Art and Science of Prediction

Book review: Superforecasting: The Art and Science of Prediction by Philip E.Tetlock and Dan Gardner
I listened to the audiobook while driving interstate
Phil Tetlock is a professor of psychology and political science at the University of Pennsylvania
spent decades studying the predictions of experts
He did a study of 284 experts 27,000 predictions on political, social, and economic outcomes over a 21-year span
forecasters had good credentials, relevant work experience and advanced degrees
Tetlocks first book is “Expert Political Judgment”
He concluded that experts did “little better than guessing”. (the bad news)
A government agency was created: Intelligence Advanced Research Projects Activity (IARPA), was assembled and decided to create a forecasting tournament
Tetlock started the Good Judgment Project (GJP), 2,800 GJP volunteers in the first year of the tournament, the top 2 percent were called “superforecasters.”
foresight is a real and measurable skill, and these skills can be learned and cultivated. (the good news)
The secret of the success of the GJP is that it was carried out with scientific rigor
About 70 percent of super-forecasters remain accurate and don’t progress to the mean, from one year to the next, they got better.
Experts
Traditional experts rarely measure their accuracy and keep score. Feedback is required in any system to make it more accurate. Closed loop feedback control system.
Pundits avoid scoring their accuracy because it doesn’ help their career.
The more famous they were, the less accurate they were
“tight, simple, clear stories that grab and hold audiences.” Famous people are better at selling their opinions than they are at predictions.
The experts were better at storytelling, persuasion skills
Intuition
“illusions of knowledge.” and the fallacy of Intuition
Daniel Kahneman and Thinking fast and slow, intuition can lead to incorrect conclusions
Reference Blink by Malcolm Gladwell: Fast thinking (intuition) can be trained over time
An overreliance on intuition leads to poor decisions “we move too fast from confusion and uncertainty to a clear and confident conclusion without spending any time in between.” i.e. Daniel Kahneman, thinking fast. – jumping to conclusions. Why we want to know how the movie ends.
based on a famous essay on thinking styles by the philosopher Isaiah Berlin. Foxes know a little about a lot of things, and hedgehogs know one big thing
Make prediction a science by measuring it and studying results based and scientific techniques that work. Understand the system
Score your accuracy
Glenn Brier, a meteorologist, developed the Brier score. It ranges from 0 (perfectly correct) to 2 (perfectly incorrect). 0.5 is precisely random guessing. Brier curve is non-linear. I describe it as logarithmic.
Forecasting is possible, even though the majority of people are bad at it.
Freakonomics episodes on prediction:
Note: If harder predictions are rewarded more than easy ones (reward for risk taking), future predictions can improve via scoring. Like how the olympics scores higher for more dififcult moves.
A model superforecaster has 4 traits:
Philosophy – live with uncertainty and percentages of likelihood, have a healthy sense of humility, never believe in “fate”
Thinking style – good problem solving and logic skills, intelligence (but not super-intelligence), Rationality Quotent (RQ), open mindedness, openness to experience, constantly improving, embrace feedback, good with numbers but doesn’t overcomplicate things
Methods – pragmatic thinking, like the fox – jack of all trades, not committed to any one idea (agnostic), higher resolution thinking results in greater accuracy (55% vs. 50/60%), update your estimate whenever new information is encountered, get feedback from others, be aware of and avoid confirmation bias, avoid knee-jerk reactions, be aware of anchoring and biases (airplane crashes are more traumatic than car crashes, but cars crash much more often)
Work ethic – fixed vs. growth mindset (Carol Dweck), grit (Angela Duckworth) Grit is perseverance in the service of long-term goals,
Note: Keith Stanovich, professor emeritus of applied psychology and human development at the University of
Toronto, compares IQ and RQ, rationality quotient. Those with high RQ’s exhibit adaptive behavioral acts, efficient behavioral regulation, sensible goal prioritization, reflectivity, and the proper treatment of evidence. These qualities are very consistent with those of the superforecasters.
Note: The Wisdom of Crowds by James Surowiecki. You can gain from diversity by capturing the views of different individuals by aggregating multiple views. The crowd at a county fair accurately guessed the weight of an ox when their individual guesses were averaged.
Methodology
How we think is more important. Smart is required, genius is not. Required skills: open mindedness, intellectually humble, nondeterministic, numerate, update often, hard working.
Bird example,  how much money you would be willing to pay to save 2,000 migratory birds from dying? $80. 20,000 birds? $78.200,000 birds? $88. Seems we are better at qualitative analysis than we are at quantitative analysis. Daniel Kahneman calls this “scope insensitivity.”
Working in teams
Teams of superforcasters can work better than lone wolves. Plus is sharing information and aggregating. Drawbacks are group-think and social loafing.
Training
Training improves forecasting. Leadership (always definitive) is at odds with super-forecasting (always weighing the options).
You can filter for the right people and you can train anyone to think like the right people.
Leadership
The flip side
Nassim Taleb and the black swan event. Black swans are unpredictable events that have a significant impact.
If the event was predictable then it wouldn’t be a Black Swan.
Housing bubble, Tech bubble, pretty much any bubble. The 911 attacks.
By | 2016-10-22T13:27:21+00:00 September 13th, 2016|blog|0 Comments