Knowing What’s Coming
The outcome of our addiction to prediction need not be random
 
Making predictions is our second nature. Open any newspaper or television channel and you will find an endless stream of forecasts—whether the Reserve Bank of India will cut interest rates, whether the Sensex will scale 26,000, whether oil prices will go up, who will win the elections in Uttar Pradesh or will become the American president, the winner of the India-Pakistan match... These are forecasts that concern a lot of people. But we are all forecasters in our own lives too. We are making guesses—that is forecasting—about when we will get married, change jobs, buy a house, calculate whether the retirement corpus will last, etc. Humans are curious to know what is coming and are willing to pay to know it—either small change to roadside card-pulling parrots or big money to gurus—whether they wear robes or sharp suits. 
 
But how good are humans at forecasting? You can only tell conclusively if all such forecasts are actually recorded and verified. Philip Tetlock has done exactly that. His ran a project from 1984 to 2004 which analysed 82,361 predictions made by 284 experts in fields like political science, economics and journalism. He found that about 15% of events that experts claimed to know in advance that had little or no chance of happening did, in fact, happen, while about 27% of the ‘sure things’ didn’t happen. Tetlock concluded that the experts did little better than a ‘dart-throwing chimpanzee’. The details of the project are captured in Tetlock’s book Expert Political Judgment, published in 2005. Tetlock, who teaches at the Wharton School of Business, became the world’s foremost expert about the so-called experts. 
 
However, while the study gave an impression that the average ‘expert’ is “roughly as accurate as a dart-throwing chimp,” the metaphor was subject to misinterpretation. Tetlock’s work was far richer and his findings were subtler than that. Tetlock showed, for instance, that near-term forecasts were far more accurate than those that we supposed would happen after three years. The book Superforecasting (written with Dan Gardner, a Canadian journalist) is a sequel and designed to set the record straight. It is, again, based on studies of actual forecasting; but, this time, not of ‘experts’ that dominate the media.
 
The book is based on a tournament for forecasting, sponsored by America’s intelligence agencies which had a lot of egg on their face following the dreadful invasion of Iraq. The contest ran from 2011 to 2015 and posed 500 near-term geopolitical questions to five teams (“Will India or Brazil become a permanent member of the UN Security Council in the next two years?), including one called Good Judgement Project (GJP) of Tetlock. In all, the tournament gathered one million individual judgements. As the tournament progressed, a small number of forecasters of GJP began to lead the pack. Using only an Internet connection, they consistently beat the other more competent teams, led by top researchers in their fields. In year-1, GJP team beat the official control group by 60%. In year-2, it beat the control group by 78%. It also beat its university-affiliated competitors and even intelligence analysts who had access to classified data. After two years, GJP was doing so much better that the other teams were dropped. The conclusion of the two books, put together: while forecasting is tough, many ordinary people are better at it than the so-called ‘experts’; also, prophecy can be practised and improved.
 
Who were these people who could predict well? You would be amazed to know that the crop of ‘super-forecasters’ included housewives, social services worker, unemployed factory workers and professors of mathematics, a film-maker, a retired official from the US department of agriculture and so on. Do they have anything in common? They had a ‘growth mindset’, as Tetlock puts it. They were focused, open-minded, careful, curious—and, above all—self-critical. They had a willingness to learn from mistakes, self-reflection and humility. The best forecasters were more interested in knowing why they were right or wrong, rather than whether they were right or wrong. Tetlock found “commitment to self-improvement to be strongest predictor of performance.” Unfortunately, these people may not be telegenic, articulate or even confident. So, they are bad for TV which will continue to bombard us with untested, simplistic and wrong predictions from ‘experts’. This book is a must-read for anyone who is just curious about how we can think and guess better.  
Comments
Array
Free Helpline
Legal Credit
Feedback