Improving Predictions and Reducing Biases
Three related pieces that I came across recently:
Prediction is becoming a testable science, according to this article in The Economist. The underlying thought is very much about tapping the collective wisdom of the crowd, a concept that I first read about in The New Yorker’s columnist Surowiecki’s book. Another book, Superforecasting, by Tetlock elaborated the concept further. Since then the Good Judgement project has floated multiple challenges to enable better decisions by asking, well, the crowd. In fact anyone can participate in these forecasting challenges that the Good Judgement team floats on a regular basis. The Economist’s article talks about Cosmic Bazaar, which is a ‘a forecasting tournament created by the British government to improve its intelligence analysis’. The forecasters are mostly from the government department unlike Good Judgement that sources in capabilities openly and anyone can participate. There are a few key insights that help make better forecasts. For instance, revising predictions in light of new data, reducing biases by being inclusive of differing opinions, and using conditional probabilities. For now, lot of these crowdsourcing experiments are restricted to intelligence communities. However, I wonder how long would it be before businesses start leveraging crowdsourcing for generating insights too.
Staying on the topic of making better judgements, Tim Harford’s interview with Daniel Kahneman, the author of Thinking, Fast and Slow is worth a read. Kahneman’s work has focused on studying biases and noise in human judgement. Most of us professionals act in different ways in different situations. So how to reduce noise? “If you average many (independent) judgements, the noise will go down,” says Kahneman in the interview. This reminded me of the principles behind superforecasting as well. One excellent example that Kahneman gives is about grading exam sheets where the ‘performance on first essay at times influence the evaluator’s judgement of the second and third. To escape this bias, Kahneman recommends to “read one question across all booklets, and write the grade at the back of the booklet so you will not see it when you read the second and the third. To assess each question separately gives a fairer view of the student’s overall performance”. The interview touches various themes some light, and some dense, including Kahneman’s tumultuous relationship with Tversky with who he had co-authored Thinking, Fast and Slow.
A third perspective on biases in decision-making appeared in this short essay by Cal Newport in which he summarises a recent paper that was published in Nature relating to the topic of how we solve problems. ‘When presented with a challenging scenario, humans cannot evaluate every possible solution, so we instead deploy heuristics to prune this search space down to a much smaller number of promising candidates,’ writes Newport. However, interestingly, when we do so ‘we’re biased toward solutions that add components instead of those that subtract them’. According to Newport, this approach has lasting consequences as we struggle to simplify and instead multiply complexity and overburden our solutions and eventually our calendars leading to chronic cognitive overload. So in these situations while our brains tell us to do more, the solution may well be in doing less. Practical application of this in corporate environment means, ‘radically reduce their (employees) responsibilities, then leave them alone to execute’. Interesting thought and worth a discussion.
A quote that I came across:
In an unpredictable world, a good routine is a safe haven of certainty.