Superforecaster

A superforecaster is a person who makes forecasts that can be shown by statistical means to have been consistently more accurate than the general public or experts. Superforecasters sometimes use modern analytical and statistical methodologies to augment estimates of base rates of events; research finds that such forecasters are typically more accurate than experts in the field who do not use analytical and statistical techniques. The term "superforecaster" is a trademark of Good Judgment Inc.

Etymology
The term is a combination of the prefix, meaning "over and above" or "of high grade or quality", and , meaning one who predicts an outcome that might occur in the future.

History
Origins of the term are attributed to Philip E. Tetlock with results from The Good Judgment Project and subsequent book with Dan Gardner Superforecasting: The Art and Science of Prediction.

In December 2019 a Central Intelligence Agency analyst writing under the pseudonym "Bobby W." suggested the Intelligence community should study superforecaster research on how certain individuals with "particular traits" are better forecasters and how they should be leveraged.

In February 2020 Dominic Cummings agreed with Tetlock and others in implying that study of superforecasting was more effective than listening to political pundits.

Science
Superforecasters estimate a probability of an occurrence, and review the estimate when circumstances contributing to the estimate change. This is based on both personal impressions, public data, and incorporating input from other superforecasters, but attempts to remove bias in their estimates. In The Good Judgment Project one set of forecasters were given training on how to translate their understandings into a probabilistic forecast, summarised into an acronym "CHAMP" for Comparisons, Historical trends, Average opinions, Mathematical models, and Predictable biases.

A study published in 2021 used a Bias, Information, Noise (BIN) model to study the underlying processes enabling accuracy among superforecasters. The conclusion was that superforecasters' ability to filter out "noise" played a more significant role in improving accuracy than bias reduction or the efficient extraction of information.

Effectiveness
In the Good Judgment Project, "the top forecasters... performed about 30 percent better than the average for intelligence community analysts who could read intercepts and other secret data".

Training forecasters with specialised techniques may increase forecaster accuracy: in the Good Judgment Project, one group was given training in the "CHAMP" methodology, which appeared to increase forecasting accuracy.

Superforecasters sometimes predict that events are less than 50% likely to happen, but they still happen: Bloomberg notes that they made a prediction of 23% for a leave vote in the month of the June 2016 Brexit referendum. On the other hand, the BBC notes that they accurately predicted Donald Trump's success in the 2016 Republican Party primaries.

Superforecasters also made a number of accurate and important forecasts about the coronavirus pandemic, which "businesses, governments and other institutions" have drawn upon. In addition, they have made "accurate predictions about world events like the approval of the United Kingdom’s Brexit vote in 2020, Saudi Arabia’s decision to partially take its national gas company public in 2019, and the status of Russia’s food embargo against some European countries also in 2019".

Aid agencies are also using superforecasting to determine the probability of droughts becoming famines, while the Center for a New American Security has described how superforecasters aided them in predicting future Colombian government policy. Goldman Sachs drew upon superforecasters' vaccine forecasts during the coronavirus pandemic to inform their analyses.

The Economist notes that in October 2021, Superforecasters accurately predicted events that occurred in 2022, including "election results in France and Brazil; the lack of a Winter Olympics boycott; the outcome of America's midterm elections, and that global Covid-19 vaccinations would reach 12bn doses in mid-2022". However, they did not forecast the emergence of the Omicron variant. The following year, The Economist wrote that all eight of the Superforecasters’ predictions for 2023 were correct, including on global GDP growth, Chinese GDP growth, and election results in Nigeria and Turkey.

In February 2023, Superforecasters made better forecasts than readers of the Financial Times on eight out of nine questions that were resolved at the end of the year.

Traits
One of Tetlock's findings from the Good Judgment Project was that cognitive and personality traits were more important than specialised knowledge when it came to predicting the outcome of various world events typically more accurately than intelligence agencies. In particular, a 2015 study found that key predictors of forecasting accuracy were "cognitive ability [IQ], political knowledge, and open-mindedness". Superforecasters "were better at inductive reasoning, pattern detection, cognitive flexibility, and open-mindedness". In the Good Judgment Project, the superforecasters "scored higher on both intelligence and political knowledge than the already well-above-average group of forecasters" who were taking part in the tournament.

People

 * Elaine Rich, a superforecaster who participated in the Good Judgement Project.
 * Andrew Sabisky, who resigned from his position as advisor to the United Kingdom government at Downing Street, with chief advisor Dominic Cummings telling journalists "read Philip Tetlock's Superforecasters, instead of political pundits who don't know what they're talking about".
 * Nick Hare, former head of futures and analytical methods at the Ministry of Defence (MoD).
 * Reed Roberts, a former PhD student in Chemistry.
 * Jonathon Kitson
 * Jean-Pierre Beugoms
 * Dan Mayland
 * Kjirste Morrell
 * Dominic Smith