We all can be more super every day. It may not be easy, but I don’t believe that “easy” and “super” have anything to do with each other.

They think probabilistically; and, perhaps most importantly, they change their minds when facts change. They admit their mistakes and change course.

I recently read the book Superforecasting: The Art and Science of Prediction, by Philip E. Tetlock, a social psychologist at the University of Pennsylvania, and coauthor Dan Gardner, a journalist. The book explores the elements of excellent prediction. In it, the authors refer to the Good Judgment Project, which Tetlock created with his wife and fellow psychologist, Barbara Mellers, in 2011. The Good Judgement Project compared the acumen of nearly 3,000 random participants in making forecasts to that of so-called “experts”—testing whether these ordinary people (a retired computer programmer, a social services worker, a homemaker, among the group) could be trained to forecast political, economic, and foreign policy events better than credentialed experts from MIT, the CIA, the Pentagon, and others. You can probably guess the winner: The amateurs far outperformed the experts. In fact, the experts often did quite poorly. For certain types of experts, they may be no better than random chance. Or as Tetlock has famously put it: the experts did little better than a “dart-throwing chimp.”

Tetlock’s point—we are all forecasters. I would add this kicker: The best forecasters gather evidence before jumping to conclusions; they think probabilistically; and, perhaps most importantly, they change their minds when facts change. They admit their mistakes and change course.

Sounds easy, right? I wish. We all suffer from some degree of what I call, “belief severance.” We rationalize, contort, and contrive our minds in crazy ways in order to avoid changing our theories or beliefs. We stubbornly want to prove ourselves right – often refusing to accept new evidence that flies in the face of our beliefs, just so that we can prove we are right (even if it’s now clear we’re not).

Riffing off of Tetlock’s research, here are some fascinating consistencies in the approach taken by the best amateur forecasters:

They are probabilistic: Wouldn’t it be cool if our kids had math classes that not only taught the ‘how’ of math, but also the practical application of math?! If this were the case, our kids would learn all about probabilities—how to interpret statements like, “an 80 percent chance of rain.” Does this mean that it will rain 80 percent of the day and not-rain 20 percent of the day? Or, does this mean that there’s an 80 percent chance of having rain, and if there’s rain, are we talking, rain for 10 percent of the day or for 100 percent of the day? Understanding these nuances of probability theory is both fascinating and important. And it’s way too complicated to explain in a few sentences, but I’d suggest Sal Khan’s introduction to probability theory (four quick videos on the Khan Academy site), or a clever example called “The Last Banana” at the Ted-Ed site, as good places to open your mind to the basic concepts.

They never rush to judgement: While we might have immediate “gut reactions” and reflexive convictions about which directions to take, we should try avoid the allure of making those quick judgements final ones. The best forecasters don’t succumb to “confirmation bias.” They remain open to a range of possible explanations until there is actual evidence that overwhelmingly favors one of them.

They are open to small ideas: We are in an age of ‘big ideas’ – massive generalizations which are made for sensationalized television and quick sound bites, but dangerous in the context of building successful businesses. Companies suffer if they let big ideas get in the way of being excellent forecasters of important small ideas.

They are aggregators of perspectives: I have a sort-of obsessive passion for finding as much information as possible and gathering as many perspectives as possible to help me understand and contextualize my thoughts. I describe it as, ‘going to page 20 in a Google search.’ (Research shows that 0.29% of people go to page 20 of a Google search, or one out of approximately 350 people). I point to the wunderkind statistician Nate Silver: He proved that it is more accurate to develop mental models rooted in the aggregation of results of every single distinct survey. The best thinkers, in other words, take every available existing thought (or, in the case of Nate Silver, every survey or poll) aggregate and assess them, and then form our forecast. This often means letting go of our ego in search of a more unbiased, accurate, and impactful powerful truth.

They start outside to get in: It’s easy for anyone to look inside and have an opinion or a forecast—we just repeat what we think. That’s simple, but it’s usually wrong (or, at a minimum, not ideal). Better to start by looking at the entire forest: gain facts about the forest, then study the whole bunch of trees, and gain facts about them, too. Only then will we have the deep contextual perspective we need to form forecasts about how to plant/build/shape the trees that are our daily work. Daniel Kahneman, the great thinker and one of the fathers of behavioral economics, calls this the “outside view.” Start with the challenge of the more abstract picture and then allow that big picture to help refine your thoughts on your tasks at hand.

And here’s the biggie I referred to earlier:

They admit when they’re wrong: When accused of being inconsistent, the legendary British economist John Maynard Keynes is said to have once quipped back: “When the facts change, I change my mind. What do you do, Sir?” Many people (who are not superforecasters) do not change their mind when the facts change. Instead, they fall into a downward spiral of defensiveness and stubbornness. This is dangerous! Opinions in any organization or business must be open to discussion, distillment, disagreement, and, dissent and discard. Opinions may be ours, but they are not us, and they do not define us. Facts are meant to be discovered. They are not screaming out at us. Rather, we must be diligent explorers and searchers to find those relevant facts that matter most. And if we find a fact that makes our opinion wrong, embrace it! Be wrong – being disproven by a new fact is excellent. It’s normal, and it’s valued in a fact-driven drama-free environment. Doing this serves us well as forecasters.

I believe that intellectual curiosity is at the core of a purpose-driven life. The authors of Superforecasting illustrate intellectual curiosity with a simple example: Do you take the question “Who will win the presidential election in Ghana?” as pointless, or as an opportunity to learn something about Ghana?

This may sound corny, but I constantly try to remind the people who work at Uptake, the company I run—as well as, myself—to “be super”: super in our efforts to tenaciously learn and discover the unarguable facts; super in our refusal to rush to judgement about the quality of our opinions or the quality of the opinions of others; super in seeing both the outside and the inside; super in our refusal to allow the easy big ideas to define our actions in how we pursue the complicated small steps; and super in seeing when we’re right, or super in admitting when we’re wrong, and then gracefully transitioning to the more probable path of success.

We all can be more super every day. It may not be easy, but I don’t believe that “easy” and “super” have anything to do with each other.

0:00
0:00
close button
Subscribe

This site was built with many thanks to:

Development: Static Interactive
Images: Chicago Ideas week
Site Design: Carli Papp