It wasn’t that long ago that we didn’t have much use for probabilities in our daily life, - outside of weather reports, election predictions, baseball and financial markets. But that’s all changed with the growing datafication of the economy and society. Probabilities have been playing an increasing role in our work and personal life given our newfound ability to quantify just about anything. In all kinds of everyday situations, - from medical diagnoses to financial decisions, - we now have to accept the fact that it’s impossible to predict what will actually happen. Instead, we have to get used to living in a complex world of uncertainties and probabilities. We have to learn how to deal with the very messy world of big data, and how to best apply our learning to make good decisions and predictions.
Physics went through such a transition about 100 years ago. In the deterministic world of classical mechanics, there’s always a real truth. The same objects, subject to the same forces, will always yield the same results. Elegant mathematical models can be used to make perfect predictions within the accuracy of their human-scale measurements. Early in the 19th century, French mathematician and scientist Pierre-Simon Laplace observed that if we knew the precise state of the universe as represented by the position and speed of every one of its particles, classical mechanics would enable us to calculate all past and future states of the universe.
But, this predictable world began to fall apart in the early 20th century. 19th century classical physics was replaced by a kind of 20th century magical mystery tour, ruled by the new principles of quantum mechanics and relativity. Classical mechanics could not explain the counter-intuitive and seemingly absurd behavior of energy and matter at atomic and cosmological scales.
At these scales, the world is intrinsically unpredictable. Instead of a deterministic world, we now have a world based on probabilities. You cannot predict all the future states of an electron, for example, based on its present state. You can map out its behavior, but only as probability distributions of all the possible states it could be at. Moreover, the Heisenberg uncertainty principle tells you that it’s impossible to know the exact state of a particle. You cannot simultaneously determine its exact position and velocity with any great degree of accuracy no matter how good your measurement tools are.
This transition, from a world view based on scientific determinism to one based on probabilities is not intuitive and quite difficult to get used to. Probabilities are inherently hard to grasp, - especially for an individual event like a war or an election, - said David Leonhardt in a recent NY Times column. “People understand that if they roll a die 100 times, they will get some 1’s. But when they see a probability for one event, they tend to think: Is this going to happen or not ? They then effectively round to 0 or to 100 percent… And when the unlikely happens, people scream: The probabilities were wrong!”
He cites as examples the 2016 elections, where most everyone expected Hilary Clinton to win because the latest polls had her probability of winning at between 72% and 85%; and the 2017 Super Bowl, where in the last minutes of the game the Atlanta Falcons had a 99% probability of winning over the New England Patriots.
Why is it so hard for people to deal with probabilities in everyday life? It’s because, when it comes to individual events, “people can’t resist saying that a probability was ‘right’ if it was above 50 percent and ‘wrong’ if it was below 50 percent,” notes Leonhardt. “I think part of the answer lies with Kahneman’s insight: Human beings need a story,” he added, referring to Princeton Professor Emeritus Daniel Kahneman. Professor Kahneman was the recipient of the 2002 Nobel Prize in Economics “for having integrated insights from psychological research into economic science, especially concerning human judgment and decision-making under uncertainty.”
In the 1970s, the prevailing view among social scientists was that people are generally rational and in control of the way they think and make decisions. It was thought that people only departed from rational behaviors because powerful emotions like fear, hatred or love distorted their judgement. Then in 1974, Kahneman, - and his long time collaborator Amos Tversky, who died in 1996, - published Judgement under Uncertainty: Heuristics and Biases, an article that challenged these assumptions. In a series of experiments, they demonstrated that human behavior often deviated from the predictions of the previous rational models, and that these deviations were due to the machinery of cognition, that is, to the biases and mental shortcuts or heuristics that we use for making everyday decisions, rather than to our emotional state.
Kahneman explained their research in his 2011 bestseller Thinking, Fast and Slow. Its central thesis is that our mind is composed of two very different systems of thinking, System 1 and System 2. System 1 is the intuitive, fast and emotional part of our mind. Thoughts come automatically and very quickly to System 1, without us doing anything to make them happen. System 2, on the other hand, is the slower, logical, more deliberate part of the mind. It’s where we evaluate and choose between multiple options, because only System 2 can think of multiple things at once and shift its attention between them.
System 1 typically works by developing a coherent story based on the observations and facts at its disposal. This helps us deal efficiently with the myriads of simple situations we encounter in everyday life. Research has shown that the intuitive System 1 is actually more influential in our decisions, choices and judgements than we generally realize.
But, while enabling us to act quickly, System 1 is prone to mistakes. It tends to be overconfident, creating the impression that we live in a world that’s more coherent and simpler than the actual real world. It suppresses complexity and information that might contradict its coherent story, unless System 2 intervenes because it realizes that something doesn’t quite feel right. System 1 does better the more expertise we have on a subject. Mistakes tend to happen when operating outside our areas of expertise.
Making sense of probabilities, numbers and graphs requires us to engage System 2, which, - for most everyone, - takes quite a bit of focus, time and energy. Thus, most people will try to evaluate the information using a System 1 simple story: who will win the election? who will win the football game?
This is not surprising. Storytelling has played a central role in human communications since times immemorial. Storytelling predates writing. Oral narratives were used by many ancient cultures as a way of passing along their traditions, beliefs and learning from generation to generation. Over the centuries, the nature of storytelling has significantly evolved with the advent of writing and the emergence of new technologies that enabled stories to be embodied in a variety of media, including books, films, and TV. Everything else being equal, it’s our preferred way of absorbing information.
“It’s not enough to say an event has a 10 percent probability,” wrote Leonhardt. “People need a story that forces them to visualize the unlikely event - so they don’t round 10 to zero. “Imagine that a forecast giving Candidate X a 10 percent chance included a prominent link, ‘How X wins.’ It would explain how the polling could be off and include a winning map for X. It would all but shout: This really may happen.”
Big data is increasingly important to help us make sense of the complex, unpredictable world around us. But doing so effectively requires two complementary set of skills. We need math, statistics and computer science to analyze and extract insights from all that data. But, we then need to help the data tell its story. And, that means translating the probabilities, tables and graphs which our brains typically have trouble processing into the simple stories that humans have long preferred.