The Wisdom of Being Wrong: Rethinking Certainty in a Complex World

Roderik looking at an elephants statue from six sides

Picture an elephant in India in mind’s eye. You can imagine describing it as a complete animal, with all its parts connected. Now imagine asking six people who can’t see and have never heard of an elephant to explore a real-life elephant and tell us what it is. Each person who can’t see approaches the elephant from a different direction. One would feel the sides and claim it’s a brick wall warm from the summer sun. Another would feel it’s leg and claim it’s a column from the temple they pray in. Another would feel it’s tusk and claim it’s a spear, while yet another would feel it’s trunk and claim the elephant is like a snake. Each person in this story is only able to describe a part of the whole, and each person would be able to say they know ‘the truth’ of what an elephant is. 

Now let’s take this story and figure out how it applies to discovery of new knowledge. When I give my talks a story I often tell is that of how I grew up. As a boy growing up in the Netherlands, on his bike in the countryside (sometimes I embellish by saying ‘while wearing my wooden clogs past the windmills’) I pass a lot of canals and lakes. And in these bodies of water I see a lot of swans. And so, I have seen thousands of swans. I even went to the Rijksmuseum in Amsterdam to do “historical research”. And this has led me to draw one important conclusion, and that is that “all swans are white”. At this point, invariably, a person in the audience would claim this to be untrue, as - and I quote - ‘they have seen a black swan’. I would ask how they can be sure this was a swan, or whether they painted the swan. And together we would learn that even though my initial theory was based on extensive research on my side, my theory didn’t hold up with additional evidence.

Both of these stories teach us the following lesson: science, and the process of discovery, doesn’t help us be ‘more right’, it helps us be ‘less wrong’.

The Concept of 'Less Wrong'

So now that we have a vague understanding of the core concepts covered in this article, let’s take a moment to describe them. The approach of being 'Less Wrong' is centred around the idea of iterative learning and continuous improvement. It acknowledges the limitations of our current knowledge and emphasises the process of gradually reducing errors or inaccuracies through feedback, experimentation, and adaptation.

Today’s world is characterised by rapid technological advancements and global interconnectedness, and so the complexity and unpredictability of problems have escalated. Traditional models and frameworks, while still valuable, often cannot keep pace with the speed of change and the emergence of novel challenges. This reality underscores the importance of flexibility and adaptability in our approach to problem-solving. For instance, the COVID-19 pandemic demonstrated how assumptions in public health, global trade, and workplace dynamics can be upended overnight, requiring swift reevaluation of strategies and practices. Businesses that were able to pivot their operations quickly, whether through adopting remote work, adjusting supply chains, or innovating new services, exemplified the 'Less Wrong' mindset by adapting to new information and circumstances. This adaptability has become a crucial asset, enabling individuals and organisations not just to survive but to thrive in an era defined by its uncertainty. Embracing a 'Less Wrong' approach allows us to navigate this uncertainty more effectively, continually refining our understanding and strategies in response to an ever-changing world. In a later example I will show how a metric we used in a previous job to inform our decision making quickly became a drag as it was used to provide more certainty, effectively flipping from a metric that helped us be ‘Less Wrong’, to one that held us to be ‘More Right’.

The concept of ‘More Right’

I define the approach of being 'More Right' as aiming for precision and certainty from the outset. It involves making decisions based on the belief that there is a single, most correct answer or solution available, often relying heavily on existing data, models, or frameworks.

The focus on certainty is one of the pitfalls of the ‘More Right’ approach. This can be seen when we are overconfident in models or predictions. An unwavering belief in certainty can close off avenues for learning and adaptation. On the flipside, there might be times when and where this unwavering belief is necessary; think about making life and death decisions where there is just no room to learn in the moment (there is plenty of time for reflection later). Overall though, I believe the ‘More Right’ mindset is something holding us back.

Unfortunately, the ‘More Right’ mindset is the one that I see prevailing everywhere. There is a reason we got to the point where the ‘More Right’ approach is one to easily fall back on even when it is not beneficial, and that has everything to do with how we are taught to learn.

We are taught to be ‘More Right’

In traditional educational settings, students are often presented with problems that have clear, definitive answers. From early mathematics where 1+1=2, to more advanced subjects, the focus is on finding the one correct solution. This method of teaching instils a belief that every problem has a single right answer. This approach is deeply ingrained; it is rewarded through grades and reinforced by the authority of educators, shaping our fundamental approach to problem-solving.

Albert Einstein reportedly* said, "If I had an hour to solve a problem, I would spend 55 minutes thinking about the problem and five minutes thinking about solutions." This quote, whoever said it, underscores the crucial insight that understanding the problem in-depth is often more critical than the solution itself. In the real world, problems are rarely handed to us fully formed and well-defined. Unlike in the classroom, identifying the true nature of the problem is often the most significant challenge. 

In my professional career I have often worked with people coming straight from university. And this pattern becomes very clear. Many people starting their careers would respond to questions by thinking hard and providing a single answer (and often expecting a certain amount of praise for the solution they had just come up with to boot). What I am looking for is someone who first works with me for the metaphorical 55 minutes, before we move on to a potential solution.

We are reinforced to be ‘More Right’

So we have been taught to be ‘More Right’ from the outset. Why do we keep doing it? One of the reasons we often cannot break the cycle of wanting to be ‘More Right’ is that we are reinforced to do so. Several cognitive biases serve to reinforce our attitude:

Confirmation Bias
This is the tendency to search for, interpret, favour, and recall information in a way that confirms one's preconceptions or hypotheses. In the context of problem-solving, confirmation bias can lead us to overlook new or contradictory information that could lead to a better understanding of the problem, effectively keeping us trapped in the belief that our initial understanding of the problem (our "1+1=2") is correct.

In a real-world and/or business setting we can often believe we know what the whole metaphorical elephant looks like when we are just holding a part. When someone else tells us they have a completely different view of the world (ie. they are holding a different part of the elephant) it is easy to close our ears and ignore the new information.

Sunk Cost Fallacy
This bias occurs when we continue a behaviour or endeavour as a result of previously invested resources (time, money, or effort), rather than cutting our losses and changing course. In product development, this can manifest as continuing down a predetermined path even when it becomes clear that this path may not lead to the best outcome, simply because of the investment already made.

Part of the reason why it is so hard to break out of the sunk cost fallacy in real-world and/or business settings is because we are often punished for ‘changing our minds’. We just have to look at the headlines to look at any example of a politician who is derided for changing their minds, without any context given on why they might have changed their minds, or information given that maybe you should follow suit and also change your mind! Flip-flop along…

Scientific Discovery and Uncertainty: The Black Swan

Remember the story I told you at the beginning, about growing up in the Dutch countryside and seeing thousands of swans - all white. Which led me to the belief that all swans are white? Well, this was the prevailing belief in Europe prior to the 17th century. There and then, the consensus among naturalists and the general populace was that all swans were white. This certainty was rooted in centuries of observations and documented instances across Europe and was considered an unchallengeable truth, embodying the 'More Right' approach to knowledge.

In the late 17th century, with the exploration of Australia by European sailors, the unexpected discovery of black swans challenged these long-held beliefs. This discovery served as a direct confrontation to the 'More Right' mindset, introducing a significant level of uncertainty into what was previously considered a closed matter.

As a ‘closed matter’ any new evidence was met with a lot of scepticism.The resistance to revising established beliefs is a hallmark of the 'More Right' mindset. Over time, the evidence of black swans was accepted, marking a pivotal moment where the scientific community began to embrace a 'Less Wrong' approach by acknowledging the limitations of their previous understanding.

So what happened next? The broader implications of accepting the existence of black swans led to new, critical questions about variation in nature, hereditary principles, and the mechanisms of genetic information transfer. You could argue that this opened up avenues for research that eventually contributed to the development of modern genetics and our understanding of biodiversity.

To emphasise, the black swan discovery exemplifies the value of embracing uncertainty in scientific advancement. It illustrates the shift towards a 'Less Wrong' mindset, where scientific progress is seen not as the pursuit of absolute certainty, but as a continual process of refinement and understanding based on incorporating new, sometimes unexpected, evidence.

Finally, the legacy of the black swan event is also seen in shaping the philosophy of science, the philosophy that answers the question ‘How do we learn?’. It serves as a reminder of the importance of remaining open to revising our understandings in the light of new evidence. This story underlines the dynamic, ever-evolving nature of knowledge and the critical role of uncertainty in driving the quest for deeper understanding. 

There are many more examples of scientists that use these principles to do scientific discoveries. One of them is Marie Curie. She made her discoveries of ‘radioactivity’ (a term she coined) at the end of the 19th and start of the 20th century. At this time scientific and social dogmas were strong - the ‘More Right’ mindset was prevailing. As the first woman to be awarded the Nobel prize (and the only person to have been awarded 2 Nobel prizes in different categories), she broke social dogmas. In her research, she had to challenge scientific dogmas by questioning established scientific truths. Her approach demonstrated the importance of persistence, curiosity, and a willingness to question. 

Practical Applications and Illustrations

In my own experience as a Data Scientist and managing Data Scientists in various roles I have observed the ‘More Right’ mindset in action many times in this field as well. Let’s first see how that looks in general, and I will follow it up with an example from my own past work. In the rapidly evolving world of data science, the quest for accuracy often leads us down a path where models dictate decisions with little room for questioning. This pursuit of being "More Right" can sometimes blind us to the complex realities outside our datasets.

Yet, what if we shifted our mindset from striving to be "More Right" to becoming "Less Wrong"? This approach acknowledges the inherent uncertainties in our models, encouraging flexibility, continuous learning, and adaptability. It's akin to using a weather forecast for planning an outing—being prepared for different outcomes rather than betting everything on a single prediction.

What does Being More Right with a Data Science Model look like?

Mindset
"This model has a high accuracy rate, so its predictions should dictate our strategy. If we follow what the model says, we'll achieve the best outcome."

What They Might Say
"According to our predictive model, launching Product X in Market Y will yield a 20% increase in sales. We should reallocate our budget to capitalise on this opportunity immediately."

Approach
The model's predictions are taken as near certainties, and strategies are developed based on the assumption that the model's output is the "right" course of action. Decisions are made with high confidence in the model's ability to predict future states accurately, possibly without sufficient consideration of uncertainties or external factors the model might not account for.

What does Being Less Wrong with a Data Science Model look like?

Mindset
"This model helps us reduce uncertainty and improve our decisions, but it's not perfect. Let's see where it can guide us and where we need to question its assumptions."

What They Might Say
"Our model indicates a trend in customer preferences, which suggests we should adjust our strategy in these areas. However, we'll proceed with small, iterative tests to validate these insights further before making significant investments."

Approach
Use the model as a tool for exploration and hypothesis generation. The focus is on learning from the model's output, questioning its assumptions, and continuously refining the model based on new data and insights. Decisions are made cautiously, with an awareness of the model's limitations.

Being both ‘Less Wrong’ and ‘More right’

At Booking.com I worked on a complex internal tool that was conceptually difficult to use. Because of that complexity it was not always used right. To improve the quality of usage we decided to create an internal metric to measure how it was used by the different users. We made it our internal goal to use this metric to improve the tool's usage. We would do this by making changes to the tool or the support we provided and looking at how it changed the metric. This setup exemplifies a 'Less Wrong' approach, where the focus is on iteratively improving a system based on feedback and metrics without imposing rigid success criteria.

The narrative takes a turn when the internal metric, designed as a feedback mechanism for continuous improvement, was discovered by other people in the organisation. Some of our users, and their leaders, discovered that this tool showed they were using our product suboptimally and asked the measure to be repurposed as an external performance measure for users. This shift represents a move towards a 'More Right' approach, where success is measured against a fixed standard, potentially overlooking the nuances of individual use cases and the limited control users have over influencing the metric.

This shift led to unintended consequences, such as user frustration and potentially counterproductive behaviour. In essence it would lead to some of our users trying to figure out ‘which hoops to jump through’. This illustrates the pitfalls of the 'More Right' mindset when applied inappropriately, especially in complex, adaptive systems where direct control over outcomes is limited.

This anecdote also exemplifies Goodhart's Law: "When a measure becomes a target, it ceases to be a good measure" and the Cobra Effect, where solutions to problems exacerbate the problem due to unintended consequences. These principles underscore the challenges of applying a 'more right' mindset to dynamic and complex systems.

The Role of Experimentation and Learning

What happened in the example of the internal measure, made me realise how important it is to recognise the context in which different approaches should be applied. In environments characterised by complexity and uncertainty, a 'less wrong' approach that emphasises flexibility, adaptation, and learning from feedback is often more effective. However, the temptation is to shift towards a 'more right' approach by setting concrete targets, and that can lead to suboptimal outcomes if not carefully managed.

Experimentation and A/B testing become critical tools in this approach. They allow us to test hypotheses about both the nature of the problem and potential solutions in a controlled, measurable way. This not only helps in refining the problem statement but also in identifying solutions that work 'less wrong' compared to our previous attempts. This iterative learning process embraces failure as a stepping stone to better understanding and solutions, moving away from the quest for a single 'right' answer towards a continuous improvement model.

However, be careful when you think that ‘just doing experimentation’ leads you to automatically apply the ‘Less Wrong’ mindset. You can still do experimentation with a ‘More Right’ approach by using it to just look for evidence that your idea is great. In future articles I will delve deeper into this pitfall. (Of course if you are keen to hear my thoughts on this, and how they might apply to your company, go to the contact form at www.rdrkk.nl and let’s set up a chat!)

The shift from a 'more right' to a 'less wrong' mindset is not just a change in problem-solving technique; it's a fundamental change in how we view the process of learning and addressing challenges. By embracing uncertainty, valuing experimentation, and learning from each step, we can navigate the complexities of the real world more effectively. This approach not only leads to better problem-solving but also fosters a culture of innovation and adaptability, crucial for success in today's rapidly changing environment.

Conclusion

In this article I have given you an overview of the ‘Less Wrong’ and “More Right’ mindsets or attitudes. How they prevailed historically, but were also constantly countered, leading to wonderful learnings and lessons. I will dive deeper into this topic in a set of future articles and offer concrete strategies for fostering a 'less wrong' mindset within teams and organisations, such as promoting small-scale testing, highlighting uncertainty, and encouraging the questioning of underlying assumptions.

To summarise the key points: We have all been taught to be ‘More Right’ in our approach to learning and problem solving. Now is the time to learn how you can be ‘Less Wrong’. Being ‘Less Wrong’ is about a sense of humility, adaptability, and continuous learning in the face of uncertainty. It is not an easy path, but the journey towards understanding is always ongoing. Being 'less wrong' offers a more flexible and resilient path forward.

How about you? I am very curious to hear your experiences with data-driven decision-making and situations where a 'less wrong' approach could have made a difference. And of course I would love to hear from you if you think I am absolutely wrong, so I can be a little less so!

Previous
Previous

Harnessing the True Power of Data Science: Beyond 35x35

Next
Next

Embracing Vulnerability: The Unseen Strength of Leadership