Create a free account, or log in

Are these the three most powerful words a leader can say?

When leadership is confused with the need to know everything it can lead to cultures of bluff where people feel it’s more important to give a response (including a wrong one) than acknowledge doubt. Instead, these three simple words from a leader can establish a very different context: I don’t know. “I don’t know” puts […]
Dionne Lew
Dionne Lew
Are these the three most powerful words a leader can say?

When leadership is confused with the need to know everything it can lead to cultures of bluff where people feel it’s more important to give a response (including a wrong one) than acknowledge doubt.

Instead, these three simple words from a leader can establish a very different context: I don’t know.

“I don’t know” puts the focus on rigour and says many things including:

  1. Let’s not assume;
  2. We need data not anecdotes;
  3. Let’s find out.

Then why is saying it so difficult?

For one, we like to believe that certainty is possible.

This is despite knowing that many of the things we once through to be true we now know to be false (that the earth is flat or that ulcers are caused by stress, for example) and that this will most likely happen again in the future.

This does not make earlier theories wrong so much as shows that what we know at any point in time is limited by the questions we ask, the assumptions we make and the tools that are available for assessing them.

We also look to others (in particular experts) to provide grounding even if it’s illusory. This is understandable because uncertainty makes us uncomfortable and impacts the way that we make decisions.

For example, in a recent article on risk literacy, Gerd Gigerenzer, director of the Centre for Adaptive Behaviour and Cognition at the Max Planck Institute for Human Development, points out that each year expert forecasters predict the following year’s exchange rates with a record that is hardly better than chance.

Despite this, we pay $200 billion a year to the industry to do so.

Why? We need the sense of certainty these forecasts promise.

Likewise Gigerenzer says patients assume their doctors to be all-knowing and are reluctant to ask for evidence or second opinions, but still feel better after a consultation. This is despite studies that show many medical experts do not properly understand statistics like base rates, important for assessing probability.

By projecting this need for certainty onto someone in a leadership position, although it has little to do with leadership ability, knowing and leading can become confused.

We also assume that because leaders make decisions they are certain; when the reality is that leaders continually make decisions with imperfect information.

This does not make them uninformed. On the contrary, leaders have in-depth knowledge of a subject area, which combined with technical and strategic ability and experience, puts them in the best position to make these calls.

In this way, a good leader avoids the extremes of:

  1. Analysis paralysis – needing more and more information in the hope of creating certainty before doing anything, often to deflect attention from an inability to make decisions, or on the other hand;
  2. Just-do-itism – confusing uninformed hunches with strategy and just diving in because of an inability to deal with ambiguity or a lack of understanding of how to assess risk.

The problem with cultures that demand certainty is that they privilege fast-talkers or technical experts who are able to bamboozle others with jargon.

This can make ‘personality’ a greater consideration in promotion or succession than qualities like critical thinking and emotional intelligence, integral to good judgment.

But does uncertainty mean we throw rigour to the wind, shrug our shoulders and say that if we can never be certain then learning does not matter? Or that a (truly) expert opinion has little more value than an uninformed guess?

Not at all.

First, we need to separate out the certainties (for example, the math used by an engineer to make sure that a bridge stands up) from the uncertainties (the conscious and unconscious biases that influence the way they research, share and implement what they have learned or how that information will be processed at the executive table).

Technical information, though vital, exists in context. Decision-making requires judgment, a quality that does not easily yield to measurement. And yet, as Deborah Nanschild and Heather Davis point out in ‘The V Factor’ it is these tacit and unexamined choices underpinning our judgments that form the real foundation of decision-making.

We also need to understand that we can deal with uncertainty in informed ways. Developing and testing theories, acquiring evidence, reassessing assumptions, refining the process is why we can now fly to the moon or cure disease.

We can establish similar processes at work such as peer review or pre-mortems or project boards that are specifically designed to identify and come up with problems and solutions to unknowns.

People who dedicate their lives to studying a certain discipline know a great deal more about that area than other people. This does not mean we should box them into commenting only on their area of expertise but we should not assume they do or should know everything else (and nor should they).

It’s important in the light of the above to keep an open mind and find ways to include the quiet, considered and reluctant (and not just the loudest or most confident) in decision-making.

If we stopped automatically applauding answers but encouraged healthy (not debilitating) reflection and doubt we could create a very different workplace culture.

It is possible to navigate uncertainty from a position of personal and professional strength by:

  1. Understanding that it’s normal to crave certainty;
  2. Accepting that you do not know everything;
  3. And that nor does anyone else. Ask questions about information that you’re given rather than accepting it on face value even from experts. If it’s impenetrable, ask them to rephrase it in terms you can understand;
  4. Actively manage the desire of groups to converge too quickly on information (groupthink) before ideas have been thrashed out;
  5. Be aware of biases that stop you from accepting new information because it’s unknown or uncomfortable;
  6. Be willing to let go of old information in the light of new evidence even if it means earlier beliefs were wrong;
  7. Find ways of including people who aren’t confident or articulate in decision-making;
  8. When it’s true say to someone: I was wrong;
  9. When it’s true say: I don’t know.

And even when it’s not try asking yourself: what if I am wrong? Or can I think of at least one unknown?