Don't let optimism get in the way of good change management

Christian Nyvang Qvick, Senior Consultant, LEAD

We tend to be too optimistic and don't think through the bad scenarios. In a management context, this often results in change projects that go over time and over budget and fail to meet success criteria, writes Christian Nyvang Qvick.

78 percent of Swedish drivers believe they drive better than the average driver. 94 percent of American lecturers believe they teach better than the average teacher. And 82 percent of French men believe they are better lovers than the average Frenchman.

The above is something to smile about. It goes without saying that we can't all be better than others. But at the same time, these are good examples that can help illustrate what a bias is exactly: a systematic fallacy.

It means we don't follow the rational or common sense, and it leads to us not seeing the world as it really is.

This also applies to leaders. And it creates problems when they need to implement big changes.

When you're overly optimistic about the future

There is a bias called the planning fallacy. This bias describes our human tendency to underestimate how long a task will take.

It can be small and large tasks related to change, where we think 'it's a quick fix' - but the reality of everyday friction with technical difficulties, delayed deliveries from business partners, other unforeseen tasks that suddenly land on our desk or a sudden illness in the team means that the original schedule slips. 

And there are a number of reasons why we tend to think that implementing change is easy and elegant.

Unrealistic optimism is a bias that describes our human tendency to believe that bad things won't happen to us. In other words, we imagine that the implementation of the new IT system will go smoothly. There will be no technical issues along the way. And our IT manager won't suddenly quit, causing the project to lose momentum until we find a new project manager two months later.

This is linked to two other biases that also describe our tendency to ignore risks related to the future. Catastrophe neglect means that we tend not to think the worst-case scenario is bad enough. Normality bias describes our tendency to not think about or be wary of rare or abnormal events that haven't happened before. The examples are almost endless. In February 2020, few leaders would have predicted that a worldwide pandemic would hamper a lot of ongoing change projects in Danish organizations.

Similarly, not many business leaders were probably prepared for the fact that a rare natural phenomenon like the hurricane in 1999 would end up destroying a lot of production facilities that required rebuilding before companies could produce as before. And few people could have imagined the consequences that the financial crisis would have for many companies. In short, we are programmed in such a way that we intuitively don't think these kinds of risky thoughts. 

Finally, there is one last relevant bias. Wishful thinking describes our tendency to overestimate the likelihood of something happening based on how much we want it to happen, rather than basing it on facts or rational considerations. This is where top management ends up formulating an overly ambitious vision that is, in practice, completely impossible to achieve within the desired timeframe, but where utopian dreams seduce them and lead them to forget to take into account the reality of the business.

Learn more about change management through our knowledge universe

In most organizations, creating change and development is an ongoing challenge. Whether it's time pressure, process fatigue or resistance to change that makes it difficult to reach the finish line.

On this page you can find some of our articles and videos on the topic. Read along and get input on how to succeed with change management.

The Big Problem: The Iron Law of Planning

And why is it that it can be problematic that managers can be affected by the above bias?

Through his research, Danish professor Bent Flyvbjerg has shown that three things happen repeatedly in large-scale change projects, such as construction projects where a new production hall is built, a bridge is constructed or a railroad network is established. They go over time. They go over budget. And they fail to meet the success criteria that were initially set.

Finally, he demonstrates that this pattern repeats itself over and over again. Overall, he calls this phenomenon the Iron Law of Planning.

And what can you do to avoid ending up in a situation where your tendency to over-optimism leads you to plan change projects that end up going over time and over budget and fail to realize the expected success criteria?

'Debiasing': Manage your biases

Successful debiasing means that you are able to manage the biases that you could potentially be affected by. According to the above, you can 'control' your tendency to over-optimism in several ways.  

Premortem analysis. Premortem is Latin for 'before death' - which makes sense when you hear what a premortem analysis is all about. It's where you and your team fictionally fast-forward to the deadline for implementing your change project and imagine that everything has gone wrong. You then try to find explanations for why the change went wrong. One suggestion could be that during the presentation of the change, the management team spoke a lot of management jargon that no one understood - which led to no one knowing how to contribute to the change. Then turn the list around and get an overview of what you need to do to prevent the change from failing - for example, avoid speaking in abstract platitudes and instead make it concrete how employees and managers at all levels can contribute to the change.

Reference class forecasting. Here you start by determining the time, budget and success criteria for the change project you need to implement. Next, you look back at similar projects within yourself and your network or seek knowledge from an external consultant. How much did these projects go over time? How much did they go over budget? To what extent were the success criteria of the projects met? Finally, you take the data from these projects and add the average time and budget overruns to your original estimate and use this data to make any necessary adjustments to the success criteria. By doing this, you're providing your 'inside perspective' with an 'outside perspective' - and avoiding falling into the trap that bias researcher Daniel Kahneman calls WYSIATI (What You See Is All There Is).

Use your critical sense

Finally, you can be happy that by knowing the above bias, you have reduced the risk of 'falling into the trap'.

So keep these biases in mind the next time you're planning a change, and you'll increase the likelihood that your own optimism won't get in the way of your good intentions.

Should we have a no-obligation dialog?

We can help with all types of leadership development, whether it's tailored development programs, courses, training, workshops, lectures or anything else. 

Get a call from an advisor

Get a call from an advisor

We're ready to help you. Simplyfill out the form and we'll call you back as soon as possible.

Event registration

Text

THE ATTRACTIVE WORKPLACE 2024

We're hosting a conference on the attractive workplace on May 21 in Aarhus and May 24 in Copenhagen.

Learn more:

  • The holistic model
  • The innovative workplace
  • Areas of focus
  • Best practice examples