Data driven decision making and its effectiveness on mega projects.
Project Controls Expo Australia
The largest and most important international annual event for those working within the spectrum of project controls.
Context
Jim Barksdale, the former CEO of Netscape once said, “If we have data, let’s look at data. If all we have are opinions, let’s go with mine.” Having worked on several major projects, I have witnessed that there were strategic decisions which were being made based on opinions at various stages of the project. There was a requirement to get a single source of truth, but this was not an easy task to achieve as the projects transitioned through various stages and various levels of maturity. Changing leadership also added to the complexity of mega projects. These anecdotal observations reveal a general problem and the need to understand the role of ‘data-based evidence’used in forming decisions.
In this article we will look at a bit of history to set the context, look at evaluating decision quality and look at some processes and explore how technology can be an enabler to help with decision making on major projects during the execution phase.
Understanding the history of data driven decision making (DDD) is important as then we can understand if this has changed over years and if some lessons can be drawn from past. Data as a discipline evolved long back with Fredrick Winslow Taylor and the scientific management techniques of the early 1900s with his world-famous work “The Principles of Scientific Management” (Winslow, 1911). There was an intent to apply scientific management techniques to data, but it was limited by the technology available at that time. There was also an introduction to data visualization by Willard Brinton (1917) which is a precursor to more modern data analytics and dashboards. A positive correlation has been established between business performance and decision-making practice (Thomas et al., 1993; Mackie et al., 2007), and since a program is a temporary organization, a correlation between program performance and decision practices should be expected (Thiry, 2004). Although it is to be read in conjunction with the decision quality framework as discussed later. Flyvbjerg (2007) has also observed that the main challenges of megaprojects are inadequate, unreliable, or misleading information; and conflicts between decision making, policy and planning.
It is to be noted that I have specifically not covered ‘behavioural biases’ as they have been well researched and I would highly recommend the international bestseller ‘Thinking, Fast and Slow’ by Daniel Kahneman for readers who want to explore the behavioural biases further.
Thus, I have covered this article on a technical basis and stayed away from concepts like ‘Strategic misrepresentation’ which is defined by Bent Flyvbjerg as the tendency to deliberately and systematically distort or misstate information for strategic purposes’. I would strongly encourage readers to go through these concepts to get a comprehensive understanding of decision making on major projects.
What’s working and what could be better?
While on large complex capital projects for over 20 years, I have seen that data driven decision making as a concept was very much prevalent in several pockets of the organisations. This in most cases was not integrated to provide insights to a governance forum or a steering committee but existed in specific pockets of the project. Thus, it is seen that the idea of data driven decision making is not new, but new technology has enabled the handling of large datasets much easier. It is also observed that the data or insights that was valued by the program leadership team may not be always valued by the construction manager on site. This is well understood as project managers and engineers look at ‘on site’ day to day data for decision making while the steering committee relied on macro level data for strategic decision making.
Decision making and Decision Quality framework
If we think through decisions and outcomes, we will see that these are two different things as there are uncertainties with every choice. You could make a good decision and still the outcome could be bad. This is particularly relevant for major project performance. To illustrate the point, you could see that a major project like Opera House Sydney could have been a bad decision with a poor outcome when we consider the iconic projects cost and schedule overrun. The project was 1300% over budget and took 14 years to build. However, the positive outcome considering its iconic nature and tourism boost cannot be denied. Similarly, a poor decision may have a good outcome because of excellent execution or just plain luck.
To judge the quality of a decision, there are six distinct elements of a good decision as documented by Carl Spetzler, Hannah Winter and Jennifer Meyer, 2016. These are:
1. Appropriate Frame – The context to the decision we are making or the problem that exists.
2. Alternatives – The options that we have
3. Relevant and reliable information – What we know and the information we have.
4. Clear values and trade-offs
5. Sound reasoning
6. Commitment to action
When we look at decision making on several projects there seems to be that missing thread of data that connects the information available for decision making. This is where the ‘single source of truth’ seems like an impossible task. The decision making in several cases centers around opinions and political drivers rather than data driven decision making. Its also seen that the data available from supply chain can’t always be tagged as reliable and relevant. This is particularly relevant for data which is shared with the client from the contractor. In major projects there seems to be a growing divergence between a ‘commercial’ program or Gantt chart which is a ‘real’ program used for executing the project. Similar differences in opinion applies to cost forecasts, risk positions etc.
领英推荐
The process for good decisions in projects - Integrated Project Controls (IPC).
Project controls does not guarantee success – but needs to be set up in the right way. Often on mega infrastructure projects and programs, controls are established in siloes – for example in planning, cost, risk, change and governance – with a lack of integration between processes. This will have some benefit on the project but may not meet the true potential of integrated project controls (IPC).
With the increased pressure on the infrastructure to deliver projects successfully, it is more important than ever for the industry to make the leap to a full IPC model – systematically bringing together customer requirements, supplier-management values, and status information from all data sources in a compatible form that allows for rational decision making. Increased pressure will require integration of controls with the supply chain as currently projects are increasingly competing for a limited pool of suppliers and resources.
This can provide data driven insights from integrated data sources to advise on areas for efficiency, recommend management action to protect performance and bring project and market data to compare performance and propagate learning in the program. The most effective decisions can only be made informed by data and through proper integration. This increases the importance of data stewardship on projects. A focus towards establishing a culture of project controls thus establishing some ‘ways of working’ early as a team can help mitigate some of the behavioral aspects which deter the effectiveness of project controls.
Currently, project controls is not inherently part of early project development. By making project controls and data strategy an inherent part of project strategy and boardroom discussions, program teams can signal from day one that transparency over performance is central to effective delivery.
Early establishment of systems and normalization of data-driven information gathering would help to set up a reliable source of project data and to inform effective decision-making. This can also help in avoiding any biases of personality politics that can sometimes be at play in organizations – focusing all decisions on hard data and clear, rational argument. Achieving this takes rigor and buy-in from the leadership team, but the benefits must not be underestimated.
The process of getting started with integration can be perceived as cumbersome and cost intensive. This need not be the case. There are three key steps all teams should consider, which will make establishing IPC on a mega infrastructure project clear and simple – delivering confidence in program and project performance.
Firstly, investment in a culture of transparency on projects – with the necessary executive buy-in. A lack of transparency comes hand in hand with the sort of siloed approach that should be avoided if we want to encourage data-driven decision-making at every level. Secondly, teams must establish a data strategy to have an integrated project controls system in place. If all data points are not integrated and thought through, the full benefits of IPC will not be felt. Thirdly, front-end planning is vital - with a focus on critical path activities such as long lead item procurement or awarding early contracts. By doing this, the efficiencies on projects can be realized and be more effective.
Technology
Technology has evolved a lot to provide us the computing power and integrate several software tools through APIs (application programming interface) or otherwise. This means that if we are looking at cost, schedule (time), risk, change control and site data coming from various interfacing tools you could look at a software architecture to integrate all those data and get a visualization platform to provide the results and insights using tools like Power Bi.
So as per the figure below (a generic integration diagram has been used to explain the concept) in a mega project organization you will have 3 parts to be integrated together.
Part A – Enterprise systems on a project or within a project organization
Part B - A reliable integration process for technology integration
Part C – Integrated cloud-based service with a data warehouse / data lake
Realizing infrastructure’s potential and the urgency required
Challenging perceptions around cost and schedule overruns relies on the industry taking proactive steps to change this narrative – both in terms of improving efficiency, and in focusing on the bigger outcomes and benefits that are gained from these mega infrastructure projects. To fulfil the potential of global infrastructure to support economic recovery across international markets, and to bring tangible benefits to people all over the world, we need a consolidated effort to embed IPC at the beginning of any mega project. This may mean to embrace technology, to have a clear data strategy which is used and understood within the organization (rather than being a shelfware for the entire organization and used solely by the IT department)
By adopting an IPC approach with integrated data points, and avoiding a siloed approach and by embracing technology, we can produce actionable insights for projects to generate better outcomes and enable projects to realize long-lasting benefits for the economy, society, and environment. As stakeholders and project leaders, we all need to be part of a shift in mindset that is required to meet this goal. We need to be aware of that some of the problems faced by global infrastructure projects are often referred to as ‘wicked problems’ where participants have incomplete knowledge, and the problems are interconnected. As we have stated, IPC is an integral part of this solution while recognizing the constraints that may need to be factored in from contracting model used and competitive forces.
Source: ACES Newsletter
Written by Abhi Datta