Thoughts on ‘Project Data Analytics’
The tools and techniques of data analytics have the potential to make a significant contribution to improving project performance.?I have no argument with the aspiration to drive improvement, nor with the idea that emerging data analytics technologies should be used more in data rich environments like projects.
But I am concerned that we might be expecting too much from Big Data, AI & ML.
I have not seen any underpinning problem analysis that demonstrates that lack of such capability is one of the main root causes of the under-performance of projects today.?So I'm left wondering whether we are tackling the most important issue and will project data analytics really ‘move the needle’?
I am hoping that readers of this article can help those like me to answer questions such as...
To clarify, I am not saying data analytics approaches cannot drive some improvements to projects. But how much improvement, and are they the highest priority for our limited R&D resources?
Below I discuss four reasons for my hesitation.
1: We already know what to do.
One of the underlying assumptions behind many data analytics initiatives is that past project data is a mine of undiscovered information. If only we could access this, we could identify how we should change project management practices and thus transform project performance for the better.
But do we as a profession need data analysis to identify improvement leverage points?
From what I see, the profession has a poor track record of (1) Identifying systemic leverage points, (2) Critically analysing emerging ideas and hypotheses, and (3) Implementing and improving the best ideas.
Take for example the ideas of ‘Front-end Loading’, scheduling and execution methods such as critical chain project management, collaborative contracting, or the re-emergence of ‘agility’ as a critical success factor.?These have been known for several decades, but are still absent on many projects.
If we really want to improve project performance, why are these not universally applied today?
In 1931 the Empire State Building opened in New York.?It was on schedule, under budget, and was completed 20 months after selecting the architect (19.5 months after selecting the main contractor).?A performance today’s projects get nowhere near to.?This was not achieved by cutting corners or taking risks, nor using big data or computers.?They managed the project well – using methods that are still far too rare nearly 100 years later.?If they achieved this with paper and brainpower, just think how projects should be performing with today’s technologies…
It does not seem that the project management profession is short of good ideas.?We are short of consistent practice of proven good ideas. How will data analytics and more proof, resolve this issue?
If I am wrong and data analytics does identify huge improvement opportunities, what has changed to make those who decide how projects are delivered to implement what is identified, given few have done much with proven good ideas in the past?
If you believe, no data is necessary. If you don't believe, will more data really change your mind?
2: Projects are Systems:
The performance of complex systems (which most projects are) is driven by the interactions between the component elements much more than the individual elements themselves.?Any improvement in project performance requires a holistic analysis and implementation. You can't make it better by improving one or two parts.
There is a risk that some of the data analytics initiatives will reinforce this, for example by embedding a common but sub-optimal practice into project organisations.
One example is reporting and dashboards which provide data to decision-making boards and assurance/governance bodies.?An alternative is to improve the data available to project team members and work package managers, so they make better and faster decisions themselves reducing the need for senior managers to be involved so much. These alternative approaches need very different data. Another example is reporting progress as variance-to-baseline.?Whilst this is the most common approach, it is not the only approach, nor necessarily the best approach. For example projects using the critical chain method, are not interested in task-level variance from plan. Despite this they report significantly better overall delivery performance.
What is the use of speeding up the flow of, and reducing errors in, data if the underlying assumptions behind our model of how to do things are wrong??Maybe the core problem is not about data analytics capability, but the understanding of how project work should be managed.?I do not see this latter prerequisite being addressed widely enough.
3: The science of prediction
Might project management be like earthquakes??
A global authority on earthquakes says “Neither [we] nor any other scientists have ever predicted a major earthquake. We do not know how, and we do not expect to know how any time in the foreseeable future.”[1]?
Why should projects be any different to earthquakes??In fact, I could argue that projects are theoretically much harder to predict than earthquakes or the weather.?Weather predictions are improving all the time, but over 14 days into the future, forecasts are less accurate at prediction than using the historical averages [2].?Likewise hurricanes – 48 hours out they are now pretty predictable, but less so 1 week out.
Projects, like earthquakes, can accurately calculate historical averages, and these can be useful.?Just not in predicting what will happen in 4 weeks on a specific project.?
领英推荐
How does it help a project manager or senior executive to know that 59% of projects with similar progress data, finished within 10% of target, 16% overshot by over 30%, and 12% came in more than 10% early/under-budget?
Is this a case of statistically accurate, but no use to an individual project?
A related issue is that if organisations become more data-driven, how to they evaluate an innovative idea? By definition it hasn't been done before so there will be no data on how the alternative performs. And the complexity and timescale of many projects means that carrying our a robust scientific experiment to create this data, is impracticable.
Data is a key element in process improvement, but only when integrated into a causal model of the underlying behaviour [3].?This is where I see a significant potential for data analytics in project management – as a key component in refining our theories and hypotheses about why projects behave as they do. From what I have seen so far, the focus for PDA has been about identifying association or correlations which I believe has much less potential value than causal modelling.
4: The issue of 'what data' & 'what analysis'
If we look at the field of manufacturing and distribution, there is an analogous example of how overreliance on data and software can hide an easier and better improvement opportunity.
After decades of adapting MRP/ERP systems and investing in technologies to improve planning and forecasting, companies including Michelin, BT and Unilever have discovered that they were addressing the wrong problem.?They improved their supply chain performance significantly by changing their ‘thoughtware’[4], rather than their software.
There are dozens of published cases where planners took a 2-day course, turned off the system's built-in MRP algorithm for their most problematic items, and used a spreadsheet to manage replenishment orders (programmed with what they learned on the training course) [5].?Almost immediately KPI’s improved significantly – typically higher service levels, lower inventory and shorter lead-times.?
The key to this was replacing the underlying concepts used to manage the supply chains with a method known as DDMRP.?Of course, at commercial-scale software is necessary to embed the method quickly across large organisations, and these systems do use data analytic methods and AI/ML.?But method came before technology.
Before we look to automate data collection and analysis, should we not critically analyse the underlying methods used today to manage projects??If we build inefficient processes and false assumptions into our systems, we risk making wrong decisions faster.
“The more efficient you are at doing the wrong thing, the wronger you become.”
Professor Russel Ackoff (pioneer in the field of systems thinking and management)
What would I like to see happen?
I would like to see the data analytics activities integrated into a wider initiative to improve our understanding of projects.?One that develops and tests causal hypotheses about how projects perform, and tests and refines these in the light of gathered data.
I see significant benefits in improving how the right data is presented in real time to project team members in order to allow them to make better decisions.?This requires not only data analytics techniques, but also an understanding of what constitutes ‘the right data’, and ‘better decisions’.
I also see potential in the use of system simulation modelling when developing project and portfolio execution strategies, both as a tool used in individual circumstances, and in developing professional guidance more generally.?Measured data should be used to test and improve our theories and hypotheses, and this needs more than data, dashboards and correlations.
This article is based on a paper submitted to the APM's Project Data Advisory Group, April 2021.
References
[1] USGS = U.S. Geological Survey. From https://www.usgs.gov/faqs/can-you-predict-earthquakes [Accessed 4/4/21]
[2] p131 in “The Signal and the Noise”, by Nate Silver (Penguin 2012)
[3] See for example ‘The Book of Why’, by Judea Pearl & Dana MacKenzie (Penguin, 2018)
[4] ‘Thoughtware’ is a term used by Carol Ptak and Chad Smith, founders of the Demand Driven Institute, and developers of the DDMRP method.
[5] The author has personally spoken to people from three different companies who have done this.
Helping turn highly engineered new products into your reliable revenue growth engine. Our proven Pipeline Accelerator process gets you to market 50% faster with 40% more revenue. We guarantee results.
3 年Ian, I've never found data that helpful at the individual project level other than as needed for execution, but more useful at the portfolio level where you can see what is driving performance or underperformance. That said, I just this morning posted an Innovation Tip around keeping metrics simple. In New Product R&D - How fast is your company getting new products to revenue and how much throughput or revenue are they creating? If both these are increasing, your efforts are working.
Connoisseur of business, a mastery of Project Management and Procurement. Director Procync. Experienced NED. Passionate about improvement and productivity.
3 年Good points made Ian. We have a fundamental issue to solve before we get to any real data analytics and that is one of TRANSPARENCY. Unless we have a culture that relishes transparency then the quality of any data is dubious. We can try to get to that culture by putting in systems that extract data but we would have to “adjust” the data until we could prove its provenance. My work with VCP is attempting to get to the base data and start that process. The issue we have currently with a lot of the improvement data is the cleanliness of the data. The construction industry KPI’s are a good example of data that really has no clear line of sight and therefore probably underplay the real state. Having said that we have to press for continuing data analytics in order to improve our performance.
Ian, you make some very good points. My first reaction is "What is the problem that we believe the extra data is going to solve?". In her excellent book, "Stop Decorating The Fish", Author Kristen Cox goes into the fallacy of thinking that adding data (or money or technology etc.) is the answer, when the problem is not defined or understood properly first. Data has its place for sure. It can also cause "Paralysis by analysis".