Transforming project success, exploring data enabled investment decisions
Greg Krawczyk ChPP
Simplified, Proven Tools that Drive Successful Projects & Change | Chartered Project Professional | Creator of Outcomes Weekly I Innovating with Confluity
The Governance SIG and PMO SIG held a webinar on February 28th, 2023, exploring data-enabled investment decisions to transform project success.?
The event was attended by 180 people who came to learn about how to use effective governance to unlock data for their projects.
The event, which was free to members and non-members alike, was run via Microsoft Teams. This was the first in a series of events on "Effective governance, unlocking data and project success," it brought together three subject matter experts and leading thinkers from key APM specific interest groups:
o??Project data analytics viewpoint - James Lea, Project Science
o??PMO viewpoint - Michael Hooper, Atkins, APM PMO SIG
o??Governance viewpoint – Jonathan Daley, DFT & APM Governance SIG.
o??Event facilitator - Greg Krawczyk, Age UK & APM Governance SIG
Attendees benefited from the experts' discussion of the problem with project success, including the concept of reference class forecasting to inform investment decisions and the principle that "all projects are data designed and data enabled." Interactive surveys were included in the event to validate key problem statements and identify the data-enabled approach required for investment decisions, which will form the basis of understanding for future events in the series and publications on this topic.
Summary
The panel discussions highlighted the importance of setting realistic expectations and developing robust business cases for investment decisions, utilising historical project performance data to enable accurate estimating. RCF adoption was assessed as low by the audience, despite their understanding as to the important and value of the technique. Building and sharing data is critical to making better investment decisions, and making investment data available to delivery teams requires alignment, quantification, governance, quality, training, and capturing lessons learned. The discussions emphasized the need for a more rigorous and transparent approach to project management, changing the culture around using data, synthesizing risks across projects, and being more proactive in learning from past failures to prevent them in the future.
Join in
To receive a copy of the future publications on this topic from the Governance SIG, please follow the community page for updates.?
The second event in the series is now live and will explore the concept of Strategic Misrepresentation in a round table format. May 22nd, London, 6pm, you are cordially invited to help shape the debate and bring light to the concepts that could be deployed to mitigate strategic misrepresentation.
Mentimeter was used throughout for audience engagement and capturing sentiment data against he 4x questions. Please see the slides and results online by clicking the Mentimeter image below.
Further Reading
For those looking to get into Data Analytics, or better understand the barriers to implementation, the following publications are recommended
How to get into project data analytics - https://www.apm.org.uk/news/how-to-get-started-in-project-data-analytics-webinar/
Barriers to project data analytics and how to overcome them?-?https://www.apm.org.uk/blog/barriers-to-project-data-analytics-and-how-to-overcome-them/#author ?
Post Event Write-Up
The topic of project failure rates can be a sensitive one. The event was not recorded to enable the audience to speak freely. Meeting summary notes were taken by hand, and with ChatGPT, developed into a summary write up.
Summary of Discussion - Event 1 in the Series "Effective governance, unlocking data and project success".
The three panellists, covering Data Analysis, PMO and Project Governance were presented with four questions for discussion. At the end of each section, the audience were invited to engage on the topic and paint a picture with numbers via Menti-Meter surveys.
The questions were as follows:
Q1 - Where can you link failed projects back to the investment decision?
Q2 - To enable RCF how can we build and share data?
Q3 - How can we make data used for investment decisions, available to delivery teams?
Q4 - Is it acceptable for a project to fail, when the stakeholders knowingly held the information which could have prevented its failure?
?Q1 - Where can you link failed projects back to the investment decision?
?Introduction
The recent panel discussion brought together experts in project management to share insights on managing project risk and the importance of setting realistic expectations for investment decisions. The first question of the discussion focused on identifying common problems encountered in project management. The panelists shared examples of challenges faced in setting realistic expectations and dealing with anchoring effects, strategic misrepresentation, and unanticipated complexity.
Identifying the Problem: Examples from the Panel
One of the participants provided an example of the anchoring effect on a project, in which the client gave a target price during the tender stage, prior to business case approval. Consequently, all commercial estimates were biased towards this amount, leading to cost inflation and difficulties in meeting the actual project cost. In another case, a panellist described how the project delivery team was instructed to deliver a project without being informed that the brief had be made more ambitious, without having increased the time or cost during the business case phase. This resulted in difficulties in delivering the project to schedule and budget, and the team struggled to cope with the increased workload. If they had been provided with more information, this might not have been the case.
Another member of the panel spoke about the notion of conspiracy optimism and entryism, whereby strategic misrepresentation can result in setting unrealistic expectations for projects. He cited the example of the Queen Elizabeth two aircraft carrier, where the business case was not properly established. The defence estimators were challenged by private sector scrutiny estimators during the business case stage, resulting in the project proceeding beyond the business case, with costs rapidly escalating to a level identified by external assessors. This scenario could be regarded as a conspiracy of optimism.
A third panellist discussed how unforeseen complexity can affect project outcomes. He referred to the example of Crossrail, where the estimated project cost increased by 50%. The complexity was not entirely understood during the development of the business case, and the changes that were made resulted in substantial complexity on top of complexity, change on top of change.
Panel Feedback and Audience Responses
During the session, the panel shared their experiences with managing risks in projects and provided recommendations on how to avoid or mitigate these problems. The audience was also given the opportunity to provide feedback and responses to the discussion. The Mentimeter stats showed that the audience leaned towards strongly agreeing that expectations for the investment decision have a high impact on the projected success of a project. However, the audience rated their organization's performance at estimating time, cost, and benefits within their business cases poorly (2.6 out of 5). When asked whether their projects always achieve their objectives, the audience answered neutrally.
Conclusion
In conclusion, the panel discussion provided valuable insights into identifying common problems in project management, including anchoring effects, strategic misrepresentation, and unanticipated complexity. The session emphasized the importance of setting realistic expectations and developing robust business cases. The panel's feedback and audience responses also highlighted the need for improving estimation and risk management practices to achieve project success. By sharing experiences and best practices, project management professionals can better manage risks and set realistic expectations for investment decisions.
Q2 - To enable RCF how can we build and share data?
Introduction
Reference class forecasting (RCF) is a technique used in project management to estimate the cost, time, and scope of a project based on similar past projects. However, to implement RCF effectively, project managers need to build and share data across the organization. This write-up presents a summary of the discussions and ideas shared by a panel of experts in a project management conference on how to enable RCF by building and sharing data.
Building and Sharing Data for RCF
领英推荐
During the conference, panelists shared different ideas on how to build and share data to enable RCF. James emphasized understanding the data available and modelling noise index sets. He used an analogy of using a sat-nav and a car to show that just like we don't set off without knowing our destination, we should not embark on projected livery without proper data. Mike suggested the use of data lakes, AI, and preventing silos of data to build and share data effectively. He also stressed the need for data translators to help project managers interpret and utilize data effectively. Jonathan urged the need to develop the skills to ask the right questions and ensure that boards seek the right decisions based on the right input information. He further noted that changing culture is required to achieve this.
Mentimeter Review
During the event, a Mentimeter review was conducted to assess the audience's past experience and confidence levels in implementing RCF. The review showed that organizations are not natural at using data from previous projects to inform investment decisions, with a score of 4.6 out of 10. The audience was also not confident about implementing RCF, with a score of 3.6 out of 10. The panel noted that this suggests a need for general training on RCF, and there may be different levels of seniority that require different training.
Q&A on Building and Sharing Data for RCF During the Q&A session, several
questions were asked, including:
·??????Risk Appetite: When using historical data to produce investment estimates, we must consider the risk appetite. For example, for organizations with a high failure acceptance threshold, providing RCF with an extremely high degree of certainty may not be appropriate.
·??????Data Trusts: A key part of the solution to ensuring that we have the right data is going to be in the provision of data trusts. An organization that cracks this and provides reference class data to different projects could have a disruptive effect on the business case estimating process.
·??????PM Involvement: Cultures and behaviours will form a key part of the challenge in changing how we use data. One example given was ensuring the project manager is involved in the sales stage in commercial entities, effectively disbanding the sales and bid team and replacing them with a project manager as the lead. Mechanisms that incentivize the right behaviours need to be created, such as ensuring people have studied.
·??????RCF on Task Level: The panel generally agreed that RCF could apply at the task level, but this would be a engineered approach to RCF. It was discussed that it's still important to get the outside view at the tier one project, a program level post-up, as it's only there that you can get an idea of the unknown unknowns and the outside view.
·??????Choosing the Right Reference Class: The panel discussed that the PM could be the custodian of data. One challenge would be ensuring the right classes of data used. They noted that we don't always need to have exactly the same class of data, and we can look for examples not just on the project type but the project environment. They also emphasized that data trusts would be key in choosing the right reference class.
Conclusion
Enabling RCF through building and sharing data is critical to making better investment decisions.
Q3 - How can we make data used for investment decisions, available to delivery teams?
Investment decisions are critical to the success of any project. However, there is often a disparity between what sales teams estimate in tender responses and what is actually delivered by the delivery teams in terms of time and cost. To bridge this gap and ensure that delivery teams have access to accurate data, project management experts recently discussed the topic in a panel session. In this write-up, we summarize the key insights and recommendations shared by the panellists.
Aligning Sales and Delivery Teams
James, one of the panellists, emphasized the need to align sales and delivery teams to ensure continuity. He suggested that the project manager should be responsible for both bid delivery and project delivery, with the sales team reporting to the project manager. By doing so, the project manager can ensure that the estimate provided by the sales team aligns with what the delivery team can actually deliver in terms of time and cost.
Quantifying Project Data
James also stressed the importance of data design and enabled projects, to measure everything in projects. He recommended quantifying most parts of projects to inform project estimating site maps and develop accurate business cases. By leveraging the experience from hundreds and thousands of projects, project managers can chart the right course with an accurate estimate. He gave the example of the sat-nav and emphasized that data-driven decisions can help avoid detours and reach the destination more efficiently.
Ensuring Availability of Foundation Data
Mike, another panellist, highlighted the challenge of disseminating foundation data. He recommended that the Association for Project Management (APM) play a key role in ensuring that data is disseminated and made available, with feedback loops in place.
Governing Data Quality and Training
Jonathan, the third panellist, emphasized the need for policies that govern the quality of data. He suggested identifying data leads who can help those using data and provide transparency. He also recommended that all stakeholders receive training on understanding data and avoiding misrepresentation. He stressed the importance of being clear on the risks and having policies in place to mitigate them.
Menti Analysis
When asked about the transparency of data from the business case stage to the delivery team, the audience responded neutrally, with a score of 3.1 out of 5. This suggests that organizations have a long way to go to make business case information available to delivery teams.
Q&A
The panel also discussed the use of RCF as a tool for forecasting. While the panel acknowledged that RCF has been around for a while, they noted that it has not been used enough because the data required for mass use has not been made available. They recommended that a variety of techniques be used, and RCF should not be the only tool.
Lessons Learned
The panel also discussed the importance of lessons learned, which often does not happen in the right way. They recommended better corporate benchmarking and putting principles in place to ensure that lessons learned happens. They also stressed the importance of capturing lessons learned at the beginning and end of the project and having a feedback loop in place. They noted that the project should not be allowed to close until the actual data is completed.
Conclusion
The panellists shared valuable insights and recommendations on making investment data available to delivery teams. To ensure that projects are delivered on time and within budget, project managers must align sales and delivery teams, quantify project data, ensure the availability of foundation data, govern data quality and training, and capture lessons learned. By adopting these best practices, organizations can achieve better project outcomes and deliver value to their stakeholders
Q4 - Is it acceptable for a project to fail, when the stakeholders knowingly held the information which could have prevented its failure?
?In the final section of our event, we discussed whether it is acceptable for a project to fail when stakeholders knowingly held information that could have prevented its failure. This question sparked an interesting debate among our panellists and audience members, who shared their thoughts on the matter.
Response One: Failure is Not an Option
James was the first to respond to this question, stating unequivocally that failure is not acceptable, particularly when stakeholders knowingly hold information that could have prevented it. He stressed that we need to address the culture around using data, emphasizing that failure should never be an option. To achieve this, we must get much better at focusing on lessons learned and actually learning from times where we didn't meet our targets.
Response Two: Projects are Too Big to Fail
Mike responded to the question by acknowledging that one of the problems is that projects are often too big to fail. He pointed out that while benefits are usually identified, they are not always detailed enough. Mike's response highlighted the need for a more rigorous approach to project management, particularly when it comes to assessing the risks and benefits of a project.
Response Three: Transparency and Risk Synthesis are Key
Jonathan responded to the question by emphasizing the importance of transparency and changing the culture around using data. He suggested that we need to be more effective at synthesizing risk across projects, programs, and portfolios so that risks are not hidden, knowingly or unknowingly. He also highlighted the need to look at smart contracting and procurement mechanisms on major projects.
Review of the Audience Response
Finally, we reviewed the audience response to the mentimeter poll, which revealed that questions relating to strategic misrepresentation and being economical with the truth scored the lowest. The audience gave the lowest score of the day (2.3 out of 5) when asked whether their organization's investment committees recognize strategic misrepresentations as an issue. When the audience was asked whether they feel confident in their ability to mitigate strategic misrepresentation, the score of 2.6 out of 5 suggested that the audience lacks confidence in this area.
Conclusion
In conclusion, our event sparked a fascinating discussion on the topic of project failure and the role of stakeholders in preventing it. Our panelists and audience members all agreed that a more rigorous and transparent approach to project management is required, particularly when it comes to assessing risks and benefits. They also highlighted the need to change the culture around using data and to be more effective at synthesizing risk across projects. Ultimately, we need to be more proactive in learning from past failures and in taking steps to prevent them from happening in the future.
Summary Conclusion
In summary, the panel discussions highlighted the importance of setting realistic expectations and developing robust business cases for investment decisions, utilising historical project performance data to enable accurate estimating. RCF adoption was assessed as low by the audience, despite their understanding as to the important and value of the technique. Building and sharing data is critical to making better investment decisions, and making investment data available to delivery teams requires alignment, quantification, governance, quality, training, and capturing lessons learned. The discussions emphasized the need for a more rigorous and transparent approach to project management, changing the culture around using data, synthesizing risks across projects, and being more proactive in learning from past failures to prevent them in the future.