Strategies to Detect, Prevent, and Correct Causes of Project Stress and Failure
Glen Alleman MSSM
Vetern, Applying Systems Engineering Principles, Processes & Practices to Increase the Probability of Program Success for Complex Systems in Aerospace & Defense, Enterprise IT, and Process and Safety Industries
A book in my library The Seven Secrets of How to Think Like a Rocket Scientist rang true in our time of the SAR-COV-2 pandemic when I encountered not only blatant science deniers, along with climate deniers, but just the standard disinformation.
With a hard science background, I remember a life-changing experience.
In the mid-1970s, our experiment's state-of-the-art computing was the PDP-15. A wonderful 18-bit machine. Ours had an analog-to-digital converter that took signals from the experiment and converted that to digital data. From there that, time-series data were input to a Fast Fourier Transform algorithm written in FOCAL. The FFT took the digital samples from the A2D converter, which sampled analog signals from the experiment and produced the signal's spectrum so we could see the physical aspects of what was going on.
That data could be assessed about experimental data matching the theoretical conjectures. I was an experimentalist and had office mates who were theorists. My claim to fame in those days was the development of the FFT.
But before the automaton of the signal processing, we used a Tektronix Oscilloscope and a Polaroid camera to take pictures of the events happening in the experiment.
With a hand full of pictures, we'd trot down to the Principle Investigator's office, a Nobel Laureate, and show him the findings. His first words would also be nice work, boys (there were rarely women in our domain in 1975), go back and get me five more. When we returned with 5 more polaroids, he'd say nice work, now take everything apart and run it all again and being me back five more. With that done and back to his office, he'd say really good work, now write up how to assemble all the list of equipment and sensors, and I'll send that to a colleague in Maryland, and he'll see if he can repeat your work.
When those pictures came back positive for the event, he'd say more good work, now write up your letter to Physics Review Letters and if the readers don't crap all over your conjecture, come back to my office. You may be on to something. But before you do that, bring in your theorist buddies and have them explain WHY you should be seeing what you're now seeing, and having been confirmed, others are seeing it as well.
When I Hear Anyone Make Any Statement About Anything Beyond the Butt Simple Obvious
When I hear statements about anything from particle physics, to agile software development, to Covid Vaccines, to landing on Mars, to here's how we ignore the standards of project management and have a better solution - that 1970s laboratory training comes to mind with the simplest of all questions...
You Got Any Evidence for Your Claim, Evidence That Someone Else, Beside You Can Confirm?
It doesn't have to be double-blind peer-reviewed, but can you produce any tangible evidentiary materials (which is the fancy science and contractual term) that support the claim? being made? No? then please STFU!
Our world is full of unsubstantiated claims about all kinds of things. Missing principles on which to make those claims. Need to include data to support the claim with some reasonable statistical confidence. By the way, the number of samples needed to produce an 80% confidence. Clopper-Pearson is the first place to start to answer that question.
So when you do anything about anything, ask yourself - any evidence? Then ask is this evidence credible. Then what level of credibility is it?
This applies to software development methods, business process improvement methods, financial forecasting, and weather forecasting to Deep Inelastic Scattering of Neutrinos at the Stanford Linear Accelerator.
Evidence Talks, BS Walks
Here are some useful BS detectors resources, but start with Carl Sagan's Baloney Detection Kit from his book The Demon-Haunted World, which should mandatory reading for every teenager and adult in the nation since we're now in a conspiracy theory-based world.
- "Deeper into Bullshit," G. A. Cohen
- "How to Improve the Use of Metrics," Tibor Braun, Nature, Vol 465, 17, June 2010
- "On Bullshit," Harry Frankfurt, Princeton University Press, 2005
- "On the reception and detection of pseudo-profound bullshit," Gordon Pennycook, James Allan Cheyne, Nathaniel Barr, and Derek J. Kowhlet, Judgement and Decision Making, Vol. 10, No. 6, November 2015, pp. 549-563
- "Storks Deliver Babies (p=0.008)," Robert Matthews, Teaching Statistics, Volume 22, Number 2, Summer 2000.
- Logically Fallacious: The Ultimate Collection of Over 300 Logical Fallacies, Bo Bennett, Archieboy Holdings, LLC, April 2015.
- Extraordinary Popular Delusions, Charles Mackay, Dover Publications, 2016.
- DIB Guide: Detecting Agile BS
Dealing with Claims with No Evidence
Starting with Root Cause Analysis of Possible Project Failure Sources and Taking Corrective and Preventive Actions to Remove or Create Barriers to Increase the Probability of Success.
Primary Causes of Project Failure
- Four Programmatic Sources of Failure
- Unrealistic performance expectations, with missing Measures of Effectiveness and Measures of Performance
- Unrealistic Cost and Schedule estimates based on inadequate risk-adjusted growth models
- Inadequate assessment of risk and unmitigated exposure to these risks with proper handling plans
- Unanticipated technical issues without alternative plans and solutions to maintain the effectiveness of the product or service
- System Engineering Sources of Failure
- Inadequate understanding of requirements
- Lack of systems engineering discipline and authority
- Lack of technical planning and oversight
- Lack of subject matter expertise
- Program Management Sources of Failure
- Incomplete, obsolete, inflexible system architectures
- Overestimates of technology maturity
- Failure to identify risks and their handling
Strategies to Prevent and Correct Sources of Program Failure
- Programmatic Strategy
- Define Effectiveness and Performance Expectations
- Definitize Cost and Schedule estimates to needed levels of accuracy and precision
- Develop Risk Management models based on Epistemic and Aleatory Uncertainties.
- Define Technical Performance Measures and Key Performance Parameters
- Systems Engineering
- Model System Requirement dependencies
- Install Systems engineering discipline
- Provide risk-based Technical Planning and Oversight
- Assure Subject Matter Expertise available for all system elements
- Program Management
- Define System and Programmatic Architectures
- Assess Technology Maturity and Readiness Levels
- Define and apply measures of Physical Percent Complete and use them to produce Estimates to Complete and Estimates at Completion.
Tools and Process Needed to Prevent or Correct Project Failure
- Identifying corrective actions
- Reducible impacts
- Irreducible impacts
- Identifying preventive actions
- Root Cause Analysis
- Pre-mortems
- Barriers
- Implementing corrective actions
- Implementing preventive actions
- Confirming the effectiveness of preventive action
- Confirming the effectiveness of corrective actions
Operations Research Engineer
1 年Thank you very much, this inspires to dig deeper into systems engineering and systems thinking as an approach to project management and control.
Project All-Round Consultant, Project Manager & Deal Closer/ Investor Looking to Widen Investments / Authoring A Management Book
1 年Very interesting post.