Root Cause Analysis Program Dashboard
Originally published 2/27/2018 at www.sologic.com
Like any other process, your root cause analysis (RCA) program requires periodic reporting to help you stay on track. Developing an RCA Program Dashboard can help. Here are some ideas:
RCA Volume:
For a given period, track the following:
- RCAs Opened
- RCAs Closed
- RCAs In Process
The number of RCAs opened and RCAs closed span the entire period. In some cases, RCAs closed will have been opened in previous periods. And some of the RCAs opened in the current period will still be in process at the end, to be closed in a future period. But the “closing ratio” of opens to closes provides insight into resources required. For instance, if you are opening more RCAs than are being closed, you may have a resource constraint. Examining the closing ratio trend over time will tell you whether your caseload is a temporary anomaly, or a new-normal, thereby providing insight into staffing requirements. The number of RCAs in Process provides a point-in-time look at what is currently in the pipeline and provides an indication of throughput velocity. For instance, if you opened 15 RCAs in a month, closed 18 RCAs, and have a current in-process count of 50 RCAs, you will want to take a closer look at what is (or is not) going on!
Return on Investment (ROI):
We track three numbers to report ROI:
- Problem Impact: The total financial impact of all problems.
- Cost of Solutions: The total cost of all approved solutions.
- Cost of Investigation: The total cost of investigating problems.
These figures should be documented for each RCA. For periodic reporting, aggregate all RCAs closed during the period together and compare. It should look something like this graph:
One way to read this graph:
“We experienced $5,500,000 in losses in Q4, we authorized an additional $300,000 to mitigate future similar losses, and we spent $50,000 investigating to make sure that our solutions were really the best things we could do.”
Be sure to include qualitative information as well. Some problems have negative impacts that don’t translate into money. These should also be reported. An example would be something like this:
“After spending $25,000 establishing the ‘New Employee Mentor’ program, all employees and supervisors report a noticeable and positive impact on morale.”
Event Location, Type Frequency, and Impact:
It’s important to document where events occurred as well as frequency and severity. The following table is one way to report this information:
You can do the same with event Type:
These are simple tables, but they do a good job at showing both count and impact in the same report section.
Solution Effectiveness:
It’s possible to report on solution effectiveness by mapping each solution from the period to some form of effectiveness hierarchy. We like the classic Hierarchy of Controls chart, but you can probably think of a few ways of scoring solution effectiveness. The hierarchy of controls chart we use looks like this:
Here is a brief description of each level, using automobile safety as an example:
- Elimination: Completely eliminates the hazard altogether. Not getting into a car at all is the safest way to avoid being injured in a car.
- Substitution: Substituting one thing for another less-risky thing. Traveling by train substitutes one mode of travel for another.
- Engineering: Engineering a less-risky option. Since the first seatbelts were introduced, cars have been engineered to be less risky in a variety of ways.
- Administrative: Using rules, regulations, procedures, and training to reduce risk. Outlawing the use of a mobile device while driving is an administrative control.
- Behavior: Getting people to behave in a less-risky way. The choice to not use a mobile device while driving is a behavior modification.
- Protection: Personal protective equipment is the last line of defense. Seat belts are a form of personal protective equipment in cars. Race car drivers wear 5-point harnesses, helmets, head/neck restraints, and fire-resistant clothing.
Solutions from lower levels of this chart are harder to maintain over time and therefore are generally less effective. We like to challenge investigation teams to try to find higher-value, creative, solutions that are more effective without breaking the bank. The table below is one way of reporting solution effectiveness:
Developing a simple RCA Program dashboard is not difficult or time consuming, particularly considering the value it provides managers. The time it takes to produce this report can be significantly reduced with good RCA software tools that automate the reports and graphs. We would encourage you to borrow, adjust, amend, and improve on the example dashboard elements above. If you have ideas, we would love to hear them in the comments below...
GBS/GCC | Managed Services | IT Service Management (ITSM/ITIL) | ServiceNow | Service Integration and Mgmt. (SIAM) | IT Ops (IMS, AMS, DC) | Service Delivery | Project Mgmt. | Automation | Vendor Mgmt. | Contract Mgmt.
7 年The only challenge I see is with the ROI.... I don't think we'll be in a position to derive the cost of all the problems, cost of investigation and cost of the solutions a lot of time for various reasons like the technical teams could be part of the internal shared services or may be the IT services is outsourced and in some cases, we don't have well defined hourly or daily rates for the technical teams involved or sometimes, we have raised a problem and started doing RCA just to improve the overall performance and user experience and may not be a full downtime.... whatever you said is absolutely correct, I am only trying to highlight the practical challenges....
Safety Engineer | Princeton Site Office| Office of Safety and Security | U.S. Department of Energy | Princeton Plasma and Physics Laboratory |
7 年Brian, it would be awesome if you become a speaker in our MBA program.
Aseptic Fill Consultant | Quality Oversight | Tech Transfer | Manufacturing Support | Process Development
7 年Suggest adding recurrence. Is the same problem being seen again under the same of different circumstances? If so, this may be due to the RCA coming to the wrong conclusion, an ineffective corrective action or too narrow a scope being considered.