Those Numbers Are Wrong!

Those Numbers Are Wrong!

I am currently creating a Measurement Systems Analysis (MSA) Module for CI Mastermind. As such, it is perfect timing to write an article on the always popular topic of “those numbers are wrong!” I will cut to the chase and give you the punchline of this article. First, the numbers are rarely wrong; they just are not quite right or not right enough. Second, whenever you present numbers it should always be done in the context of “measurement error is always present.” You need to have an understanding and ideally, a quantification of just how much error exists.?


As we continue the theme of collecting and utilizing data to improve our processes, let’s look at measurement and measurement systems as the people, processes, and tools to collect data. The output of the measurement process is data. Given that measurement is a process it is susceptible to all the waste, variation, and errors that all processes are susceptible to. In short, data is almost always flawed to some degree. This is measurement error and it is the reason why measurement system analysis is so darn important!


I cannot begin to count the number of times that I have sat in a meeting with a group of people all looking at data and someone in the room blurts out, “You can’t trust those numbers; the data is all wrong!” It is even better (note my sarcasm) when someone comes up to me afterwards and shares this epiphany. Fortunately, I run into this less and less or at least people do not try to disrupt meetings with this anymore. “I don’t trust their data” is a blunt thoughtless way to convey skepticism over the measurement process. I like to challenge the people who make this statement with the question, “Well, how wrong is it?” There is rarely a direct answer to this because it is more of a diversion or avoidance technique from a difficult conversation than an actual concern over some decision or change that is being proposed.


This is a passionate subject for me. I will move away from my rant and be constructive. Yes. The data is wrong, or really I’d prefer we say that it is not 100% correct. This is okay because we can still make good decisions with 80% correct data. When you run into this sort of resistance tactic, your job is to take the discussion into a thoughtful decomposition of how the data was created. What are the data sources? Who collected the data? What are the labels we are applying in the collection process and do we have a shared Operational Definition? It will slow the discussion down. It may require a side meeting, but it avoids the dead end debates of? “the data is all wrong” or “we don’t know; let’s throw it all out” or “I am going to make decisions based on my own experience and judgment” (all personal biases included, of course).


We were really good at understanding our measurement systems at Motorola. It was ingrained in the culture and I assume they still are good at this today. It allowed us to move from “shoot from the hip decisions” to thoughtful discussion, improvements to our measurement process, and, ultimately better decisions.?


The behaviors that result from what I just explained is why my second point on measurement system analysis is so important. The bottomline here is that whenever you are planning to create data and present it, YOU MUST DO THE MEASUREMENT SYSTEM ANALYSIS. It is much easier to move the discussion along when you can quickly shut down, “The data is wrong.” I have experienced this in an open forum on more than one occasion. When I had my facts together I quickly ended the debate and gained immediate credibility. This is the outcome you want.


Years ago, I had an employee perform a staffing analysis for one of my clients. Staffing analysis in administrative processes is time consuming and complex. We ask a series of specific questions, organize the response, and often reask questions to validate the answers. The measurement process is heavily dependent on interviewing techniques which create all kinds of variation. Noneless, I have performed enough of these to know that the mean responses are repeatable which makes the results useful in decision making.?


After about three weeks of conducting interviews my employee came to me very concerned. She told me that the way she conducts the interview can bias the results. In fact, she stated that she could make the results fit her own bias if she wanted. I said, “I know and I trust that you will follow the standard approach. Please follow the process. It was designed to avoid these errors.” This is the point. Standard approach will provide consistent results. Always analyze your measurement system. Identify problems in the system and the improvements to reduce potential measurement errors. Apply the measurement process multiple times and check the results against a known standard. Continuously improve the system to reduce bias and improve precision.


The easiest measurement system analysis is the MSA Drilldown. It is as simple as documenting the measurement process, including all inputs. Reviewing each input and process step for errors in accuracy or precision. Then note your findings, develop countermeasures, and improve the measurement process. I think MSA sounds complex to people and they avoid the discussion. MSA Drilldown is quite basic, simple, and useful. Certainly there are more powerful and complex methods, but even a simple analysis will allow you to gain the credibility you need to stay on a database decision making path.


The staffing analysis process was something that I had learned from another Black Belt. It was applied previously by other companies and I had practice applying it as well. It was a well designed system but still carried many flaws. The last bit of advice I will give is to never hide the flaws. My employee feared that if she shared these flaws with the client they would lose confidence in the outcome. The client's confidence in our result is directly correlated to their level of confidence in the measurement process we are applying. They need to understand the potential flaws in our methods. When they use our data to make a decision they need to consider the error that exists in the measurements and the solution must mitigate this potential error. Simply stated, “I don’t want them to be overconfident.”


When my employee reported the results, I made sure that she started by educating the client on the measurement process that we followed and specifically highlighted assumptions that she made and areas where she had her own concerns. The resulting data pointed to areas of staffing gaps and process opportunities. Both were followed up with changes. At no point did anyone say, “These numbers are wrong.” The statement could not be made because we were transparent in the measurement process and the potential for error. All that remained was constructive discussion on the actions to take next.

要查看或添加评论,请登录

社区洞察

其他会员也浏览了