Data Sanity (Part 4 of 5): "Adopt the New Philosophy"? (Deming Point 2) and "Institute Leadership"? (Point 7) to STOP Unwitting Destruction

Data Sanity (Part 4 of 5): "Adopt the New Philosophy" (Deming Point 2) and "Institute Leadership" (Point 7) to STOP Unwitting Destruction

[For those of you who missed the background: Part1, Part2 and Part3. Here is Part5]

[Note from Davis: This post is slightly longer than usual, but I designed it to be a breezy read. I have no doubt you will have an uncanny sense of deja vu while reading it. The extent and seriousness of this huge, mostly hidden, problem demands nothing less than a thorough exposure in one sitting to get your attention. You also have permission to re-read it :-) ]

There is a penalty for ignorance. We are paying through the nose -- Dr. Deming

"Unknown or Unknowable?" The biggest opportunity of which hardly anyone seems aware...or knows how to deal with effectively. 

According to Mark Graham Brown (from his book Keeping Score), 50 percent of time leaders spend in meetings involving data is waste, 80 percent of the pounds (kilos) of published financial data is waste, 60 percent of the pounds (kilos) of published operational data is waste, and middle management wastes one hour a day poring over these reports. Do you have any idea what this costs -- but does it even matter?

In meetings involving leaders and middle managers, there seems to be a code of virtually universal management laws of using operational data. It also has a unique vocabulary that implicitly requires the ubiquitous use of the words "tough," "stretch," "low-hanging fruit," "accountability," and the phrase "I don't like these numbers" as well as the two tantrums "Find out what happened here!" and "What are you going to do about it?!" as often as possible. 

Nine Universal Leadership/Managerial Laws of Data Display and Use...

..."Perfectly Designed" to Create Confusion, Conflict, Complexity, and Chaos...and increase Costs

Does something magical happen in January (or beginning of any fiscal year)? January 2nd, most of us come to work at the same organization that existed on December 31st. But:

(1) As of January 1, the past no longer exists except as yearly averages, with two appropriate exceptions:

   (1a) the current year-to-date figure can be compared only with the exact same time span of the previous year

  • Trap: assuming seasonality and treating any differences as special causes

   (1b) using only this same month's performance 12 months ago to compare with the current month.

  • Trap: assuming each month is a special cause and treating the two months' year-over-year difference as a special cause

(2) If at all possible, the data should be presented in tabular form on which we will draw little circles around any numbers we don't like.

   (2a) (Optional): Put "Why?!" or "What happened?!" in red ink next to the circle and mail it to the appropriate person.

  • Trap: Human variation in perception of and response to variation. treating it all as special cause -- plus unnecessary time spent arguing over whose circles are most important and/or coming to consensus about which are vitally important for action now. (aka MBLC: management by little circles)

Balestracci's Profound Law of Numbers:  Given a set of numbers, one will be the largest, one will be the smallest, 10 percent will be the top (and bottom) 10 percent, and 25 percent will be the top (and bottom) quartiles -- and the biggest difference between any two months will be the biggest difference since...the last biggest difference that happened.

I wonder whether a better explanation of MBLC might be: management by literal chaos (or confusion or complexity or conflict)?

No alt text provided for this image

Where would you put your circles?

(3a) For important data, we may need an occasional graph of no more than the last 24 months using one or more of the three following displays: (1) separate yearly averages superimposed on the running 24-month record and trend line(s) added whenever possible, (2) plotted year-over-year by month in a "copulating earthworm plot" to compare differences and look for seasonality (the only exception for possibly using more than 2 years), or (3) as year-to-date side-by-side monthly bar graphs of each of the two months' performances.

(3b) More preferable and less confusing is the past 12 months (only) of data to see how we're doing -- as bar graphs with a trend line (always). 

  • Trap: More human variation in reaction to such nonsensical displays. I didn't make up this following figure of a number that made people sweat:

 

No alt text provided for this image

What are you supposed to DO with this?

(4) When displaying financial data, use rolling averages whenever possible.

  • Trap: Common cause data can exhibit strong evidence of very obvious special causes that don't exist! To demonstrate, here are time plots of the exact same data: (top plot) randomly generated data that has no special causes; (middle plot) its rolling averages of 4 (e.g.,analogous to commonly used 4-quarter rolling average); (bottom plot), its rolling averages of 12 (e.g., equivalent to common 12-month rolling average, often used in calculating "days outstanding accounts receivable").

 

No alt text provided for this image

Yes...the exact same data!

(5) The difference between this month's performance and last month's performance might need to be explained, especially if trending in the wrong direction by "too much."

  • Trap: once again treating the difference and perceived trend as a special cause due to human variations in perceptions of the exhibited variation and how large it "should" be.

(6) The performances of this month, last month, and same month 12 months ago give an idea of the overall trend and may need a trend line so we can compare it with last month's trend. Then we can update our projection of year end performance.

  • Trap: once again treating common cause as special cause and, additionally, a 33 percent risk of calling three data points a trend -- either all going up or all down -- when it isn't.

(7) If possible, convert a table of numbers to its traffic light equivalents. Any green indicator is fine. We will discuss whom should get recognition; but, more preferably, one of these tougher strategies should be considered to get even better results: 

   (7a) (Optional): Use a tough reward process to stretch them further: If they get [pick a number] greens-in-a-row, we will tell them "Send out for pizza and send us the bill"...and then stretch their "red, yellow, green" endpoints

   (7b) (Optional): Set a standard that no more than [pick a number] months in-a-row can be non-green and will require a special report, as will [pick a number] reds-in-a-row.

   (7c) (Optional...and very tough): For very important numbers, show them we mean business! (e.g., customer satisfaction survey results): go around as a leadership group weekly or monthly and plant a red, yellow, or green flag in each department based on their most recent result [DB: I didn't make this up!]

  • TRAP: high risk of treating common cause as special...and destroying cultural morale (and any remaining respect for the leaders)

(8) Current month and year-to-date performances are compared to goals and recorded as variances.  

   (8a) (Optional: getting tough): The [arbitrary percentage] of people having the largest variances will need to write a special report about what they're going to do about it and present their results to us next month)

  • Trap 1: treating all variances as special cause and choosing an arbitrary percentage of people as a cutoff for needing explanations -- assuming they are special causes when many are probably not.
  •  Trap 2: time wasted with people preparing these nonsense reports and unnecessarily presenting them at the next meeting. 
  • Trap 3: at this subsequent meeting once again allowing human variation on perceived variation to demand questionable actions based on these reports, especially asking, "Where is your low-hanging fruit?"

(9) All goals must end with a "0" or a "5" with one exception: for what we know to be an impossible situation, we will ask for only a 3 percent stretch. 

  • Trap: Using goals to motivate. Treating individual differences between current performances and their goals as special causes. No use of an IChart to gain knowledge of process's actual performance vis-a-vis any goal to suggest that maybe a common cause strategy is required.

==========================================================

Not only do all of these laws apply to the everyday use of routine data, but they really come into play during the dreaded, tiresome annual ritual of the budget. 

  • Trap: treating every year as a special cause...and taking up to 30 percent of peoples' time with creating, adjusting, re-adjusting, and routine cost-cutting meetings throughout the year.

To summarize the use of these laws of data and display:

No alt text provided for this image

Great gig for a six or seven figure salary, eh?

Isn't all this nonsense merely a common cause symptom of a much, much deeper systemic problem -- data INsanity? 

"Unknown or Unknowable?" -- perhaps, but who needs figures to see this widespread organizational cancer as a staggering cost?

Might data INsanity and its toxic consequences be the root cause of Dr. Deming's disgust with American management?

How many tools have I used so far?

"Rewind...and plot something important over time"

Next time: wrap-up and a sobering challenge. (Part 5)

Mark Anderson

Researcher & Analyst

2 天前

I’ve lived through it all ( my biggest career accomplishment).

回复
Debbie George

Business Development Executive South West

7 年

Karen Lodge, Marie Feeney. Any of this sound familiar??? ??

回复
Krae Stumpf MBA PMP CLSSBB

Healthcare Revenue Cycle Management

7 年

Sometimes pushed into, other times pulled into but sadly a few times, of my own volition have I fallen into the "traps" illustrated here. Thank you... wonderful reminder.

回复
Mark Graban

I help organizations and leaders drive continuous improvement and spark innovation through Lean management, building a culture of learning from mistakes, and fostering psychological safety. 3 Shingo Book Awards.

7 年

Statistically literacy is not a luxury. It should be a must for any organization. Learning better methods isn't hard... but unlearning "the way we've always managed" is a real challenge. "But we have to label data points as red and green." Do you? "But we have to hold people accountable!" Oh, you mean blaming them? You HAVE to do that? Read more from Davis or read Don Wheeler's "Understanding Variation" to learn more. Better managing metrics isn't calculus or rocket science... it's basic arithmetic.

Just a blog I needed Davis as a refresh and reminder plot the dots over time and institute the model for improvement, plot the dots again and has an improvement been made

回复

要查看或添加评论,请登录

Davis Balestracci的更多文章

社区洞察

其他会员也浏览了