A Short Primer on Statistical Process Control

A Short Primer on Statistical Process Control

It is only in the state of statistical control that statistical theory provides, with a high degree of belief, prediction of performance in the immediate future.

W. Edwards Deming, 1993

A Little History

The most crucial breakthrough in the modern quality movement came in 1931 with the publication of Walter Shewart’s Economic Control of Quality of Manufactured Product. Shewart became the first “to recognize that variability was a fact of industrial life and that it could be understood (and managed) using the principles of probability and statistics”. Shewart gave the quality movement its theoretical underpinnings when he defined the problem of managing quality as one of differentiating between acceptable variation, or common causes, and unacceptable variation, or special causes. Both kinds of causes produce variation. However, Shewart recognized that it is possible to live with common causes because, while they can never be fully eliminated, they allow the process to function with a predictable level of variation. Thus, a company that has only common causes to contend with in its processes will produce products of a predictable level of quality to which it can peg warranties, product claims, and prices. Special causes, however, are by definition unpredictable, can wreak havoc with a process, and offer management no basis on which to predict the quality level of its products or manage change.[1]??????

In 1924 Shewart developed a diagram that showed whether a process was stable, and measured the capability of the process over time. The diagram contained the methodology for creating a control chart, to be used by Western Electric technicians. To use a control chart, a technician would sample a production process at regular intervals and plot the results on a chart. Using a mathematical formula developed by Shewart, he could establish boundaries of variation for a given process within which it is considered stable, predictable, in control. Outside of those boundaries the process is deemed unstable, unpredictable, and out of control.

Shewart’s techniques were used extensively in World War II and again in more recent years by W. Edwards Deming and others as a basis for improving product quality, both in Japan and in many US companies. As a result of the successes of SPC in industrial settings, the techniques have been adopted for use in many other business areas. SPC differs from other process improvement approaches—such as problem management and zero defects—in that it provides more constructive help in reducing the causes of defects and improving quality. SPC techniques can be applied to software processes just as they have been to industrial processes, both to establish process stability and predictability and to serve as a basis for process improvement.

How This Relates to Organizations

Stability of an organization’s software processes for any given attribute is determined by measuring the attribute and tracking the results over time. If one or more measurements fall outside the range of chance variation, or if systematic patterns are apparent, the process may not be stable. To achieve a stable and predictable state of operation, deviations must be discovered and removed.

Process stability is considered by many to be at the core of process management. Stability is essential to an organization’s ability to produce products according to plan and to improve processes to produce better and more competitive products.

What is stability? Remember Shewart’s theory stating that almost all characteristics of products and processes display variation when measured over time.

In equation form, the concept is:

Total Variation = [Common Cause Variation] + [Assignable Cause Variation]

Common cause variation is the normal variation of the process, caused by the interaction among process components (people, machines, material, environment and methods). The results of common cause variation are random, but they vary within predictable bounds. When a process is stable, the random variations come from a constant system of chance causes. The variation is predictable and unexpected results are extremely rare.

The key word here is predictable. Predictable is synonymous with in control. Processes can vary in known, nonrandom ways and still satisfy the requirements of a controlled process. Controlling a process means making it behave the way we want it to. Control will enable an organization to do two things:

·????????Predict results

·????????Produce products that have characteristics desired by our clients

With control, we can commit to dates when products will be delivered and live up to such commitments.

To measure process performance, we must measure as many product quality and process-performance attributes as needed, doing this at several points in time to obtain sequential records of their behavior. It is these sequential records that are the basis for establishing statistical process control and, therefore, for asserting the stability and capability of the process. If a process is in statistical control, and if a sufficiently large proportion of measured results fall within specified limits, the process is termed capable. Thus, a capable process at InnovaSystems is a stable process whose performance satisfies customer requirements.

Figure 1 is an example of a control chart indicating the number of reported but unresolved problems backlogged over the first 35 weeks of testing. The dark center line indicates that the problem resolution process is averaging about 14 backlogged problems. The upper control limit (dashed line) for backlogged problems is about 25, and the lower control limit is about 4.5. Note that the upper limit is exceeded at one point, indicating that there may be a problem in the problem-resolution process. There may be a particular situation that caused problems to pile up. If so, corrective action may need to be taken to return the process to its normal (characteristic) behavior. Also note that the lower limit is exceeded at one point. This may indicate exceptionally high performance and it may be worth studying the work habits employed during that period to see if the methods used could improve the system overall.

Note that these chart limits are estimates for the natural limits of the process, based on the measurements that were made. The natural process limits together with the dark center line are sometimes referred to as the “voice of the process”. The performance derived from this voice is not necessarily the performance needed to satisfy client expectations. If a stable process produces too many nonconforming products, then the process requires improvement; the variability must be reduced or the average lowered.

An important point is that performance of workers will always vary because they are human and because it is in the nature of all worldly things to be variable. Therefore, the first objective is to control the environment and the components of the system—whether they are people, tools or computers—so that the natural variability remains within certain predictable limits. The next objective is to fine-tune the environment and the components of the system to shrink the variation.


Number of

Unresolved

Problem

Reports


?






?

?

?

?

?

?

?

?

?

?

?



Figure 1: Example of a Control Chart




[1] Gabor, Andres, The Man Who Discovered Quality: How W. Edwards Deming Brought the Quality Revolution to America, New York, Penguin Books, 1990.


要查看或添加评论,请登录

Pete Morris, PMP, PMI-ACP的更多文章

社区洞察

其他会员也浏览了