Precision and Accuracy

by Anne Marie Helmenstine, Ph.D.

Updated December 29, 2017

Accuracy and precision are two important factors to consider when taking data measurements. Both accuracy and precision reflect how close a measurement is to an actual value, but accuracy reflects how close a measurement is to a known or accepted value, while precision reflects how reproducible measurements are, even if they are far from the accepted value.

You can think of accuracy and precision in terms of hitting a bullseye.


Accurately hitting the target means you are close to the center of the target, even if all of the marks are on different sides of the center. Precisely hitting a target means all the hits are closely spaced, even if they are very far from the center of the target. Measurements which are both precise and accurate are repeatable and very near true values.

Definition of Accuracy

There are two common definitions of the term accuracy. In math, science, and engineering, accuracy refers to how close a measurement is to the true value.

The ISO (International Organization for Standardization) applies a more rigid definition, where accuracy refers to a measurement with both true and consistent results. The ISO definition means an accurate measurement has no systematic error and no random error. Essentially, the ISO advises the term accurate be used when a measurement is both accurate and precise.

Definition of Precision

Precision is how consistent results are when measurements are repeated.

Precise values differ from each other because of random error, which is a form of observational error. 

Examples of Accuracy and Precision

You can think of accuracy and precision in terms of a basketball player. If the player always makes a basket, even though he strikes different portions of the rim, he has a high degree of accuracy.


If he doesn't make many baskets, but always strikes the same portion of the rim, he has a high degree of precision. A player who throws free throws that always make the basket the exact same way has a high degree of both accuracy and precision.

Take experimental measurements for another example of precision and accuracy. If you take the measurements of the mass of a 50.0-gram standard sample and get values of 47.5, 47.6, 47.5, and 47.7 grams, your scale is precise, but not very accurate. If your scale gives you values of 49.8, 50.5, 51.0, 49.6, it is more accurate than the first balance, but not as precise. The more precise scale would be better to use in the lab, providing you made an adjustment for its error.

Mnemonic To Memorize the Difference

An easy way to remember the difference between accuracy and precision is:

  • ACcurate is Correct. (or Close to real value)
  • PRecise is Repeating. (or Repeatable)

Accuracy, Precision, and Calibration

Do you think it is better to use an instrument that records accurate measurements or one that records precise measurements? If you weigh yourself on a scale three times and each time the number is different, yet close to your true weight, the scale is accurate.

Yet, it might be better to use a scale that is precise, even if it is not accurate. In this case, all of the measurements would be very close to each other and "off" from the true value by about the same amount. This is a common issue with scales, which often have a "tare" button to zero them.

While scales and balances may allow you to tare or make an adjustment to make measurements both accurate and precise, many instruments require calibration. A good example is a thermometer. Thermometers often read more reliably within a certain range and give increasingly inaccurate (but not necessarily imprecise) values outside of that range. To calibrate an instrument, record how far off its measurements are from known or true values. Keep a record of the calibration to ensure proper readings. Many pieces of equipment require periodic calibration to ensure accurate and precise readings.


要查看或添加评论,请登录

社区洞察

其他会员也浏览了