Based in Mahone Bay, nova scotia, agile libre is an independent publisher of technical books about the subsurface.  

Precision is not accuracy, interpretation is not truth

Despite having been taught the difference between precision and accuracy, geologists often confuse the two. To use an example, we can calculate reserves to three or five or even ten decimal places. However, when a geologist works with a field that has a billion barrels in-place, this amount of precision does not mean the data are correct (accurate). We are putting too fine a point on it.

Before computers there would be a couple of wells in an area and calculating reserves was done with estimates. Inputs included average pay thickness, average porosity, average saturation, and a recovery factor calculated for an interval of interest. We would multiply them to come up with an estimate that might be off by as much as 25 percent (or more). Since this was done by hand we recognized the assumptions involved, and realized the lack of precision and accuracy. We knew that the answer approached the ‘truth’ as the number of wells in the prospect increased. Now that we use gridded contour maps or build geomodels, and because the computer can give us an answer to as many decimal places as we like, we tend to believe the output as unvarnished truth. Even if we recognize that this ‘most likely’ case is one of many possible outcomes, we then provide the maps and numbers to management without explicitly stating this.

The correctness of our calculations depends on the accuracy of many inputs, including estimates of porosity, thickness, saturation, and recovery factor. Each of these has an error bar that represents the assumptions that go into correcting the raw logs and interpreting those logs. Each log has a vertical resolution that depends on the specific instrument configuration and design. Corrected logs such as gamma ray, resistivity, and density curves are used to create interpreted logs such as shale volume, porosity, saturation, and perhaps a pay flag that provides net pay thickness. These interpretations make several assumptions such as being ‘on depth’ and no instrument errors. As well, the log values are averaged when they are upscaled from the 15 cm sample interval of the logging tool to the 1 or 2 metre (or more) cell thickness in a geomodel.

Logs are averaged over the entire pay interval when we make a gridded contour map. When we build a geomodel or make a contour map we assume the values at the wells are ‘true’ rather than close approximations or averages. We hold the value at the well constant while using some statistical process (e.g. sequential Gaussian simulation) or facies distribution concept (e.g. fluvio-estuarine system) to extrapolate values away from the wells. Sometimes we impose trends on the data when we use a mental model (a preconceived bias). Using variograms in a geomodel, for example, allows us to dictate how far away from a well and in what direction porosity or saturation values can extend their influence. These processes have built-in assumptions about the distribution of rock properties away from wells. The most egregious assumption is that a well or two, each of which samples maybe a 10 cm diameter section of rock, and the logging tools we use, which sense maybe a metre or two into the formation, are representative of an area that is tens of square kilometres or more.

So how does this affect the geologist? You report that your prospect contains 411.6 million barrels of recoverable oil. You know this because your geomodel told you. The following year you drill a new well, or change your variogram distance, or change your thickness cutoff, or decide that effective porosity is a better measure than total porosity in your area, or modify your recovery factor from 40 percent to 35 percent. Suddenly your reserves are 340.4 million barrels recoverable. You just lost 71 million barrels, equivalent to a good-sized field. You have to report this loss to your boss. Worse, she has to report it to her boss. You can bet someone will want to know what happened.

Without acknowledging it explicitly, your boss took your reserve number as ‘the truth.’ You neglected to put a range of possible outcomes around your first number that took into account all of the estimates and assumptions that went into it. Because of the precision available from your computer model you inadvertently communicated to your boss, or your boss assumed, that the answer must be essentially true. In fact there could still be a 25 percent error in the actual recoverable estimate. Since your second number was about 17 percent smaller than the original, it suggests that both are within a valid range of probable outcomes. The variation is within the background noise of the accuracy of the estimate.

This realization applies not just to estimates of reserves but to every aspect of our work. The precision being used must not exceed the accuracy of the data. We must recognize and communicate that the interpreted answer is merely our humble attempt to approach the truth.

 

The fossil guardians

Thank your mentors

0