How many different interpretations could be made of the same seismic image? This is a question that few interpreters consider when faced with a seismic image to interpret. That’s because we try and match the image to something familiar and hone in on those qualities of the image that are most similar to those we expect. When we determine the concept that best matches the image, in our own eyes, we find it very difficult to move away from this initial concept and reflect on alternatives. Although not commonly considered in seismic image analysis the conceptual uncertainty associated with model choice during interpretation of seismic images, in combination with human biases, has the potential to severely influence the final subsurface model.
Consider the data used to produce a seismic image. It has a limited resolution, like any remotely sensed data, and an inherent uncertainty. In fact each element in the process from seismic acquisition to processing has associated decisions that narrow the potential conceptual interpretations that are most likely for the dataset. For example, the type of acquisition chosen may be influenced by, amongst other things, the terrain, economics, and practicalities, but also by the depth and structural style of the target. The processing also leaves its imprint, with the chosen parameters having a direct impact on the features highlighted in the final seismic image. These non-economic and practical decisions are based on prior knowledge that itself has differing levels of certainty, or uncertainty.
On top of this uncertainty in the data is the influence of human bias at the interpretation stage. Humans are prone to using heuristics (rules of thumb) that allow the processing of large volumes of complicated data quickly — cunning short-cuts. They are very powerful, but not always correct. For example, imagine listening to a telephone conversation at the far end of an open-plan office; from the snippets heard it is possible to use these to construct the full conversation. Or at least the conversation expected based on the elements heard and any other prior knowledge you can bring into play (such as person X has just split-up with person Y).When we interpret we use all our prior knowledge consciously or sub-consciously to help us: the geographical location, first-hand knowledge of the tectonic or sedimentary regime, and second-hand knowledge from papers and books, or what others have said. This prior knowledge helps us to fill in the ‘holes’ in the data presented to us, it helps filter all the possible concepts to those that are most likely. We use it to focus on the elements of the data that are most important to confirm the concepts in our heads.
But sometimes heuristics can set us on completely the wrong track, and once heading in the wrong direction it is hard to turn around. Some of the human biases that can affect interpretational outcome and our ability to use heuristics effectively are:
- Availability bias. The decision, model, or interpretation that is most readily brought to mind.
- Confirmation bias. To seek out opinions and facts that support one’s own beliefs or hypotheses.
- Anchoring bias. Failure to adjust from experts’ beliefs, dominant approaches, or initial ideas.
- Optimistic bias. ‘It won’t happen to me’ mentality or ‘there is definitely oil in this prospect!’.
Failure to consider the other possible interpretations or concepts, the full breadth of the conceptual uncertainty challenge, could mean a dry well or a missed opportunity.
Bond, C E, A D Gibbs, Z K Shipton, and S Jones (2007). What do you think this is? ‘Conceptual Uncertainty’ in geoscience interpretation. GSA Today 17, 4–10, DOI 10.1130/GSAT01710A.1.
Krueger, J I, and D C Funder (2004). Towards a balanced social psychology: Causes, consequences and cures for the problem-seeking behaviour and cognition. Behavioural and Brain Sciences 27, 313–327, DOI 10.1017/S0140525X04000081.
Rankey, E C, and J C Mitchell (2003). That’s why it’s called interpretation: Impact of horizon uncertainty on seismic attribute analysis. The Leading Edge 22, 820–828, DOI 10.1190/1.1614152.
Tversky A, and D Kahneman (1974). Judgement under uncertainty: heuristics and biases. Science 185, 1124–31, DOI 10.1126/science.185.4157.1124.