Author(s): John Kaldi (presenter) Emeritus Professor of Petroleum Geology and Engineering, Australian School of Petroleum and Energy Resources, University of Adelaide, Australia
Net pay is the formation thickness within the hydrocarbon column that can be produced economically. The most common methods for determining net pay include wireline log cut-offs, such as gamma-ray (sand vs. shale percentage), resistivity (hydrocarbon vs. water saturation), and porosity (rock vs. pore volume). This presentation details case histories from around the world that highlight various challenges with relying on many of these methodologies. Overly simplistic assumptions of wireline log response, and poor understanding of fundamental rock and fluid properties in the formation, have caused the misinterpretation of pay in many formations. Examples include incorrect v-shale cut-offs due to shale clasts, incorrect density porosity calculations caused by extreme mineral densities, incorrect neutron porosity cut-offs due to microporous formations, and misinterpreted fluid compositions (on resistivity logs) due to conductive minerals in the reservoir. Potential errors in quantifying reservoir properties may occur even when core data are plentiful. This is commonly due to core plug sampling bias, either from poor core and/or core plug recovery or due to human sampling bias; both biases result in over or under representation of particular rock types. Often, production mechanism and net pay interpretation are not considered together. Pay on primary production, may not equate to pay on water-flood, if the flooded formation is discontinuous or sweep efficiency is low. In addition, production or injection zone tests (e.g., PLT), are rarely correlated to rock types in a formation, thus leading to mismatch between actual and interpreted pay zones. Net pay determination is improved markedly when petrophysical properties are linked to field-specific production mechanisms.