Statics Model & STOIIP Calculation: Challenges & Lack of data Considerations

Mehdi Behnamrad, Behnam Jan Ahmad ,Geostatic Modeler


Keywords:3D reservoir model,Gas-oil (GOC) and Oil-Water (OWC),Uncertainty Analysis,Statics Model,Stratigraphic modelling,Structural modelling,Facies modelling,Inversion model

,عدم قطعیت,حجم درجا , مدل وارونه, مدل رخساره,مدل ساختمانی,مدل رخساره ای, سطح آب و نفت , سطح آب "گاز,مدل سه بعدی مخزنی 

A 3D reservoir model consists of the following elements: (1) a structural model consisting of horizons and faults, (2) fluid contacts – gas-oil (GOC) and oil-water (OWC), (3) a sedimentological model, (4) a porosity model, (5) a permeability model, (6) a net-to-gross (NTG) model, (7) a water saturation model. In geostatistical prediction from data, the user develops a statistical model. This statistical model may comprise both systematic trends and a random component (Stein, 1999), the expected value of this distribution (its mean value) is generally computed using the geostatistical method as a predicted value with an associated prediction error variance. The computer-based simulation of a spatially distributed variable such that it the spatial dependence of the data and its values at known data points (Journel and Huijbregts, 1978; Goovaerts, 1997). Two main sources of uncertainty can be distinguished in Statics Model: 1.The inherent variability of Nature Natural variability is an inherent feature of all geological objects and processes. There are no completely homogeneous geological objects, even the minerals are not homogeneous, as their real crystal structure differs from the theoretical one. The degree of variability of a geological object may be highly different depending on the geological processes. 2.Uncertainty Analysis The main uncertainties related to the geological investigations are due to incomplete knowledge and limited possibilities of the investigations. There is some degree of uncertainty inherent in any geologic model because of the differences (or errors) between the true and sampled values of data used in building the model. If the amount of sample data available for constructing the model is sparse, the level of uncertainty is expected to be high. Conversely, if the amount of sample data available is large and well distributed across the volume of interest, the level of uncertainty is comparatively less. In geologic model building, the two main sources of uncertainties are structural data and petrophysical data. structural uncertainties may result from processing and interpretation of seismic data, mapping of faults, structural and strati-graphic correlation of log data, “picking” of structure tops from seismic and/or log data, description of the depositional environment, etc. petrophysical uncertainties may derive from measurements of porosity, permeability, and water saturation data, determination of gross and net thickness data, definition and assignment of facies, location of fluid contacts, etc. practically most of the data used in the construction of geologic models have some degree of uncertainty associated with them. Consequently, it is important to develop methodologies for assessing and quantifying the uncertainty associated with geologic models built from these data. One of the methods that can be used to assess quantitatively the degree of uncertainty associated with a geologic model is by generation of multiple, equiprobable realizations of the model. Even though the realizations are equally probable (equiprobable), there are differences from one realization to the next realization. These differences are measures of the uncertainties present in the models. as a result, the realizations can serve as the input for uncertainty analysis. by application of statistical analysis, several measures of uncertainty can be generated from the set of multiple realizations. These statistical metrics may include the mean, standard deviation, and the cumulative probability distribution of variables such as total pore volumes and oil-in-place volumes which can be calculated from multiple realizations. a typical cumulative probability curve representing the distribution of oil-in-place volumes based on multiple realizations. The nominal classification based on P10, P50, and P90 quantiles for instance, the P10 quantile represents realizations with volumes of oil in place equal to or larger than the designated volume. Note that in some organizations, the designations of magnitudes of P10, P50, and P90 quantiles are reversed from the designations. From the cumulative probability curve calculated for a variable, the probability of achieving any level of outcome for that variable can be estimated. The cumulative probability curve can also be used to select specific realizations (geologic models) at the P10, P50, and P90 quantiles for further analyses in reservoir flow simulations.

Statics Model:

The most important phase of a reservoir study is probably the definition of a static model of the reservoir rock, given both the large number of activities involved, and its impact on the end results. As we know, the production capacity of a reservoir depends on its geometrical/structural and petrophysical characteristics. The availability of a representative static model is therefore an essential condition for the subsequent dynamic modelling phase. A static reservoir study typically involves five main stages, carried out by experts in the various disciplines (Cosentino, 2001).



1-Stratigraphic modelling:

Stratigraphic has Rules for make Horizons:

Horizon is divided into four categories, and each category has its unique way when connecting horizons together. The following correlation rules apply to layers and sub-layers:

The main uncertainty in the stratigraphic model is errors in the correlation scheme; fewer deterministic horizons reduce this uncertainty, so keep the zonation scheme simple! Where the main zone boundaries are from seismic interpretation, then there will be an element of uncertainty away from the wells due to depth conversion and horizon picking. The geophysicist may be able to assess the degree of uncertainty to allow a stochastic approach to the horizon model for each sub-grid; this can be fraught with danger especially if there are too many closely spaced horizons, leading to boundaries cutting each other and disrupting the grid integrity. Proceed with care when building stochastically defined horizons.

2-Structural modelling:

A structural model consisting of horizons and faults, the Structural framework functionality solves many of the problems posed by complex fault relationships and feeds construction construction of geocellular models, including stair-step faults, to handle complex geometries. Consequently, these tools improve both the time to model and the quality of geocellular grids.


 Fault and fracture networks as the arrays of increased permeability have significant effects on the productivity and economic viability of hydrogeothermal developments - they can significantly impact drilling, resource operations and potential recovery. Fault and fracture networks also control the compartmentalizing of reservoir rocks for underground storage and the sealing characteristics of cap rocks. The inherent structural and petrophysical complexity of fault zones produces correspondingly complex flow patterns inside and across the fault zone (Antonellini and Aydin, 1994, 1995; Caine et al., 1996; Fisher and Knipe, 2001; Fowles and Burley, 1994; Odling et al., 2004);thus faults can act both as pathways and obstacles to sub-surface fluid flow (Caine et al., 1996; Chester and Logan, 1986; Manzocchi et al., 2008, 1999; Seront et al., 1998) and considerably influence petroleum migration, accumulation and recovery. Characterizing fault properties and understanding fault impact on flow paths and reservoir dynamics through modeling remain key issues for optimizing production and exploration strategies.( Fisher and Jolley, 2007). However, these efforts are hampered by the inherent difficulty of describing structural complexity and petrophysical heterogeneity of entire fault zones based on spatially constrained outcrops representing a limited range of scales compared to those observed in the subsurface, (Braathen et al., 2009). Limitations related to modeling conventions, grid types, grid resolution and cost further constrain the level of detail that can be included in fieldsized simulation models. (Manzocchi et al., 2010, 2008). a few examples can be mention errors in the faults modeling that include:

1) A lack of systematically quantified descriptions of the spatial

distribution of fault rock facies (Braathen et al., 2009) a lack of key petrophysical data such as two-phase flow properties of fault rocks (Al-Busafi et al., 2005; Al-Hinai et al., 2008; Tueckmantel et al., 2012), 2) a high computation cost related to the necessity of using high-resolution grids, and 3) a lack of robust upscaling techniques for highly heterogeneous rocks.

3-Facies modelling

In the context of facies modeling, this could result in various interpretations of facies architecture, associations, geometries, and the way they are distributed in space. Classification of lithofacies and their accurate representation in a 3D cellular geologic model is critical to the project because permeability and fluid saturations for a given porosity and height above free water vary considerably among lithofacies (Dubois et al., 2003). Actually reducing uncertainties at reservoir is a key challenge, due to the limited number of exploration wells. 4D forward stratigraphic modelling is a powerful tool.The physics, sedimentology and stratigraphy allowed the building of predictive environment of deposition and lithologies.


If the geological analysis has been carried out thoroughly, then there should be little uncertainty in the facies model. However, there are always the unknowns, commonly the direction of channels or proportion of net- and non-net facies. The well data should drive these inputs to the model but seldom is there a wholly conclusive set of data. Any attempt to introduce a trend that is not represented by the wells or seismic means that the model will diverge from the input data, making the comparison difficult. The solution is to model a number of scenarios that represent the possible solution: model channels that flow in two or three directions across a structure to see what impact there is on net rock volume (NRV) for instance.

4-Petrophysical modelling:

Petrophysical modeling is the interpolation or simulation of continuous data (for example, porosity or permeability) throughout the model grid. And using, two calculation procedures may be used: deterministic and/or probabilistic. The deterministic method is by far the most common. The procedure is to select a single value for each parameter to input into an appropriate equation and to obtain a single answer. The probabilistic method, on the other hand, is more rigorous and less commonly used. This method utilizes a distribution curve for each parameter, and through the use of Monte Carlo Simulation, a distribution curve for the answer can be developed. assuming good data, a lot of qualifying information can be derived from the resulting statistical calculations, such as the minimum and maximum values, the mean (average value), the median (middle value), the mode (most likely value), the standard deviation and the percentiles, The probabilistic methods have several inherent problems. They are affected by all input parameters, including the most likely and maximum values for the parameters. in such methods, One cannot back calculate the input parameters associated with reserves. Only the end result is known but not the exact value of any input parameter. On the other hand, deterministic methods calculate reserve values that are more tangible and explainable. In these methods, all input parameters are exactly known; however, they may sometimes ignore the variability and uncertainty in the input data compared to the probabilistic methods which allow the incorporation of more variance in the data. In recent years, the quantification, understanding, and management of subsurface uncertainties has become increasingly important for oil and gas companies as they strive to optimize reserve portfolios, make better field development decisions, and improve day-to-day technical  operations such as well planning. Stochastic approaches based on the standard volumetric equation are now commonly used for screening and value assessment of hydrocarbon assets. Uncertainty in static volumes and recoverable reserves are quantified by Monte Carlo sampling of probability distributions for the controlling parameters in the volumetric equation. Volumes are calculated by simple multiplication of the sampled values for each of the input distributions. As the Monte Carlo sampling and direct multiplication is very fast, 1000s of Monte Carlo loops can be run to provide reliable output distributions. Although these approaches are very fast, it is often difficult to estimate the intrinsic dependencies between the input parameters, and they provide no quantification or visualization of the spatial location and variability of the uncertainty. (Mohammad Reza Kamali, et all., 2013).



 Challenges & Lack of data:

Integrated static workflows are a powerful tool for quantifying subsurface risks and guiding decisions during field development planning. In general, the Challenges in Statics modeling include several parts:

1-Updating related software and the annual training of these software:Software developers annually correct their equations in their engineering department, however, we continue to use our software for several years in important departments and decision makers. The annual training of software and new items added to them will transfer knowledge of the day. These trainings are very important in increasing the ability and knowledge of Iranian users. It is a very valuable result in the long time because:

•Help us to find new plays, increase their oil and gas

reserves and optimize production and revenues.

•We encounter with best-in-class and independent

advice and software in Exploration, Production and oil


•Share knowledge, and technology with us.

•Reduce risk and optimize costs.

•To Support future exploration activities.

 For example:

We still use variogram-based methods, whether twopoint or multi-point, for distributing rock types and properties, we still prefer to have grid cells that are orthogonal or nearly so. In most models, permeability heterogeneity is assumed to occur only at a length scale of several grid cells. We represent heterogeneity at this scale, But when near-spaced measurements are made, it that the range of the bed-parallel variogram of porosity or permeability is smaller than a typical grid cell – in most cases less than 20 m.Most of the variance is inside the grid cell and cannot be modelled by standard methods. Or in lots of depositional environments, sand beds merge with shales. In lots of depositional environments, flow-limiting shales are organized and not parallel to the model layering. Flow-expediting coarse layers may have organized dip too, or are arranged into threads. All these geological geometries are hard to represent in geological models, so are ignored or merged and smeared into of the reservoir that fails to represent the flow structure. These subject matter with the relevant new algorithms in the software and update equations can be solved.

 2- One of the most important cases is the lack of data or good Quality data used in Statics modeling :

The data taken into consideration can be divided into three groups regarding the degree of their uncertainty:

•Quantitative data. They are results of measurements,

their relative error being less than about 25%.

•Semi-quantitative data. They are also results of measurements,

but their relative error is more than about 25 %. The beginning of Static modeling is based on the geophysical data. The most widespread source for subsurface data is 2-D and, increasingly, 3-D seismic. Data related to boreholes - such as well logs and rock samples provide crucial complementary and calibration

parameters. Software programs today can provide all of this functionality. Geophysicists continue to respond with better solutions to E&P needs by providing better, higher- resolution data, more accurate processing and sophisticated integration of seismic data. it’s high spatial resolution, plays a key role not only in defining the reservoir structure and geometry, plays an important role in understanding reservoir characteristics, such as porosity, water saturation, thickness and the lateral extension. Seismic data provide high-resolution images of the subsurface structure.





Regarding the above mentioned, many of our fields have 2D seismic and need new 3D seismic projects and another fields that have 3D geophysics do not have high quality primary data or in these fields interpretations don’t have good qualities. For example, in Sub Coastal Fars fields with carbonate target reservoir, often due to mud loss and increase the cost of drilling and rush to complete the well, the most important tests such as (XPT. MDT. DST. Etc.) not done. According to the subjects that mentioned above you are not able to calculate the exact STOOIP and this matters caused increasing reservoir uncertainties.