Keith R. Holdaway, SPE, SAS Institute Inc. Copyright 2012 ... explicates how data mining workflows can be successfully applied in such unconventional plays.
SPE 149785 Oilfield Data Mining Workflows for Robust Reservoir Characterization Keith R. Holdaway, SPE, SAS Institute Inc.
Copyright 2012, Society of Petroleum Engineers This paper was prepared for presentation at the SPE Intelligent Energy International held in Utrecht, The Netherlands, 27–29 March 2012. This paper was selected for presentation by an SPE program committee following review of information contained in an abstract submitted by the author(s). Contents of the paper have not been reviewed by the Society of Petroleum Engineers and are subject to correction by the author(s). The material does not necessarily reflect any position of the S ociety of Petroleum Engineers, its officers, or members. Electronic reproduction, distribution, or storage of any part of this paper without the written consent of the Society of Petroleum Engineers is prohibited. Permission to reproduce in print is restricted to an abstract of not more than 300 words; illustrations may not be copied. The abstract mus t contain conspicuous acknowledgment of SPE copyright.
Abstract "There are more things in heaven and earth, Horatio, than are thought of in your philosophy" "Hamlet" William Shakespeare To fully appreciate the plethora of disparate data sets across the upstream geoscientific silos, it is essential to establish a sound and consistent suite of workflows that embrace data management facets as well as diverse data mining modules established under the banner of soft computing or artificial intelligence. When does one implement a neural network, a decision tree or non-linear regression techniques? Will Genetic Algorithms or Fuzzy Logic be appropriate for my objective function? This paper sets out to answer some important questions around data mining workflows underpinned by exploratory data analysis, confirmatory data analysis, descriptive and predictive modeling to establish sound and important reservoir characterization decision-cycles. Case studies are presented to illustrate effective and successful studies based on advanced statistical analysis and AI workflows in sandstone and carbonate reservoirs. Can such workflows be adopted in unconventional reservoirs? Determining accurate and effective hydraulic fracturing packages are keys in tight gas plays, and this paper explicates how data mining workflows can be successfully applied in such unconventional plays. Introduction We find ourselves in the midst of a data explosion, and those raw data are sourced across a multi-disciplinary environment in the petroleum industry. To master complex problems inherent in heterogeneous subsurface reservoir systems, we must break down the walls built around the traditional disciplines of petroleum engineering, geophysics, geology, petrophysics and geochemistry. Professional curiosity is trumped by necessity to uncover the knowledge hidden across all upstream data sets as a multi-disciplinary analysis, underpinned by a multivariate suite of advanced analytical workflows, is implemented through data mining methodologies. Thus we are required to complement the conventional wisdom of interpretation and deterministic approaches steeped in first principals with the emergence of soft computing techniques. It is important to grasp that the evolution of soft computing methodologies is based on the robust and tractable premise that unlike hard computing it is tolerant of uncertainties, imprecisions and partial truths. Intelligent reservoir characterization depends on the paradigm shift that is requisite to realize more profound understanding of complex subsurface systems; moving from the empirical sciences, the theoretical and computational to data mining and non-trivial extraction of implicit, previously unknown and potentially useful information from raw data. Artificial intelligence that embraces a menu of genetic algorithms, fuzzy logic and advanced data mining techniques morphs to a Platonic or inductive reasoning by pursuing a pathway from the general to the specific. Statistics too operate on patterns within the data, but tend to match pre-determined patterns to data in a deductive or Aristotelian way moving from the details to the general, big picture perspective.
Discussion The technologies detailed in this paper have already proven to be beneficial in optimizing production from low permeability and fractured reservoirs such as shale gas, tight gas, coal seam gas or coal bed methane and pre-salt deep-water plays. The
2
SPE 149785
intelligent techniques can be implemented to quantify uncertainty, assess risk and perform advanced data management roles and thus address the following business issues that permeate the oil and gas industry during E&P: 1. 2. 3. 4.
High exploration risk Uncontrollable OPEX and CAPEX in E&P Low recovery factor in extant assets Short life-cycle of producing wells
Methodologies Workflow 1: Pre-Salt Plays One important workflow entails detailed analysis of seismic attributes that can be subsequently mapped via a predictive algorithm based on artificial neural networks to reservoir properties and thus establishing a more robust reservoir characterization of a very complex subsurface system. Owing to the large velocity contrast between the fast salt velocity and the surrounding sediments, and due to the irregularity of the salt bodies, it is of paramount importance to attain seismic fidelity or high resolution to build a reliable geologic model, see Figure 1. In order to correctly image the sub-salt section it is a requisite to create a velocity model that describes the salt body and the sedimentary velocity field and then apply depth migration. In most cases the salt image as well as the sub-salt image is improved. However, in some cases, it can be noted that the sub-salt sedimentary dips close to the salt body are not correct, although the base of salt image is correct. Additionally, coherent noise trains and random noise obfuscate the picture and diminish the seismic attributes’ value as predictor variables prior to implementing soft computing methodologies to characterize the reservoir.
Figure 1: Seismic line across a salt play
Both land and marine seismic acquisition surveys are negatively impacted by noise be it random or coherent in nature, that obscure the seismic reflections or representative signals sought by geophysicists in their goal to interpret and identify both structural and stratigraphic oil and gas plays as potential hydrocarbon reservoirs. The signal-to-noise ratio optimization in seismic sections, profiles of the subsurface geologic strata, is a target for continuous improvement. The oil and gas industry, primarily the geophysical service companies that acquire and process the seismic data, devotes enormous energy and resources and a considerable amount of their budget towards processes and solutions that ultimately deliver noise attenuation methodologies. Noise can play a very prominent role in pre-salt traps and even post pre-stack depth migration, velocity filtering and deconvolution, the amplitudes and frequencies are obscured by noise remnants. So, like any upstream data management workflow that traditionally embraces filtering, imputation and transformation to collate a robust data set, the
SPE 149785
3
seismic data necessitates a noise attenuation methodology. In the marine environment there are noise issues that could result in poor signal-to-noise ratio in the seismic sections, see Figure 2. If more than one seismic vessel, for example, acquires data in a neighboring area, the energy generated by these vessels causes interference in the signal recorded by other vessels, which is known as seismic interference noise (SI). This is recorded along with the desired reflection data. This interference can be easily identified in the seismic section. Within a single trace, the interference consists of isolated, short-duration events that can hardly be recognized. However, across numerous traces, a characteristic trend develops, which is different from the trends of the real reflection characteristic, representing real geologic configuration. For reflection times greater than the water bottom time, this high-energy noise overrides weak reflections and is destructive to many pre-stack processes such as surface multiple prediction, pre-stack migration and Amplitude versus Offset (AVO) trends. There is also an issue in marine seismic data acquisition with backscattered noise or bursts of seismic energy that are generated by a seismic source that have been backscattered off a surface diffraction point such as a rig, see Figure 3, arriving back at the seismic recorders (hydrophones) during the recording cycle 1.
Figure 2: Time and Depth Pre-stack Migration
Figure 3: Seismic shot across salt dome with unattenuated noise
It is feasible to implement the application of advanced statistical analysis techniques to attenuate noise, both coherent and random. The main benefits in adopting a data driven predictive analytical approach on shot-records of seismic data include effective and near real-time discrimination of coherent noise trains and random noise from primaries or seismic event signal responses. The data driven modeling approach is intended to uncover relationships that may be formalized as rules for a production system. This is accomplished through Exploratory Data Analysis (EDA) and predictive models that provide continuous monitoring of seismic acquisition data. Implementing these extracted rules and relationships during data acquisition in the field, enables removal of noise, both coherent and random, to provide clean shot-records free of such dispersive events.
4
SPE 149785
The approach is based on a two-step data driven methodology, without first principles, to identify patterns in the amplitude and frequency content of the time series data. This approach is intended to identify events and discriminate between noise, both coherent and random, from the primaries or signal energy that represents geologic strata. Removal of coherent and random noise from seismic data acquisition to create two data sets: 1.
Seismic signal that displays the earth’s reflectivity series or those stationary components that correspond with the localized reflectors.
2.
Coherent and dispersive noise trains with aliasing qualities in the F-K domain, and ambient random noise.
Firstly, an EDA workflow is implemented to fully surface underlying seismic attributes of the shot records. The amplitude data can be viewed from trace to trace for spatial trends and spectral analysis can be run to appreciate underlying seismic attributes such as frequency content. Through EDA of the seismic data, it will be plausible to identify the best predictive modeling techniques: Artificial Neural Network (ANN), Self-Organizing Maps (SOMs), decision tree analysis or linear/nonlinear regression methodologies. Spectral analysis is then run post EDA. This initial stage will identify appropriate modeling techniques in the data mining stage and hence suggest hypotheses worth testing. Secondly, a methodology based on predictive analytics is used. The intent is to break the amplitude signal into its most uniquely elemental components utilizing Principal Component Analysis (PCA), a statistical technique adopted to analyze and transform multivariate datasets. This approach adopts a set of N-dimensional vectors as inputs that will represent the entire volume of a shot record. The process ranks the Principal Components in accordance with their contribution to the total variance of the seismic traces in the entire dataset. The PC’s are deemed fully independent and as such are appropriate for further advanced analytical processes. Decision tree analysis is also considered, given the nature of continuous events inherent in the seismic trace attributes. A decision tree is a logical model that shows how the value of a target variable can be predicted by using the values of a suite of predictor variables. Singular value decomposition2 (SVD) is also implemented. On the one hand, we can see it as a method for transforming correlated variables into a set of uncorrelated ones that better expose the various relationships among the original data items. At the same time, SVD is a method for identifying and ordering the dimensions along which data points exhibit the most variation. This ties in to the third way of viewing SVD, which is that once we have identified where the most variation is, it's possible to find the best approximation of the original data points using fewer dimensions. Hence, SVD can be seen as a method for data reduction. Having cleansed the seismic data, it is now feasible to implement further soft computing techniques such as supervised and unsupervised neural networks to ascertain reservoir characterization. We shall discuss both approaches. The main difference between these approaches lies in the amount of a priori information that is provided. There are certain conventional statistical techniques that are usually based on particular assumptions. One such assumption is the requirement for statistical independence of the input variables in regression analysis and in the majority of the clustering algorithms. The PCA process mentioned earlier guaranteed such independence. Supervised Neural Network3
The advantage of a neural network is its ability to work with ambiguous and non-linear data. A representative training data set describes the problem; such as separating the seismic attributes from a pre-salt 3D data set to discriminate between gas- and water bearing zones. Supervised essentially refers to the type of training inherent in the network implementation, consisting of a training set that has both input and output known. Thus the internal weights of the neural network are adapted to optimize the relationship between the input values and the associated output values. In this case the training set was the seismic attributes found at well and random locations identified a priori to training the network. A training set consisting of the seismic attributes amplitude envelope (raw and instantaneous), maximum amplitude, coherency and frequency attributes (instantaneous phase or frequency) was used. The attributes were normalized to ensure equal importance was given to each of them. In theory we could have used all the seismic attributes as input to the Artificial Neural Network (ANN) and the performance should be satisfactory. However, in practice the training time required is very significant without perceptible improvement in the performance and the tolerance or accuracy of the results. The ANN devotes excessive time learning futile correlations amongst a very large number of input variables owing to oddities in individual training scenarios. Hence it is necessary to identify those input variables that enable the ANN to focus on significant correlations. This is where PCA is traditionally
SPE 149785
5
employed to determine whether any variables are highly correlated and thus should be combined. It also surfaces those combinations of variables that contain large spreads in the data on average. Unsupervised Neural Network
The unsupervised neural network approach does not predict a particular parameter but instead performs an unbiased analysis resulting in groups or clusters of similar data vectors such as seismic traces. A data swath that is representative of a horizon interval subset was selected from the 3D seismic data volume. The data was “unwrapped” to transform the data into a matrix of rows and columns; and this matrix was then used as input to a Self-Organizing Map. The visual output from the unsupervised neural network is a suite of maps that represent groupings of CMP’s based on their similarity. Workflow 2: CBM or CSG Unconventional Reservoirs In recent decades there has been an important source of energy in United States, Canada, and other countries that is known as Coal Bed Methane (CBM). It is a form of natural gas extracted from coal beds. Australia has rich deposits where it is known as Coal Seam Gas (CSG). The term refers to methane adsorbed into the solid matrix of the coal. It is called 'sweet gas' because of its lack of hydrogen sulfide. The presence of this gas is well known from its occurrence in underground coal mining, where it presents a serious safety risk. Coaled methane, often referred to as CBM, is distinct from typical sandstone or other conventional gas reservoir, as the methane is stored within the coal by a process called adsorption. The methane is in a near-liquid state, lining the inside of pores within the coal (called the matrix). The open fractures in the coal (called the cleats) can also contain free gas or can be saturated with water. Unlike much natural gas from conventional reservoirs, coal bed methane contains very little heavier hydrocarbons such as propane or butane, and no natural gas condensate. It often contains up to a few percent carbon dioxide. Some coal seams, such as those in certain areas of the Illawarra Coal Measures in NSW, Australia, contain little methane, with the predominant coal seam gas being carbon dioxide. It is imperative to adopt an Intelligent Methodology (IM) approach that incorporates an enterprise business suite of workflows, integrating stakeholders, technologies and processes, in order to attain efficient, repeatable and effective analytical results that accelerate the decision-making cycles to formulate strategic and tactical field development plans. Historical data are analyzed post an EDA and cleansing process to identify hidden patterns and trends that are signatures representing events, see Figure 4.
Figure 4: Data Mining Historical Observations to Predict Future Events Unlike a conventional gas field, CSG requires drilling groups of wells where interference between them will improve overall gas production by facilitating the more rapid removal of large volumes of water. Interference is a function of coal seam
6
SPE 149785
permeability, well hydraulic fracture length and well spacing. Dewatering depends on the permeability of the coal (cleats and induced fracture) and well spacing. However, this leads to a paradox of a possible lower initial gas production from a larger spacing because of the greater timeframe for removing water, yet a higher ultimate recovery per well, from a lower number of wells. An optimum well spacing for the most economical development of a CSG field is usually determined by simulation where the effects on interference from permeability, fracture lengths and permeability anisotropy are considered. Permeability is key factor for CBM. Coal itself is a low permeability reservoir. Almost all the permeability of a coal bed is usually considered to be due to fractures, which in coal are in the form of cleats and joints. The permeability of the coal matrix is negligible by comparison. Coal cleats are of two types: butt cleats and face cleats, see Figure 5, which occur at nearly right angles. The face cleats are continuous and provide paths of higher permeability while butt cleats are discontinuous and end at face cleats. Joints are larger fractures through the coal that may cross-lithological boundaries. Hence, on a small scale, fluid flow through coal bed methane reservoirs usually follows rectangular paths. The ratio of permeabilities in the face cleat direction over the butt cleat direction may range from 1:1 to 17:1. Because of this anisotropic permeability, drainage areas around coal bed methane wells are often elliptical in shape.
Figure 5: CSG and Conventional Reservoirs To anticipate reservoir damage and production downtime from pre-conditioned monitoring of data across all the upstream disciplines it is essential to ensure production attains predefined targets, and to sustain assets performance in the context of a global economy. Data integration, risk assessment and quantification of uncertainty are key issues and as the problems become too complex and the cost associated with poor predictions (dry holes) increases, the need for proper integration of disciplines, data fusion, risk reduction and uncertainty management become very important. Soft computing methods offer an excellent opportunity to address all these issues, particularly integrating information from various sources with varying degrees of uncertainty, establishing relationships between measurements and reservoir properties, and assigning risk factors or error bars to predictions.
SPE 149785
7
Remove the individual, autocratic approach of domain silos, each operating with their own different and distinctive solutions. Consistency of information across the upstream organization leads to collaboration and the ability to make timely and strategic decisions. All GGRE data must be integrated to provide as much information across the upstream geosciences to ensure accurate models as real-time data is correlated with historical data prior to building static and dynamic models. Analysis of disparate data enables targets to be attained, and process changes to be made in a timely manner to ensure more flexible and alternative plans.
Figure 6: Well Completion Design Evolution in CSG Plays
The ability to maximize recovery from existing wells and minimize the deferred production helps mitigate the greatest cost uncertainty in CSG exploitation; total number of requisite wells to economically exploit the hydrocarbons, Figure 6. During the CSG to LNG (Liquefied Natural Gas) process low initial rates require focus on OPEX reduction immediately, whereas for conventional gas, production focus is on OPEX reduction towards the completion of a field’s life-cycle. When developing advanced analytical workflows for CSG/CBM it is important to entertain and function in any time or data constraints as the latter may be limited in scope and availability. On account of the highly heterogeneous nature of the coal seams with associated variability in well rates and spud times. It is essential to automate a data management workflow that filters, transforms, imputes and normalizes the data. This ensures optimum available data for the analytical steps and thus more accurate quantification of uncertainty and mitigation strategies for risk associated with OPEX limitations. Among the data points to collate:
Well Performance (Rate and Pressure) Initial Rate of decline (Di): Decline Curve Analysis (DCA) Rate of decline (qg): DCA Arps’ coefficient that reflects the performance trend (b): DCA Bottom Hole Flowing Pressure (BHFP) and Material Balances for EUR
8
SPE 149785
Rate Transient Analysis (RTA) and/or Pressure Transient Analysis (PTA) for permeability and skin
Determine well profiles for Best/Worst performers based on data points: Skin Permeability BHFP Water production (Sw) Near wellbore relative permeability impacts
Are there well integrity issues? Sanding Scaling/paraffin Drilling system parameters Drilling operational variables
It is proposed that the methodology that is optimum for CSG or CBM unconventional plays integrate, cleanse and aggregate the data into a data mart primed for a particular analytical suite of processes that are tailored for identifying those parameters that have most statistical impact on the objective function, see Figure 7a and Figure 7b.
Figure 7a: Analytical Workflows
SPE 149785
9
Figure 7b: Analytical Workflows The objective function may be expressed as recovery factor, plateau duration or production optimization. There are many production inhibitors, such as skin damage and sanding that can be predicted by generating models inferred by exploratory data analysis. Aggregating and integrating data sets from across the geo-scientific silos to produce a robust data set tailored for specific analytical studies are the foundation for all such studies. Analytical workflows can be implemented to attain the following goals: a. b. c. d.
Establish variables that are key production indicators Identify critical parameters and their range of values Automate normalization and remediation of all data for missing and erroneous values Identify objective function, i.e. target variable such as recovery factor, liquid carry over or cumulative nonzero production over a certain period, and determine sensitivity studies to identify key drivers
Such workflows, Figure 8, can identify key performance drivers, and offer strategies and tactics for well completion methods and optimized hydraulic fracture treatment designs. A probabilistic approach helps quantify uncertainty and assess risk for individual field development plans. Important results from production performance studies adopting aforementioned workflows embrace an automatic methodology to characterize impairment, classify wells as good or bad candidates for well stimulation, predict performance outcomes of particular operational parameters and increase production with faster decision-cycles.
10
SPE 149785
Figure 8: Well Performance Methodology Analytical workflows can incorporate a decline curve analysis step implementing a web-based OPF (Oilfield Production Forecasting) to identify short- and long-term forecasts for oil, gas and water production. Implementing mature SAS forecasting models and first principals such as Arps’ empirical algorithms, you can estimate accurate well performance and EUR (Estimated Ultimate Recovery) and measure the impact, positive or negative, of well remediation techniques. Comparing real-time production data rates and type curves against forecasted trends, you can:
Quickly and efficiently identify those wells that require remediation Segment the field via well profile clustering Ratify from a field, reservoir or well perspective whether current production falls within confidence intervals of expectation and act accordingly
Consistent production reporting with Key Performance Indicators (KPIs) visualized in a web-based suite of solutions renders clear and insightful information across a collaborative team. An enormous volume of data and decisions are generated in each and every study with an associated wealth and range of knowledge (technical and business decisions) including study uncertainties, alternative development decisions and their associated (combined and individual) impact on the study target objective function. Yet the ability and opportunity to easily access, review, interrogate and visualize the accumulated range of study outcomes in a fast and responsive manner must be put into place in order to take advantage of raw data transformed into knowledge. Data accumulated from Case Based Reasoning analogies and predictive models over time build a database that can be refined and tied with events to enable a clear and effective range of solutions to issues that are reported in real-time. A comprehensive business intelligence stack thus proffers a suite of reporting solutions based on dashboards and portals.
SPE 149785
11
Figure 9: Data Driven Models versus Theoretical Models It is important to adopt a data driven methodology, Figure 9, to ascertain both descriptive and predictive models. You can then marry such models with the knowledge garnered by studying first principals; and thus attain a deterministic and stochastic joint evaluation of all available raw data to translate into knowledge in a timely manner that reduces the decision cycles and alleviates the burden of the geoscientists to devote precious time on fundamental but very important tasks such as data management and quantification of uncertainty in the high dimensional data sets. CSG is fundamentally different to conventional gas both from economic and technical perspectives as CSG CAPEX can be as much as fifty percent greater than CSG to LNG life cycle CAPEX. Invariably thousands of wells are drilled over the field’s lifetime and there is significant facilities and infrastructure required. Development of such resources are usually based on statistical averages of well performance derived from pilot projects and thus it is not so much ascertaining the Original Gas In Place (OGIP) but determining the number of requisite wells and identifying the economics behind water extraction and disposal. Such business issues can be addressed implementing advanced analytical methodologies. Moving Domain Analysis (MDA) Workflow Moving Domain Analysis4 enables a much broader, more statistical view of production data. It uses available production and completion data to identify areas of interference between existing wells and to quantify the impact of completion and stimulation practices on well performance and drainage area. It may be used to quantify infill drilling opportunities over large areas, to evaluate completion practices and stimulation treatment sizes, and to locate areas for more in-depth, detailed engineering studies. MDA also provides an unbiased and consistent analysis across a study area. Database Construction and Analysis methodology: The methodology employed is characterized into three areas: Moving Domain Analysis (MDA), Reservoir analysis Quantifying infill reserves and spotting infill wells.
12
SPE 149785
The analysis approach implemented MDA to blend original gas-in-place estimates, with drainage area calculations. MDA is a mosaic of localized performance studies that blends analogy, statistics, and conventional engineering to identify infill locations. Three types of information are utilized: (1) magnitude of production performance, (2) geographic location of that performance, and (3) the date when this performance was observed. A basic principal is that infill expectation is based on previous performance around the infill location in both time and geographic position, while considering the amount of undrained acreage available. MDA expertise makes use of production indicators to measure well quality, estimate long-term recovery, and estimate gas volumes that may be in communication with other wellbores. A production indicator is a means to estimate long-term production from short-term data. Since at least five years of production history was available for each well, a five-year cumulative gas volume was used for a main production indicator. In many cases, a “Best-Year” can be used as a production indicator. This is the summation of the highest 12 consecutive months of production divided by 12 (Mscf/month). However, the five-year cumulative volume is a better short-term indicator of long-term performance (20 year EUR) due to its longer time span. Conclusions
Figure 10: Business Analytics in CSG or CBM underpins Advantage It has been shown that adopting a comprehensive suite of business analytics, Figure 10, incorporating workflows around data management and descriptive and predictive models, exploitation of resources in both deep water pre-salt plays and the unconventional Coal Bed Methane or Coal Seam Gas is enhanced to attain business value and Return on Investment (ROI), Figure 11.
SPE 149785
13
In the case of seismic processing suppressing noise and garnering appropriate attributes there are obvious benefits inherent in an advanced analytical approach:
Seismic data acquisition o Cleaner shot records – 2D and 3D o o
Predict coherent noise trains Ground roll Cleanse random noise
o Retain cleansed signal and noise records Amplitude and frequency attributes mapped to reservoir properties
Coherent and random noise attenuation at time of seismic data acquisition
Real time interpretation of major signals
Ensure effective data acquisition parameters
The case detailing CBM/CSG establishes production and drilling methodologies that optimize well performance and studies the data to identify optimum location for drilling and exploiting such unconventional resources. Expectations from “Well Engineering Optimization” workflows:
Accelerated decision making capability
Identify/separate well production from reservoir conformance issues
Reductions in OPEX i.e. Field Staffing etc.
Increased production rates from existing well portfolio
Reduced deferred production due to unplanned well interventions
Increased reserves from existing well portfolio
Figure 11: Business Drivers for Analytical Methodology
14
SPE 149785
Acknowledgements The author wishes to thank SAS Institute Inc., Cary NC USA for kind permission to publish this paper. Glossary ANN AVO BHFP CAPEX CBM CMP CSG DCA Di EDA EUR F-K GGRE IM KPI LNG MDA OPEX OPF PCA PTA qg ROI RTA SI SOM SVD Sw
Artificial Neural Network Amplitude versus Offset Bottom Hole Flowing Pressure Capital Expenditure Coal Bed Methane Common Mid-Point Coal Seam Gas Decline Curve Analysis Initial rate of decline Exploratory Data Analysis Estimated Ultimate Recovery Frequency-Wavenumber Domain Geology, Geophysics and Reservoir Engineering Intelligent Methodology Key Performance Indicator Liquefied Natural Gas Moving Domain Analysis Operational Expenditure Oilfield Production Forecasting Principal Components Analysis Pressure Transient Analysis Rate of gas production Return on Investment Rate Transient Analysis Seismic Interference Self-Organizing Map Singular Value Decomposition Water saturation
References 1. 2. 3. 4.
Seth Haines and Antoine Guitton: “Coherent Noise suppression in electroseismic data with non-stationary prediction-error filters”, Standford Exploration Project, Report 113, July 8th 2003. Peter Cary and Changjun Zhang: “Ground Roll Attenuation via SVD and Adaptive Subtraction”, 2009 CSPG CSEG CWLS Convention, Calgary Alberta Canada. Fred Aminzadeh, Paul de Groot.: “Neural Networks and Other Soft Computing Techniques with applications in the oil industry”. John P. Martin.: “Moving Domain Analysis to Identify Infill Well Potential”