The nature and extent of the SCRF affiliates program allows for paradigm-changing research in the field of geostatistics and numerical reservoir modeling. We are not bound by the limited extent of project-based research with its short-term deadlines and limited scope. Below are some current themes of research projects, either conducted by an individual student or a group of researchers, often in collaboration with affiliate member companies. The list of themes provides an overview of ongoing research. Reports on this research are available for affiliate members. Reports of more than 4 years ago are available to the general public, see the resources tab.
Geological characterization of naturally fractured reservoirs is potentially associated with large uncertainty. However, the geological modeling of discrete fracture networks (DFN) is in practice considerably disconnected from uncertainty modeling based on conventional flow simulators. We focus on research in fracture modeling from geology to flow. We develop methodologies for turning DFN models into useable training image for actual reservoir modeling and uncertainty quantification for flow.
Uncertainty in the geological structure significantly influences the overall uncertainty in a reservoir. However, this structural uncertainty is currently still not widely incorporated in actual reservoir forecasting. Integrating all the sources of uncertainty in the structure effectively requires generating many structural reservoir models. To address this, we focus on only those sources of structural uncertainty that significantly influence reservoir response and decision making. This offers computational gains that we try to exploit towards our goal of better utilizing our knowledge of structural uncertainty when making reservoir decisions.
Model complexity has been an implicit problem in geomodeling, reservoir simulation together with uncertainty quantification and history matching workflows. An example to that would be modeling a reservoir using hundreds of faults, tens of facies on billions of grid cells and running flow simulations, later on to find out that most of the components included in models are not significant for the modeling purpose. Several static and dynamic proxies are used to understand impact of targeted complexities on fluid flow uncertainty without creating structural grids or running flow simulations. This information is then used in workflows to determine needed structural and geological complexity for given modeling purposes.
Pattern-based algorithms for generating reservoir models from training images are fast and versatile alternatives to pixel-based algorithms. Our research focuses on implementing new algorithms that make use of the latest computer science research in the area of texture modeling and synthesis yet steer it towards geostatistical application involving conditioning.We rely on techniques such as image quilting to simulate very large multi-million cell models in a matter of seconds. This research is in collaboration with Gregoire Mariethoz of the University of Lausanne.
Traditional seismic inversion approaches have focused on reducing error between data and model within a fixed geological scenario. The problem with this approach is that either uncertainty related to geological interpretation is ignored or that inversion needs to be repeated for each interpreted scenario. This research proposes to first assess the consistency of postulated geological scenarios with the field seismic by defining a pattern similarity between seismic data and forward simulated data. Low probability scenarios are rejected and the remaining scenarios are used in iterative inversion for generating realistic geological models.
Sensitivity analysis is an essential component of uncertainty quantification and decision-making. Proper evaluation of parameter sensitivities requires running flow simulations on large number of scenarios, which is highly computationally expensive. For this reason it is highly attractive to employ much faster proxy flow simulations, which come at the price of reduced accuracy. Our research effort is focused on integration of proxies with all their imperfections into recently developed distance based generalized sensitivity analysis workflow. One of the main challenges is the accurate estimation of classification error introduced by proxy flow responses and its general impact on uncertainty quantification and decision making procedure
Surface-based models maximize use of depositional rules in reservoir modeling. One challenge in surface-based modeling is the lack of understanding of the underlying geological processes. A geomorphic experiment is a useful tool to improve such understanding because of the accessibility of comprehensive measurements. In collaboration with Prof. Chris Paola and his Sedimentology Group at the National Center for Earth Surface Dynamics, we apply data mining and stochastic solutions for quantifying, comparing and mimicking stacking patterns of sand bodies that link geomorphic experiments to real reservoirs.
Reservoir models, optimally constrained to seismic response as well as flow response can provide a better description of the reservoir and thus a more reliable forecast. However joint inversion of time lapse seismic and production data is complex and challenging with uncertainties at each step of the process. An on-going study focuses on the Norne field located in the southern part of the Nordland II area in the Norwegian Sea. Another project addresses time-lapse monitoring using joint integration of electromagnetic and seismic data. The usefulness of time-lapse seismic and EM data will be evaluated for various reservoir settings (rock types and fluid contents) and recovery methods (water injection, gas injection, steam injection, etc).
This interdisciplinary project aims to conduct a systematic study to quantify the associated uncertainties in basin and petroleum system modeling (BPSM) using basin models constructed from real dataset collected from Piceance Basin, Colorado. Generalized Sensitivity Analysis (GSA) was conducted first to identify sensitivity parameters among various model inputs. Stochastic modeling approach will be introduced to establish a robust uncertainty quantification workflow in BPSM. This study will not only enhance our understanding of the most critical parameters affecting the model outcome but also optimizing data-gathering in future exploration activities.
Prediction-Focused Analysis & modeling: direct forecasting with production data without history matching
Inverse modeling is traditionally applied as new observed data becomes available. We investigate several methods such as non-linear principal component analysis, functional data analysis and canonical correlation analysis to directly forecast with data without explicit time-consuming inversion of models.