Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 12 de 12
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Sci Rep ; 14(1): 13589, 2024 Jun 12.
Artículo en Inglés | MEDLINE | ID: mdl-38866943

RESUMEN

The calibration of reservoir models using production data can enhance the reliability of predictions. However, history matching often leads to only a few matched models, and the original geological interpretation is not always preserved. Therefore, there is a need for stochastic methodologies for history matching. The Ensemble Kalman Filter (EnKF) is a well-known Monte Carlo method that updates reservoir models in real time. When new production data becomes available, the ensemble of models is updated accordingly. The initial ensemble is created using the prior model, and the posterior probability function is sampled through a series of updates. In this study, EnKF was employed to evaluate the uncertainty of production forecasts for a specific development plan and to match historical data to a real field reservoir model. This study represents the first attempt to combine EnKF with an integrated model that includes a genuine oil reservoir, actual production wells, a surface choke, a surface pipeline, a separator, and a PID pressure controller. The research optimized a real integrated production system, considering the constraint that there should be no slug flow at the inlet of the separator. The objective function was to maximize the net present value (NPV). Geological data was used to model uncertainty using Sequential Gaussian Simulation. Porosity scenarios were generated, and conditioning the porosity to well data yielded improved results. Ensembles were employed to balance accuracy and efficiency, demonstrating a reduction in porosity uncertainty due to production data. This study revealed that utilizing a PID pressure controller for the production separator can enhance oil production by 59% over 20 years, resulting in the generation of 2.97 million barrels of surplus oil in the field and significant economic gains.

2.
Philos Trans A Math Phys Eng Sci ; 380(2233): 20220039, 2022 Oct 03.
Artículo en Inglés | MEDLINE | ID: mdl-35965471

RESUMEN

We analyze JUNE: a detailed model of COVID-19 transmission with high spatial and demographic resolution, developed as part of the RAMP initiative. JUNE requires substantial computational resources to evaluate, making model calibration and general uncertainty analysis extremely challenging. We describe and employ the uncertainty quantification approaches of Bayes linear emulation and history matching to mimic JUNE and to perform a global parameter search, hence identifying regions of parameter space that produce acceptable matches to observed data, and demonstrating the capability of such methods. This article is part of the theme issue 'Technical challenges of modelling real-life epidemics and examples of overcoming these'.


Asunto(s)
COVID-19 , Teorema de Bayes , Humanos , Incertidumbre
3.
Epidemics ; 39: 100574, 2022 06.
Artículo en Inglés | MEDLINE | ID: mdl-35617882

RESUMEN

Uncertainty quantification is a formal paradigm of statistical estimation that aims to account for all uncertainties inherent in the modelling process of real-world complex systems. The methods are directly applicable to stochastic models in epidemiology, however they have thus far not been widely used in this context. In this paper, we provide a tutorial on uncertainty quantification of stochastic epidemic models, aiming to facilitate the use of the uncertainty quantification paradigm for practitioners with other complex stochastic simulators of applied systems. We provide a formal workflow including the important decisions and considerations that need to be taken, and illustrate the methods over a simple stochastic epidemic model of UK SARS-CoV-2 transmission and patient outcome. We also present new approaches to visualisation of outputs from sensitivity analyses and uncertainty quantification more generally in high input and/or output dimensions.


Asunto(s)
COVID-19 , Epidemias , COVID-19/epidemiología , Calibración , Humanos , SARS-CoV-2 , Incertidumbre
4.
Biometrics ; 78(3): 1195-1208, 2022 09.
Artículo en Inglés | MEDLINE | ID: mdl-33837525

RESUMEN

The presence of protein aggregates in cells is a known feature of many human age-related diseases, such as Huntington's disease. Simulations using fixed parameter values in a model of the dynamic evolution of expanded polyglutaime (PolyQ) proteins in cells have been used to gain a better understanding of the biological system. However, there is considerable uncertainty about the values of some of the parameters governing the system. Currently, appropriate values are chosen by ad hoc attempts to tune the parameters so that the model output matches experimental data. The problem is further complicated by the fact that the data only offer a partial insight into the underlying biological process: the data consist only of the proportions of cell death and of cells with inclusion bodies at a few time points, corrupted by measurement error. Developing inference procedures to estimate the model parameters in this scenario is a significant task. The model probabilities corresponding to the observed proportions cannot be evaluated exactly, and so they are estimated within the inference algorithm by repeatedly simulating realizations from the model. In general such an approach is computationally very expensive, and we therefore construct Gaussian process emulators for the key quantities and reformulate our algorithm around these fast stochastic approximations. We conclude by highlighting appropriate values of the model parameters leading to new insights into the underlying biological processes.


Asunto(s)
Algoritmos , Agregado de Proteínas , Teorema de Bayes , Humanos , Cinética , Cadenas de Markov , Método de Montecarlo , Péptidos , Procesos Estocásticos
5.
Stat Appl Genet Mol Biol ; 19(2)2020 07 13.
Artículo en Inglés | MEDLINE | ID: mdl-32649296

RESUMEN

A major challenge in plant developmental biology is to understand how plant growth is coordinated by interacting hormones and genes. To meet this challenge, it is important to not only use experimental data, but also formulate a mathematical model. For the mathematical model to best describe the true biological system, it is necessary to understand the parameter space of the model, along with the links between the model, the parameter space and experimental observations. We develop sequential history matching methodology, using Bayesian emulation, to gain substantial insight into biological model parameter spaces. This is achieved by finding sets of acceptable parameters in accordance with successive sets of physical observations. These methods are then applied to a complex hormonal crosstalk model for Arabidopsis root growth. In this application, we demonstrate how an initial set of 22 observed trends reduce the volume of the set of acceptable inputs to a proportion of 6.1 × 10-7 of the original space. Additional sets of biologically relevant experimental data, each of size 5, reduce the size of this space by a further three and two orders of magnitude respectively. Hence, we provide insight into the constraints placed upon the model structure by, and the biological consequences of, measuring subsets of observations.


Asunto(s)
Arabidopsis/crecimiento & desarrollo , Reguladores del Crecimiento de las Plantas/fisiología , Raíces de Plantas/crecimiento & desarrollo , Análisis de Varianza , Arabidopsis/genética , Arabidopsis/metabolismo , Teorema de Bayes , Simulación por Computador , Regulación de la Expresión Génica de las Plantas/genética , Modelos Biológicos , Raíces de Plantas/genética , Raíces de Plantas/metabolismo
6.
Philos Trans A Math Phys Eng Sci ; 378(2173): 20190334, 2020 Jun 12.
Artículo en Inglés | MEDLINE | ID: mdl-32448071

RESUMEN

Cardiac contraction is the result of integrated cellular, tissue and organ function. Biophysical in silico cardiac models offer a systematic approach for studying these multi-scale interactions. The computational cost of such models is high, due to their multi-parametric and nonlinear nature. This has so far made it difficult to perform model fitting and prevented global sensitivity analysis (GSA) studies. We propose a machine learning approach based on Gaussian process emulation of model simulations using probabilistic surrogate models, which enables model parameter inference via a Bayesian history matching (HM) technique and GSA on whole-organ mechanics. This framework is applied to model healthy and aortic-banded hypertensive rats, a commonly used animal model of heart failure disease. The obtained probabilistic surrogate models accurately predicted the left ventricular pump function (R2 = 0.92 for ejection fraction). The HM technique allowed us to fit both the control and diseased virtual bi-ventricular rat heart models to magnetic resonance imaging and literature data, with model outputs from the constrained parameter space falling within 2 SD of the respective experimental values. The GSA identified Troponin C and cross-bridge kinetics as key parameters in determining both systolic and diastolic ventricular function. This article is part of the theme issue 'Uncertainty quantification in cardiac and cardiovascular modelling and simulation'.

7.
Int J Mol Sci ; 18(12)2017 Dec 01.
Artículo en Inglés | MEDLINE | ID: mdl-29194393

RESUMEN

Since they can provide a natural and flexible description of nonlinear dynamic behavior of complex system, Agent-based models (ABM) have been commonly used for immune system simulation. However, it is crucial for ABM to obtain an appropriate estimation for the key parameters of the model by incorporating experimental data. In this paper, a systematic procedure for immune system simulation by integrating the ABM and regression method under the framework of history matching is developed. A novel parameter estimation method by incorporating the experiment data for the simulator ABM during the procedure is proposed. First, we employ ABM as simulator to simulate the immune system. Then, the dimension-reduced type generalized additive model (GAM) is employed to train a statistical regression model by using the input and output data of ABM and play a role as an emulator during history matching. Next, we reduce the input space of parameters by introducing an implausible measure to discard the implausible input values. At last, the estimation of model parameters is obtained using the particle swarm optimization algorithm (PSO) by fitting the experiment data among the non-implausible input values. The real Influeza A Virus (IAV) data set is employed to demonstrate the performance of our proposed method, and the results show that the proposed method not only has good fitting and predicting accuracy, but it also owns favorable computational efficiency.


Asunto(s)
Virus de la Influenza A/inmunología , Gripe Humana/inmunología , Modelos Biológicos , Algoritmos , Simulación por Computador , Humanos , Dinámicas no Lineales , Análisis de Regresión
8.
Environmetrics ; 27(8): 507-523, 2016 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-28042255

RESUMEN

Expensive computer codes, particularly those used for simulating environmental or geological processes, such as climate models, require calibration (sometimes called tuning). When calibrating expensive simulators using uncertainty quantification methods, it is usually necessary to use a statistical model called an emulator in place of the computer code when running the calibration algorithm. Though emulators based on Gaussian processes are typically many orders of magnitude faster to evaluate than the simulator they mimic, many applications have sought to speed up the computations by using regression-only emulators within the calculations instead, arguing that the extra sophistication brought using the Gaussian process is not worth the extra computational power. This was the case for the analysis that produced the UK climate projections in 2009. In this paper, we compare the effectiveness of both emulation approaches upon a multi-wave calibration framework that is becoming popular in the climate modeling community called "history matching." We find that Gaussian processes offer significant benefits to the reduction of parametric uncertainty over regression-only approaches. We find that in a multi-wave experiment, a combination of regression-only emulators initially, followed by Gaussian process emulators for refocussing experiments can be nearly as effective as using Gaussian processes throughout for a fraction of the computational cost. We also discover a number of design and emulator-dependent features of the multi-wave history matching approach that can cause apparent, yet premature, convergence of our estimates of parametric uncertainty. We compare these approaches to calibration in idealized examples and apply it to a well-known geological reservoir model.

9.
Bull Volcanol ; 77(10): 83, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-26500386

RESUMEN

Mathematical models of natural processes can be used as inversion tools to predict unobserved properties from measured quantities. Uncertainty in observations and model formulation impact on the efficacy of inverse modelling. We present a general methodology, history matching, that can be used to investigate the effect of observational and model uncertainty on inverse modelling studies. We demonstrate history matching on an integral model of volcanic plumes that is used to estimate source conditions from observations of the rise height of plumes during the eruptions of Eyjafjallajökull, Iceland, in 2010 and Grímsvötn, Iceland, in 2011. Sources of uncertainty are identified and quantified, and propagated through the integral plume model. A preliminary sensitivity analysis is performed to identify the uncertain model parameters that strongly influence model predictions. Model predictions are assessed against observations through an implausibility measure that rules out model inputs that are considered implausible given the quantified uncertainty. We demonstrate that the source mass flux at the volcano can be estimated from plume height observations, but the magmatic temperature, exit velocity and exsolved gas mass fraction cannot be accurately determined. Uncertainty in plume height observations and entrainment coefficients results in a large range of plausible values of the source mass flux. Our analysis shows that better constraints on entrainment coefficients for volcanic plumes and more precise observations of plume height are required to obtain tightly constrained estimates of the source mass flux.

10.
Fuel (Lond) ; 148: 87-97, 2015 May 15.
Artículo en Inglés | MEDLINE | ID: mdl-29563647

RESUMEN

Coal seam degasification improves coal mine safety by reducing the gas content of coal seams and also by generating added value as an energy source. Coal seam reservoir simulation is one of the most effective ways to help with these two main objectives. As in all modeling and simulation studies, how the reservoir is defined and whether observed productions can be predicted are important considerations. Using geostatistical realizations as spatial maps of different coal reservoir properties is a more realistic approach than assuming uniform properties across the field. In fact, this approach can help with simultaneous history matching of multiple wellbores to enhance the confidence in spatial models of different coal properties that are pertinent to degasification. The problem that still remains is the uncertainty in geostatistical simulations originating from the partial sampling of the seam that does not properly reflect the stochastic nature of coal property realizations. Stochastic simulations and using individual realizations, rather than E-type, make evaluation of uncertainty possible. This work is an advancement over Karacan et al. (2014) in the sense of assessing uncertainty that stems from geostatistical maps. In this work, we batched 100 individual realizations of 10 coal properties that were randomly generated to create 100 bundles and used them in 100 separate coal seam reservoir simulations for simultaneous history matching. We then evaluated the history matching errors for each bundle and defined the single set of realizations that would minimize the error for all wells. We further compared the errors with those of E-type and the average realization of the best matches. Unlike in Karacan et al. (2014), which used E-type maps and average of quantile maps, using these 100 bundles created 100 different history match results from separate simulations, and distributions of results for in-place gas quantity, for example, from which uncertainty in coal property realizations could be evaluated. The study helped to determine the realization bundle that consisted of the spatial maps of coal properties, which resulted in minimum error. In addition, it was shown that both E-type and the average of realizations that gave the best match for invidual approximated the same properties resonably. Moreover, the determined realization bundle showed that the study field initially had 151.5 million m3 (cubic meter) of gas and 1.04 million m3 water in the coal, corresponding to Q90 of the entire range of probability for gas and close to Q75 for water. In 2013, in-place fluid amounts decreased to 138.9 million m3 and 0.997 million m3 for gas and water, respectively.

11.
J Nat Gas Sci Eng ; 10: 51-67, 2013 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-26191096

RESUMEN

The Black Warrior Basin of Alabama is one of the most important coal mining and coalbed methane production areas in the United States. Methane control efforts through degasification that started almost 25 years ago for the sole purpose of ensuring mining safety resulted in more than 5000 coalbed methane wells distributed within various fields throughout the basin. The wells are completed mostly in the Pratt, Mary Lee, and Black Creek coal groups of the Upper Pottsville formation and present a unique opportunity to understand methane reservoir properties of these coals and to improve their degasification performances. The Brookwood and Oak Grove fields in the Black Warrior Basin are probably two of the most important fields in the basin due to current longwall coal mining activities. In this work, methane and water productions of 92 vertical wellbores drilled, some completed 20 years ago, over a current large coal mine district located in these two fields, were analyzed by history matching techniques. The boreholes were completed at the Mary Lee coal group, or at combinations of the Pratt, Mary Lee, and Black Creek groups. History matching models were prepared and performed according to properties of each coal group. Decline curve analyses showed that effective exponential decline rates of the wells were between 2% and 25% per year. Results of production history matching showed, although they varied by coal group, that pressure decreased as much as 80% to nearly 25 psi in some areas and resulted in corresponding decreases in methane content. Water saturation in coals decreased from 100% to between 20 and 80%, improving gas relative permeabilities to as much as 0.8. As a result of primary depletion, permeability of coal seams increased between 10 and 40% compared to their original permeability, which varied between 1 and 10 md depending on depth and coal seam. These results not only can be used for diagnostic and interpretation purposes, but can be used as parameter distributions in probabilistic simulations, as illustrated in the last section of this paper. They can also be used in conjunction with spatial modeling and geological considerations to calculate potential methane emissions in operating mines.

12.
Int J Coal Geol ; 114: 96-113, 2013 Jul 30.
Artículo en Inglés | MEDLINE | ID: mdl-26435557

RESUMEN

Coal seam degasification and its efficiency are directly related to the safety of coal mining. Degasification activities in the Black Warrior basin started in the early 1980s by using vertical boreholes. Although the Blue Creek seam, which is part of the Mary Lee coal group, has been the main seam of interest for coal mining, vertical wellbores have also been completed in the Pratt, Mary Lee, and Black Creek coal groups of the Upper Pottsville formation to degasify multiple seams. Currently, the Blue Creek seam is further degasified 2-3 years in advance of mining using in-seam horizontal boreholes to ensure safe mining. The studied location in this work is located between Tuscaloosa and Jefferson counties in Alabama and was degasified using 81 vertical boreholes, some of which are still active. When the current long mine expanded its operation into this area in 2009, horizontal boreholes were also drilled in advance of mining for further degasification of only the Blue Creek seam to ensure a safe and a productive operation. This paper presents an integrated study and a methodology to combine history matching results from vertical boreholes with production modeling of horizontal boreholes using geostatistical simulation to evaluate spatial effectiveness of in-seam boreholes in reducing gas-in-place (GIP). Results in this study showed that in-seam wells' boreholes had an estimated effective drainage area of 2050 acres with cumulative production of 604 MMscf methane during ~2 years of operation. With horizontal borehole production, GIP in the Blue Creek seam decreased from an average of 1.52 MMscf to 1.23 MMscf per acre. It was also shown that effective gas flow capacity, which was independently modeled using vertical borehole data, affected horizontal borehole production. GIP and effective gas flow capacity of coal seam gas were also used to predict remaining gas potential for the Blue Creek seam.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA