Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 38
Filtrar
1.
ISA Trans ; 144: 201-210, 2024 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-37940470

RESUMEN

This work is dedicated to the leaderless/leader-following stochastic scaled consensus issue of second-order stochastic multi-agent systems (SMASs) in a noisy environment. Scaled consensus represents that the ratios among agents asymptotically tend to designated constants rather than the common convergence value. To lessen the influence of communication noise, some stochastic approximation protocols with time-varying gain are designed for our underlying system, where the time-varying gain remove the restriction of nonnegative value. Compared with the existing consensus results with communication noise, the major challenge is that the introduction of time-varying gain results in the inapplicability of Lyapunov-based technique. To cope with it, a state decomposition method is utilized, and a series of sufficient necessary conditions are set up for interacting agents with constant velocity and zero velocity if the topology includes a spanning tree. Furthermore, it is conducted that the consensus and bipartite consensus can be seen as two special cases of our work. Finally, the validity of our results is demonstrated by a simulation example.

2.
BMC Med Res Methodol ; 23(1): 257, 2023 11 03.
Artículo en Inglés | MEDLINE | ID: mdl-37924007

RESUMEN

BACKGROUND: The use of mixed effect models with a specific functional form such as the Sigmoidal Mixed Model and the Piecewise Mixed Model (or Changepoint Mixed Model) with abrupt or smooth random change allows the interpretation of the defined parameters to understand longitudinal trajectories. Currently, there are no interface R packages that can easily fit the Sigmoidal Mixed Model allowing the inclusion of covariates or incorporating recent developments to fit the Piecewise Mixed Model with random change. RESULTS: To facilitate the modeling of the Sigmoidal Mixed Model, and Piecewise Mixed Model with abrupt or smooth random change, we have created an R package called nlive. All needed pieces such as functions, covariance matrices, and initials generation were programmed. The package was implemented with recent developments such as the polynomial smooth transition of the piecewise mixed model with improved properties over Bacon-Watts, and the stochastic approximation expectation-maximization (SAEM) for efficient estimation. It was designed to help interpretation of the output by providing features such as annotated output, warnings, and graphs. Functionality, including time and convergence, was tested using simulations. We provided a data example to illustrate the package use and output features and interpretation. The package implemented in the R software is available from the Comprehensive R Archive Network (CRAN) at https://CRAN.R-project.org/package=nlive . CONCLUSIONS: The nlive package for R fits the Sigmoidal Mixed Model and the Piecewise Mixed: abrupt and smooth. The nlive allows fitting these models with only five mandatory arguments that are intuitive enough to the less sophisticated users.


Asunto(s)
Algoritmos , Programas Informáticos , Humanos
3.
Entropy (Basel) ; 25(8)2023 Aug 18.
Artículo en Inglés | MEDLINE | ID: mdl-37628265

RESUMEN

The variational Bayesian method solves nonlinear estimation problems by iteratively computing the integral of the marginal density. Many researchers have demonstrated the fact its performance depends on the linear approximation in the computation of the variational density in the iteration and the degree of nonlinearity of the underlying scenario. In this paper, two methods for computing the variational density, namely, the natural gradient method and the simultaneous perturbation stochastic method, are used to implement a variational Bayesian Kalman filter for maneuvering target tracking using Doppler measurements. The latter are collected from a set of sensors subject to single-hop network constraints. We propose a distributed fusion variational Bayesian Kalman filter for a networked maneuvering target tracking scenario and both of the evidence lower bound and the posterior Cramér-Rao lower bound of the proposed methods are presented. The simulation results are compared with centralized fusion in terms of posterior Cramér-Rao lower bounds, root-mean-squared errors and the 3σ bound.

4.
Res Sq ; 2023 Apr 10.
Artículo en Inglés | MEDLINE | ID: mdl-37090666

RESUMEN

Background: The use of mixed effect models with a specific functional form such as the Sigmoidal Mixed Model and the Piecewise Mixed Model (or Changepoint Mixed Model) with abrupt or smooth random change allow the interpretation of the defined parameters to understand longitudinal trajectories. Currently, there are no interface R packages that can easily fit the Sigmoidal Mixed Model allowing the inclusion of covariates or incorporate recent developments to fit the Piecewise Mixed Model with random change. Results: To facilitate the modeling of the Sigmoidal Mixed Model, and Piecewise Mixed Model with abrupt or smooth random change, we have created an R package called nlive. All needed pieces such as functions, covariance matrices, and initials generation were programmed. The package was implemented with recent developments such as the polynomial smooth transition of piecewise mixed model with improved properties over Bacon-Watts, and the stochastic approximation expectation-maximization (SAEM) for efficient estimation. It was designed to help interpretation of the output by providing features such as annotated output, warnings, and graphs. Functionality, including time and convergence, was tested using simulations. We provided a data example to illustrate the package use and output features and interpretation. The package implemented in the R software is available from the Comprehensive R Archive Network (CRAN) at https://CRAN.R-project.org/package=nlive. Conclusions: The nlive package for R fits the Sigmoidal Mixed Model and the Piecewise Mixed: abrupt and smooth. The nlive allows fitting these models with only five mandatory arguments that are intuitive enough to the less sophisticated users.

5.
Psychometrika ; 88(4): 1407-1442, 2023 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-35648266

RESUMEN

In recent years, the four-parameter model (4PM) has received increasing attention in item response theory. The purpose of this article is to provide more efficient and more reliable computational tools for fitting the 4PM. In particular, this article focuses on the four-parameter normal ogive model (4PNO) model and develops efficient stochastic approximation expectation maximization (SAEM) algorithms to compute the marginalized maximum a posteriori estimator. First, a data augmentation scheme is used for the 4PNO model, which makes the complete data model be an exponential family, and then, a basic SAEM algorithm is developed for the 4PNO model. Second, to overcome the drawback of the SAEM algorithm, we develop an improved SAEM algorithm for the 4PNO model, which is called the mixed SAEM (MSAEM). Results from simulation studies demonstrate that: (1) the MSAEM provides more accurate or comparable estimates as compared with the other estimation methods, while computationally more efficient; (2) the MSAEM is more robust to the choices of initial values and the priors for item parameters, which is a valuable property for practice use. Finally, a real data set is analyzed to show the good performance of the proposed methods.


Asunto(s)
Algoritmos , Psicometría , Simulación por Computador
6.
Accid Anal Prev ; 179: 106878, 2023 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-36334543

RESUMEN

Proper calibration process is of considerable importance for traffic safety evaluations using simulation models. Allowing for a pure with and without comparison under identical circumstances that is not directly testable in the field, microsimulation-based approach has drawn considerable attention for the performance evaluation of emerging technologies, such as connected vehicle (CV) safety applications. Different from the traditional approaches to evaluate mobility impacts, safety evaluations of such applications demand the simulation models to be well calibrated to match real-world safety conditions. This paper proposes a novel calibration framework which combines traffic conflict techniques and multi-objective stochastic optimization so that the operational and safety measures can be calibrated simultaneously. The conflict distribution of different severity levels categorized by time-to-collision (TTC) is applied as the safety performance measure. Simultaneous perturbation stochastic approximation (SPSA) algorithm, which can efficiently approximate the gradient of the multi-objective stochastic loss function, is used for model parameters optimization that minimizes the total simulation error of both operational and safety performance measures. The proposed calibration methodology is implemented using an open-source software SUMO on a simulation network of the Flatbush Avenue corridor in Brooklyn, NY. 17 key parameters are calibrated using the SPSA algorithm and are compared with the real-world traffic conflicts extracted using vehicle trajectories from 14 h' high-resolution aerial and traffic surveillance videos. Representative days are identified to create variation envelopes for performance measures. Four acceptability criteria, including control for time-variant outliers and inliers, bounded dynamic absolute and system errors are adopted for results analysis. The results show that the calibrated parameters can significantly improve the performance of the simulation model to represent real-world safety conditions (i.e., traffic conflicts) as well as operational conditions. The case study also demonstrates the usefulness of aerial imagery and the applicability of the proposed model calibration framework, so the calibrated model can be used to evaluate the safety benefits of CV applications more accurately.


Asunto(s)
Accidentes de Tránsito , Humanos , Accidentes de Tránsito/prevención & control
7.
J Comput Graph Stat ; 32(2): 448-469, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38240013

RESUMEN

Inference for high-dimensional, large scale and long series dynamic systems is a challenging task in modern data science. The existing algorithms, such as particle filter or sequential importance sampler, do not scale well to the dimension of the system and the sample size of the dataset, and often suffers from the sample degeneracy issue for long series data. The recently proposed Langevinized ensemble Kalman filter (LEnKF) addresses these difficulties in a coherent way. However, it cannot be applied to the case that the dynamic system contains unknown parameters. This article proposes the so-called stochastic approximation-LEnKF for jointly estimating the states and unknown parameters of the dynamic system, where the parameters are estimated on the fly based on the state variables simulated by the LEnKF under the framework of stochastic approximation Markov chain Monte Carlo (MCMC). Under mild conditions, we prove its consistency in parameter estimation and ergodicity in state variable simulations. The proposed algorithm can be used in uncertainty quantification for long series, large scale, and high-dimensional dynamic systems. Numerical results indicate its superiority over the existing algorithms. We employ the proposed algorithm in state-space modeling of the sea surface temperature with a long short term memory (LSTM) network, which indicates its great potential in statistical analysis of complex dynamic systems encountered in modern data science. Supplementary materials for this article are available online.

8.
Front Psychol ; 13: 971126, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36506999

RESUMEN

In the estimation of item response models, the normality of latent traits is frequently assumed. However, this assumption may be untenable in real testing. In contrast to the conventional three-parameter normal ogive (3PNO) model, a 3PNO model incorporating Ramsay-curve item response theory (RC-IRT), denoted as the RC-3PNO model, allows for flexible latent trait distributions. We propose a stochastic approximation expectation maximization (SAEM) algorithm to estimate the RC-3PNO model with non-normal latent trait distributions. The simulation studies of this work reveal that the SAEM algorithm produces more accurate item parameters for the RC-3PNO model than those of the 3PNO model, especially when the latent density is not normal, such as in the cases of a skewed or bimodal distribution. Three model selection criteria are used to select the optimal number of knots and the degree of the B-spline functions in the RC-3PNO model. A real data set from the PISA 2018 test is used to demonstrate the application of the proposed algorithm.

9.
Philos Trans A Math Phys Eng Sci ; 380(2239): 20210282, 2022 Dec 26.
Artículo en Inglés | MEDLINE | ID: mdl-36335950

RESUMEN

Using counterdiabatic (CD) driving-aiming at suppression of diabatic transition-in digitized adiabatic evolution has garnered immense interest in quantum protocols and algorithms. However, improving the approximate CD terms with a nested commutator ansatz is a challenging task. In this work, we propose a technique of finding optimal coefficients of the CD terms using a variational quantum circuit. By classical optimization routines, the parameters of this circuit are optimized to provide the coefficients corresponding to the CD terms. Then their improved performance is exemplified in Greenberger-Horne-Zeilinger state preparation on the nearest-neighbour Ising model. Finally, we also show the advantage over the usual quantum approximation optimization algorithm, in terms of fidelity with bounded time. This article is part of the theme issue 'Shortcuts to adiabaticity: theoretical, experimental and interdisciplinary perspectives'.

10.
BMC Med Res Methodol ; 22(1): 258, 2022 10 01.
Artículo en Inglés | MEDLINE | ID: mdl-36183071

RESUMEN

BACKGROUND: Current dose-finding designs for phase I clinical trials can correctly select the MTD in a range of 30-80% depending on various conditions based on a sample of 30 subjects. However, there is still an unmet need for efficiency and cost saving. METHODS: We propose a novel dose-finding design based on Bayesian stochastic approximation. The design features utilization of dose level information through local adaptive modelling and free assumption of toxicity probabilities and hyper-parameters. It allows a flexible target toxicity rate and varying cohort size. And we extend it to accommodate historical information via prior effective sample size. We compare the proposed design to some commonly used methods in terms of accuracy and safety by simulation. RESULTS: On average, our design can improve the percentage of correct selection to about 60% when the MTD resides at a early or middle position in the search domain and perform comparably to other competitive methods otherwise. A free online software package is provided to facilitate the application, where a simple decision tree for the design can be pre-printed beforehand. CONCLUSION: The paper proposes a novel dose-finding design for phase I clinical trials. Applying the design to future cancer trials can greatly improve the efficiency, consequently save cost and shorten the development period.


Asunto(s)
Neoplasias , Proyectos de Investigación , Teorema de Bayes , Ensayos Clínicos Fase I como Asunto , Simulación por Computador , Relación Dosis-Respuesta a Droga , Humanos , Dosis Máxima Tolerada
11.
Stat Commun Infect Dis ; 14(1): 20210001, 2022 Jan 01.
Artículo en Inglés | MEDLINE | ID: mdl-35880974

RESUMEN

Objectives: Characterizing features of the viral rebound trajectories and identifying host, virological, and immunological factors that are predictive of the viral rebound trajectories are central to HIV cure research. We investigate if key features of HIV viral decay and CD4 trajectories during antiretroviral therapy (ART) are associated with characteristics of HIV viral rebound following ART interruption. Methods: Nonlinear mixed effect (NLME) models are used to model viral load trajectories before and following ART interruption, incorporating left censoring due to lower detection limits of viral load assays. A stochastic approximation EM (SAEM) algorithm is used for parameter estimation and inference. To circumvent the computational intensity associated with maximizing the joint likelihood, we propose an easy-to-implement three-step method. Results: We evaluate the performance of the proposed method through simulation studies and apply it to data from the Zurich Primary HIV Infection Study. We find that some key features of viral load during ART (e.g., viral decay rate) are significantly associated with important characteristics of viral rebound following ART interruption (e.g., viral set point). Conclusions: The proposed three-step method works well. We have shown that key features of viral decay during ART may be associated with important features of viral rebound following ART interruption.

12.
J Appl Stat ; 49(6): 1519-1539, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35707109

RESUMEN

Online learning is a method for analyzing very large datasets ('big data') as well as data streams. In this article, we consider the case of constrained binary logistic regression and show the interest of using processes with an online standardization of the data, in particular to avoid numerical explosions or to allow the use of shrinkage methods. We prove the almost sure convergence of such a process and propose using a piecewise constant step-size such that the latter does not decrease too quickly and does not reduce the speed of convergence. We compare twenty-four stochastic approximation processes with raw or online standardized data on five real or simulated data sets. Results show that, unlike processes with raw data, processes with online standardized data can prevent numerical explosions and yield the best results.

13.
Front Big Data ; 5: 686416, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35647535

RESUMEN

Elasticsearch is currently the most popular search engine for full-text database management systems. By default, its configuration does not change while it receives data. However, when Elasticsearch stores a large amount of data over time, the default configuration becomes an obstacle to improving performance. In addition, the servers that host Elasticsearch may have limited resources, such as internal memory and CPU. A general solution to these problems is to dynamically tune the configuration parameters of Elasticsearch in order to improve its performance. The sheer number of parameters involved in this configuration makes it a complex task. In this work, we apply the Simultaneous Perturbation Stochastic Approximation method for optimizing Elasticsearch with multiple unknown parameters. Using this algorithm, our implementation optimizes the Elasticsearch configuration parameters by observing the performance and automatically changing the configuration to improve performance. The proposed solution makes it possible to change the configuration parameters of Elasticsearch automatically without having to restart the currently running instance of Elasticsearch. The results show a higher than 40% improvement in the combined data insertion capacity and the system's response time.

14.
Psychometrika ; 87(4): 1473-1502, 2022 12.
Artículo en Inglés | MEDLINE | ID: mdl-35524934

RESUMEN

Latent variable models have been playing a central role in psychometrics and related fields. In many modern applications, the inference based on latent variable models involves one or several of the following features: (1) the presence of many latent variables, (2) the observed and latent variables being continuous, discrete, or a combination of both, (3) constraints on parameters, and (4) penalties on parameters to impose model parsimony. The estimation often involves maximizing an objective function based on a marginal likelihood/pseudo-likelihood, possibly with constraints and/or penalties on parameters. Solving this optimization problem is highly non-trivial, due to the complexities brought by the features mentioned above. Although several efficient algorithms have been proposed, there lacks a unified computational framework that takes all these features into account. In this paper, we fill the gap. Specifically, we provide a unified formulation for the optimization problem and then propose a quasi-Newton stochastic proximal algorithm. Theoretical properties of the proposed algorithms are established. The computational efficiency and robustness are shown by simulation studies under various settings for latent variable model estimation.


Asunto(s)
Algoritmos , Modelos Teóricos , Funciones de Verosimilitud , Psicometría , Simulación por Computador
15.
BMC Infect Dis ; 22(1): 20, 2022 Jan 04.
Artículo en Inglés | MEDLINE | ID: mdl-34983387

RESUMEN

BACKGROUND: The CD4 cell count signifies the health of an individual's immune system. The use of data-driven models enables clinicians to accurately interpret potential information, examine the progression of CD4 count, and deal with patient heterogeneity due to patient-specific effects. Quantile-based regression models can be used to illustrate the entire conditional distribution of an outcome and identify various covariates effects at the respective location. METHODS: This study uses the quantile mixed-effects model that assumes an asymmetric Laplace distribution for the error term. The model also incorporated multiple random effects to consider the correlation among observations. The exact maximum likelihood estimation was implemented using the Stochastic Approximation of the Expectation-Maximization algorithm to estimate the parameters. This study used the Centre of the AIDS Programme of Research in South Africa (CAPRISA) 002 Acute Infection Study data. In this study, the response variable is the longitudinal CD4 count from HIV-infected patients who were initiated on Highly Active Antiretroviral Therapy (HAART), and the explanatory variables are relevant baseline characteristics of the patients. RESULTS: The analysis obtained robust parameters estimates at various locations of the conditional distribution. For instance, our result showed that baseline BMI (at [Formula: see text] 0.05: [Formula: see text]), baseline viral load (at [Formula: see text] 0.05: [Formula: see text] [Formula: see text]), and post-HAART initiation (at [Formula: see text] 0.05: [Formula: see text]) were major significant factors of CD4 count across fitted quantiles. CONCLUSIONS: CD4 cell recovery in response to post-HAART initiation across all fitted quantile levels was observed. Compared to HIV-infected patients with low viral load levels at baseline, HIV-infected patients enrolled in the treatment with a high viral load level at baseline showed a significant negative effect on CD4 cell counts at upper quantiles. HIV-infected patients registered with high BMI at baseline had improved CD4 cell count after treatment, but physicians should not ignore this group of patients clinically. It is also crucial for physicians to closely monitor patients with a low BMI before and after starting HAART.


Asunto(s)
Infecciones por VIH , Terapia Antirretroviral Altamente Activa , Recuento de Linfocito CD4 , Infecciones por VIH/tratamiento farmacológico , Humanos , Sudáfrica/epidemiología , Carga Viral
16.
J Stat Plan Inference ; 221: 90-99, 2022 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-37711732

RESUMEN

Bayesian response adaptive clinical trials are currently evaluating experimental therapies for several diseases. Adaptive decisions, such as pre-planned variations of the randomization probabilities, attempt to accelerate the development of new treatments. The design of response adaptive trials, in most cases, requires time consuming simulation studies to describe operating characteristics, such as type I/II error rates, across plausible scenarios. We investigate large sample approximations of pivotal operating characteristics in Bayesian Uncertainty directed trial Designs (BUDs). A BUD trial utilizes an explicit metric u to quantify the information accrued during the study on parameters of interest, for example the treatment effects. The randomization probabilities vary during time to minimize the uncertainty summary u at completion of the study. We provide an asymptotic analysis (i) of the allocation of patients to treatment arms and (ii) of the randomization probabilities. For BUDs with outcome distributions belonging to the natural exponential family with quadratic variance function, we illustrate the asymptotic normality of the number of patients assigned to each arm and of the randomization probabilities. We use these results to approximate relevant operating characteristics such as the power of the BUD. We evaluate the accuracy of the approximations through simulations under several scenarios for binary, time-to-event and continuous outcome models.

17.
J Pharmacokinet Pharmacodyn ; 48(4): 581-595, 2021 08.
Artículo en Inglés | MEDLINE | ID: mdl-33884580

RESUMEN

First-order conditional estimation (FOCE) has been the most frequently used estimation method in NONMEM, a leading program for population pharmacokinetic/pharmacodynamic modeling. However, with growing data complexity, the performance of FOCE is challenged by long run time, convergence problem and model instability. In NONMEM 7, expectation-maximization (EM) estimation methods and FOCE with FAST option (FOCE FAST) were introduced. In this study, we compared the performance of FOCE, FOCE FAST, and two EM methods, namely importance sampling (IMP) and stochastic approximation expectation-maximization (SAEM), utilizing the rich pharmacokinetic data of oxfendazole and its two metabolites obtained from the first-in-human single ascending dose study in healthy adults. All methods yielded similar parameter estimates, but great differences were observed in parameter precision and modeling time. For simpler models (i.e., models of oxfendazole and/or oxfendazole sulfone), FOCE and FOCE FAST were more efficient than EM methods with shorter run time and comparable parameter precision. FOCE FAST was about two times faster than FOCE but it was prone to premature termination. For the most complex model (i.e., model of all three analytes, one of which having high level of data below quantification limit), FOCE failed to reliably assess parameter precision, while parameter precision obtained by IMP and SAEM was similar with SAEM being the faster method. IMP was more sensitive to model misspecification; without pre-systemic metabolism, IMP analysis failed to converge. With parallel computing introduced in NONMEM 7.2, modeling speed increased less than proportionally with the increase in the number of CPUs from 1 to 16.


Asunto(s)
Modelos Estadísticos , Farmacocinética , Bencimidazoles/farmacocinética , Bencimidazoles/farmacología , Fenbendazol/farmacocinética , Fenbendazol/farmacología , Humanos , Dinámicas no Lineales , Farmacología
18.
Med Decis Making ; 41(4): 386-392, 2021 05.
Artículo en Inglés | MEDLINE | ID: mdl-33504258

RESUMEN

Policy makers need decision tools to determine when to use physical distancing interventions to maximize the control of COVID-19 while minimizing the economic and social costs of these interventions. We describe a pragmatic decision tool to characterize adaptive policies that combine real-time surveillance data with clear decision rules to guide when to trigger, continue, or stop physical distancing interventions during the current pandemic. In model-based experiments, we find that adaptive policies characterized by our proposed approach prevent more deaths and require a shorter overall duration of physical distancing than alternative physical distancing policies. Our proposed approach can readily be extended to more complex models and interventions.


Asunto(s)
COVID-19/prevención & control , Análisis Costo-Beneficio , Técnicas de Apoyo para la Decisión , Pandemias , Distanciamiento Físico , Formulación de Políticas , Políticas , Costos y Análisis de Costo , Toma de Decisiones , Humanos , Modelos Teóricos , SARS-CoV-2
19.
J Comput Graph Stat ; 30(3): 794-807, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-35936018

RESUMEN

This article introduces a nonparametric approach to spectral analysis of a high-dimensional multivariate nonstationary time series. The procedure is based on a novel frequency-domain factor model that provides a flexible yet parsimonious representation of spectral matrices from a large number of simultaneously observed time series. Real and imaginary parts of the factor loading matrices are modeled independently using a prior that is formulated from the tensor product of penalized splines and multiplicative gamma process shrinkage priors, allowing for infinitely many factors with loadings increasingly shrunk towards zero as the column index increases. Formulated in a fully Bayesian framework, the time series is adaptively partitioned into approximately stationary segments, where both the number and locations of partition points are assumed unknown. Stochastic approximation Monte Carlo (SAMC) techniques are used to accommodate the unknown number of segments, and a conditional Whittle likelihood-based Gibbs sampler is developed for efficient sampling within segments. By averaging over the distribution of partitions, the proposed method can approximate both abrupt and slowly varying changes in spectral matrices. Performance of the proposed model is evaluated by extensive simulations and demonstrated through the analysis of high-density electroencephalography.

20.
Stat Comput ; 32: 7, 2021 Dec 06.
Artículo en Inglés | MEDLINE | ID: mdl-35125678

RESUMEN

Bayesian modelling enables us to accommodate complex forms of data and make a comprehensive inference, but the effect of partial misspecification of the model is a concern. One approach in this setting is to modularize the model and prevent feedback from suspect modules, using a cut model. After observing data, this leads to the cut distribution which normally does not have a closed form. Previous studies have proposed algorithms to sample from this distribution, but these algorithms have unclear theoretical convergence properties. To address this, we propose a new algorithm called the stochastic approximation cut (SACut) algorithm as an alternative. The algorithm is divided into two parallel chains. The main chain targets an approximation to the cut distribution; the auxiliary chain is used to form an adaptive proposal distribution for the main chain. We prove convergence of the samples drawn by the proposed algorithm and present the exact limit. Although SACut is biased, since the main chain does not target the exact cut distribution, we prove this bias can be reduced geometrically by increasing a user-chosen tuning parameter. In addition, parallel computing can be easily adopted for SACut, which greatly reduces computation time.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA