Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 18 de 18
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Sensors (Basel) ; 24(12)2024 Jun 19.
Artículo en Inglés | MEDLINE | ID: mdl-38931771

RESUMEN

Managing car parking systems is a complex process because multiple constraints must be considered; these include organizational and operational constraints. In this paper, a constraint optimization model for dynamic parking space allocation is introduced. An ad hoc algorithm is proposed, presented, and explained to achieve the goal of our proposed model. This paper makes research contributions by providing an intelligent prioritization mechanism, considering user schedule shifts and parking constraints, and assigning suitable parking slots based on a dynamic distribution. The proposed model is implemented to demonstrate the applicability of our approach. A benchmark is constructed based on well-defined metrics to validate our proposed model and the results achieved.

2.
Sci Rep ; 14(1): 13589, 2024 Jun 12.
Artículo en Inglés | MEDLINE | ID: mdl-38866943

RESUMEN

The calibration of reservoir models using production data can enhance the reliability of predictions. However, history matching often leads to only a few matched models, and the original geological interpretation is not always preserved. Therefore, there is a need for stochastic methodologies for history matching. The Ensemble Kalman Filter (EnKF) is a well-known Monte Carlo method that updates reservoir models in real time. When new production data becomes available, the ensemble of models is updated accordingly. The initial ensemble is created using the prior model, and the posterior probability function is sampled through a series of updates. In this study, EnKF was employed to evaluate the uncertainty of production forecasts for a specific development plan and to match historical data to a real field reservoir model. This study represents the first attempt to combine EnKF with an integrated model that includes a genuine oil reservoir, actual production wells, a surface choke, a surface pipeline, a separator, and a PID pressure controller. The research optimized a real integrated production system, considering the constraint that there should be no slug flow at the inlet of the separator. The objective function was to maximize the net present value (NPV). Geological data was used to model uncertainty using Sequential Gaussian Simulation. Porosity scenarios were generated, and conditioning the porosity to well data yielded improved results. Ensembles were employed to balance accuracy and efficiency, demonstrating a reduction in porosity uncertainty due to production data. This study revealed that utilizing a PID pressure controller for the production separator can enhance oil production by 59% over 20 years, resulting in the generation of 2.97 million barrels of surplus oil in the field and significant economic gains.

3.
MethodsX ; 10: 102181, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37152671

RESUMEN

Many-objective truss structure problems from small to large-scale problems with low to high design variables are investigated in this study. Mass, compliance, first natural frequency, and buckling factor are assigned as objective functions. Since there are limited optimization methods that have been developed for solving many-objective truss optimization issues, it is important to assess modern algorithms performance on these issues to develop more effective techniques in the future. Therefore, this study contributes by investigating the comparative performance of eighteen well-established algorithms, in various dimensions, using four metrics for solving challenging truss problems with many objectives. The statistical analysis is performed based on the objective function best mean and standard deviation outcomes, and Friedman's rank test. MMIPDE is the best algorithm as per the overall comparison, while SHAMODE with whale optimisation approach and SHAMODE are the runners-up.•A comparative test to measure the efficiency of eighteen state-of-the-practice methods is performed.•Small to large-scale truss design challenges are proposed for the validation.•The performance is measured using four metrics and Friedman's rank test.

4.
Surg Endosc ; 37(6): 4754-4765, 2023 06.
Artículo en Inglés | MEDLINE | ID: mdl-36897405

RESUMEN

BACKGROUND: We previously developed grading metrics for quantitative performance measurement for simulated endoscopic sleeve gastroplasty (ESG) to create a scalar reference to classify subjects into experts and novices. In this work, we used synthetic data generation and expanded our skill level analysis using machine learning techniques. METHODS: We used the synthetic data generation algorithm SMOTE to expand and balance our dataset of seven actual simulated ESG procedures using synthetic data. We performed optimization to seek optimum metrics to classify experts and novices by identifying the most critical and distinctive sub-tasks. We used support vector machine (SVM), AdaBoost, K-nearest neighbors (KNN) Kernel Fisher discriminant analysis (KFDA), random forest, and decision tree classifiers to classify surgeons as experts or novices after grading. Furthermore, we used an optimization model to create weights for each task and separate the clusters by maximizing the distance between the expert and novice scores. RESULTS: We split our dataset into a training set of 15 samples and a testing dataset of five samples. We put this dataset through six classifiers, SVM, KFDA, AdaBoost, KNN, random forest, and decision tree, resulting in 0.94, 0.94, 1.00, 1.00, 1.00, and 1.00 accuracy, respectively, for training and 1.00 accuracy for the testing results for SVM and AdaBoost. Our optimization model maximized the distance between the expert and novice groups from 2 to 53.72. CONCLUSION: This paper shows that feature reduction, in combination with classification algorithms such as SVM and KNN, can be used in tandem to classify endoscopists as experts or novices based on their results recorded using our grading metrics. Furthermore, this work introduces a non-linear constraint optimization to separate the two clusters and find the most important tasks using weights.


Asunto(s)
Gastroplastia , Humanos , Algoritmos , Aprendizaje Automático , Bosques Aleatorios , Máquina de Vectores de Soporte
5.
Front Robot AI ; 9: 851846, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35845255

RESUMEN

This article describes an approach for multiagent search planning for a team of agents. A team of UAVs tasked to conduct a forest fire search was selected as the use case, although solutions are applicable to other domains. Fixed-path (e.g., parallel track) methods for multiagent search can produce predictable and structured paths, with the main limitation being poor management of agents' resources and limited adaptability (i.e., based on predefined geometric paths, e.g., parallel track, expanding square, etc.). On the other hand, pseudorandom methods allow agents to generate well-separated paths; but methods can be computationally expensive and can result in a lack of coordination of agents' activities. We present a hybrid solution that exploits the complementary strengths of fixed-pattern and pseudorandom methods, i.e., an approach that is resource-efficient, predictable, adaptable, and scalable. Our approach evolved from the Delaunay triangulation of systematically selected waypoints to allocate agents to explore a specific region while optimizing a given set of mission constraints. We implement our approach in a simulation environment, comparing the performance of the proposed algorithm with fixed-path and pseudorandom baselines. Results proved agents' resource utilization, predictability, scalability, and adaptability of the developed path. We also demonstrate the proposed algorithm's application on real UAVs.

6.
J Environ Manage ; 317: 115446, 2022 Sep 01.
Artículo en Inglés | MEDLINE | ID: mdl-35751256

RESUMEN

Distributed Constraint Optimization (DCOP)-based approaches, as the distributed version of constraint optimization, provide a framework for coordinated decision making by a team of agents. In this paper, an agent-based DCOP model is developed to allocate water and reclaimed wastewater to demands considering the conflicting interests of involved stakeholders. One of the well-known DCOP algorithms, ADOPT1, is modified to incorporate an agent responsible for monitoring and conserving water resources. This new algorithm considers the social characteristics of agents and a new form of interaction between agents. For the first time in the literature, a real-world water and reclaimed wastewater allocation problem is formulated as a DCOP and solved using the Modified ADOPT (MADOPT) algorithm. To evaluate the MADOPT algorithm, it is applied to a water and reclaimed wastewater allocation problem in Tehran, Iran. The results illustrate the applicability and efficiency of the proposed methodology in dealing with large-scale multi-agent water resources systems. It is also shown that agents' selfishness and social relationships could affect their water use policies.


Asunto(s)
Aguas Residuales , Agua , Algoritmos , Irán , Aguas Residuales/análisis , Agua/análisis , Recursos Hídricos
7.
Ann Oper Res ; 289(1): 33-50, 2020 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-33343053

RESUMEN

Understanding brain computation requires assembling a complete catalog of its architectural components. Although the brain is organized into several anatomical and functional regions, it is ultimately the neurons in every region that are responsible for cognition and behavior. Thus, classifying neuron types throughout the brain and quantifying the population sizes of distinct classes in different regions is a key subject of research in the neuroscience community. The total number of neurons in the brain has been estimated for multiple species, but the definition and population size of each neuron type are still open questions even in common model organisms: the so called "cell census" problem. We propose a methodology that uses operations research principles to estimate the number of neurons in each type based on available information on their distinguishing properties. Thus, assuming a set of neuron type definitions, we provide a solution to the issue of assessing their relative proportions. Specifically, we present a three-step approach that includes literature search, equation generation, and numerical optimization. Solving computationally the set of equations generated by literature mining yields best estimates or most likely ranges for the number of neurons in each type. While this strategy can be applied towards any neural system, we illustrate its usage on the rodent hippocampus.

8.
Sensors (Basel) ; 19(6)2019 Mar 23.
Artículo en Inglés | MEDLINE | ID: mdl-30909562

RESUMEN

Emergency observations are missions executed by Earth observation satellites to support urgent ground operations. Emergency observations become more important for meeting the requirements of highly dynamic and highly time-sensitive observation missions, such as disaster monitoring and early warning. Considering the complex scheduling problem of Earth observation satellites under emergency conditions, a multi-satellite dynamic mission scheduling model based on mission priority is proposed in this paper. A calculation model of mission priority is designed for emergency missions based on seven impact factors. In the satellite mission scheduling, the resource constraints of scheduling are analyzed in detail, and the optimization objective function is built to maximize the observation mission priority and mission revenues, and minimize the waiting time for missions that require urgency for execution time. Then, the hybrid genetic tabu search algorithm is used to obtain the initial satellite scheduling plan. In case of the dynamic arrival of new emergency missions before scheduling plan releases, a dynamic scheduling algorithm based on mission priority is proposed to solve the scheduling problem caused by newly arrived missions and to obtain the scheduling plan of newly arrived missions. A simulation experiment was conducted for different numbers of initial missions and newly arrived missions, and the scheduling results were evaluated with a model performance evaluation function. The results show that the execution probability of high-priority missions increased because the mission priority was taken into account in the model. In the case of more satellite resources, when new missions dynamically arrived, the satellite resources can be reasonably allocated to these missions based on the mission priority. Overall, this approach reduces the complexity of the dynamic adjustment and maintains the stability of the initial scheduling plan.

9.
Mathematics (Basel) ; 7(6): 537, 2019 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-32483528

RESUMEN

In this paper, we consider several new applications of the recently introduced mathematical framework of the Theory of Connections (ToC). This framework transforms constrained problems into unconstrained problems by introducing constraint-free variables. Using this transformation, various ordinary differential equations (ODEs), partial differential equations (PDEs) and variational problems can be formulated where the constraints are always satisfied. The resulting equations can then be easily solved by introducing a global basis function set (e.g., Chebyshev, Legendre, etc.) and minimizing a residual at pre-defined collocation points. In this paper, we highlight the utility of ToC by introducing various problems that can be solved using this framework including: (1) analytical linear constraint optimization; (2) the brachistochrone problem; (3) over-constrained differential equations; (4) inequality constraints; and (5) triangular domains.

10.
J Struct Biol ; 203(2): 120-134, 2018 08.
Artículo en Inglés | MEDLINE | ID: mdl-29689299

RESUMEN

2D electron crystallography can be used to study small membrane proteins in their native environment. Obtaining highly ordered 2D crystals is difficult and time-consuming. However, 2D crystals diffracting to only 10-12 Šcan be prepared relatively conveniently in most cases. We have developed image-processing algorithms allowing to generate a high resolution 3D structure from cryo-electron crystallography images of badly ordered crystals. These include movie-mode unbending, refinement over sub-tiles of the images in order to locally refine the sample tilt geometry, implementation of different CTF correction schemes, and an iterative method to apply known constraints in the real and reciprocal space to approximate amplitudes and phases in the so-called missing cone regions. These algorithms applied to a dataset of the potassium channel MloK1 show significant resolution improvements to better than 5 Å.


Asunto(s)
Cristalografía por Rayos X/métodos , Procesamiento de Imagen Asistido por Computador/métodos , Proteínas de la Membrana/química , Proteínas de la Membrana/ultraestructura , Algoritmos , Microscopía por Crioelectrón/métodos , Programas Informáticos
11.
Stat Methods Med Res ; 27(11): 3436-3446, 2018 11.
Artículo en Inglés | MEDLINE | ID: mdl-28406062

RESUMEN

In medical and epidemiologic studies, relative risk is usually the parameter of interest. However, calculating relative risk using standard log-Binomial regression approach often encounters non-convergence. A modified Poisson regression, which uses robust variance, was proposed by Zou in 2004. Although the modified Poisson regression with sandwich variance estimator is valid for the estimation of relative risk, the predicted probability of the outcome may be greater than the natural boundary 1 for the unobserved but plausible covariate combinations. Moreover, the lower and upper bounds of confidence intervals for predicted probabilities could fall out of (0, 1). Chu and Cole, in 2010, proposed a Bayesian approach to overcome this issue. Posterior median was used to get the parameter estimation. However, the Bayesian approach may provide biased estimation, especially when the probability of outcome is high. In this article, we propose an alternative constraint optimization approach for estimating relative risk. Our approach can reach similar or better performance than Bayesian approach in terms of bias, root mean square error, coverage rate, and predictive probabilities. Simulation studies are conducted to demonstrate the usefulness of this approach. Our method is also illustrated by Prospective Registry Evaluating Myocardial Infarction: Event and Recovery data.


Asunto(s)
Ajuste de Riesgo/métodos , Medición de Riesgo , Algoritmos , Teorema de Bayes , Distribución Binomial , Investigación Biomédica/estadística & datos numéricos , Estudios Epidemiológicos , Femenino , Humanos , Masculino , Distribución de Poisson , Ajuste de Riesgo/estadística & datos numéricos , Medición de Riesgo/estadística & datos numéricos
12.
Biom J ; 60(1): 207-215, 2018 01.
Artículo en Inglés | MEDLINE | ID: mdl-29110320

RESUMEN

The risk difference is an intelligible measure for comparing disease incidence in two exposure or treatment groups. Despite its convenience in interpretation, it is less prevalent in epidemiological and clinical areas where regression models are required in order to adjust for confounding. One major barrier to its popularity is that standard linear binomial or Poisson regression models can provide estimated probabilities out of the range of (0,1), resulting in possible convergence issues. For estimating adjusted risk differences, we propose a general framework covering various constraint approaches based on binomial and Poisson regression models. The proposed methods span the areas of ordinary least squares, maximum likelihood estimation, and Bayesian inference. Compared to existing approaches, our methods prevent estimates and confidence intervals of predicted probabilities from falling out of the valid range. Through extensive simulation studies, we demonstrate that the proposed methods solve the issue of having estimates or confidence limits of predicted probabilities out of (0,1), while offering performance comparable to its alternative in terms of the bias, variability, and coverage rates in point and interval estimation of the risk difference. An application study is performed using data from the Prospective Registry Evaluating Myocardial Infarction: Event and Recovery (PREMIER) study.


Asunto(s)
Biometría/métodos , Teorema de Bayes , Humanos , Medición de Riesgo
13.
Int J Approx Reason ; 90: 208-225, 2017 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-29755201

RESUMEN

We consider causal structure estimation from time series data in which measurements are obtained at a coarser timescale than the causal timescale of the underlying system. Previous work has shown that such subsampling can lead to significant errors about the system's causal structure if not properly taken into account. In this paper, we first consider the search for system timescale causal structures that correspond to a given measurement timescale structure. We provide a constraint satisfaction procedure whose computational performance is several orders of magnitude better than previous approaches. We then consider finite-sample data as input, and propose the first constraint optimization approach for recovering system timescale causal structure. This algorithm optimally recovers from possible conflicts due to statistical errors. We then apply the method to real-world data, investigate the robustness and scalability of our method, consider further approaches to reduce underdetermination in the output, and perform an extensive comparison between different solvers on this inference problem. Overall, these advances build towards a full understanding of non-parametric estimation of system timescale causal structures from sub-sampled time series data.

14.
JMLR Workshop Conf Proc ; 52: 216-227, 2016 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-28203316

RESUMEN

This paper focuses on causal structure estimation from time series data in which measurements are obtained at a coarser timescale than the causal timescale of the underlying system. Previous work has shown that such subsampling can lead to significant errors about the system's causal structure if not properly taken into account. In this paper, we first consider the search for the system timescale causal structures that correspond to a given measurement timescale structure. We provide a constraint satisfaction procedure whose computational performance is several orders of magnitude better than previous approaches. We then consider finite-sample data as input, and propose the first constraint optimization approach for recovering the system timescale causal structure. This algorithm optimally recovers from possible conflicts due to statistical errors. More generally, these advances allow for a robust and non-parametric estimation of system timescale causal structures from subsampled time series data.

15.
Int J Numer Method Biomed Eng ; 32(4): e02742, 2016 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-26249168

RESUMEN

A computational study of an optimal control approach for cardiac defibrillation in a 3D geometry is presented. The cardiac bioelectric activity at the tissue and bath volumes is modeled by the bidomain model equations. The model includes intramural fiber rotation, axially symmetric around the fiber direction, and anisotropic conductivity coefficients, which are extracted from a histological image. The dynamics of the ionic currents are based on the regularized Mitchell-Schaeffer model. The controls enter in the form of electrodes, which are placed at the boundary of the bath volume with the goal of dampening undesired arrhythmias. The numerical optimization is based on Newton techniques. We demonstrated the parallel architecture environment for the computation of potentials on multidomains and for the higher order optimization techniques.


Asunto(s)
Cardioversión Eléctrica , Ventrículos Cardíacos/anatomía & histología , Imagenología Tridimensional , Modelos Cardiovasculares , Algoritmos , Simulación por Computador , Análisis Numérico Asistido por Computador
16.
Oncol Lett ; 10(4): 2043-2050, 2015 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-26622793

RESUMEN

Intensity-modulated radiation therapy (IMRT) is able to achieve good target conformance with a limited dose to organs at risk (OARs); however, IMRT increases the irradiation volume and monitor units (MUs) required. The present study aimed to evaluate the use of an IMRT plan with fewer segments and MUs, while maintaining quality in the treatment of nasopharyngeal carcinoma. In the present study, two types of IMRT plan were therefore compared: The direct machine parameter optimization (DMPO)-RT method and the feedback constraint DMPO-RT (fc_DMPO-RT) method, which utilizes compensative feedback constraint in DMPO-RT and maintains optimization. Plans for 23 patients were developed with identical dose prescriptions. Each plan involved synchronous delivery to various targets, with identical OAR constraints, by means of 7 coplanar fields. The average dose, maximum dose, dose-volume histograms of targets and the OAR, MUs of the plan, the number of segments, delivery time and accuracy were subsequently compared. The fc_DMPO-RT exhibited superior dose distribution in terms of the average, maximum and minimum doses to the gross tumor volume compared with that of DMPO-RT (t=62.7, 20.5 and 22.0, respectively; P<0.05). The fc_DMPO-RT also resulted in a smaller maximum dose to the spinal cord (t=7.3; P<0.05), as well as fewer MUs, fewer segments and decreased treatment times than that of the DMPO-RT (t=6.2, 393.4 and 244.3, respectively; P<0.05). The fc_DMPO-RT maintained plan quality with fewer segments and MUs, and the treatment time was significantly reduced, thereby resulting in reduced radiation leakage and an enhanced curative effect. Therefore, introducing feedback constraint into DMPO may result in improved IMRT planning. In nasopharyngeal carcinoma specifically, feedback constraint resulted in the improved protection of OARs in proximity of targets (such as the brainstem and parotid) due to sharp dose distribution and reduced MUs.

17.
KDD ; 2015: 1265-1274, 2015 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-31452969

RESUMEN

Computational phenotyping is the process of converting heterogeneous electronic health records (EHRs) into meaningful clinical concepts. Unsupervised phenotyping methods have the potential to leverage a vast amount of labeled EHR data for phenotype discovery. However, existing unsupervised phenotyping methods do not incorporate current medical knowledge and cannot directly handle missing, or noisy data. We propose Rubik, a constrained non-negative tensor factorization and completion method for phenotyping. Rubik incorporates 1) guidance constraints to align with existing medical knowledge, and 2) pairwise constraints for obtaining distinct, non-overlapping phenotypes. Rubik also has built-in tensor completion that can significantly alleviate the impact of noisy and missing data. We utilize the Alternating Direction Method of Multipliers (ADMM) framework to tensor factorization and completion, which can be easily scaled through parallel computing. We evaluate Rubik on two EHR datasets, one of which contains 647,118 records for 7,744 patients from an outpatient clinic, the other of which is a public dataset containing 1,018,614 CMS claims records for 472,645 patients. Our results show that Rubik can discover more meaningful and distinct phenotypes than the baselines. In particular, by using knowledge guidance constraints, Rubik can also discover sub-phenotypes for several major diseases. Rubik also runs around seven times faster than current state-of-the-art tensor methods. Finally, Rubik is scalable to large datasets containing millions of EHR records.

18.
Int J Mol Sci ; 12(1): 694-724, 2011 Jan 19.
Artículo en Inglés | MEDLINE | ID: mdl-21340009

RESUMEN

The aggregation of the amyloid-ß-peptide (AßP) into well-ordered fibrils has been considered as the key pathological marker of Alzheimer's disease. Molecular attributes related to the specific binding interactions, covalently and non-covalently, of a library of compounds targeting of conformational scaffolds were computed employing static lattice atomistic simulations and array constructions. A combinatorial approach using isobolographic analysis was stochastically modeled employing Artificial Neural Networks and a Design of Experiments approach, namely an orthogonal Face-Centered Central Composite Design for small molecules, such as curcumin and glycosylated nornicotine exhibiting concentration-dependent behavior on modulating AßP aggregation and oligomerization. This work provides a mathematical and in silico approach that constitutes a new frontier in providing neuroscientists with a template for in vitro and in vivo experimentation. In future this could potentially allow neuroscientists to adopt this in silico approach for the development of novel therapeutic interventions in the neuroprotection and neurotherapy of Alzheimer's disease. In addition, the neuroprotective entities identified in this study may also be valuable in this regard.


Asunto(s)
Enfermedad de Alzheimer/tratamiento farmacológico , Biología Computacional/métodos , Curcumina/uso terapéutico , Fármacos Neuroprotectores/uso terapéutico , Nicotina/análogos & derivados , Enfermedad de Alzheimer/metabolismo , Péptidos beta-Amiloides/metabolismo , Humanos , Nicotina/uso terapéutico
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA