Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 25
Filtrar
1.
Comput Biol Med ; 181: 109062, 2024 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-39205344

RESUMEN

We propose a state-of-the-art deep learning approach for accurate electrocardiogram (ECG) signal analysis, addressing both waveform delineation and beat type classification tasks. For beat type classification, we integrated two novel schemes into the deep learning model, significantly enhancing its performance. The first scheme is an adaptive beat segmentation method that determines the optimal duration for each heartbeat based on RR-intervals, mitigating segmenting errors from conventional fixed-period segmentation. The second scheme incorporates relative heart rate information of the target beat compared to neighboring beats, improving the model's ability to accurately detect premature atrial contractions (PACs) that are easily confused with normal beats due to similar morphology. Extensive evaluations on the PhysioNet QT Database, MIT-BIH Arrhythmia Database, and real-world wearable device data demonstrated the proposed approach's superior capabilities over existing methods in both tasks. The proposed approach achieved sensitivities of 99.81% for normal beats, 99.08% for premature ventricular contractions, and 97.83% for PACs in beat type classification. For waveform delineation, we achieved F1-scores of 0.9842 for non-waveform, 0.9798 for P-waves, 0.9749 for QRS complexes, and 0.9848 for T-waves. It significantly outperforms existing methods in PAC detection while maintaining high performance across both tasks. The integration of aforementioned two schemes into the deep learning model improved the accuracy of normal sinus rhythms and arrhythmia detection.


Asunto(s)
Aprendizaje Profundo , Electrocardiografía , Frecuencia Cardíaca , Procesamiento de Señales Asistido por Computador , Humanos , Electrocardiografía/métodos , Frecuencia Cardíaca/fisiología , Bases de Datos Factuales , Arritmias Cardíacas/fisiopatología , Arritmias Cardíacas/diagnóstico
2.
Sensors (Basel) ; 24(10)2024 May 14.
Artículo en Inglés | MEDLINE | ID: mdl-38793978

RESUMEN

The data incest problem causes inter-estimate correlation during data fusion processes, which yields inconsistent data fusion results. Especially in the multi-sensor multi-vehicle (MSMV) system, the data incest problem is serious due to multiple relative position estimations, which not only lead to pessimistic estimation but also cause additional computational overhead. In order to address the data incest problem, we propose a new data fusion method termed the interval split covariance intersection filter (ISCIF). The general consistency of the ISCIF is proven, serving as supplementary proof for the split covariance intersection filter (SCIF). Moreover, a decentralized MSMV localization system including absolute and relative positioning stages is designed. In the absolute positioning stage, each vehicle uses the ISCIF algorithm to update its own position based on absolute measurements. In the relative position stage, the interval constraint propagation (ICP) method is implemented to preprocess multiple relative position estimates and initially prepare input data for ISCIF. Then, the proposed ISCIF algorithm is employed to realize relative positioning. In addition, comparative simulations demonstrate that the proposed method can achieve both accurate and consistent results compared with the state-of-the-art methods.

3.
Heliyon ; 10(4): e25786, 2024 Feb 29.
Artículo en Inglés | MEDLINE | ID: mdl-38370234

RESUMEN

In this paper, systems of multivariate interval linear equations with complex interval coefficients are examined, and a novel linear algebra-based approach for locating all of their solutions is proposed. The key concept is to convert the system into a crisp polynomial system that is equivalent and allows for the use of the innovative computational features of Gröbner bases. It is possible to calculate all of the system's precise solutions at once after an appropriate Gröbner basis has been determined. Design is a condition for the presence of a solution in complex interval linear systems. In addition, an algorithm is devised to retrieve all solutions using the eigenvalue approach. In addition, a proportional case is solved using the provided approach to demonstrate its efficiency and efficacy. The given approach can locate all solutions for linear systems with complex intervals. Additionally, it determines the presence or absence of a solution for the system. We use the aforementioned technique in the context of circuit analysis to demonstrate the effectiveness of the findings obtained.

4.
Sensors (Basel) ; 23(18)2023 Sep 14.
Artículo en Inglés | MEDLINE | ID: mdl-37765941

RESUMEN

Automation of visual quality inspection tasks in manufacturing with machine vision is beginning to be the de facto standard for quality inspection as manufacturers realize that machines produce more reliable, consistent and repeatable analyses much quicker than a human operator ever could. These methods generally rely on the installation of cameras to inspect and capture images of parts; however, there is yet to be a method proposed for the deployment of cameras which can rigorously quantify and certify the performance of the system when inspecting a given part. Furthermore, current methods in the field yield unrealizable exact solutions, making them impractical or impossible to actually install in a factory setting. This work proposes a set-based method of synthesizing continuous pose intervals for the deployment of cameras that certifiably satisfy constraint-based performance criteria within the continuous interval.

5.
PeerJ Comput Sci ; 9: e1301, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37346667

RESUMEN

Acquiring reliable knowledge amidst uncertainty is a topical issue of modern science. Interval mathematics has proved to be of central importance in coping with uncertainty and imprecision. Algorithmic differentiation, being superior to both numeric and symbolic differentiation, is nowadays one of the most celebrated techniques in the field of computational mathematics. In this connexion, laying out a concrete theory of interval differentiation arithmetic, combining subtlety of ordinary algorithmic differentiation with power and reliability of interval mathematics, can extend real differentiation arithmetic so markedly both in method and objective, and can so far surpass it in power as well as applicability. This article is intended to lay out a systematic theory of dyadic interval differentiation numbers that wholly addresses first and higher order automatic derivatives under uncertainty. We begin by axiomatizing a differential interval algebra and then we present the notion of an interval extension of a family of real functions, together with some analytic notions of interval functions. Next, we put forward an axiomatic theory of interval differentiation arithmetic, as a two-sorted extension of the theory of a differential interval algebra, and provide the proofs for its categoricity and consistency. Thereupon, we investigate the ensuing structure and show that it constitutes a multiplicatively non-associative S-semiring in which multiplication is subalternative and flexible. Finally, we show how to computationally realize interval automatic differentiation. Many examples are given, illustrating automatic differentiation of interval functions and families of real functions.

6.
Sensors (Basel) ; 24(1)2023 Dec 25.
Artículo en Inglés | MEDLINE | ID: mdl-38202973

RESUMEN

This work establishes a complete methodology for solving continuous sets of camera deployment solutions for automated machine vision inspection systems in industrial manufacturing facilities. The methods presented herein generate constraints that realistically model cameras and their associated intrinsic parameters and use set-based solving methods to evaluate these constraints over a 3D mesh model of a real part. This results in a complete and certifiable set of all valid camera poses describing all possible inspection poses for a given camera/part pair, as well as how much of the part's surface is inspectable from any pose in the set. These methods are tested and validated experimentally using real cameras and precise 3D tracking equipment and are shown to accurately align with real imaging results according to the hardware they are modelling for a given inspection deployment. In addition, their ability to generate full inspection solution sets is demonstrated on several realistic geometries using realistic factory settings, and they are shown to generate tangible, deployable inspection solutions, which can be readily integrated into real factory settings.

7.
Materials (Basel) ; 15(13)2022 Jul 04.
Artículo en Inglés | MEDLINE | ID: mdl-35806803

RESUMEN

This paper studies the yield behavior of a woven carbon-fiber-reinforced silicon-matrix (C/SiC) composite under dynamic tensile loading. Experiments were carried out to obtain the tensile properties of the C/SiC composite at a strain rate range of 2 × 10-5/s to 99.4/s. A strain-rate-dependent yield criterion based on the distortional strain energy density theory is established to describe the yield behavior. The interval uncertainty is considered for a more reliable yield prediction. Experimental results show that the yield stress, elastic modulus, and yield strain of the C/SiC composite grow with the increasing strain rate. The failure mode transitions from progressive crack extension to uneven fiber bundle breakage. The predicted results by the yield criterion match well with experimental data. Experimental results are enveloped within the uncertainty level of 45% in the critical distortional energy density, corresponding to an uncertainty of 14% and 11% in the yield stress and yield strain, respectively. With the support of the proposed strain-rate-dependent yield criterion, the yield behavior of the C/SiC composite under dynamic loading conditions can be predicted with reasonable accuracy.

8.
J Environ Manage ; 304: 114225, 2022 Feb 15.
Artículo en Inglés | MEDLINE | ID: mdl-34871870

RESUMEN

Estimating the recreational value of a coastal wetland park is useful in understanding wetland ecosystem and nurturing a balanced relationship between wetland tourism exploration and natural conservation. This study aims to apply appropriate methodologies to accurately estimate the recreational value of a coastal wetland park. The Nansha Wetland in China was used as the study site, and its recreational value was divided into non-use value (estimated using the choice experiment method (CEM)) and use value (estimated using travel cost interval analysis (TCIA)). The data were collected via questionnaires consisting of different choice experiment scenarios and travel cost investigations. The results showed that the per capita and total non-use values were 116.97 CNY/17.80 USD and 24.56 million CNY/3.74 million USD, respectively, and the per capita and total use values were 313.95 CNY/47.79 USD and 65.93 million CNY/10.04 million USD. Therefore, the per capita and total recreational values were 430.92 CNY/65.59 USD and 90.49 million CNY/13.77 million USD. CEM was used to identify tourists' trade-offs and preferences among the selected wetland attributes. As a result, tourists were found to have the largest marginal willingness to pay (MWTP) for "mangrove coverage," followed by "species of rare birds" and "water visibility." TCIA was used to solve the under-dispersion problem of the number of trips. Based on these findings, several managerial implications were identified, including adjusting ticket price based on the non-use value, regulating tourists' behaviors, enhancing the protection of mangroves, improving the water quality and the living habitats of migrant birds, and promoting science education and popularization.


Asunto(s)
Ecosistema , Humedales , Conservación de los Recursos Naturales , Costos y Análisis de Costo , Parques Recreativos , Viaje
9.
ISA Trans ; 125: 252-259, 2022 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-34247764

RESUMEN

An analytical investigation of a DC motor with interval uncertainties is performed in this study and a new approach by interval analysis is suggested for optimal control of the system. The main advantage of using an interval model for uncertainties is that makes the system independent from the probability distribution models of the system; therefore, it can be analyzed by only having information about minimum and maximum bounds. Here, the interval analysis deals with linear quadratic feedback control (LQR) to simulate and optimal control of the DC motor in the realistic state. To do this, the Pontryagins principle is used to solve the interval linear quadratic regulator to obtain the essential conditions, and thus, they have been reconstructed as ordinary differential equation by applying several algebraic manipulations. Afterward, by solving the interval nonlinear system of the ODE, the confidence interval for the feedback controller is achieved. The confidence interval is to guarantee the solution which is included in it. The Chebyshev inclusion approach is applied here to find solution for the ODE system with uncertainties. A comparison of the step response of the suggested approach with the centered approach and Monte Carlo methods a statistical approach is performed. The simulation results indicated that the suggested approach retains tighter and more sensible results than the Monte Carlo method.

10.
Artículo en Inglés | MEDLINE | ID: mdl-34886150

RESUMEN

Ecological birdwatching tourism is an ecological product and an essential part of ecotourism, and the realization of its recreation value is crucial for improving human well-being, and realization of the local benefits of ecosystem services for areas focused on biodiversity conservation, especially in bird species. In this study, we use travel cost interval analysis, one of the travel cost derived models featuring more easily satisfied assumptions and less limited data, to evaluate the recreation value of the ecological bird-watching tourism destination, and compare it with the general ecotourism, of Mingxi County destination in China. The results show that, firstly, the per capita recreation value of eco-birdwatching is 3.9 times that of general eco-tourism, its per capita social benefit is three times that of general eco-tourism, and its per capita economic benefit is 4.5 times that of general eco-tourism. Secondly, compared with general ecotourists, the per capita travel costs of eco-birdwatchers are higher, and there were significant statistical differences in the expenses for catering, tickets, shopping, opportunity cost, and total travel expenses between these two groups. Thirdly, in comparison with general ecotourists, the marginal cost of an individual eco-birdwatcher is higher, and the travel intention of an eco-birdwatcher is more robust at the same cost level. The price of a single eco-birdwatcher is higher under the same travel intention demand level. In short, the ecological bird-watching industry has a higher marginal value than general eco-tourism and has higher social, economic, and ecological benefits, bringing a higher level of development for the local tourism industry.


Asunto(s)
Ecosistema , Turismo , Animales , Biodiversidad , Aves , China , Conservación de los Recursos Naturales , Humanos , Viaje
11.
Materials (Basel) ; 14(9)2021 Apr 22.
Artículo en Inglés | MEDLINE | ID: mdl-33922030

RESUMEN

There are great uncertainties in road design parameters, and the traditional point numerical calculation results cannot reflect the complexity of the actual project well. Additionally, the calculation method of road design theory based on interval analysis is more difficult in the use of uncertain design parameters. In order to simplify the calculation process of the interval parameters in the road design theory, the asphalt pavement design is taken as the analysis object, and the permanent deformation of the asphalt mixture is simplified by combining the interval analysis theory. Considering the uncertainty of the design parameters, the data with boundaries but uncertain size are expressed in intervals, and then the interval calculation formula for the permanent deformation of the asphalt mixture is derived, and the interval results are obtained. In order to avoid the dependence of interval calculation on the computer code, according to the interval calculation rule, the interval calculation method with the upper and lower end point values as point operations is proposed. In order to overcome the contradiction between interval expansion results and engineering applications, by splitting the multi-interval variable formulas, the interval variable weights are reasonably given, and the synthesis of each single interval result realizes a simplified calculation based on interval variable weight assignment. The analysis results show that the interval calculation method based on the point operation rule is accurate and reliable, and the simplified method based on the interval variable weight assignment is effective and feasible. The simplified interval calculation method proposed in this paper provides a reference for the interval application of road design theory.

12.
Spectrochim Acta A Mol Biomol Spectrosc ; 244: 118827, 2021 Jan 05.
Artículo en Inglés | MEDLINE | ID: mdl-32862077

RESUMEN

In this paper, a new method for simultaneous determination of nitrate, COD and turbidity in water based on UV-Vis absorption spectrometry combined with interval analysis was studied. By analyzing the spectral absorption characteristics of nitrate, COD, and turbidity standard solutions and the mixtures of them, the absorption spectra in the range of 225-260 nm, 260-320 nm and 320-700 nm were selected as the characteristic spectra of nitrate, COD and turbidity, respectively. Multiplicative scatter correction was employed to compensate turbidity of the absorption spectra of the mixture solutions in the wavelength range of 225-320 nm. Then, the spectra after turbidity compensation in the range of 225-260 nm was compensated for COD using the method of spectral difference. The original spectra in the range of 320-700 nm, the turbidity compensated spectra in the range of 260-320 nm, and the COD compensated spectra in the range of 225-260 nm were analyzed by PLS algorithm in order to calculate the concentrations of nitrate, COD and turbidity in the mixture solutions. The results showed that this method could simultaneously and accurately determine the concentrations of nitrate, COD and turbidity. After interval analysis, all the correlation coefficients (R2) between the predicted values and the true values of nitrate, COD and turbidity were higher than 0.9, and root mean square error (RMSE) of predicted values were between 0.696 and 2.337.

13.
Entropy (Basel) ; 22(9)2020 Aug 25.
Artículo en Inglés | MEDLINE | ID: mdl-33286701

RESUMEN

In this paper, first we show that the variance used in the Markowitz's mean-variance model for the portfolio selection with its numerous modifications often does not properly present the risk of portfolio. Therefore, we propose another treating of portfolio risk as the measure of possibility to earn unacceptable low profits of portfolio and a simple mathematical formalization of this measure. In a similar way, we treat the criterion of portfolio's return maximization as the measure of possibility to get a maximal profit. As the result, we formulate the portfolio selection problem as a bicriteria optimization task. Then, we study the properties of the developed approach using critical examples of portfolios with interval and fuzzy valued returns. The α-cuts representation of fuzzy returns was used. To validate the proposed method, we compare the results we got using it with those obtained with the use of fuzzy versions of seven widely reputed methods for portfolio selection. As in our approach we deal with the bicriteria task, the three most popular methods for local criteria aggregation are compared using the known example of fuzzy portfolio consist of five assets. It is shown that the results we got using our approach to the interval and fuzzy portfolio selection reflect better the essence of this task than those obtained by widely reputed traditional methods for portfolio selection in the fuzzy setting.

14.
Materials (Basel) ; 13(7)2020 Mar 28.
Artículo en Inglés | MEDLINE | ID: mdl-32231149

RESUMEN

Deterministic damage detection methods often fail in practical applications due to ever-present uncertainties. Moreover, vibration-based model updating strategies are easily affected by measurement noises and could encounter ill-conditioning problems during inverse solutions. On this account, a model-free method has been proposed combining modal interval analyses with static measurements. Structural geometrical dimensions, material parameters and external loads are expressed by interval variables representing uncertainties. Mechanical formulas for static responses are then extended to their interval forms, which are subsequently solved using classic interval and modal interval analyses. The analytical interval envelopes of static responses such as deflections and strains are defined by the interval solutions, and damage can be detected when the measured responses intersect the envelopes. By this approach, potential damage can be found in a fast and rough way without any inverse solution process such as model updating. The proposed method has been verified against both numerical and experimental reinforced concrete beams whose strains were taken as the desirable responses. It was found that the strain envelopes provided by modal interval analysis were narrower than those by classic interval analysis. Modal interval analysis effectively avoids the phenomenon of interval overestimation. In addition, the intersection point also identifies the current external load, providing a loading alarm for structures.

15.
Comput Biol Med ; 113: 103386, 2019 10.
Artículo en Inglés | MEDLINE | ID: mdl-31446318

RESUMEN

In this paper, we present a fully automated technique for robust detection of Atrial Fibrillation (AF) episodes in single-lead electrocardiogram (ECG) signals using discrete-state Markov models and Random Forests. METHODS: The ECG signal is first preprocessed using Stationary Wavelet Transforms (SWT) for noise suppression, signal quality assessment and subsequent R-peak detection. Discrete-state Markov probabilities modelling transitions between successive RR intervals along with other statistical quantities derived from the RR-interval series constitute the feature set to perform AF classification using Random Forests. Further enhancement in AF detection is achieved by using a post-processing false positive suppression algorithm based on autocorrelation analysis of the RR-interval series. Datasets: The AF classifier was trained using the Physionet/Computing in Cardiology 2017 AF Challenge dataset and the Atrial Fibrillation Termination Database (AFTDB). The test datasets consist of the MIT-BIH Atrial Fibrillation Database (AFDB) and the MIT-BIH Arrhythmia Database (MITDB). RESULTS: Our algorithms achieved sensitivity, specificity and F-score values of 97.4%, 98.6% and 97.7% respectively on the AFDB dataset and 96.3%, 97.0% and 85.6% respectively on the MITDB dataset. It was also observed that inclusion of the false positive suppression step resulted in a 1.1% increase in specificity and a 4.0% increase in F-score for the MITDB dataset without any decrease in sensitivity. CONCLUSION: The proposed method of AF detection, combining Markov models and Random Forests, achieves high accuracy across multiple databases and demonstrates comparable or superior performance to several other state-of-the-art algorithms.


Asunto(s)
Algoritmos , Fibrilación Atrial , Bases de Datos Factuales , Diagnóstico por Computador , Electrocardiografía , Modelos Cardiovasculares , Fibrilación Atrial/diagnóstico , Fibrilación Atrial/fisiopatología , Humanos , Cadenas de Markov
16.
Comput Methods Programs Biomed ; 171: 109-117, 2019 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-27526628

RESUMEN

BACKGROUND AND OBJECTIVE: This paper deals with the improvement of parameter estimation in terms of precision and computational time for dynamical models in a bounded error context. METHODS: To improve parameter estimation, an optimal initial state design is proposed combined with a contractor. This contractor is based on a volumetric criterion and an original condition initializing this contractor is given. Based on a sensitivity analysis, our optimal initial state design methodology consists in searching the minimum value of a proposed criterion for the interested parameters. In our framework, the uncertainty (on measurement noise and parameters) is supposed unknown but belongs to known bounded intervals. Thus guaranteed state and sensitivity estimation have been considered. An elementary effect analysis on the number of sampling times is also implemented to achieve the fast and guaranteed parameter estimation. RESULTS: The whole procedure is applied to a pharmacokinetics model and simulation results are given. CONCLUSIONS: The good improvement of parameter estimation in terms of computational time and precision for the case study highlights the potential of the proposed methodology.


Asunto(s)
Simulación por Computador , Modelos Biológicos , Dinámicas no Lineales , Algoritmos , Glucosa Oxidasa/farmacocinética
17.
J Electrocardiol ; 51(3): 382-385, 2018.
Artículo en Inglés | MEDLINE | ID: mdl-29779528

RESUMEN

Differences in successive R-R intervals (RRIs) were normalized by RRIs before and after the indexing beats (normalized DRs) in individuals with normal sinus rhythm (NSR) and 98.89% of normalized DRs were found to distribute within mean±0.100 (≒mean±3SD), whereas 73.47% were out of this range in atrial fibrillation (AF). When 7 out 20 normalized DRs fell outside of 0.000±0.100, NSR (n=129) and AF (n=108) could be discriminated with high sensitivity, specificity, and predictive values (>99.0% for all). This method will be used in detecting AF candidates from a small number of heart beats or arterial pulses.


Asunto(s)
Algoritmos , Fibrilación Atrial/diagnóstico , Electrocardiografía/métodos , Fibrilación Atrial/fisiopatología , Diagnóstico Diferencial , Femenino , Humanos , Masculino , Valor Predictivo de las Pruebas , Sensibilidad y Especificidad
18.
J Ultrasound Med ; 37(3): 745-753, 2018 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-28948639

RESUMEN

OBJECTIVES: The aims of this study were to construct reference ranges for the time interval parameters of the ductus venosus during the early second trimester of pregnancy and to demonstrate the clinical utility in various fetal disorders. METHODS: The ductus venosus Doppler measurements of 331 healthy fetuses between 15 and 22 weeks' gestation were analyzed. The systolic time and diastolic time were subdivided into the systolic acceleration time, systolic deceleration time, diastolic acceleration time, and diastolic deceleration time. The median, 5th, and 95th regression lines for each variable were determined according to gestational age. The ductus venosus time interval parameters in cases of fetoplacental abnormalities were calculated and plotted against the reference ranges. RESULTS: With advancing gestation, the systolic acceleration time and total systolic time increased significantly (P < .001). In contrast to the systolic phase, the diastolic deceleration time decreased significantly during the early second trimester of pregnancy (P = .023). The systolic deceleration time, diastolic acceleration time, and diastolic time were relatively constant. Fetuses with tricuspid insufficiency, twin-twin transfusion syndrome, intrauterine fetal growth restriction, and anemia had abnormal ductus venosus times with different patterns. CONCLUSIONS: Predicted normal reference ranges for time interval variables in relation to gestational age were established. These could be helpful for assessing fetal cardiovascular function during the early second trimester of pregnancy.


Asunto(s)
Enfermedades Fetales/diagnóstico por imagen , Enfermedades Fetales/fisiopatología , Corazón Fetal/diagnóstico por imagen , Corazón Fetal/fisiopatología , Segundo Trimestre del Embarazo , Ultrasonografía Prenatal/métodos , Adulto , Velocidad del Flujo Sanguíneo/fisiología , Estudios Transversales , Femenino , Humanos , Embarazo , Valores de Referencia , Tiempo
19.
Sci Total Environ ; 603-604: 760-771, 2017 Dec 15.
Artículo en Inglés | MEDLINE | ID: mdl-28395953

RESUMEN

In municipal solid waste management system, decision makers have to develop an insight into the processes namely, waste generation, collection, transportation, processing, and disposal methods. Many parameters (e.g., waste generation rate, functioning costs of facilities, transportation cost, and revenues) in this system are associated with uncertainties. Often, these uncertainties of parameters need to be modeled under a situation of data scarcity for generating probability distribution function or membership function for stochastic mathematical programming or fuzzy mathematical programming respectively, with only information of extreme variations. Moreover, if uncertainties are ignored, then the problems like insufficient capacities of waste management facilities or improper utilization of available funds may be raised. To tackle uncertainties of these parameters in a more efficient manner an algorithm, based on interval analysis, has been developed. This algorithm is applied to find optimal solutions for a facility location model, which is formulated to select economically best locations of transfer stations in a hypothetical urban center. Transfer stations are an integral part of contemporary municipal solid waste management systems, and economic siting of transfer stations ensures financial sustainability of this system. The model is written in a mathematical programming language AMPL with KNITRO as a solver. The developed model selects five economically best locations out of ten potential locations with an optimum overall cost of [394,836, 757,440] Rs.1 /day ([5906, 11,331] USD/day) approximately. Further, the requirement of uncertainty modeling is explained based on the results of sensitivity analysis.


Asunto(s)
Residuos Sólidos , Incertidumbre , Instalaciones de Eliminación de Residuos , Administración de Residuos , Modelos Teóricos , Eliminación de Residuos , Transportes
20.
J Glob Optim ; 68(2): 413-438, 2017.
Artículo en Inglés | MEDLINE | ID: mdl-32055105

RESUMEN

This article presents an arithmetic for the computation of Chebyshev models for factorable functions and an analysis of their convergence properties. Similar to Taylor models, Chebyshev models consist of a pair of a multivariate polynomial approximating the factorable function and an interval remainder term bounding the actual gap with this polynomial approximant. Propagation rules and local convergence bounds are established for the addition, multiplication and composition operations with Chebyshev models. The global convergence of this arithmetic as the polynomial expansion order increases is also discussed. A generic implementation of Chebyshev model arithmetic is available in the library MC++. It is shown through several numerical case studies that Chebyshev models provide tighter bounds than their Taylor model counterparts, but this comes at the price of extra computational burden.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA