Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 55
Filtrar
1.
J Forensic Sci ; 69(5): 1699-1705, 2024 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-38978157

RESUMEN

During an investigation using Forensic Investigative Genetic Genealogy, which is a novel approach for solving violent crimes and identifying human remains, reference testing-when law enforcement requests a DNA sample from a person in a partially constructed family tree-is sometimes used when an investigation has stalled. Because the people considered for a reference test have not opted in to allow law enforcement to use their DNA profile in this way, reference testing is viewed by many as an invasion of privacy and by some as unethical. We generalize an existing mathematical optimization model of the genealogy process by incorporating the option of reference testing. Using simulated versions of 17 DNA Doe Project cases, we find that reference testing can solve cases more quickly (although many reference tests are required to substantially hasten the investigative process), but only rarely (<1%) solves cases that cannot otherwise be solved. Through a mixture of mathematical and computational analysis, we find that the most desirable people to test are at the bottom of a path descending from an ancestral couple that is most likely to be related to the target. We also characterize the rare cases where reference testing is necessary for solving the case: when there is only one descending path from an ancestral couple, which precludes the possibility of identifying an intersection (e.g., marriage) between two descendants of two different ancestral couples.


Asunto(s)
Dermatoglifia del ADN , Linaje , Humanos , Dermatoglifia del ADN/métodos , Genética Forense/métodos , Privacidad Genética , Funciones de Verosimilitud
2.
Med Decis Making ; : 272989X241262343, 2024 Jul 26.
Artículo en Inglés | MEDLINE | ID: mdl-39056310

RESUMEN

BACKGROUND: Methods to present the result of cost-effectiveness analyses under parameter uncertainty include cost-effectiveness planes (CEPs), cost-effectiveness acceptability curves/frontier (CEACs/CEAF), expected loss curves (ELCs), and net monetary benefit (NMB) lines. We describe how NMB lines can be augmented to present NMB values that could be achieved by reducing or resolving parameter uncertainty. We evaluated the ability of these methods to correctly 1) identify the alternative with the highest expected NMB and 2) communicate the magnitude of parameter and decision uncertainty. METHODS: We considered 4 hypothetical decision problems representing scenarios with high variance or correlated cost and effect estimates and alternatives with similar cost-effectiveness ratios. We used these decision problems to demonstrate the limitations of existing methods and the potential of augmented NMB lines to resolve these issues. RESULTS: CEPs and CEACs/CEAF could falsely imply the lack of sufficient evidence to identify the optimal option if cost and effect estimates have high variance, are correlated across alternatives, or when alternatives have similar cost-effectiveness ratios. The augmented NMB lines and ELCs can correctly identify the option with the highest expected NMB and communicate the potential benefit of resolving uncertainties. Like ELCs, the augmented NMB lines provide information about the value of resolving parameter uncertainties, but augmented NMB lines may be easier to interpret for decision makers. CONCLUSIONS: Our analysis supports recommending the augment NMB lines as an important method to present the results of economic evaluation studies under parameter uncertainty. HIGHLIGHTS: The results of cost-effectiveness analyses (CEAs) when the cost and effect estimates of alternatives are uncertain are commonly presented using cost-effectiveness planes (CEPs), cost-effectiveness acceptability curves/frontier (CEACs/CEAF), and expected loss curves (ELCs).Although currently not often used, net monetary benefit (NMB) lines could present the results of cost-effectiveness to identify the alternative with the highest expected NMB values given the current level of uncertainty. Furthermore, NMB lines can be augmented to 1) show metrics of value of information, which measure the value of additional research to reduce or eliminate the decision uncertainty, and 2) display the confidence intervals along the NMB lines to ensure that NMB values are estimated accurately using a sufficiently large number of parameter samples.Using several decision problems, we demonstrate the limitation of existing methods to present the results of CEAs under parameter uncertainty and how augmented NMB lines could resolve these issues.Our analysis supports recommending augmented NMB lines as an important method to present the results of CEA under uncertainty since they 1) correctly identify the alternative with the highest expected NMB value given the current evidence, 2) provide information about the potential value of additional research to improve the decision by reducing or resolving uncertainty in model parameters, 3) assist the analysis to visually ensure that enough parameter samples are used to estimate the expected NMB of alternatives, and 4) are easier to interpret for decision makers compared with other methods.

3.
Heliyon ; 10(7): e28701, 2024 Apr 15.
Artículo en Inglés | MEDLINE | ID: mdl-38596125

RESUMEN

Modal verbs, with their multifaceted semantic nuances and varied grammatical configurations, present notable challenges for L2 learners and regularly intrigue L2 researchers. This study attempts to investigate and compare how English modal verbs are used by L2 learners from different L1 backgrounds. By exploring the Turkish and Chinese learners' subcorpora of the International Corpus of Learner English (ICLE), this work scrutizes the overall frequencies of nine core English modal verbs as grouped into three major semantic classes along with the influential lexico-syntatic variables that are semantic classes of collocated verbs, grammatical patterns and subject pronominality. The results of a Bayesian probabilistic analysis show that both the Turkish and Chinese learners primed similar modal verbs and constructional preferences without topic as the normalizing factor. While the broader analysis reveals no statistically significant divergences between these two learner groups in English modal verb preferences, a pronounced contextual influence is evident when the dataset narrows to essays on a unified theme. This nuanced shift underscores the intricate relationship between essay topics and linguistic structures, thus emphasizing the pivotal role of context in modal verb usage.

4.
Proc Natl Acad Sci U S A ; 121(5): e2314215121, 2024 Jan 30.
Artículo en Inglés | MEDLINE | ID: mdl-38261621

RESUMEN

The competition-colonization (CC) trade-off is a well-studied coexistence mechanism for metacommunities. In this setting, it is believed that the coexistence of all species requires their traits to satisfy restrictive conditions limiting their similarity. To investigate whether diverse metacommunities can assemble in a CC trade-off model, we study their assembly from a probabilistic perspective. From a pool of species with parameters (corresponding to traits) sampled at random, we compute the probability that any number of species coexist and characterize the set of species that emerges through assembly. Remarkably, almost exactly half of the species in a large pool typically coexist, with no saturation as the size of the pool grows, and with little dependence on the underlying distribution of traits. Through a mix of analytical results and simulations, we show that this unlimited niche packing emerges as assembly actively moves communities toward overdispersed configurations in niche space. Our findings also apply to a realistic assembly scenario where species invade one at a time from a fixed regional pool. When diversity arises de novo in the metacommunity, richness still grows without bound, but more slowly. Together, our results suggest that the CC trade-off can support the robust emergence of diverse communities, even when coexistence of the full species pool is exceedingly unlikely.


Asunto(s)
Vendajes , Fenotipo , Probabilidad
5.
Heliyon ; 9(11): e21467, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-38034810

RESUMEN

The aim of the study is to develop Salmonella spp. Quantitative microbial risk assessments (QMRA) and to evaluate the risk of Salmonellosis illness in the Taiwanese population after consumption of Taiwanese salty chicken (TSC). We assume that Salmonella spp. May contaminate the fresh raw chicken used in TSC. After transport to the diner, fresh raw chicken is received, cleaned, and surface-washed by diner staff. The TSC is then cooked and sold to consumers. We set four different cross-contamination scenarios to evaluate the contamination level of Salmonella spp. In TSC. We used a Monte Carlo simulation method, a probabilistic analysis method, and exceedance risk to evaluate the risk of Salmonellosis illness. When the exceedance risk was 5 %, and taking the Taiwanese population above 19 years old as an example, the rate of contracting Salmonellosis from the consumption of TSC will be 2.94 % (2.94 million per 100 million people) if the chef does not clean their hands, knives, or cutting boards. However, if the chef washes their hands, knives, and cutting boards with cold water and soap, the illness rate of Salmonellosis from consuming TSC will be 1.93E-04 % (193 per 100 million people). Sensitivity analysis indicates that the most important risk factor in the QMRAs of TSC is the temperature of the fresh raw chicken during transportation, following which were the Salmonella spp. Residual. If the staff of the diner separates the cooking tools used for raw ingredients and those for cooked food, the illness risk of Salmonellosis will be very low.

6.
Liver Int ; 43(12): 2615-2624, 2023 12.
Artículo en Inglés | MEDLINE | ID: mdl-37735959

RESUMEN

BACKGROUND: Italy has a high HCV prevalence, and despite the approval of a dedicated fund for 'Experimental screening' for 2 years, screening has not been fully implemented. We aimed to evaluate the long-term impact of the persisting delay in HCV elimination after the Coronavirus disease 2019 (COVID-19) pandemic in Italy. METHODS: We used a mathematical, probabilistic modelling approach evaluating three hypothetical 'Inefficient', 'Efficient experimental' and 'WHO Target' screening scenarios differing by treatment rates over time. A Markov chain for liver disease progression evaluated the number of active infections, decompensated cirrhosis (DC), hepatocellular carcinoma (HCC) and HCV liver-related deaths up to the years 2030 and 2050. RESULTS: The 'WHO Target' scenario estimated 3900 patients with DC and 600 with HCC versus 4400 and 600 cases, respectively, similar for both 'Inefficient' and 'Efficient experimental' screening up to 2030. A sharp (10-fold) decrease in DC and HCC was estimated by the 'WHO Target' scenario compared with the other two scenarios in 2050; the forecasted number of DC was 420 cases versus 4200 and 3800 and of HCC <10 versus 600 and 400 HCC cases by 'WHO Target,' 'Inefficient' and 'Efficient experimental' scenarios, respectively. A significant decrease of the cumulative estimated number of liver-related deaths was observed up to 2050 by the 'WHO Target' scenario (52000) versus 'Inefficient' or 'Efficient experimental' scenarios (79 000 and 74 000 liver-related deaths, respectively). CONCLUSIONS: Our estimates highlight the need to extensively and efficiently address HCV screening and cure of HCV infection in order to avoid the forecasted long-term HCV adverse outcomes in Italy.


Asunto(s)
COVID-19 , Carcinoma Hepatocelular , Hepatitis C Crónica , Hepatitis C , Neoplasias Hepáticas , Humanos , Neoplasias Hepáticas/tratamiento farmacológico , Hepatitis C/diagnóstico , Hepacivirus , Italia/epidemiología , Hepatitis C Crónica/tratamiento farmacológico , Cirrosis Hepática/tratamiento farmacológico , Antivirales/uso terapéutico
7.
Comput Methods Programs Biomed ; 241: 107774, 2023 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-37651819

RESUMEN

BACKGROUND AND OBJECTIVES: The healing outcomes of distal radius fracture (DRF) treated with the volar locking plate (VLP) depend on surgical strategies and postoperative rehabilitation. However, the accurate prediction of healing outcomes is challenging due to a range of certainties related to the clinical conditions of DRF patients, including fracture geometry, fixation configuration, and physiological loading. The purpose of this study is to investigate the influence of uncertainty and variability in fracture/fixation parameters on the mechano-biology and biomechanical stability of DRF, using a probabilistic numerical approach based on the results from a series of experimental tests performed in this study. METHODS: Six composite radius sawboneses fitted with titanium VLP (VLP 2.0, Austofix) were loaded to failure at a rate of 2 N/s. The testing results of the elastic and plastic behaviour of the VLP were used as inputs for a probabilistic-based computational model of DRF, which simulated mechano-regulated tissue differentiation and fixation elastic capacity at the fracture site. Finally, the probability of success in early indirect healing and fracture stabilisation was predicted. RESULTS: The titanium VLP is a strong and ductile fixation whose flexibility and elastic capacity are governed by flexion working length and bone-to-plate distance, respectively. A fixation with optimised designs and configurations is critical to mechanically stabilising the early fracture site. Importantly, the uncertainty and variability in fracture/fixation parameters could compromise early DRF healing. The physiological loading uncertainty is the most adverse factor, followed by the negative impact of uncertainty in fracture geometry. CONCLUSIONS: The VRP 2.0 fixation made of grade II titanium is a desirable fixation that is strong enough to resist irreparable deformation during early recovery and is also ductile to deform plastically without implant failure at late rehabilitation.


Asunto(s)
Fracturas Óseas , Fracturas de la Muñeca , Humanos , Incertidumbre , Titanio , Probabilidad
8.
Chemosphere ; 337: 139232, 2023 Oct.
Artículo en Inglés | MEDLINE | ID: mdl-37364637

RESUMEN

It is challenging to conduct groundwater contamination risk assessment in fractured aquifers containing a large number of complex fractures, especially in a situation where the uncertainty of massive fractures and fluid-rock interactions is inevitable. In this study, a novel probabilistic assessment framework based on discrete fracture network (DFN) modeling is proposed to assess the uncertainty of groundwater contamination in fractured aquifers. The Monte Carlo simulation technique is employed to quantify the uncertainty of fracture geometry, and the environmental and health risks of the contaminated site are probabilistically analyzed in conjunction with the water quality index (WQI) and hazard index (HI). The results show that the contaminant transport behavior in fractured aquifers can be strongly affected by the distribution of the fracture network. The proposed framework of groundwater contamination risk assessment is capable of practically accounting for the uncertainties involved in the mass transport process and effectively assessing the contamination risk of fractured aquifers.


Asunto(s)
Agua Subterránea , Contaminantes Químicos del Agua , Contaminantes Químicos del Agua/análisis
9.
Artículo en Inglés | MEDLINE | ID: mdl-37264680

RESUMEN

OBJECTIVES: The correlations between economic modeling input parameters directly impact the variance and may impact the expected values of model outputs. However, correlation coefficients are not often reported in the literature. We aim to understand the correlations between model inputs for probabilistic analysis from summary statistics. METHODS: We provide proof that for correlated random variables X and Y (e.g. inpatient visits and outpatient visits), the Pearson correlation coefficients of sample means and samples are equal to each other (corrX,Y=corrX-,Y-). Therefore, when studies report summary statistics of correlated parameters, we can quantify the correlation coefficient between parameters. RESULTS: We use examples to illustrate how to estimate the correlation coefficient between the incidence rates of non-severe and severe hypoglycemia events, and the common coefficient of five cost components for patients with diabetic foot ulcers. We further introduce three types of correlations for utilities and provide two examples to estimate the correlations for utilities based on published data. We also evaluate how correlations between cost parameters and utility parameters impact the cost-effectiveness results using a Markov model for major depression. CONCLUSION: Incorporation of the correlations can improve the precision of cost-effectiveness results and increase confidence in evidence-based decision-making. Further empirical evidence is warranted.


Asunto(s)
Análisis Costo-Beneficio , Humanos
10.
Front Bioeng Biotechnol ; 11: 1182877, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37008030

RESUMEN

[This corrects the article DOI: 10.3389/fbioe.2022.808027.].

11.
Sensors (Basel) ; 23(5)2023 Mar 04.
Artículo en Inglés | MEDLINE | ID: mdl-36905022

RESUMEN

Blockchain technology has been gaining great interest from a variety of sectors including healthcare, supply chain, and cryptocurrencies. However, Blockchain suffers from a limited ability to scale (i.e., low throughput and high latency). Several solutions have been proposed to tackle this. In particular, sharding has proved to be one of the most promising solutions to Blockchain's scalability issue. Sharding can be divided into two major categories: (1) Sharding-based Proof-of-Work (PoW) Blockchain protocols, and (2) Sharding-based Proof-of-Stake (PoS) Blockchain protocols. The two categories achieve good performances (i.e., good throughput with a reasonable latency), but raise security issues. This article focuses on the second category. In this paper, we start by introducing the key components of sharding-based PoS Blockchain protocols. We then briefly introduce two consensus mechanisms, namely PoS and practical Byzantine Fault Tolerance (pBFT), and discuss their use and limitations in the context of sharding-based Blockchain protocols. Next, we provide a probabilistic model to analyze the security of these protocols. More specifically, we compute the probability of committing a faulty block and measure the security by computing the number of years to fail. We achieve a number of years to fail of approximately 4000 in a network of 4000 nodes, 10 shards, and a shard resiliency of 33%.

12.
Appl Radiat Isot ; 194: 110718, 2023 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-36780765

RESUMEN

The purpose of this study is to apply a probabilistic method to derive the derived concentration guideline levels for decommissioning of Korea Research Reactor 1 and 2. A total of seven parameters were found to be the sensitive parameters of the target nuclides. The DCGLs of Co-60 and H-3 were 0.063 Bq/g and 85.470 Bq/g, respectively. The concentrations of the gamma ray-emitting nuclides in the actual reactor sites were 7.7-215 times lower than the derived DCGLs for gamma ray-emitting nuclides.

13.
Eur J Health Econ ; 24(2): 307-319, 2023 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-35610397

RESUMEN

Guidelines of economic evaluations suggest that probabilistic analysis (using probability distributions as inputs) provides less biased estimates than deterministic analysis (using point estimates) owing to the non-linear relationship of model inputs and model outputs. However, other factors can also impact the magnitude of bias for model results. We evaluate bias in probabilistic analysis and deterministic analysis through three simulation studies. The simulation studies illustrate that in some cases, compared with deterministic analyses, probabilistic analyses may be associated with greater biases in model inputs (risk ratios and mean cost estimates using the smearing estimator), as well as model outputs (life-years in a Markov model). Point estimates often represent the most likely value of the parameter in the population, given the observed data. When model parameters have wide, asymmetric confidence intervals, model inputs with larger likelihoods (e.g., point estimates) may result in less bias in model outputs (e.g., costs and life-years) than inputs with lower likelihoods (e.g., probability distributions). Further, when the variance of a parameter is large, simulations from probabilistic analyses may yield extreme values that tend to bias the results of some non-linear models. Deterministic analysis can avoid extreme values that probabilistic analysis may encounter. We conclude that there is no definitive answer on which analytical approach (probabilistic or deterministic) is associated with a less-biased estimate in non-linear models. Health economists should consider the bias of probabilistic analysis and select the most suitable approach for their analyses.


Asunto(s)
Análisis Costo-Beneficio , Humanos , Probabilidad , Sesgo
14.
Motor Control ; 27(1): 112-122, 2023 Jan 01.
Artículo en Inglés | MEDLINE | ID: mdl-35894912

RESUMEN

Biomechanical trajectories are often routed through a chain of processing steps prior to statistical analysis. As changes in processing parameter values can affect these trajectories, care is required when choosing data processing specifics. The purpose of this Research Note was to demonstrate a simple way to propagate data processing parameter uncertainty to statistical inferences regarding biomechanical trajectories. As an example application, the correlation between foot contact duration and vertical ground reaction force during constant-speed treadmill walking was considered. Uncertainty was modeled using plausible-range uniform distributions in three data processing steps, and Monte Carlo simulation was used to construct probabilistic representations of both individual vertical ground reaction force measurements and the ultimate statistical results. Whereas an initial, plausible set of parameter values yielded a significant correlation between contact duration and late-stance vertical ground reaction force, Monte Carlo simulations revealed strong sensitivity, with "significance" being reached in fewer than 40% of simulations, with relatively little net effect of parameter uncertainty magnitude. These results indicate that propagating processing parameter uncertainty to statistical results promotes a cautious, nuanced, and robust view of observed effects. By extension, Monte Carlo simulations may yield greater interpretive consistency across studies involving data processing uncertainties.


Asunto(s)
Caminata , Humanos , Método de Montecarlo , Incertidumbre , Simulación por Computador
15.
J Forensic Sci ; 67(6): 2218-2229, 2022 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-36059116

RESUMEN

The genealogy process is typically the most time-consuming part of-and a limiting factor in the success of-forensic genetic genealogy, which is a new approach to solving violent crimes and identifying human remains. We formulate a stochastic dynamic program that-given the list of matches and their genetic distances to the unknown target-chooses the best decision at each point in time: which match to investigate (i.e., find its ancestors and look for most recent common ancestors between the match and the target), which set of potential most recent common ancestors to descend from (i.e., find its descendants, with the goal of identifying a marriage between the maternal and paternal sides of the target's family tree), or whether to terminate the investigation. The objective is to maximize the probability of finding the target minus a cost associated with the expected size of the final family tree. We estimate the parameters of our model using data from 17 cases (eight solved, nine unsolved) from the DNA Doe Project. We assess the Proposed Strategy using simulated versions of the 17 DNA Doe Project cases, and compare it to a Benchmark Strategy that ranks matches by their genetic distance to the target and only descends from known common ancestors between a pair of matches. The Proposed Strategy solves cases ≈10 - fold faster than the Benchmark Strategy, and does so by aggressively descending from a set of potential most recent common ancestors between the target and a match even when this set has a low probability of containing the correct most recent common ancestor. Our analysis provides a mathematical foundation for improving the genealogy process in forensic genetic genealogy.


Asunto(s)
ADN , Genética Forense , Humanos , Linaje , ADN/genética , Probabilidad , Modelos Genéticos
16.
Materials (Basel) ; 15(15)2022 Jul 22.
Artículo en Inglés | MEDLINE | ID: mdl-35897523

RESUMEN

Considerable uncertainties in the mechanical properties of composites not only prevent them from having efficient applications but also threaten the safety and reliability of structures. In order to determine the uncertainty in the elastic properties of unidirectional CFRP composites, this paper develops a probabilistic analysis method based on a micromechanics theoretical model and the Monte Carlo simulation. Firstly, four commonly used theoretical models are investigated by calculating the deterministic elastic parameters of three unidirectional CFRP composites, which are compared with experimental outcomes. According to error analyses, the bridging model is the most brilliant one, with errors lower than 6%, which suggests that it can be used in probabilistic analyses. Furthermore, constituent parameters are regarded as normally distributed random variables, and the Monte Carlo simulation was used to obtain samplings based on the statistics of constituent parameters. The predicted probabilistic elastic parameters of the T800/X850 composite coincide with those from experiments, which verified the effectiveness of the developed probabilistic analysis method. According to the probabilistic analysis results, the statistics of the elastic parameters, the correlations between the elastic parameters, and their sensitivity to the constituent's properties are determined. The moduli E11, E22, and G12 of the T800/X850 composite follow the lognormal distribution, namely, ln(E11)~N[5.15, 0.0282], ln(E22)~N[2.15, 0.0242], and ln(G12)~N[1.48, 0.0382], whereas its Poisson's ratio, v12, obeys the normal distribution, namely, v12~N(0.33, 0.0122). Additionally, the correlation coefficients between v12 and E11/E22/G12 are small and thus can be ignored, whereas the correlation coefficients between any two of E11, E22, and G12 are larger than 0.5 and should be considered in the reliability analyses of composite structures. The developed probabilistic analysis method based on the bridging model and the Monte Carlo simulation is fast and reliable and can be used to efficiently evaluate the probabilistic properties of the elastic parameters of any unidirectional composite in the reliability design of structures in engineering practice.

17.
Front Bioeng Biotechnol ; 10: 808027, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35721846

RESUMEN

Understanding the sources of error is critical before models of the musculoskeletal system can be usefully translated. Using in vivo measured tibiofemoral forces, the impact of uncertainty in muscle-tendon parameters on the accuracy of knee contact force estimates of a generic musculoskeletal model was investigated following a probabilistic approach. Population variability was introduced to the routine musculoskeletal modeling framework by perturbing input parameters of the lower limb muscles around their baseline values. Using ground reaction force and skin marker trajectory data collected from six subjects performing body-weight squat, the knee contact force was calculated for the perturbed models. The combined impact of input uncertainties resulted in a considerable variation in the knee contact force estimates (up to 2.1 BW change in the predicted force), especially at larger knee flexion angles, hence explaining up to 70% of the simulation error. Although individual muscle groups exhibited different contributions to the overall error, variation in the maximum isometric force and pathway of the muscles showed the highest impacts on the model outcomes. Importantly, this study highlights parameters that should be personalized in order to achieve the best possible predictions when using generic musculoskeletal models for activities involving deep knee flexion.

18.
Materials (Basel) ; 15(12)2022 Jun 09.
Artículo en Inglés | MEDLINE | ID: mdl-35744162

RESUMEN

Given the increasingly serious nature of environmental problems, many countries have recently declared carbon neutrality policies and expended efforts to implement them. The domestic building industry aims to reduce its environmental impact using life-cycle assessments (LCAs) of buildings according to the Green Standard for Energy and Environmental Design. However, it is difficult to perform efficient LCAs because the required quantity takeoff process is complex, and the quantity takeoff sheet may not exist during the building's design phase. In this study, 21 building LCAs were used to simplify and improve the efficiency of the proposed method and enable building LCAs even when there was no quantity takeoff sheet. Furthermore, a standard quantity database of building materials was constructed based on the analysis of the input quantities of building materials per unit area, and the apartment buildings LCA method was proposed using this database. The input quantities of building materials were analyzed using the probabilistic analysis technique. The probability distribution was derived using Monte Carlo simulations, and the goodness-of-fit was verified. Finally, the reliability of the proposed building LCA method was verified using a case study.

19.
Am J Epidemiol ; 191(8): 1429-1443, 2022 07 23.
Artículo en Inglés | MEDLINE | ID: mdl-35434739

RESUMEN

Chronic traumatic encephalopathy (CTE) is a neurodegenerative disease associated with exposure to repetitive head impacts such as those from American football. Our understanding of this association is based on research in autopsied brains, since CTE can only be diagnosed postmortem. Such studies are susceptible to selection bias, which needs to be accounted for to ensure a generalizable estimate of the association between repetitive head impacts and CTE. We evaluated the relationship between level of American football playing and CTE diagnosis after adjusting for selection bias. The sample included 290 deceased male former American football players who donated their brains to the Veterans Affairs-Boston University-Concussion Legacy Foundation (VA-BU-CLF) Brain Bank between 2008 and 2019. After adjustment for selection bias, college-level and professional football players had 2.38 (95% simulation interval (SI): 1.16, 5.94) and 2.47 (95% SI: 1.46, 4.79) times the risk of being diagnosed with CTE as high-school-level players, respectively; these estimates are larger than estimates with no selection bias adjustment. Since CTE is currently diagnosed only postmortem, we additionally provide plausible scenarios for CTE risk ratios for each level of play during the former players' lifetime. This study provides further evidence to support a dose-response relationship between American football playing and CTE.


Asunto(s)
Conmoción Encefálica , Encefalopatía Traumática Crónica , Fútbol Americano , Enfermedades Neurodegenerativas , Encéfalo , Encefalopatía Traumática Crónica/diagnóstico , Humanos , Masculino
20.
Environ Toxicol Pharmacol ; 92: 103847, 2022 May.
Artículo en Inglés | MEDLINE | ID: mdl-35283284

RESUMEN

The purpose of this study was to assess the risk of aflatoxins due to multiple food consumption among the Zhejiang population. Ultra-high-performance liquid chromatography coupled with tandem mass spectrometry method was used to determine aflatoxins in 792 samples. Aflatoxins were detected in 27.1% of the samples at levels between 0.07 and 262.63 µg kg-1, and aflatoxins B1 was the most frequently detected among different types of samples. 0.8% of peanut oil, 3.39% of nut products as well as 1.1% of condiments contaminated with aflatoxins B1 exceeded China national tolerance limits. Peanut oil had the highest incidence of aflatoxin, with a range from 0.17 to 22.50 µg kg-1. Using bags conferred limited advantages in reducing aflatoxin contents. Moreover, peanut and rice were the main contributors to dietary exposure to aflatoxins among Zhejiang residents. Finally, the margin of exposure values obtained by rice consumption were far from the safe margin of 10,000, indicating a potential risk to public health. The results pointed out the need for further prioritization of aflatoxins B1 risk-management actions in Zhejiang.


Asunto(s)
Aflatoxinas , Oryza , Aflatoxina B1/análisis , Arachis/química , China , Cromatografía Líquida de Alta Presión , Contaminación de Alimentos/análisis , Aceite de Cacahuete/análisis
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA