Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 24
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
J Neurodev Disord ; 16(1): 54, 2024 Sep 12.
Artículo en Inglés | MEDLINE | ID: mdl-39266988

RESUMEN

BACKGROUND: Common genetic variation has been shown to account for a large proportion of ASD heritability. Polygenic scores generated for autism spectrum disorder (ASD-PGS) using the most recent discovery data, however, explain less variance than expected, despite reporting significant associations with ASD and other ASD-related traits. Here, we investigate the extent to which information loss on the target study genome-wide microarray weakens the predictive power of the ASD-PGS. METHODS: We studied genotype data from three cohorts of individuals with high familial liability for ASD: The Early Autism Risk Longitudinal Investigation (EARLI), Markers of Autism Risk in Babies-Learning Early Signs (MARBLES), and the Infant Brain Imaging Study (IBIS), and one population-based sample, Study to Explore Early Development Phase I (SEED I). Individuals were genotyped on different microarrays ranging from 1 to 5 million sites. Coverage of the top 88 genome-wide suggestive variants implicated in the discovery was evaluated in all four studies before quality control (QC), after QC, and after imputation. We then created a novel method to assess coverage on the resulting ASD-PGS by correlating a PGS informed by a comprehensive list of variants to a PGS informed with only the available variants. RESULTS: Prior to imputations, None of the four cohorts directly or indirectly covered all 88 variants among the measured genotype data. After imputation, the two cohorts genotyped on 5-million arrays reached full coverage. Analysis of our novel metric showed generally high genome-wide coverage across all four studies, but a greater number of SNPs informing the ASD-PGS did not result in improved coverage according to our metric. LIMITATIONS: The studies we analyzed contained modest sample sizes. Our analyses included microarrays with more than 1-million sites, so smaller arrays such as Global Diversity and the PsychArray were not included. Our PGS metric for ASD is only generalizable to samples of European ancestries, though the coverage metric can be computed for traits that have sufficiently large-sized discovery findings in other ancestries. CONCLUSIONS: We show that commonly used genotyping microarrays have incomplete coverage for common ASD variants, and imputation cannot always recover lost information. Our novel metric provides an intuitive approach to reporting information loss in PGS and an alternative to reporting the total number of SNPs included in the PGS. While applied only to ASD here, this metric can easily be used with other traits.


Asunto(s)
Trastorno del Espectro Autista , Estudio de Asociación del Genoma Completo , Humanos , Trastorno del Espectro Autista/genética , Herencia Multifactorial , Predisposición Genética a la Enfermedad , Masculino , Femenino , Genotipo , Polimorfismo de Nucleótido Simple
2.
Entropy (Basel) ; 26(4)2024 Mar 27.
Artículo en Inglés | MEDLINE | ID: mdl-38667841

RESUMEN

Cognitive science is confronted by several fundamental anomalies deriving from the mind-body problem. Most prominent is the problem of mental causation and the hard problem of consciousness, which can be generalized into the hard problem of agential efficacy and the hard problem of mental content. Here, it is proposed to accept these explanatory gaps at face value and to take them as positive indications of a complex relation: mind and matter are one, but they are not the same. They are related in an efficacious yet non-reducible, non-observable, and even non-intelligible manner. Natural science is well equipped to handle the effects of non-observables, and so the mind is treated as equivalent to a hidden 'black box' coupled to the body. Two concepts are introduced given that there are two directions of coupling influence: (1) irruption denotes the unobservable mind hiddenly making a difference to observable matter, and (2) absorption denotes observable matter hiddenly making a difference to the unobservable mind. The concepts of irruption and absorption are methodologically compatible with existing information-theoretic approaches to neuroscience, such as measuring cognitive activity and subjective qualia in terms of entropy and compression, respectively. By offering novel responses to otherwise intractable theoretical problems from first principles, and by doing so in a way that is closely connected with empirical advances, irruption theory is poised to set the agenda for the future of the mind sciences.

3.
Entropy (Basel) ; 25(12)2023 Dec 15.
Artículo en Inglés | MEDLINE | ID: mdl-38136543

RESUMEN

The information loss paradox associated with black hole Hawking evaporation is an unresolved problem in modern theoretical physics. In a recent brief essay, we revisited the evolution of the black hole entanglement entropy via the Euclidean path integral (EPI) of the quantum state and allow for the branching of semi-classical histories along the Lorentzian evolution. We posited that there exist at least two histories that contribute to EPI, where one is an information-losing history, while the other is an information-preserving one. At early times, the former dominates EPI, while at the late times, the latter becomes dominant. By doing so, we recovered the essence of the Page curve, and thus, the unitarity, albeit with the turning point, i.e., the Page time, much shifted toward the late time. In this full-length paper, we fill in the details of our arguments and calculations to strengthen our notion. One implication of this modified Page curve is that the entropy bound may thus be violated. We comment on the similarity and difference between our approach and that of the replica wormholes and the islands' conjectures.

4.
Entropy (Basel) ; 24(12)2022 Dec 06.
Artículo en Inglés | MEDLINE | ID: mdl-36554189

RESUMEN

The generalized likelihood ratio test (GLRT) for composite hypothesis testing problems is studied from a geometric perspective. An information-geometrical interpretation of the GLRT is proposed based on the geometry of curved exponential families. Two geometric pictures of the GLRT are presented for the cases where unknown parameters are and are not the same under the null and alternative hypotheses, respectively. A demonstration of one-dimensional curved Gaussian distribution is introduced to elucidate the geometric realization of the GLRT. The asymptotic performance of the GLRT is discussed based on the proposed geometric representation of the GLRT. The study provides an alternative perspective for understanding the problems of statistical inference in the theoretical sense.

5.
Front Digit Health ; 4: 841853, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36120716

RESUMEN

Introduction: Electronic Health Records (EHRs) are essential data structures, enabling the sharing of valuable medical care information for a diverse patient population and being reused as input to predictive models for clinical research. However, issues such as the heterogeneity of EHR data and the potential compromisation of patient privacy inhibit the secondary use of EHR data in clinical research. Objectives: This study aims to present the main elements of the MODELHealth project implementation and the evaluation method that was followed to assess the efficiency of its mechanism. Methods: The MODELHealth project was implemented as an Extract-Transform-Load system that collects data from the hospital databases, performs harmonization to the HL7 FHIR standard and anonymization using the k-anonymity method, before loading the transformed data to a central repository. The integrity of the anonymization process was validated by developing a database query tool. The information loss occurring due to the anonymization was estimated with the metrics of generalized information loss, discernibility and average equivalence class size for various values of k. Results: The average values of generalized information loss, discernibility and average equivalence class size obtained across all tested datasets and k values were 0.008473 ± 0.006216252886, 115,145,464.3 ± 79,724,196.11 and 12.1346 ± 6.76096647, correspondingly. The values of those metrics appear correlated with factors such as the k value and the dataset characteristics, as expected. Conclusion: The experimental results of the study demonstrate that it is feasible to perform effective harmonization and anonymization on EHR data while preserving essential patient information.

6.
Artículo en Inglés | MEDLINE | ID: mdl-36011759

RESUMEN

To better explain the cause of gas explosion accidents, based on the existing accident-causation theory, this paper proposes an accident-causation model of gas explosion accidents based on safety information transmission. Based on this, a new method for the prevention of gas explosion accidents can be developed. By analysing the connection between safety information transmission and the causal factors of gas explosion accidents, it is inferred that the loss of safety information transmission is the key factor leading to accidents. Safety information transmission is a process chain in which information is transmitted between the information source and information subject. This process involves the stages of information generation, conversion, perception, cognition, decision-making, and execution. Information loss is inevitable during the transmission process. When the information loss of the degree of safety affects the judgment of the information subject on the current situation and decision making, the possibility of accidents increases. Therefore, in this study, we constructed an accident-causation model for gas explosion accidents based on the three elements and six stages of safety information transmission. Subsequently, the DEMATEL-ISM method was used to quantitatively analyse the causes of gas explosion accidents. Through a multilevel hierarchical structure division of the accident causes, the cause, result, and root factors affecting accidents were identified, and countermeasures were proposed to provide a theoretical basis for the prevention of gas explosion accidents.


Asunto(s)
Accidentes , Explosiones , Accidentes de Trabajo/prevención & control , Causalidad
7.
Entropy (Basel) ; 24(3)2022 Mar 11.
Artículo en Inglés | MEDLINE | ID: mdl-35327902

RESUMEN

In this paper, we focus on some aspects of the relation of spacetime and quantum mechanics and the study counterparts (in Set) of the categorical local symmetries of smooth 4-manifolds. In the set-theoretic limit, there emerge some exotic smoothness structures on R4 (hence the Riemannian nonvanishing curvature), which fit well with the quantum mechanical lattice of projections on infinite-dimensional Hilbert spaces. The method we follow is formalization localized on the open covers of the spacetime manifold. We discuss our findings in the context of the information paradox assigned to evaporating black holes. A black hole can evaporate entirely, but the smoothness structure of spacetime will be altered and, in this way, the missing information about the initial states of matter forming the black hole will be encoded. Thus, the possible global geometric remnant of black holes in spacetime is recognized as exotic 4-smoothness. The full-fledged verification of this proposal will presumably be possible within the scope of future quantum gravity theory research.

8.
BMC Med Inform Decis Mak ; 22(1): 24, 2022 01 28.
Artículo en Inglés | MEDLINE | ID: mdl-35090447

RESUMEN

BACKGROUND: Data privacy is one of the biggest challenges for any organisation which processes personal data, especially in the area of medical research where data include sensitive information about patients and study participants. Sharing of data is therefore problematic, which is at odds with the principle of open data that is so important to the advancement of society and science. Several statistical methods and computational tools have been developed to help data custodians and analysts overcome this challenge. METHODS: In this paper, we propose a new deterministic approach for anonymising personal data. The method stratifies the underlying data by the categorical variables and re-distributes the continuous variables through a k nearest neighbours based algorithm. RESULTS: We demonstrate the use of the deterministic anonymisation on real data, including data from a sample of Titanic passengers, and data from participants in the 1958 Birth Cohort. CONCLUSIONS: The proposed procedure makes data re-identification difficult while minimising the loss of utility (by preserving the spatial properties of the underlying data); the latter means that informative statistical analysis can still be conducted.


Asunto(s)
Investigación Biomédica , Privacidad , Anonimización de la Información , Humanos
9.
Entropy (Basel) ; 24(10)2022 Oct 02.
Artículo en Inglés | MEDLINE | ID: mdl-37420431

RESUMEN

Unitarity demands that the black-hole final state (what remains inside the event horizon at complete evaporation) must be unique. Assuming a UV theory with infinitely many fields, we propose that the uniqueness of the final state can be achieved via a mechanism analogous to the quantum-mechanical description of dissipation.

10.
Int J Data Sci Anal ; 14(1): 65-87, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-34778513

RESUMEN

Social media has been playing a vital importance in information sharing at massive scale due to its easy access, low cost, and faster dissemination of information. Its competence to disseminate the information across a wide audience has raised a critical challenge to determine the social data provenance of digital content. Social Data Provenance describes the origin, derivation process, and transformations of social content throughout its lifecycle. In this paper, we present a Big Social Data Provenance (BSDP) Framework for key-value pair (KVP) database using the novel concept of Zero-Information Loss Database (ZILD). In our proposed framework, a huge volume of social data is first fetched from the social media (Twitter's Network) through live streaming and simultaneously modelled in a KVP database by using a query-driven approach. The proposed framework is capable in capturing, storing, and querying provenance information for different query sets including select, aggregate, standing/historical, and data update (i.e., insert, delete, update) queries on Big Social Data. We evaluate the performance of proposed framework in terms of provenance capturing overhead for different query sets including select, aggregate, and data update queries, and average execution time for various provenance queries.

11.
BMC Anesthesiol ; 21(1): 38, 2021 02 05.
Artículo en Inglés | MEDLINE | ID: mdl-33546588

RESUMEN

BACKGROUND: Handovers of post-anesthesia patients to the intensive care unit (ICU) are often unstructured and performed under time pressure. Hence, they bear a high risk of poor communication, loss of information and potential patient harm. The aim of this study was to investigate the completeness of information transfer and the quantity of information loss during post anesthesia handovers of critical care patients. METHODS: Using a self-developed checklist, including 55 peri-operative items, patient handovers from the operation room or post anesthesia care unit to the ICU staff were observed and documented in real time. Observations were analyzed for the amount of correct and completely transferred patient data in relation to the written documentation within the anesthesia record and the patient's chart. RESULTS: During a ten-week study period, 97 handovers were included. The mean duration of a handover was 146 seconds, interruptions occurred in 34% of all cases. While some items were transferred frequently (basic patient characteristics [72%], surgical procedure [83%], intraoperative complications [93.8%]) others were commonly missed (underlying diseases [23%], long-term medication [6%]). The completeness of information transfer is associated with the handover's duration [B coefficient (95% CI): 0.118 (0.084-0.152), p<0.001] and increases significantly in handovers exceeding a duration of 2 minutes (24% ± 11.7 vs. 40% ± 18.04, p<0.001). CONCLUSIONS: Handover completeness is affected by time pressure, interruptions, and inappropriate surroundings, which increase the risk of information loss. To improve completeness and ensure patient safety, an adequate time span for handover, and the implementation of communication tools are required.


Asunto(s)
Lista de Verificación/métodos , Comunicación , Cuidados Críticos/métodos , Unidades de Cuidados Intensivos , Quirófanos , Pase de Guardia/estadística & datos numéricos , Seguridad del Paciente/estadística & datos numéricos , Adulto , Anciano , Anciano de 80 o más Años , Femenino , Alemania , Humanos , Masculino , Persona de Mediana Edad , Estudios Prospectivos , Tiempo , Adulto Joven
12.
Entropy (Basel) ; 22(2)2020 Feb 18.
Artículo en Inglés | MEDLINE | ID: mdl-33286002

RESUMEN

Pseudo-density matrices are a generalisation of quantum states and do not obey monogamy of quantum correlations. Could this be the solution to the paradox of information loss during the evaporation of a black hole? In this paper we discuss this possibility, providing a theoretical proposal to extend quantum theory with these pseudo-states to describe the statistics arising in black-hole evaporation. We also provide an experimental demonstration of this theoretical proposal, using a simulation in optical regime, that tomographically reproduces the correlations of the pseudo-density matrix describing this physical phenomenon.

13.
Artículo en Inglés | MEDLINE | ID: mdl-33082616

RESUMEN

Significant progress has been made using fMRI to characterize the brain changes that occur in ASD, a complex neuro-developmental disorder. However, due to the high dimensionality and low signal-to-noise ratio of fMRI, embedding informative and robust brain regional fMRI representations for both graph-level classification and region-level functional difference detection tasks between ASD and healthy control (HC) groups is difficult. Here, we model the whole brain fMRI as a graph, which preserves geometrical and temporal information and use a Graph Neural Network (GNN) to learn from the graph-structured fMRI data. We investigate the potential of including mutual information (MI) loss (Infomax), which is an unsupervised term encouraging large MI of each nodal representation and its corresponding graph-level summarized representation to learn a better graph embedding. Specifically, this work developed a pipeline including a GNN encoder, a classifier and a discriminator, which forces the encoded nodal representations to both benefit classification and reveal the common nodal patterns in a graph. We simultaneously optimize graph-level classification loss and Infomax. We demonstrated that Infomax graph embedding improves classification performance as a regularization term. Furthermore, we found separable nodal representations of ASD and HC groups in prefrontal cortex, cingulate cortex, visual regions, and other social, emotional and execution related brain regions. In contrast with GNN with classification loss only, the proposed pipeline can facilitate training more robust ASD classification models. Moreover, the separable nodal representations can detect the functional differences between the two groups and contribute to revealing new ASD biomarkers.

14.
Biochem Biophys Rep ; 22: 100749, 2020 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-32181372

RESUMEN

The present question is the possibility of information loss in gene expression? Information loss in the gene expression disrupts the cellular dynamics and can lead to serious defects, including cancer. Using Gottesman and Preskill method for calculating information loss in black holes,a mechanism for calculating the amount of information transformation in gene expression is proposed. In this proposal, there are three different Hilbert spaces that belong to degrees of freedom of DNA, RNA, and protein. The genetic sequence of the DNA is transcribed into protein at two stages. At first stage it is shown that the internal stationary state of the cell can be represented by a maximally entangled two-mode squeezed state of DNA and mRNA.At second stage, the state of the cell is described by a maximally entangled two-mode squeezed state of mRNA and protein. The amount of information transformation can be obtain by projecting the state at first stage on the state at second stage. Evidently for all finite values of the transcription factor concentration y, binding energy E and free energy F of the transcription factor, the information isn't lost in gene expression.

15.
Math Biosci Eng ; 16(5): 5584-5594, 2019 06 17.
Artículo en Inglés | MEDLINE | ID: mdl-31499726

RESUMEN

The authenticity of the image is crucial to many cases. The efficient detection of the JPEG compression history of bitmap image could reveal the possibility of tampering on the image. In this paper, we propose a lightweight but reliable JPEG compression detection method based on image information loss. An efficient feature of the decreasing percentage of zero coefficient is proposed to detect the JPEG compression history of an image, due to the increasing JPEG compression quality factor. In our method, estimated original images are first created. Then the given image and its estimated counterpart are compressed to get the JPEG coefficient. After that, the image information loss will be calculated. Through the analysis, the goal of the compression history detection can be achieved. Extensive experimental results have demonstrated that the proposed method outperforms the existing methods.

16.
Sensors (Basel) ; 17(3)2017 Feb 26.
Artículo en Inglés | MEDLINE | ID: mdl-28245634

RESUMEN

In this paper, we propose a computationally efficient spatial differencing matrix set (SDMS) method for two-dimensional direction of arrival (2D DOA) estimation with uniform rectangular arrays (URAs) in a low-grazing angle (LGA) condition. By rearranging the auto-correlation and cross-correlation matrices in turn among different subarrays, the SDMS method can estimate the two parameters independently with one-dimensional (1D) subspace-based estimation techniques, where we only perform difference for auto-correlation matrices and the cross-correlation matrices are kept completely. Then, the pair-matching of two parameters is achieved by extracting the diagonal elements of URA. Thus, the proposed method can decrease the computational complexity, suppress the effect of additive noise and also have little information loss. Simulation results show that, in LGA, compared to other methods, the proposed methods can achieve performance improvement in the white or colored noise conditions.

17.
J Med Imaging Radiat Sci ; 48(2): 137-143, 2017 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-31047361

RESUMEN

INTRODUCTION: Quality in radiology images can be assessed by determining the levels of information retained or lost in an image. Information loss in images has been recently assessed via a method based on information theory and the employment of a contrast-detail (CD) phantom. In this study, the traditional CD phantom (air-Perspex) and a modified CD phantom were used. METHODS: Using the Agfa DX-D 600 digital flat panel system, six phantom radiographs were acquired at 70 kVp and 20 mAs. Three x-ray images were acquired for each phantom. RESULTS: Our results demonstrate that the material within the CD phantom influences total information loss (TIL) and image quality figure (IQF) measurements. The modified CD phantom provides a more realistic account of TIL and IQF for soft tissue radiology imaging. CONCLUSION: It is recommended that a low inherent subject contrast phantom, such as this modified CD phantom, be added to the image quality assessment processes of radiology departments. In addition, use of both IQF and TIL to assess image quality will provide radiology departments with greater evidence on which to base decisions.

18.
J R Stat Soc Series B Stat Methodol ; 78(5): 1103-1130, 2016 Nov.
Artículo en Inglés | MEDLINE | ID: mdl-27840585

RESUMEN

We propose a framework for general Bayesian inference. We argue that a valid update of a prior belief distribution to a posterior can be made for parameters which are connected to observations through a loss function rather than the traditional likelihood function, which is recovered as a special case. Modern application areas make it increasingly challenging for Bayesians to attempt to model the true data-generating mechanism. For instance, when the object of interest is low dimensional, such as a mean or median, it is cumbersome to have to achieve this via a complete model for the whole data distribution. More importantly, there are settings where the parameter of interest does not directly index a family of density functions and thus the Bayesian approach to learning about such parameters is currently regarded as problematic. Our framework uses loss functions to connect information in the data to functionals of interest. The updating of beliefs then follows from a decision theoretic approach involving cumulative loss functions. Importantly, the procedure coincides with Bayesian updating when a true likelihood is known yet provides coherent subjective inference in much more general settings. Connections to other inference frameworks are highlighted.

19.
Bull Math Biol ; 78(4): 834-858, 2016 04.
Artículo en Inglés | MEDLINE | ID: mdl-27090117

RESUMEN

Mathematical models of population extinction have a variety of applications in such areas as ecology, paleontology and conservation biology. Here we propose and investigate two types of sub-exponential models of population extinction. Unlike the more traditional exponential models, the life duration of sub-exponential models is finite. In the first model, the population is assumed to be composed of clones that are independent from each other. In the second model, we assume that the size of the population as a whole decreases according to the sub-exponential equation. We then investigate the "unobserved heterogeneity," i.e., the underlying inhomogeneous population model, and calculate the distribution of frequencies of clones for both models. We show that the dynamics of frequencies in the first model is governed by the principle of minimum of Tsallis information loss. In the second model, the notion of "internal population time" is proposed; with respect to the internal time, the dynamics of frequencies is governed by the principle of minimum of Shannon information loss. The results of this analysis show that the principle of minimum of information loss is the underlying law for the evolution of a broad class of models of population extinction. Finally, we propose a possible application of this modeling framework to mechanisms underlying time perception.


Asunto(s)
Extinción Biológica , Modelos Biológicos , Animales , Evolución Biológica , Humanos , Conceptos Matemáticos , Dinámica Poblacional , Percepción del Tiempo
20.
Int J Thermophys ; 36(9): 2328-2341, 2015.
Artículo en Inglés | MEDLINE | ID: mdl-26594081

RESUMEN

Thermal waves are caused by pure diffusion: their amplitude is decreased by more than a factor of 500 within a propagation distance of one wavelength. The diffusion equation, which describes the temperature as a function of space and time, is linear. For every linear equation the superposition principle is valid, which is known as Huygens principle for optical or mechanical wave fields. This limits the spatial resolution, like the Abbe diffraction limit in optics. The resolution is the minimal size of a structure which can be detected at a certain depth. If an embedded structure at a certain depth in a sample is suddenly heated, e.g., by eddy current or absorbed light, an image of the structure can be reconstructed from the measured temperature at the sample surface. To get the resolution the image reconstruction can be considered as the time reversal of the thermal wave. This inverse problem is ill-conditioned and therefore regularization methods have to be used taking additional assumptions like smoothness of the solutions into account. In the present work for the first time, methods of non-equilibrium statistical physics are used to solve this inverse problem without the need of such additional assumptions and without the necessity to choose a regularization parameter. For reconstructing such an embedded structure by thermal waves the resolution turns out to be proportional to the depth and inversely proportional to the natural logarithm of the signal-to-noise ratio. This result could be derived from the diffusion equation by using a delta-source at a certain depth and setting the entropy production caused by thermal diffusion equal to the information loss. No specific model about the stochastic process of the fluctuations and about the distribution densities around the mean values was necessary to get this result.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA