Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 10 de 10
Filtrar
1.
Nat Commun ; 14(1): 186, 2023 01 17.
Artículo en Inglés | MEDLINE | ID: mdl-36650144

RESUMEN

Dynamic processes on networks, be it information transfer in the Internet, contagious spreading in a social network, or neural signaling, take place along shortest or nearly shortest paths. Computing shortest paths is a straightforward task when the network of interest is fully known, and there are a plethora of computational algorithms for this purpose. Unfortunately, our maps of most large networks are substantially incomplete due to either the highly dynamic nature of networks, or high cost of network measurements, or both, rendering traditional path finding methods inefficient. We find that shortest paths in large real networks, such as the network of protein-protein interactions and the Internet at the autonomous system level, are not random but are organized according to latent-geometric rules. If nodes of these networks are mapped to points in latent hyperbolic spaces, shortest paths in them align along geodesic curves connecting endpoint nodes. We find that this alignment is sufficiently strong to allow for the identification of shortest path nodes even in the case of substantially incomplete networks, where numbers of missing links exceed those of observable links. We demonstrate the utility of latent-geometric path finding in problems of cellular pathway reconstruction and communication security.


Asunto(s)
Algoritmos , Transducción de Señal , Comunicación , Comunicación Celular
2.
Risk Anal ; 40(1): 183-199, 2020 01.
Artículo en Inglés | MEDLINE | ID: mdl-28873246

RESUMEN

Risk assessors and managers face many difficult challenges related to novel cyber systems. Among these challenges are the constantly changing nature of cyber systems caused by technical advances, their distribution across the physical, information, and sociocognitive domains, and the complex network structures often including thousands of nodes. Here, we review probabilistic and risk-based decision-making techniques applied to cyber systems and conclude that existing approaches typically do not address all components of the risk assessment triplet (threat, vulnerability, consequence) and lack the ability to integrate across multiple domains of cyber systems to provide guidance for enhancing cybersecurity. We present a decision-analysis-based approach that quantifies threat, vulnerability, and consequences through a set of criteria designed to assess the overall utility of cybersecurity management alternatives. The proposed framework bridges the gap between risk assessment and risk management, allowing an analyst to ensure a structured and transparent process of selecting risk management alternatives. The use of this technique is illustrated for a hypothetical, but realistic, case study exemplifying the process of evaluating and ranking five cybersecurity enhancement strategies. The approach presented does not necessarily eliminate biases and subjectivity necessary for selecting countermeasures, but provides justifiable methods for selecting risk management actions consistent with stakeholder and decisionmaker values and technical data.

3.
Phys Rev E ; 97(1-1): 012309, 2018 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-29448477

RESUMEN

We analyze the stability of the network's giant connected component under impact of adverse events, which we model through the link percolation. Specifically, we quantify the extent to which the largest connected component of a network consists of the same nodes, regardless of the specific set of deactivated links. Our results are intuitive in the case of single-layered systems: the presence of large degree nodes in a single-layered network ensures both its robustness and stability. In contrast, we find that interdependent networks that are robust to adverse events have unstable connected components. Our results bring novel insights to the design of resilient network topologies and the reinforcement of existing networked systems.

4.
Sci Rep ; 8(1): 1859, 2018 01 30.
Artículo en Inglés | MEDLINE | ID: mdl-29382870

RESUMEN

Assessing and managing the impact of large-scale epidemics considering only the individual risk and severity of the disease is exceedingly difficult and could be extremely expensive. Economic consequences, infrastructure and service disruption, as well as the recovery speed, are just a few of the many dimensions along which to quantify the effect of an epidemic on society's fabric. Here, we extend the concept of resilience to characterize epidemics in structured populations, by defining the system-wide critical functionality that combines an individual's risk of getting the disease (disease attack rate) and the disruption to the system's functionality (human mobility deterioration). By studying both conceptual and data-driven models, we show that the integrated consideration of individual risks and societal disruptions under resilience assessment framework provides an insightful picture of how an epidemic might impact society. In particular, containment interventions intended for a straightforward reduction of the risk may have net negative impact on the system by slowing down the recovery of basic societal functions. The presented study operationalizes the resilience framework, providing a more nuanced and comprehensive approach for optimizing containment schemes and mitigation policies in the case of epidemic outbreaks.


Asunto(s)
Brotes de Enfermedades/prevención & control , Epidemias , Modelos Teóricos , Resiliencia Psicológica , Automanejo , Manejo de la Enfermedad , Humanos , Medición de Riesgo
5.
Sci Adv ; 3(12): e1701079, 2017 12.
Artículo en Inglés | MEDLINE | ID: mdl-29291243

RESUMEN

Urban transportation systems are vulnerable to congestion, accidents, weather, special events, and other costly delays. Whereas typical policy responses prioritize reduction of delays under normal conditions to improve the efficiency of urban road systems, analytic support for investments that improve resilience (defined as system recovery from additional disruptions) is still scarce. In this effort, we represent paved roads as a transportation network by mapping intersections to nodes and road segments between the intersections to links. We built road networks for 40 of the urban areas defined by the U.S. Census Bureau. We developed and calibrated a model to evaluate traffic delays using link loads. The loads may be regarded as traffic-based centrality measures, estimating the number of individuals using corresponding road segments. Efficiency was estimated as the average annual delay per peak-period auto commuter, and modeled results were found to be close to observed data, with the notable exception of New York City. Resilience was estimated as the change in efficiency resulting from roadway disruptions and was found to vary between cities, with increased delays due to a 5% random loss of road linkages ranging from 9.5% in Los Angeles to 56.0% in San Francisco. The results demonstrate that many urban road systems that operate inefficiently under normal conditions are nevertheless resilient to disruption, whereas some more efficient cities are more fragile. The implication is that resilience, not just efficiency, should be considered explicitly in roadway project selection and justify investment opportunities related to disaster and other disruptions.

6.
Risk Anal ; 37(9): 1644-1651, 2017 09.
Artículo en Inglés | MEDLINE | ID: mdl-27935146

RESUMEN

Recent cyber attacks provide evidence of increased threats to our critical systems and infrastructure. A common reaction to a new threat is to harden the system by adding new rules and regulations. As federal and state governments request new procedures to follow, each of their organizations implements their own cyber defense strategies. This unintentionally increases time and effort that employees spend on training and policy implementation and decreases the time and latitude to perform critical job functions, thus raising overall levels of stress. People's performance under stress, coupled with an overabundance of information, results in even more vulnerabilities for adversaries to exploit. In this article, we embed a simple regulatory model that accounts for cybersecurity human factors and an organization's regulatory environment in a model of a corporate cyber network under attack. The resulting model demonstrates the effect of under- and overregulation on an organization's resilience with respect to insider threats. Currently, there is a tendency to use ad-hoc approaches to account for human factors rather than to incorporate them into cyber resilience modeling. It is clear that using a systematic approach utilizing behavioral science, which already exists in cyber resilience assessment, would provide a more holistic view for decisionmakers.

7.
Sci Rep ; 6: 19540, 2016 Jan 19.
Artículo en Inglés | MEDLINE | ID: mdl-26782180

RESUMEN

Building resilience into today's complex infrastructures is critical to the daily functioning of society and its ability to withstand and recover from natural disasters, epidemics, and cyber-threats. This study proposes quantitative measures that capture and implement the definition of engineering resilience advanced by the National Academy of Sciences. The approach is applicable across physical, information, and social domains. It evaluates the critical functionality, defined as a performance function of time set by the stakeholders. Critical functionality is a source of valuable information, such as the integrated system resilience over a time interval, and its robustness. The paper demonstrates the formulation on two classes of models: 1) multi-level directed acyclic graphs, and 2) interdependent coupled networks. For both models synthetic case studies are used to explore trends. For the first class, the approach is also applied to the Linux operating system. Results indicate that desired resilience and robustness levels are achievable by trading off different design parameters, such as redundancy, node recovery time, and backup supply available. The nonlinear relationship between network parameters and resilience levels confirms the utility of the proposed approach, which is of benefit to analysts and designers of complex systems and networks.


Asunto(s)
Modelos Teóricos , Programas Informáticos , Desastres , National Academy of Sciences, U.S. , Estados Unidos
8.
Med Phys ; 37(9): 5037-43, 2010 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-20964223

RESUMEN

PURPOSE: To minimize respiratory motion artifacts, this work proposes quiescent period gating (QPG) methods that extract PET data from the end-expiration quiescent period and form a single PET frame with reduced motion and improved signal-to-noise properties. METHODS: Two QPG methods are proposed andevaluated. Histogram-based quiescent period gating (H-QPG) extracts a fraction of PET data determined by a window of the respiratory displacement signal histogram. Cycle-based quiescent period gating (C-QPG) extracts data with a respiratory displacement signal below a specified threshold of the maximum amplitude of each individual respiratory cycle. Performances of both QPG methods were compared to ungated and five-bin phase-gated images across 21 FDG-PET/CT patient data sets containing 31 thorax and abdomen lesions as well as with computer simulations driven by 1295 different patient respiratory traces. Image quality was evaluated in terms of the lesion SUV(max) and the fraction of counts included in each gate as a surrogate for image noise. RESULTS: For all the gating methods, image noise artifactually increases SUV(max) when the fraction of counts included in each gate is less than 50%. While simulation data show that H-QPG is superior to C-QPG, the H-QPG and C-QPG methods lead to similar quantification-noise tradeoffs in patient data. Compared to ungated images, both QPG methods yield significantly higher lesion SUV(max). Compared to five-bin phase gating, the QPG methods yield significantly larger fraction of counts with similar SUV(max) improvement. Both QPG methods result in increased lesion SUV(max) for patients whose lesions have longer quiescent periods. CONCLUSIONS: Compared to ungated and phase-gated images, the QPG methods lead to images with less motion blurring and an improved compromise between SUV(max) and fraction of counts. The QPG methods for respiratory motion compensation could effectively improve tumor quantification with minimal noise increase.


Asunto(s)
Tomografía de Emisión de Positrones/métodos , Técnicas de Imagen Sincronizada Respiratorias/métodos , Descanso , Tomografía Computarizada por Rayos X/métodos , Anciano , Humanos , Persona de Mediana Edad , Respiración , Estudios Retrospectivos
9.
J Nucl Med ; 47(12): 1960-7, 2006 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-17138738

RESUMEN

UNLABELLED: The National Electrical Manufacturers Association (NEMA) NU 2-2001 performance measurements were conducted on the Discovery RX, a whole-body PET/CT system under development by GE Healthcare. The PET scanner uses 4.2 x 6.3 x 30 mm lutetium yttrium orthosilicate (LYSO) crystals grouped in 9 x 6 blocks. There are 24 rings with 630 crystals per ring and the ring diameter is 88.6 cm. The transaxial and axial fields of view are 70.0 and 15.7 cm, respectively. The scanner has retractable septa and can operate in both 2-dimensional (2D) and 3-dimensional (3D) modes. 2D acquisitions use ring differences of +/-4 for direct and +/-5 for cross slices; 3D acquisitions use a ring difference of 23. The coincident window width is 6.5 ns and the energy window is 425-650 keV. Other than the detectors, the system uses the same hardware and software as a Discovery ST. The CT scanner is a 16-slice LightSpeed; the performance characteristics of the CT component are not included herein. METHODS: Performance measurements of sensitivity, spatial resolution, image quality, scatter fraction and counting rate performance, and image quality were obtained using NEMA methodology. RESULTS: The system sensitivity in 2D and 3D was measured as 1.7 cps/kBq and 7.3 cps/kBq, respectively. The transaxial resolution for 2D (3D) was 5.1 mm full width at half maximum (FWHM) (5.0 mm) at 1 cm from gantry center and the radial and tangential resolutions were 5.9 mm (5.9 mm) and 5.1 mm (5.2 mm) at 10 cm, respectively. The axial resolution for 2D (3D) was 4.8 mm FWHM (5.8 mm) and 6.3 mm (6.5 mm) at 1 cm and 10 cm from gantry center, respectively. The scatter fraction was 13.1% and 31.8% in 2D and 3D. The peak noise equivalent count rate (NECR) was 155 kcps at 92.1 kBq/mL in 2D and 117.7 kcps at 21.7 kBq/mL in 3D for a noise-free estimation of randoms. The contrast of the 22, 17, 13, and 10 mm hot spheres in the image quality phantom in 2D (3D) were 74.6% (72.4%), 56.7% (59.5%), 46.2% (44.6%), and 17.9% (18.0%), respectively. CONCLUSION: The Discovery RX is a scanner that possesses high NECR, low scatter fraction, and good spatial resolution characteristics.


Asunto(s)
Imagenología Tridimensional/instrumentación , Tomografía de Emisión de Positrones/instrumentación , Técnica de Sustracción/instrumentación , Tomografía Computarizada por Rayos X/instrumentación , Diseño de Equipo , Análisis de Falla de Equipo/normas , Guías como Asunto , Imagenología Tridimensional/normas , Fantasmas de Imagen , Tomografía de Emisión de Positrones/normas , Reproducibilidad de los Resultados , Sensibilidad y Especificidad , Técnica de Sustracción/normas , Tomografía Computarizada por Rayos X/normas , Estados Unidos
10.
Artículo en Inglés | MEDLINE | ID: mdl-22072860

RESUMEN

We investigated the use of partial collimation on a clinical PET scanner by removing septa from conventional 2D collimators. The goal is to improve noise equivalent count-rates (NEC) compared to 2D and 3D scans for clinically relevant activity concentrations. We evaluated two cases: removing half of the septa (2.5D); and removing two-thirds of the septa (2.7D). System performance was first modeled using the SimSET simulation package, and then measured with the NEMA NU2-2001 count-rate cylinder (20 cm dia., 70 cm long), and 27 cm and 35 cm diameter cylinders of the same length. An image quality phantom was also imaged with the 2.7D collimator. SimSET predicted the relative NEC curves very well, as confirmed by measurements, with 2.5D and 2.7D NEC greater than 2D and 3D NEC in the range of ~5-20 mCi in the phantom. We successfully reconstructed images of the image quality phantom from measured 2.7D data using custom 2.7D normalization. Partial collimation shows promise for optimized clinical imaging in a fixed-collimator system.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA