Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Med Image Anal ; 91: 103027, 2024 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-37992494

RESUMEN

Established surgical navigation systems for pedicle screw placement have been proven to be accurate, but still reveal limitations in registration or surgical guidance. Registration of preoperative data to the intraoperative anatomy remains a time-consuming, error-prone task that includes exposure to harmful radiation. Surgical guidance through conventional displays has well-known drawbacks, as information cannot be presented in-situ and from the surgeon's perspective. Consequently, radiation-free and more automatic registration methods with subsequent surgeon-centric navigation feedback are desirable. In this work, we present a marker-less approach that automatically solves the registration problem for lumbar spinal fusion surgery in a radiation-free manner. A deep neural network was trained to segment the lumbar spine and simultaneously predict its orientation, yielding an initial pose for preoperative models, which then is refined for each vertebra individually and updated in real-time with GPU acceleration while handling surgeon occlusions. An intuitive surgical guidance is provided thanks to the integration into an augmented reality based navigation system. The registration method was verified on a public dataset with a median of 100% successful registrations, a median target registration error of 2.7 mm, a median screw trajectory error of 1.6°and a median screw entry point error of 2.3 mm. Additionally, the whole pipeline was validated in an ex-vivo surgery, yielding a 100% screw accuracy and a median target registration error of 1.0 mm. Our results meet clinical demands and emphasize the potential of RGB-D data for fully automatic registration approaches in combination with augmented reality guidance.


Asunto(s)
Tornillos Pediculares , Fusión Vertebral , Cirugía Asistida por Computador , Humanos , Columna Vertebral/diagnóstico por imagen , Columna Vertebral/cirugía , Cirugía Asistida por Computador/métodos , Vértebras Lumbares/diagnóstico por imagen , Vértebras Lumbares/cirugía , Fusión Vertebral/métodos
2.
Allergol Int ; 68(2): 254-258, 2019 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-30598404

RESUMEN

BACKGROUND: Patch testing of contact allergens to diagnose allergic contact dermatitis (ACD) is a traditional, useful tool. The most important decision is the distinction between allergic and irritant reactions, as this has direct implications on diagnosis and management. Our objective was to evaluate a new method of non-contact infrared reading of patch tests. Secondary objectives included a possible correlation between the intensity of the patch test reaction and temperature change. METHODS: 420 positive reactions from patients were included in our study. An independent patch test reader assessed the positive reactions and classified them as allergic (of intensity + to +++) or irritant (IR). At the same time, a forward-looking infrared (FLIR) camera attachment for an iPhone was used to acquire infrared thermal images of the patch tests, and images were analyzed using the FLIR ONE app. RESULTS: Allergic patch test reactions were characterized by temperature increases of 0.72 ± 0.67 °C compared to surrounding skin. Irritant reactions only resulted in 0.17 ± 0.31 °C temperature increase. The mean temperature difference between the two groups was highly significant (p < 0.0001) and therefore was used to predict the type of contact dermatitis. CONCLUSIONS: Thermography is a reliable and effective way to distinguish between allergic and irritant contact dermatitis.


Asunto(s)
Dermatitis Alérgica por Contacto/diagnóstico , Dermatitis Irritante/diagnóstico , Pruebas del Parche , Termografía , Adolescente , Adulto , Anciano , Anciano de 80 o más Años , Alérgenos/administración & dosificación , Femenino , Humanos , Irritantes/administración & dosificación , Masculino , Persona de Mediana Edad , Sensibilidad y Especificidad , Temperatura Cutánea , Adulto Joven
3.
IEEE Trans Vis Comput Graph ; 23(11): 2455-2462, 2017 11.
Artículo en Inglés | MEDLINE | ID: mdl-28809696

RESUMEN

We present a real-time method for rendering novel virtual camera views from given RGB-D (color and depth) data of a different viewpoint. Missing color and depth information due to incomplete input or disocclusions is efficiently inpainted in a temporally consistent way. The inpainting takes the location of strong image gradients into account as likely depth discontinuities. We present our method in the context of a view correction system for mobile devices, and discuss how to obtain a screen-camera calibration and options for acquiring depth input. Our method has use cases in both augmented and virtual reality applications. We demonstrate the speed of our system and the visual quality of its results in multiple experiments in the paper as well as in the supplementary video.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA