Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 4 de 4
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
J Med Internet Res ; 22(4): e13810, 2020 04 22.
Artículo en Inglés | MEDLINE | ID: mdl-32319961

RESUMEN

BACKGROUND: Several studies have shown that facial attention differs in children with autism. Measuring eye gaze and emotion recognition in children with autism is challenging, as standard clinical assessments must be delivered in clinical settings by a trained clinician. Wearable technologies may be able to bring eye gaze and emotion recognition into natural social interactions and settings. OBJECTIVE: This study aimed to test: (1) the feasibility of tracking gaze using wearable smart glasses during a facial expression recognition task and (2) the ability of these gaze-tracking data, together with facial expression recognition responses, to distinguish children with autism from neurotypical controls (NCs). METHODS: We compared the eye gaze and emotion recognition patterns of 16 children with autism spectrum disorder (ASD) and 17 children without ASD via wearable smart glasses fitted with a custom eye tracker. Children identified static facial expressions of images presented on a computer screen along with nonsocial distractors while wearing Google Glass and the eye tracker. Faces were presented in three trials, during one of which children received feedback in the form of the correct classification. We employed hybrid human-labeling and computer vision-enabled methods for pupil tracking and world-gaze translation calibration. We analyzed the impact of gaze and emotion recognition features in a prediction task aiming to distinguish children with ASD from NC participants. RESULTS: Gaze and emotion recognition patterns enabled the training of a classifier that distinguished ASD and NC groups. However, it was unable to significantly outperform other classifiers that used only age and gender features, suggesting that further work is necessary to disentangle these effects. CONCLUSIONS: Although wearable smart glasses show promise in identifying subtle differences in gaze tracking and emotion recognition patterns in children with and without ASD, the present form factor and data do not allow for these differences to be reliably exploited by machine learning systems. Resolving these challenges will be an important step toward continuous tracking of the ASD phenotype.


Asunto(s)
Trastorno del Espectro Autista/terapia , Emociones/fisiología , Gafas Inteligentes/normas , Dispositivos Electrónicos Vestibles/normas , Adolescente , Niño , Femenino , Humanos , Masculino , Fenotipo
2.
J Psychiatr Res ; 111: 140-144, 2019 04.
Artículo en Inglés | MEDLINE | ID: mdl-30771619

RESUMEN

Children with autism spectrum disorder (ASD) frequently exhibit language delays and functional communication deficits. Pivotal response treatment (PRT) is an effective intervention for targeting these skills; however, similar to other behavioral interventions, response to PRT is variable across individuals. Thus, objective markers capable of predicting treatment response are critically-needed to identify which children are most likely to benefit from this intervention. In this pilot study, we investigated whether structural neuroimaging measures from language regions in the brain are associated with response to PRT. Children with ASD (n = 18) who were receiving PRT to target their language deficits were assessed with MRI at baseline. T1-weighted images were segmented with FreeSurfer and morphometric measures of the primary language regions (inferior frontal (IFG) and superior temporal (STG) gyri) were evaluated. Children with ASD and language deficits did not exhibit the anticipated relationships between baseline structural measures of language regions and baseline language abilities, as assessed by the number of utterances displayed during a structured laboratory observation (SLO). Interestingly, the level of improvement on the SLO was correlated with baseline asymmetry of the IFG, and the size of the left STG at baseline was correlated with the level of improvement on standardized parental questionnaires. Although very preliminary, the observed associations between baseline structural properties of language regions and improvement in language abilities following PRT suggest that neuroimaging measures may be able to help identify which children are most likely to benefit from specific language treatments, which could help improve precision medicine for children with ASD.


Asunto(s)
Trastorno del Espectro Autista , Terapia Conductista/métodos , Trastornos del Lenguaje , Terapia del Lenguaje/métodos , Evaluación de Resultado en la Atención de Salud , Corteza Prefrontal/diagnóstico por imagen , Lóbulo Temporal/diagnóstico por imagen , Trastorno del Espectro Autista/complicaciones , Trastorno del Espectro Autista/diagnóstico por imagen , Trastorno del Espectro Autista/terapia , Niño , Preescolar , Femenino , Humanos , Trastornos del Lenguaje/diagnóstico por imagen , Trastornos del Lenguaje/etiología , Trastornos del Lenguaje/terapia , Imagen por Resonancia Magnética , Masculino , Neuroimagen , Proyectos Piloto
3.
Science ; 363(6424)2019 01 18.
Artículo en Inglés | MEDLINE | ID: mdl-30545847

RESUMEN

A wide range of human diseases result from haploinsufficiency, where the function of one of the two gene copies is lost. Here, we targeted the remaining functional copy of a haploinsufficient gene using CRISPR-mediated activation (CRISPRa) in Sim1 and Mc4r heterozygous mouse models to rescue their obesity phenotype. Transgenic-based CRISPRa targeting of the Sim1 promoter or its distant hypothalamic enhancer up-regulated its expression from the endogenous functional allele in a tissue-specific manner, rescuing the obesity phenotype in Sim1 heterozygous mice. To evaluate the therapeutic potential of CRISPRa, we injected CRISPRa-recombinant adeno-associated virus into the hypothalamus, which led to reversal of the obesity phenotype in Sim1 and Mc4r haploinsufficient mice. Our results suggest that endogenous gene up-regulation could be a potential strategy to treat altered gene dosage diseases.


Asunto(s)
Repeticiones Palindrómicas Cortas Agrupadas y Regularmente Espaciadas , Elementos de Facilitación Genéticos , Haploinsuficiencia , Obesidad/genética , Regiones Promotoras Genéticas , Animales , Factores de Transcripción con Motivo Hélice-Asa-Hélice Básico/genética , Línea Celular , Dependovirus , Modelos Animales de Enfermedad , Femenino , Regulación de la Expresión Génica , Técnicas de Transferencia de Gen , Heterocigoto , Hipotálamo , Mutación con Pérdida de Función , Masculino , Ratones , Ratones Transgénicos , Obesidad/terapia , Fenotipo , Receptor de Melanocortina Tipo 4/genética , Proteínas Represoras/genética , Regulación hacia Arriba , Aumento de Peso
4.
Appl Clin Inform ; 9(1): 129-140, 2018 01.
Artículo en Inglés | MEDLINE | ID: mdl-29466819

RESUMEN

BACKGROUND: Recent advances in computer vision and wearable technology have created an opportunity to introduce mobile therapy systems for autism spectrum disorders (ASD) that can respond to the increasing demand for therapeutic interventions; however, feasibility questions must be answered first. OBJECTIVE: We studied the feasibility of a prototype therapeutic tool for children with ASD using Google Glass, examining whether children with ASD would wear such a device, if providing the emotion classification will improve emotion recognition, and how emotion recognition differs between ASD participants and neurotypical controls (NC). METHODS: We ran a controlled laboratory experiment with 43 children: 23 with ASD and 20 NC. Children identified static facial images on a computer screen with one of 7 emotions in 3 successive batches: the first with no information about emotion provided to the child, the second with the correct classification from the Glass labeling the emotion, and the third again without emotion information. We then trained a logistic regression classifier on the emotion confusion matrices generated by the two information-free batches to predict ASD versus NC. RESULTS: All 43 children were comfortable wearing the Glass. ASD and NC participants who completed the computer task with Glass providing audible emotion labeling (n = 33) showed increased accuracies in emotion labeling, and the logistic regression classifier achieved an accuracy of 72.7%. Further analysis suggests that the ability to recognize surprise, fear, and neutrality may distinguish ASD cases from NC. CONCLUSION: This feasibility study supports the utility of a wearable device for social affective learning in ASD children and demonstrates subtle differences in how ASD and NC children perform on an emotion recognition task.


Asunto(s)
Trastorno Autístico/psicología , Conducta , Aprendizaje Social , Dispositivos Electrónicos Vestibles , Estudios de Casos y Controles , Niño , Demografía , Emociones , Estudios de Factibilidad , Femenino , Humanos , Modelos Logísticos , Masculino , Modelos Biológicos , Análisis y Desempeño de Tareas
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA