Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
J Sport Exerc Psychol ; 42(1): 15-25, 2020 Feb 01.
Artículo en Inglés | MEDLINE | ID: mdl-31883505

RESUMEN

In 2 experiments, the authors investigated the effects of bimodal integration in a sport-specific task. Beach volleyball players were required to make a tactical decision, responding either verbally or via a motor response, after being presented with visual, auditory, or both kinds of stimuli in a beach volleyball scenario. In Experiment 1, players made the correct decision in a game situation more often when visual and auditory information were congruent than in trials in which they experienced only one of the modalities or incongruent information. Decision-making accuracy was greater when motor, rather than verbal, responses were given. Experiment 2 replicated this congruence effect using different stimulus material and showed a decreasing effect of visual stimulation on decision making as a function of shorter visual stimulus durations. In conclusion, this study shows that bimodal integration of congruent visual and auditory information results in more accurate decision making in sport than unimodal information.

2.
Ann Otol Rhinol Laryngol ; 128(6_suppl): 139S-145S, 2019 Jun.
Artículo en Inglés | MEDLINE | ID: mdl-31092038

RESUMEN

OBJECTIVES: The present study investigated the effects of 3-dimensional deep search (3DDS) signal processing on the enhancement of consonant perception in bimodal and normal hearing listeners. METHODS: Using an articulation-index gram and 3DDS signal processing, consonant segments that greatly affected performance were identified and intensified with a 6-dB gain. Then consonant recognition was measured unilaterally and bilaterally before and after 3DDS processing both in quiet and noise. RESULTS: The 3DDS signal processing provided a benefit to both groups, with greater benefit occurring in noise than quiet. The benefit rendered by 3DDS was the greatest in binaural listening condition. Ability to integrate acoustic features across ears was also enhanced with 3DDS processing. In listeners with normal hearing, manner and place of articulation were improved in binaural listening condition. In bimodal listeners, voicing and manner and place of articulation were also improved in bimodal and hearing aid ear-alone conditions. CONCLUSIONS: Consonant recognition was improved with 3DDS in both groups. This observed benefit suggests 3DDS can be used as an auditory training tool for improved integration and for bimodal users who receive little or no benefit from their current bimodal hearing.


Asunto(s)
Implantes Cocleares , Audífonos , Pérdida Auditiva/fisiopatología , Pérdida Auditiva/rehabilitación , Procesamiento de Señales Asistido por Computador , Percepción del Habla , Adulto , Anciano , Estudios de Casos y Controles , Femenino , Humanos , Masculino , Persona de Mediana Edad
3.
J Neurosci ; 37(42): 10104-10113, 2017 10 18.
Artículo en Inglés | MEDLINE | ID: mdl-28912157

RESUMEN

Integrating inputs across sensory systems is a property of the brain that is vitally important in everyday life. More than two decades of fMRI research have revealed crucial insights on multisensory processing, yet the multisensory operations at the neuronal level in humans have remained largely unknown. Understanding the fine-scale spatial organization of multisensory brain regions is fundamental to shed light on their neuronal operations. Monkey electrophysiology revealed that the bimodal superior temporal cortex (bSTC) is topographically organized according to the modality preference (visual, auditory, and bimodal) of its neurons. In line with invasive studies, a previous 3 Tesla fMRI study suggests that the human bSTC is also topographically organized according to modality preference (visual, auditory, and bimodal) when analyzed at 1.6 × 1.6 × 1.6 mm3 voxel resolution. However, it is still unclear whether this resolution is able to unveil an accurate spatial organization of the human bSTC. This issue was addressed in the present study by investigating the spatial organization of functional responses of the bSTC in 10 participants (from both sexes) at 1.5 × 1.5 × 1.5 mm3 and 1.1 × 1.1 × 1.1 mm3 using ultra-high field fMRI (at 7 Tesla). Relative to 1.5 × 1.5 × 1.5 mm3, the bSTC at 1.1 × 1.1 × 1.1 mm3 resolution was characterized by a larger selectivity for visual and auditory modalities, stronger integrative responses in bimodal voxels, and it was organized in more distinct functional clusters indicating a more precise separation of underlying neuronal clusters. Our findings indicate that increasing the spatial resolution may be necessary and sufficient to achieve a more accurate functional topography of human multisensory integration.SIGNIFICANCE STATEMENT The bimodal superior temporal cortex (bSTC) is a brain region that plays a crucial role in the integration of visual and auditory inputs. The aim of the present study was to investigate the fine-scale spatial organization of the bSTC by using ultra-high magnetic field fMRI at 7 Tesla. Mapping the functional topography of bSTC at a resolution of 1.1 × 1.1 × 1.1 mm3 revealed more accurate representations than at lower resolutions. This result indicates that standard-resolution fMRI may lead to wrong conclusions about the functional organization of the bSTC, whereas high spatial resolution is essential to more accurately approach neuronal operations of human multisensory integration.


Asunto(s)
Estimulación Acústica/métodos , Percepción Auditiva/fisiología , Imagen por Resonancia Magnética/métodos , Estimulación Luminosa/métodos , Lóbulo Temporal/fisiología , Percepción Visual/fisiología , Adulto , Mapeo Encefálico/métodos , Femenino , Humanos , Masculino , Red Nerviosa/fisiología , Distribución Aleatoria , Adulto Joven
4.
Int J Psychophysiol ; 106: 14-20, 2016 08.
Artículo en Inglés | MEDLINE | ID: mdl-27238075

RESUMEN

Recent findings on audiovisual emotional interactions suggest that selective attention affects cross-sensory interaction from an early processing stage. However, the influence of attention manipulation on facial-vocal integration during emotional change perception is still elusive at this point. To address this issue, we asked participants to detect emotional changes conveyed by prosodies (vocal task) or facial expressions (facial task) while facial, vocal, and facial-vocal expressions were presented. At the same time, behavioral responses and electroencephalogram (EEG) were recorded. Behavioral results showed that bimodal emotional changes were detected with shorter response latencies compared to each unimodal condition, suggesting that bimodal emotional cues facilitated emotional change detection. Moreover, while the P3 amplitudes were larger for the bimodal change condition than for the sum of the two unimodal conditions regardless of attention direction, the N1 amplitudes were larger for the bimodal emotional change condition than for the sum of the two unimodal conditions under the attend-voice condition, but not under the attend-face condition. These findings suggest that selective attention modulates facial-vocal integration during emotional change perception in early sensory processing, but not in late cognitive processing stages.


Asunto(s)
Atención/fisiología , Emociones/fisiología , Potenciales Evocados/fisiología , Reconocimiento Facial/fisiología , Percepción del Habla/fisiología , Adulto , Electroencefalografía , Expresión Facial , Femenino , Humanos , Masculino , Adulto Joven
5.
Soc Cogn Affect Neurosci ; 11(7): 1152-61, 2016 07.
Artículo en Inglés | MEDLINE | ID: mdl-26130820

RESUMEN

The ability to detect emotional changes is of primary importance for social living. Though emotional signals are often conveyed by multiple modalities, how emotional changes in vocal and facial modalities integrate into a unified percept has yet to be directly investigated. To address this issue, we asked participants to detect emotional changes delivered by facial, vocal and facial-vocal expressions while behavioral responses and electroencephalogram were recorded. Behavioral results showed that bimodal emotional changes were detected with higher accuracy and shorter response latencies compared with each unimodal condition. Moreover, the detection of emotional change, regardless of modalities, was associated with enhanced amplitudes in the N2 and P3 component, as well as greater theta synchronization. More importantly, the P3 amplitudes and theta synchronization were larger for the bimodal emotional change condition than for the sum of the two unimodal conditions. The superadditive responses in P3 amplitudes and theta synchronization were both positively correlated with the magnitude of the bimodal superadditivity in accuracy. These behavioral and electrophysiological data consistently illustrated an effect of audiovisual integration during the detection of emotional changes, which is most likely mediated by the P3 activity and theta oscillations in brain responses.


Asunto(s)
Señales (Psicología) , Emociones , Expresión Facial , Percepción Social , Voz/fisiología , Electroencefalografía , Sincronización de Fase en Electroencefalografía , Potenciales Evocados/fisiología , Femenino , Humanos , Masculino , Desempeño Psicomotor/fisiología , Tiempo de Reacción , Ritmo Teta/fisiología , Adulto Joven
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA