Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Front Psychol ; 14: 1161613, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37384193

RESUMEN

Brain activation during left- and right-hand motor imagery is a popular feature used for brain-computer interfaces. However, most studies so far have only considered right-handed participants in their experiments. This study aimed to investigate how handedness influences brain activation during the processes of imagining and executing simple hand movements. EEG signals were recorded using 32 channels while participants repeatedly squeezed or imagined squeezing a ball using their left, right, or both hands. The data of 14 left-handed and 14 right-handed persons were analyzed with a focus on event-related desynchronization/synchronization patterns (ERD/S). Both handedness groups showed activation over sensorimotor areas; however, the right-handed group tended to display more bilateral patterns than the left-handed group, which is in contrast to earlier research results. Furthermore, a stronger activation during motor imagery than during motor execution could be found in both groups.

2.
Front Hum Neurosci ; 17: 1160800, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37180552

RESUMEN

For their ease of accessibility and low cost, current Brain-Computer Interfaces (BCI) used to detect subjective emotional and affective states rely largely on electroencephalographic (EEG) signals. Public datasets are available for researchers to design models for affect detection from EEG. However, few designs focus on optimally exploiting the nature of the stimulus elicitation to improve accuracy. The RSVP protocol is used in this experiment to present human faces of emotion to 28 participants while EEG was measured. We found that artificially enhanced human faces with exaggerated, cartoonish visual features significantly improve some commonly used neural correlates of emotion as measured by event-related potentials (ERPs). These images elicit an enhanced N170 component, well known to relate to the facial visual encoding process. Our findings suggest that the study of emotion elicitation could exploit consistent, high detail, AI generated stimuli transformations to study the characteristics of electrical brain activity related to visual affective stimuli. Furthermore, this specific result might be useful in the context of affective BCI design, where a higher accuracy in affect decoding from EEG can improve the experience of a user.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA