Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 2 de 2
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Artículo en Inglés | MEDLINE | ID: mdl-39229675

RESUMEN

While artificial Intelligence (AI) has made significant advancements, the seeming absence of its emotional ability has hindered effective communication with humans. This study explores how ChatGPT (ChatGPT-3.5 Mar 23, 2023 Version) represents affective responses to emotional narratives and compare these responses to human responses. Thirty-four participants read affect-eliciting short stories and rated their emotional responses and 10 recorded ChatGPT sessions generated responses to the stories. Classification analyses revealed the successful identification of affective categories of stories, valence, and arousal within and across sessions for ChatGPT. Classification analyses revealed the successful identification of affective categories of stories, valence, and arousal within and across sessions for ChatGPT. Classification accuracies predicting affective categories of stories, valence, and arousal of humans based on the affective ratings of ChatGPT and vice versa were not significant, indicating differences in the way the affective states were represented., indicating differences in the way the affective states were represented. These findings suggested that ChatGPT can distinguish emotional states and generate affective responses consistently, but there are differences in how the affective states are represented between ChatGPT and humans. Understanding these mechanisms is crucial for improving emotional interactions with AI.

2.
Cognition ; 249: 105830, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38810426

RESUMEN

Prior studies have extensively examined modality-general representation of affect across various sensory modalities, particularly focusing on auditory and visual stimuli. However, little research has explored the modality-general representation of affect between gustatory and other sensory modalities. This study aimed to investigate whether the affective responses induced by tastes and musical pieces could be predicted within and across modalities. For each modality, eight stimuli were chosen based on four basic taste conditions (sweet, bitter, sour, and salty). Participants rated their responses to each stimulus using both taste and emotion scales. The multivariate analyses including multidimensional scaling and classification analysis were performed. The findings revealed that auditory and gustatory stimuli in the sweet category were associated with positive valence, whereas those from the other taste categories were linked to negative valence. Additionally, auditory and gustatory stimuli in sour taste category were linked to high arousal, whereas stimuli in bitter taste category were associated with low arousal. This study revealed the potential mapping of gustatory and auditory stimuli onto core affect space in everyday experiences. Moreover, it demonstrated that emotions evoked by taste and music could be predicted across modalities, supporting modality-general representation of affect.


Asunto(s)
Música , Percepción del Gusto , Humanos , Femenino , Masculino , Adulto Joven , Adulto , Percepción del Gusto/fisiología , Afecto/fisiología , Percepción Auditiva/fisiología , Gusto/fisiología , Estimulación Acústica , Emociones/fisiología , Nivel de Alerta/fisiología , Adolescente
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA