Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 61
Filtrar
1.
Sci Rep ; 14(1): 17859, 2024 08 01.
Artículo en Inglés | MEDLINE | ID: mdl-39090239

RESUMEN

Recent research shows that emotional facial expressions impact behavioral responses only when their valence is relevant to the task. Under such conditions, threatening faces delay attentional disengagement, resulting in slower reaction times and increased omission errors compared to happy faces. To investigate the neural underpinnings of this phenomenon, we used functional magnetic resonance imaging to record the brain activity of 23 healthy participants while they completed two versions of the go/no-go task. In the emotion task (ET), participants responded to emotional expressions (fearful or happy faces) and refrained from responding to neutral faces. In the gender task (GT), the same images were displayed, but participants had to respond based on the posers' gender. Our results confirmed previous behavioral findings and revealed a network of brain regions (including the angular gyrus, the ventral precuneus, the left posterior cingulate cortex, the right anterior superior frontal gyrus, and two face-responsive regions) displaying distinct activation patterns for the same facial emotional expressions in the ET compared to the GT. We propose that this network integrates internal representations of task rules with sensory characteristics of facial expressions to evaluate emotional stimuli and exert top-down control, guiding goal-directed actions according to the context.


Asunto(s)
Mapeo Encefálico , Encéfalo , Emociones , Expresión Facial , Imagen por Resonancia Magnética , Tiempo de Reacción , Humanos , Masculino , Femenino , Emociones/fisiología , Adulto , Adulto Joven , Encéfalo/fisiología , Encéfalo/diagnóstico por imagen , Tiempo de Reacción/fisiología
2.
BMC Psychol ; 12(1): 459, 2024 Aug 29.
Artículo en Inglés | MEDLINE | ID: mdl-39210484

RESUMEN

BACKGROUND: Attentional processes are influenced by both stimulus characteristics and individual factors such as mood or personal experience. Research has suggested that attentional biases to socially relevant stimuli may occur in individuals with a history of peer victimization in childhood and adolescence. Based on this, the present study aimed to examine attentional processes in response to emotional faces at both the behavioral and neurophysiological levels in participants with experiences of peer victimization. METHODS: In a sample of 60 adult participants with varying severity of retrospectively reported peer victimization in childhood and adolescence, the dot-probe task was administered with angry, disgusted, sad, and happy facial expressions. In addition to behavioral responses, physiological responses (i.e., event-related potentials) were analyzed. RESULTS: Analyses of mean P100 and P200 amplitudes revealed altered P200 amplitudes in individuals with higher degrees of peer victimization. Higher levels of relational peer victimization were associated with increased P200 amplitudes in response to facial expressions, particularly angry and disgusted facial expressions. Hierarchical regression analyses showed no evidence for an influence of peer victimization experiences on reaction times or P100 amplitudes in response to the different emotions. CONCLUSION: Cortical findings suggest that individuals with higher levels of peer victimization mobilize more attentional resources when confronted with negative emotional social stimuli. Peer victimization experiences in childhood and adolescence appear to influence cortical processes into adulthood.


Asunto(s)
Atención , Emociones , Potenciales Evocados , Expresión Facial , Humanos , Masculino , Femenino , Potenciales Evocados/fisiología , Adulto , Emociones/fisiología , Adulto Joven , Atención/fisiología , Electroencefalografía , Grupo Paritario , Acoso Escolar/psicología , Víctimas de Crimen/psicología , Reconocimiento Facial/fisiología , Estudios Retrospectivos , Adolescente
3.
Artículo en Inglés | MEDLINE | ID: mdl-38296873

RESUMEN

Under natural viewing conditions, complex stimuli such as human faces are typically looked at several times in succession, implying that their recognition may unfold across multiple eye fixations. Although electrophysiological (EEG) experiments on face recognition typically prohibit eye movements, participants still execute frequent (micro)saccades on the face, each of which generates its own visuocortical response. This finding raises the question of whether the fixation-related potentials (FRPs) evoked by these tiny gaze shifts also contain psychologically valuable information about face processing. Here, we investigated this question by corecording EEG and eye movements in an experiment with emotional faces (happy, angry, neutral). Deconvolution modeling was used to separate the stimulus ERPs to face onset from the FRPs generated by subsequent microsaccades-induced refixations on the face. As expected, stimulus ERPs exhibited typical emotion effects, with a larger early posterior negativity (EPN) for happy/angry compared with neutral faces. Eye tracking confirmed that participants made small saccades in 98% of the trials, which were often aimed at the left eye of the stimulus face. However, while each saccade produced a strong response over visual areas, this response was unaffected by the face's emotional expression, both for the first and for subsequent (micro)saccades. This finding suggests that the face's affective content is rapidly evaluated after stimulus onset, leading to only a short-lived sensory enhancement by arousing stimuli that does not repeat itself during immediate refixations. Methodologically, our work demonstrates how eye tracking and deconvolution modeling can be used to extract several brain responses from each EEG trial, providing insights into neural processing at different latencies after stimulus onset.

4.
Cogn Emot ; 37(4): 835-851, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37190958

RESUMEN

According to the perceptual-attentional limitations hypothesis, the confusion between expressions of disgust and anger may be due to the difficulty in perceptually distinguishing the two, or insufficient attention to their distinctive cues. The objective of the current study was to test this hypothesis as an explanation for the confusion between expressions of disgust and anger in adults using eye-movements. In Experiment 1, participants were asked to identify each emotion in 96 trials composed of prototypes of anger and prototypes of disgust. In Experiment 2, fixation points oriented participants' attention toward the eyes, the nose, or the mouth of each prototype. Results revealed that disgust was less accurately recognised than anger (Experiment 1 and 2), especially when the mouth was open (Experiment 1 and 2), and even when attention was oriented toward the distinctive features of disgust (Experiment 2). Additionally, when attention was oriented toward certain zones, the eyes (which contain characteristics of anger) had the longest dwell times, followed by the nose (which contains characteristics of disgust; Experiment 2). Thus, although participants may attend to the distinguishing features of disgust and anger, these may not aid them in accurately recognising each prototype.


Asunto(s)
Asco , Adulto , Humanos , Ira , Emociones , Confusión , Cara
5.
Front Psychol ; 14: 1127381, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-36949914

RESUMEN

Introduction: Previous research has indicated altered attentional processing in individuals with experiences of maltreatment or victimization in childhood and adolescence. The present study examined the impact of child and adolescent experiences of relational peer victimization on attentional processes in adulthood when confronted with emotional facial expressions. Methods: As part of an online study, a community sample of adults completed a facial dot-probe task. In the present task, pictures of facial expressions displaying four different emotions (anger, disgust, happiness, and sadness) were used. Results: The results of the hierarchical regression analyses showed that retrospective reports of peer victimization made a significant contribution to the prediction of facilitated orienting processes for sad facial expressions. Experiences of emotional child maltreatment, on the other hand, made a significant contribution to the prediction of attentional biases for angry facial expressions. Discussion: Our results emphasize the relevance of experiences of emotional and relational maltreatment in childhood and in adolescence for the processing of social stimuli in adulthood. The findings regarding emotional child maltreatment are more indicative of attentional biases in the context of threat detection, whereas the altered attentional processes in peer victimization are more indicative of mood-congruent biases. These altered processes may be active in social situations and may therefore influence future social situations, behavior, feelings, and thus mental health.

6.
Arch Suicide Res ; 27(3): 938-955, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-35787745

RESUMEN

Individuals who with suicide behaviors pay more attention to negative signals than positive ones. However, it is unclear that whether this bias exists when suicide ideators perceive interpersonal stimuli (such as faces with emotion) and the underlying neural mechanism of the attention process. The present study aimed to examine the attentional bias toward emotional facial expressions by employing event-related potentials in a population with suicide ideation. Twenty-five undergraduates with suicide ideation (SI group) and sixteen undergraduates without suicide ideation (NSI group) participated in a modified dot-probe task. Compared to the NSI group, the SI group exhibited: (1) a longer mean reaction time to fearful faces; (2) a larger N1 component to fearful faces; (3) a larger N1 component to the location of sad faces, as well as to the opposite location of fearful faces and happy faces; and (4) a larger N1 component to the contralateral location of happy faces, whereas the NSI group elicited a larger N1 component to the ipsilateral location of happy faces. These results indicated that the SI group was more sensitive to negative emotions (fearful and sad faces) than positive emotions (happy faces), and the negative interpersonal stimuli in suicide ideators was processed at an early attention stage.


Asunto(s)
Sesgo Atencional , Ideación Suicida , Humanos , Expresión Facial , Emociones , Miedo
7.
Heliyon ; 8(12): e11964, 2022 Dec.
Artículo en Inglés | MEDLINE | ID: mdl-36561662

RESUMEN

In this article, we tested the respective importance of low spatial frequencies (LSF) and high spatial frequencies (HSF) for conscious visual recognition of emotional stimuli by using an attentional blink paradigm. Thirty-eight participants were asked to identify and report two targets (happy faces) embedded in a rapid serial visual presentation of distractors (angry faces). During attentional blink, conscious perception of the second target (T2) is usually altered when the lag between the two targets is short (200-500 ms) but is restored at longer lags. The distractors between T1 and T2 were either non-filtered (broad spatial frequencies, BSF), low-pass filtered (LSF), or high-pass filtered (HSF). Assuming that prediction abilities could be at the root of conscious visual recognition, we expected that LSF distractors could result in a greater disturbance of T2 reporting than HSF distractors. Results showed that both LSF and HSF play a role in the emergence of exogenous consciousness in the visual system. Furthermore, HSF distractors strongly affected T1 and T2 reporting irrespective of the lag between targets, suggesting their role for facial emotion processing. We discuss these results with regards to other models of visual recognition. .

8.
Policy Insights Behav Brain Sci ; 9(1): 137-144, 2022 Mar.
Artículo en Inglés | MEDLINE | ID: mdl-36059861

RESUMEN

Emotion understanding facilitates the development of healthy social interactions. To develop emotion knowledge, infants and young children must learn to make inferences about people's dynamically changing facial and vocal expressions in the context of their everyday lives. Given that emotional information varies so widely, the emotional input that children receive might particularly shape their emotion understanding over time. This review explores how variation in children's received emotional input shapes their emotion understanding and their emotional behavior over the course of development. Variation in emotional input from caregivers shapes individual differences in infants' emotion perception and understanding, as well as older children's emotional behavior. Finally, this work can inform policy and focus interventions designed to help infants and young children with social-emotional development.

9.
Artículo en Inglés | MEDLINE | ID: mdl-36078304

RESUMEN

Many studies have demonstrated that exposure to simulated natural scenes has positive effects on emotions and reduces stress. In the present study, we investigated emotional facial expressions while viewing images of various types of natural environments. Both automated facial expression analysis by iMotions' AFFDEX 8.1 software (iMotions, Copenhagen, Denmark) and self-reported emotions were analyzed. Attractive and unattractive natural images were used, representing either open or closed natural environments. The goal was to further understand the actual features and characteristics of natural scenes that could positively affect emotional states and to evaluate face reading technology to measure such effects. It was predicted that attractive natural scenes would evoke significantly higher levels of positive emotions than unattractive scenes. The results showed generally small values of emotional facial expressions while observing the images. The facial expression of joy was significantly higher than that of other registered emotions. Contrary to predictions, there was no difference between facial emotions while viewing attractive and unattractive scenes. However, the self-reported emotions evoked by the images showed significantly larger differences between specific categories of images in accordance with the predictions. The differences between the registered emotional facial expressions and self-reported emotions suggested that the participants more likely described images in terms of common stereotypes linked with the beauty of natural environments. This result might be an important finding for further methodological considerations.


Asunto(s)
Emociones , Expresión Facial , Emociones/fisiología , Humanos , Autoinforme
10.
Epilepsy Behav ; 134: 108821, 2022 09.
Artículo en Inglés | MEDLINE | ID: mdl-35868157

RESUMEN

Functional hemispherectomy results in good outcomes in cases of refractory epilepsy and constitutes a unique situation in which to study cerebral plasticity and the reorganization of lateralized functions of the brain, especially in cases of infancy or childhood surgery. Previous studies have highlighted the remarkable ability of the brain to recover language after left hemispherectomy. This leads to a reorganization of language networks toward right hemisphere, causing limitation in the development of visuo-spatial abilities, known as a crowding effect in the right hemisphere. Deficits in nonverbal functions have also been described as a more direct consequence of right hemipherectomy, but the results from case studies have sometimes been contradictory. We conducted a group study which may effectively compare patients with left and right hemispherectomy and address the effects of the age of seizure onset and surgery. We analyzed the general visuo-spatial and visuo-perceptive abilities, including face and emotional facial expression processing, in a group of 40 patients aged 7-16 years with left (n = 24) or right (n = 16) functional hemispherectomy. Although the groups did not differ, on average, in general visuo-spatial and visuo-perceptive skills, patients with right hemispherectomy were more impaired in the processing of faces and emotional facial expressions compared with patients with left hemispherectomy. This may reflect a specific deficit in the perceptual processing of faces after right hemispherectomy. Results are discussed in terms of limited plasticity of the left hemisphere for facial and configural processing.


Asunto(s)
Epilepsia Refractaria , Hemisferectomía , Niño , Lateralidad Funcional , Humanos , Lenguaje , Convulsiones
11.
Cogn Affect Behav Neurosci ; 22(6): 1404-1420, 2022 12.
Artículo en Inglés | MEDLINE | ID: mdl-35761029

RESUMEN

Daily life demands that we differentiate between a multitude of emotional facial expressions (EFEs). The mirror neuron system (MNS) is becoming increasingly implicated as a neural network involved with understanding emotional body expressions. However, the specificity of the MNS's involvement in emotion recognition has remained largely unexplored. This study investigated whether six basic dynamic EFEs (anger, disgust, fear, happiness, sadness, and surprise) would be differentiated through event-related desynchronisation (ERD) of sensorimotor alpha and beta oscillatory activity, which indexes sensorimotor MNS activity. We found that beta ERD differentiated happy, fearful, and sad dynamic EFEs at the central region of interest, but not at occipital regions. Happy EFEs elicited significantly greater central beta ERD relative to fearful and sad EFEs within 800 - 2,000 ms after EFE onset. These differences were source-localised to the primary somatosensory cortex, which suggests they are likely to reflect differential sensorimotor simulation rather than differential attentional engagement. Furthermore, individuals with higher trait anxiety showed less beta ERD differentiation between happy and sad faces. Similarly, individuals with higher trait autism showed less beta ERD differentiation between happy and fearful faces. These findings suggest that the differential simulation of specific affective states is attenuated in individuals with higher trait anxiety and autism. In summary, the MNS appears to support the skills needed for emotion processing in daily life, which may be influenced by certain individual differences. This provides novel evidence for the notion that simulation-based emotional skills may underlie the emotional difficulties that accompany affective disorders, such as anxiety.


Asunto(s)
Trastorno Autístico , Expresión Facial , Humanos , Emociones , Ansiedad/psicología , Trastornos de Ansiedad/psicología , Felicidad
12.
An. psicol ; 38(2): 375-381, may. 2022.
Artículo en Español | IBECS | ID: ibc-202898

RESUMEN

Las expresiones de amenaza son detectadas con rapidez y precisión, advirtiendo a quienes las observan de la presencia de un potencial peligro. Durante el proceso de detección, la expresión de sorpresa podría jugar un papel importante como clave de orientación en condiciones de incertidumbre donde se requiere una respuesta rápida y precisa. Con el objetivo de analizar este supuesto se plateó un experimento en el que participaron 70 sujetos que realizaron una tarea de señalización espacial, donde se utilizaron expresiones faciales de sorpresa (vs. neutra) como claves de orientación, y expresiones faciales de miedo, ira, alegría y neutras como estímulos objetivo. Los resultados mostraron un efecto facilitador de la expresión de sorpresa solo en la detección de la expresión de ira, reduciendo los tiempos de respuesta y el porcentaje de errores. Los datos apuntan a que la expresión de sorpresa, cuando se procesa como un estímulo independiente, podría facilitar la detección de aquellos estímulos que supongan una amenaza directa, como la expresión de ira, siendo esta distinción clave para entender en qué condiciones se detecta más eficazmente la expresión de ira respecto a otro tipo de expresiones.(AU)


Threatening expressions are detected quickly and accurately, warning the observer of the presence of a potential danger. During the de-tection process, a facial expression of surprise could play an important role as a cue for orientation in conditions of uncertainty that call for a swift and precise response. With a view to analysing this contingency, an experiment was conducted in which 70 subjects undertook a spatial cueing task that involved facial expressions of surprise (vs. neutral ones) as orientation cues, and facial expressions of fear, anger and happiness as target stimuli. The results revealed a priming effect of the expression of surprise solely in the detection of the expression of anger, reducing response times and the percentage of errors. Thedata indicate that the expression of surprise, when processed as an independent stimulus, could prime the detection of those stimuli that constitute a direct threat, such as the expression of anger, with this being a crucial distinction for understanding the circumstances in which the expression of anger is detected more effectively than other kinds of expressions.(AU)


Asunto(s)
Humanos , Ciencias de la Salud , Emociones , Expresión Facial , Ira , Miedo
13.
Soc Cogn Affect Neurosci ; 17(6): 590-597, 2022 06 03.
Artículo en Inglés | MEDLINE | ID: mdl-35077566

RESUMEN

Costly punishment describes decisions of an interaction partner to punish an opponent for violating rules of fairness at the expense of personal costs. Here, we extend the interaction process by investigating the impact of a socio-emotional reaction of the opponent in response to the punishment that indicates whether punishment was successful or not. In a modified Ultimatum game, emotional facial expressions of the proposer in response to the decision of the responder served as feedback stimuli. We found that both honored reward following acceptance of an offer (smiling compared to neutral facial expression) and successful punishment (sad compared to neutral facial expression) elicited a reward positivity, indicating that punishment was the intended outcome. By comparing the pattern of results with a probabilistic learning task, we show that the reward positivity on sad facial expressions was specific for the context of costly punishment. Additionally, acceptance rates on a trial-by-trial basis were altered according to P3 amplitudes in response to the emotional facial reaction of the proposer. Our results are in line with the concept of costly punishment as an intentional act following norm-violating behavior. Socio-emotional stimuli have an important influence on the perception and behavior in economic bargaining.


Asunto(s)
Toma de Decisiones , Castigo , Toma de Decisiones/fisiología , Emociones/fisiología , Expresión Facial , Humanos , Castigo/psicología , Recompensa
14.
Psychophysiology ; 59(1): e13945, 2022 01.
Artículo en Inglés | MEDLINE | ID: mdl-34553782

RESUMEN

Using still pictures of emotional facial expressions as experimental stimuli, reduced amygdala responses or impaired recognition of basic emotions were repeatedly found in people with psychopathic traits. The amygdala also plays an important role in short-latency facial mimicry responses. Since dynamic emotional facial expressions may have higher ecological validity than still pictures, we compared short-latency facial mimicry responses to dynamic and static emotional expressions between adolescents with psychopathic traits and normal controls. Facial EMG responses to videos or still pictures of emotional expressions (happiness, anger, sadness, fear) were measured. Responses to 500-ms dynamic expressions in videos, as well as the subsequent 1500-ms phase of maximal (i.e., static) expression, were compared between male adolescents with disruptive behavior disorders and high (n = 14) or low (n = 17) callous-unemotional (CU) traits, and normal control subjects (n = 32). Responses to still pictures were also compared between groups. EMG responses to dynamic expressions were generally significantly smaller in the high-CU group than in the other two groups, which generally did not differ. These group differences gradually emerged during the 500-ms stimulus presentation period but in general they were already seen a few hundred milliseconds after stimulus onset. Group differences were absent during the 1500-ms phase of maximal expression and during exposure to still pictures. Subnormal short-latency mimicry responses to dynamic emotional facial expressions in the high-CU group might have negative consequences for understanding emotional facial expressions of others during daily life when human facial interactions are primarily dynamic.


Asunto(s)
Déficit de la Atención y Trastornos de Conducta Disruptiva , Expresión Facial , Reconocimiento Facial/fisiología , Reconocimiento en Psicología/fisiología , Adolescente , Emociones/fisiología , Humanos , Masculino , Tiempo de Reacción
15.
Cogn Sci ; 45(10): e13042, 2021 10.
Artículo en Inglés | MEDLINE | ID: mdl-34606110

RESUMEN

Previous studies have shown that the human visual system can detect a face and elicit a saccadic eye movement toward it very efficiently compared to other categories of visual stimuli. In the first experiment, we tested the influence of facial expressions on fast face detection using a saccadic choice task. Face-vehicle pairs were simultaneously presented and participants were asked to saccade toward the target (the face or the vehicle). We observed that saccades toward faces were initiated faster, and more often in the correct direction, than saccades toward vehicles, regardless of the facial expressions (happy, fearful, or neutral). We also observed that saccade endpoints on face images were lower when the face was happy and higher when it was neutral. In the second experiment, we explicitly tested the detection of facial expressions. We used a saccadic choice task with emotional-neutral pairs of faces and participants were asked to saccade toward the emotional (happy or fearful) or the neutral face. Participants were faster when they were asked to saccade toward the emotional face. They also made fewer errors, especially when the emotional face was happy. Using computational modeling, we showed that this happy face advantage can, at least partly, be explained by perceptual factors. Also, saccade endpoints were lower when the target was happy than when it was fearful. Overall, we suggest that there is no automatic prioritization of emotional faces, at least for saccades with short latencies, but that salient local face features can automatically attract attention.


Asunto(s)
Emociones , Movimientos Sacádicos , Atención , Expresión Facial , Humanos , Tiempo de Reacción
16.
Brain Sci ; 11(9)2021 Sep 13.
Artículo en Inglés | MEDLINE | ID: mdl-34573224

RESUMEN

The ability to rapidly process others' emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.

17.
Handb Clin Neurol ; 183: 99-108, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34389127

RESUMEN

One of the most important means of communicating emotions is by facial expressions. About 30-40 years ago, several studies examined patients with right and left hemisphere strokes for deficits in expressing and comprehending emotional facial expressions. The participants with right- or left-hemispheric strokes attempted to determine if two different actors were displaying the same or different emotions, to name the different emotions being displayed, and to select the face displaying an emotion named by the examiner. Investigators found that the right hemisphere-damaged group was impaired on all these emotional facial tests and that this deficit was not solely related to visuoperceptual processing defects. Further studies revealed that the patients who were impaired at recognizing emotional facial expressions and who had lost these visual representations of emotional faces often had damage to their right parietal lobe and their right somatosensory cortex. Injury to the cerebellum has been reported to impair emotional facial recognition, as have dementing diseases such as Alzheimer's disease and frontotemporal dementia, movement disorders such as Parkinson's disease and Huntington's disease, traumatic brain injuries, and temporal lobe epilepsy. Patients with right hemisphere injury are also more impaired than left-hemisphere-damaged patients when attempting to voluntarily produce facial emotional expressions and in their spontaneous expression of emotions in response to stimuli. This impairment does not appear to be induced by emotional conceptual deficits or an inability to experience emotions. Many of the disorders that cause impairments of comprehension of affective facial expressions also impair facial emotional expression. Treating the underlying disease may help patients with impairments of facial emotion recognition and expression, but unfortunately, there have not been many studies of rehabilitation.


Asunto(s)
Demencia Frontotemporal , Enfermedad de Huntington , Comprensión , Emociones , Expresión Facial , Lateralidad Funcional , Humanos , Pruebas Neuropsicológicas
18.
Indian J Psychol Med ; 43(1): 51-57, 2021 Jan.
Artículo en Inglés | MEDLINE | ID: mdl-34349307

RESUMEN

BACKGROUND: Successful identification of emotional expression in patients is of considerable importance in the diagnosis of diseases and while developing rapport between physicians and patients. Despite the importance of such skills, this aspect remains grossly overlooked in conventional medical training in India. This study aims to explore the extent to which medical students can identify emotions by observing photographs of male and female subjects expressing different facial expressions. METHODS: A total of 106 medical students aged 18-25, without any diagnosed mental illnesses, were shown images of the six universal facial expressions (anger, sadness, fear, happiness, disgust, and surprise) at 100% intensity with an exposure time of 2 seconds for each image. The participants marked their responses after each image was shown. Collected data were analyzed using Statistical Package for the Social Sciences. RESULTS: Participants could identify 76.54% of the emotions on average, with higher accuracy for positive emotions (95.6% for happiness) and lower for negative emotions (46% for fear). There were no significant variations in identification with respect to sex of the observers. However, it was seen that participants could identify emotions better from male faces than those from female faces, a finding that was statistically significant. Negative emotions were identified more accurately from male faces, while positive emotions were identified better from female ones. CONCLUSIONS: Male participants identified emotions better from male faces, while females identified positive emotions better from female faces and negative ones from male faces.

19.
Front Psychiatry ; 12: 668019, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34267686

RESUMEN

Background: The concept of alexithymia is characterized by difficulties identifying and describing one's emotions. Alexithymic individuals are impaired in the recognition of others' emotional facial expressions. Alexithymia is quite common in patients suffering from major depressive disorder. The face-in-the-crowd task is a visual search paradigm that assesses processing of multiple facial emotions. In the present eye-tracking study, the relationship between alexithymia and visual processing of facial emotions was examined in clinical depression. Materials and Methods: Gaze behavior and manual response times of 20 alexithymic and 19 non-alexithymic depressed patients were compared in a face-in-the-crowd task. Alexithymia was empirically measured via the 20-item Toronto Alexithymia-Scale. Angry, happy, and neutral facial expressions of different individuals were shown as target and distractor stimuli. Our analyses of gaze behavior focused on latency to the target face, number of distractor faces fixated before fixating the target, number of target fixations, and number of distractor faces fixated after fixating the target. Results: Alexithymic patients exhibited in general slower decision latencies compared to non-alexithymic patients in the face-in-the-crowd task. Patient groups did not differ in latency to target, number of target fixations, and number of distractors fixated prior to target fixation. However, after having looked at the target, alexithymic patients fixated more distractors than non-alexithymic patients, regardless of expression condition. Discussion: According to our results, alexithymia goes along with impairments in visual processing of multiple facial emotions in clinical depression. Alexithymia appears to be associated with delayed manual reaction times and prolonged scanning after the first target fixation in depression, but it might have no impact on the early search phase. The observed deficits could indicate difficulties in target identification and/or decision-making when processing multiple emotional facial expressions. Impairments of alexithymic depressed patients in processing emotions in crowds of faces seem not limited to a specific affective valence. In group situations, alexithymic depressed patients might be slowed in processing interindividual differences in emotional expressions compared with non-alexithymic depressed patients. This could represent a disadvantage in understanding non-verbal communication in groups.

20.
Front Psychol ; 12: 627561, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34025503

RESUMEN

Emotional facial expressions can inform researchers about an individual's emotional state. Recent technological advances open up new avenues to automatic Facial Expression Recognition (FER). Based on machine learning, such technology can tremendously increase the amount of processed data. FER is now easily accessible and has been validated for the classification of standardized prototypical facial expressions. However, applicability to more naturalistic facial expressions still remains uncertain. Hence, we test and compare performance of three different FER systems (Azure Face API, Microsoft; Face++, Megvii Technology; FaceReader, Noldus Information Technology) with human emotion recognition (A) for standardized posed facial expressions (from prototypical inventories) and (B) for non-standardized acted facial expressions (extracted from emotional movie scenes). For the standardized images, all three systems classify basic emotions accurately (FaceReader is most accurate) and they are mostly on par with human raters. For the non-standardized stimuli, performance drops remarkably for all three systems, but Azure still performs similarly to humans. In addition, all systems and humans alike tend to misclassify some of the non-standardized emotional facial expressions as neutral. In sum, emotion recognition by automated facial expression recognition can be an attractive alternative to human emotion recognition for standardized and non-standardized emotional facial expressions. However, we also found limitations in accuracy for specific facial expressions; clearly there is need for thorough empirical evaluation to guide future developments in computer vision of emotional facial expressions.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA