Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 37
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Health Educ Behav ; : 10901981241267879, 2024 Aug 28.
Artículo en Inglés | MEDLINE | ID: mdl-39199019

RESUMEN

Even though multiple states have approved legal recreational use of cannabis, the expansion of recreational cannabis legalization has led to public health concerns in the United States. Young adults (18-25 years old) have the highest percentage of cannabis use disorder compared to all other age groups. The purpose of this study is to compare cognitive and emotional responses of young adults who use cannabis and non-users to two anti-cannabis media campaigns that employed different message strategies. In total, 50 people (25 people who use cannabis and 25 non-users) participated in the study-a 2 (cannabis use status: people who currently use cannabis/non-users) × 2 (Public Service Advertising [PSA] campaign: Don't be a Lab Rat-Informational/Stoner Sloth-Narrative) × 3 (message replication) experiment. Participants viewed six messages based on the combinations of each of the three message replications within two campaigns. Participants' facial emotional responses were recorded during message exposure. Self-report questions were asked after viewing each message. Self-report indices showed no differences between the two campaigns for participants who use cannabis and non-users. However, after controlling for individual differences, participants who use cannabis displayed more negative emotional responses to the Don't be a Lab Rat messages than to the Stoner Sloth messages. Conversely, cannabis users experienced more positive emotional responses to the Stoner Sloth messages than to the Don't be a Lab Rat messages. The study provides insights for message design in public health campaigns addressing cannabis use, suggesting that psychophysiological measures can be helpful in providing insights into responses not detected by traditional self-report measures.

2.
J Oral Rehabil ; 51(8): 1373-1378, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-38661360

RESUMEN

BACKGROUND: The Fijian 'Bula Smile' is often described as the world's friendliest; however, its description remains anecdotal. OBJECTIVE: This study aimed to describe and compare the dynamics of Fijians' smiles with those of New Zealand Europeans. METHODS: An observational study was conducted on two ethnic groups, Fijians (FJ; N = 23) and New Zealand Europeans (NZ; N = 23), age- and gender-matched. All participants were asked to watch amusing videos, and their reactions were video recorded. The videos were analysed by software to assess the frequency, duration, intensity and genuineness of smiling episodes. Based on the Facial Action Coding System, Action Unit 6 (AU6-cheek raiser), Action Unit 12 (AU12-lip corner puller) and Action Unit 25 (AU25-lips apart) were assessed. Data were analysed by generalised linear models after adjusting for personality traits. RESULTS: Fijians smiled longer than New Zealand Europeans (+19.9%; p = .027). Mean intensity of AU6 (+1.0; 95%CIs = 0.6-1.5; p < .001), AU12 (+0.5; 95%CIs = 0.1-0.9; p = .008) and AU25 (+22.3%; 95%CIs = 7.3%-37.3%; p = .005) were significantly higher in FJ group than the NZ group. CONCLUSION: Smiling features of Fijians and New Zealanders showed objective differences, the most distinctive being a higher activation of the Duchenne's marker (AU6) in the Fijian group, which is regarded as a sign of smile genuineness.


Asunto(s)
Sonrisa , Población Blanca , Humanos , Sonrisa/fisiología , Femenino , Masculino , Nueva Zelanda , Adulto , Grabación en Video , Adulto Joven , Pueblo Europeo , Pueblos Isleños del Pacífico
3.
J Headache Pain ; 25(1): 33, 2024 Mar 11.
Artículo en Inglés | MEDLINE | ID: mdl-38462615

RESUMEN

BACKGROUND: The present study used the Facial Action Coding System (FACS) to analyse changes in facial activities in individuals with migraine during resting conditions to determine the potential of facial expressions to convey information about pain during headache episodes. METHODS: Facial activity was recorded in calm and resting conditions by using a camera for both healthy controls (HC) and patients with episodic migraine (EM) and chronic migraine (CM). The FACS was employed to analyse the collected facial images, and intensity scores for each of the 20 action units (AUs) representing expressions were generated. The groups and headache pain conditions were then examined for each AU. RESULTS: The study involved 304 participants, that is, 46 HCs, 174 patients with EM, and 84 patients with CM. Elevated headache pain levels were associated with increased lid tightener activity and reduced mouth stretch. In the CM group, moderate to severe headache attacks exhibited decreased activation in the mouth stretch, alongside increased activation in the lid tightener, nose wrinkle, and cheek raiser, compared to mild headache attacks (all corrected p < 0.05). Notably, lid tightener activation was positively correlated with the Numeric Rating Scale (NRS) level of headache (p = 0.012). Moreover, the lip corner depressor was identified to be indicative of emotional depression severity (p < 0.001). CONCLUSION: Facial expressions, particularly lid tightener actions, served as inherent indicators of headache intensity in individuals with migraine, even during resting conditions. This indicates that the proposed approach holds promise for providing a subjective evaluation of headaches, offering the benefits of real-time assessment and convenience for patients with migraine.


Asunto(s)
Expresión Facial , Trastornos Migrañosos , Humanos , Trastornos Migrañosos/complicaciones , Cefalea , Dolor , Depresión
4.
Int J Psychol ; 59(1): 104-110, 2024 Feb.
Artículo en Inglés | MEDLINE | ID: mdl-37848345

RESUMEN

We aimed to understand which factors have a functional role in the size coding of responses, either the size of the switches or the force required to trigger each switch. This question is of relevance because it allows a better understanding of processes underlying action coding. In each trial, participants saw a small or large object. Depending on its colour, the participants had to press one of two switches. In the "size" condition, the response device consisted of two switches of different visual size, but both required the same amount of force. In the "force-feedback" condition, the response device consisted in two switches of identical visual size, but one switch required more force than the other. We found a compatibility effect in the "size," not in the "force-feedback" condition, supporting that the size-coding of responses would be due to the size of the switches.


Asunto(s)
Desempeño Psicomotor , Humanos , Retroalimentación , Desempeño Psicomotor/fisiología
5.
Psych J ; 13(2): 322-334, 2024 Apr.
Artículo en Inglés | MEDLINE | ID: mdl-38105597

RESUMEN

Empathic concern and personal distress are common vicarious emotional responses that arise when witnessing someone else's pain. However, the influence of perceived similarity on these responses remains unclear. In this study, we examined how perceived similarity with an injured target impacts vicarious emotional responses. A total of 87 participants watched a video of an athlete in pain preceded by a clip describing the athlete's trajectory, which indicated either high, moderate, or low similarity to the participants. Emotional self-reports, facial expressions, gaze behavior, and pupil diameter were measured as indicators of the participants' emotional responses. Participants in the moderate- and high-similarity groups exhibited greater empathic concern, as evidenced by their display of more sadness compared with those in the low-similarity group. Furthermore, those in the moderate-similarity group exhibited less avoidance by displaying reduced disgust, indicating lower personal distress compared with those in the low-similarity condition. Nevertheless, the high-similarity group displayed just as much disgust as the low-similarity group. These findings suggest that perceived similarity enhances empathic concern to others' suffering, but that high similarity can also lead to personal distress. Future studies on empathy should explore distinct vicarious states using multimodal measurements to further advance our understanding of these processes.


Asunto(s)
Emociones , Empatía , Humanos , Emociones/fisiología , Dolor/psicología , Autoinforme , Expresión Facial
6.
Sensors (Basel) ; 23(22)2023 Nov 09.
Artículo en Inglés | MEDLINE | ID: mdl-38005462

RESUMEN

Although electromyography (EMG) remains the standard, researchers have begun using automated facial action coding system (FACS) software to evaluate spontaneous facial mimicry despite the lack of evidence of its validity. Using the facial EMG of the zygomaticus major (ZM) as a standard, we confirmed the detection of spontaneous facial mimicry in action unit 12 (AU12, lip corner puller) via an automated FACS. Participants were alternately presented with real-time model performance and prerecorded videos of dynamic facial expressions, while simultaneous ZM signal and frontal facial videos were acquired. Facial videos were estimated for AU12 using FaceReader, Py-Feat, and OpenFace. The automated FACS is less sensitive and less accurate than facial EMG, but AU12 mimicking responses were significantly correlated with ZM responses. All three software programs detected enhanced facial mimicry by live performances. The AU12 time series showed a roughly 100 to 300 ms latency relative to the ZM. Our results suggested that while the automated FACS could not replace facial EMG in mimicry detection, it could serve a purpose for large effect sizes. Researchers should be cautious with the automated FACS outputs, especially when studying clinical populations. In addition, developers should consider the EMG validation of AU estimation as a benchmark.


Asunto(s)
Cara , Expresión Facial , Humanos , Músculos Faciales/fisiología , Electromiografía/métodos , Grabación de Cinta de Video , Emociones/fisiología
7.
Elife ; 122023 10 03.
Artículo en Inglés | MEDLINE | ID: mdl-37787008

RESUMEN

The social complexity hypothesis for communicative complexity posits that animal societies with more complex social systems require more complex communication systems. We tested the social complexity hypothesis on three macaque species that vary in their degree of social tolerance and complexity. We coded facial behavior in >3000 social interactions across three social contexts (aggressive, submissive, affiliative) in 389 animals, using the Facial Action Coding System for macaques (MaqFACS). We quantified communicative complexity using three measures of uncertainty: entropy, specificity, and prediction error. We found that the relative entropy of facial behavior was higher for the more tolerant crested macaques as compared to the less tolerant Barbary and rhesus macaques across all social contexts, indicating that crested macaques more frequently use a higher diversity of facial behavior. The context specificity of facial behavior was higher in rhesus as compared to Barbary and crested macaques, demonstrating that Barbary and crested macaques used facial behavior more flexibly across different social contexts. Finally, a random forest classifier predicted social context from facial behavior with highest accuracy for rhesus and lowest for crested, indicating there is higher uncertainty and complexity in the facial behavior of crested macaques. Overall, our results support the social complexity hypothesis.


Asunto(s)
Cara , Conducta Social , Animales , Macaca mulatta , Agresión , Comunicación , Conducta Animal
8.
Cell Rep ; 42(9): 113091, 2023 09 26.
Artículo en Inglés | MEDLINE | ID: mdl-37656619

RESUMEN

Our natural behavioral repertoires include coordinated actions of characteristic types. To better understand how neural activity relates to the expression of actions and action switches, we studied macaques performing a freely moving foraging task in an open environment. We developed a novel analysis pipeline that can identify meaningful units of behavior, corresponding to recognizable actions such as sitting, walking, jumping, and climbing. On the basis of transition probabilities between these actions, we found that behavior is organized in a modular and hierarchical fashion. We found that, after regressing out many potential confounders, actions are associated with specific patterns of firing in each of six prefrontal brain regions and that, overall, encoding of action category is progressively stronger in more dorsal and more caudal prefrontal regions. Together, these results establish a link between selection of units of primate behavior on one hand and neuronal activity in prefrontal regions on the other.


Asunto(s)
Macaca , Corteza Prefrontal , Animales , Corteza Prefrontal/fisiología
9.
Front Psychol ; 14: 1138916, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-37179867

RESUMEN

Introduction: As self-rating scales are prone to many measurement distortions, there is a growing call for more objective measures based on physiological or behavioural indicators. Self-criticism is one of the major transdiagnostic factor of all mental disorders therefore it is important to be able to distinguish what are the characteristic facial features of self-criticizing. To the best of our knowledge, there has been no automated facial emotion expression analysis of participants self-criticising via the two-chair technique. The aim of this study was to detect which action units of facial expressions were significantly more often present in participants performing self-criticism using the two-chair technique. The broader goal was to contribute to the scientific knowledge on objective behavioural descriptions of self-criticism and to provide an additional diagnostic means to the existing self-rating scales by exploring facial behavioral markers of self-criticism. Methods: The non-clinical sample consisted of 80 participants (20 men and 60 women) aged 19 years to 57 years (M = 23.86; SD = 5.98). In the analysis we used iMotions's Affectiva AFFDEX module (Version 8.1) to classify the participants' actions units from the self-criticising videos. For the statistical analysis we used a multilevel model to account for the repeated-measures design. Results: Based on the significant results the self-critical facial expression may therefore comprise the following action units: Dimpler, Lip Press, Eye Closure, Jaw Drop, and Outer Brow Raise, which are related to contempt, fear, and embarrassment or shame; and Eye Closure and Eye Widen (in rapid sequence Blink), which are a sign that highly negative stimuli are being emotionally processed. Discussion: The research study need to be further analysed using clinical samples to compare the results.

10.
Behav Res Methods ; 55(3): 1024-1035, 2023 04.
Artículo en Inglés | MEDLINE | ID: mdl-35538295

RESUMEN

Automated detection of facial action units in infants is challenging. Infant faces have different proportions, less texture, fewer wrinkles and furrows, and unique facial actions relative to adults. For these and related reasons, action unit (AU) detectors that are trained on adult faces may generalize poorly to infant faces. To train and test AU detectors for infant faces, we trained convolutional neural networks (CNN) in adult video databases and fine-tuned these networks in two large, manually annotated, infant video databases that differ in context, head pose, illumination, video resolution, and infant age. AUs were those central to expression of positive and negative emotion. AU detectors trained in infants greatly outperformed ones trained previously in adults. Training AU detectors across infant databases afforded greater robustness to between-database differences than did training database specific AU detectors and outperformed previous state-of-the-art in infant AU detection. The resulting AU detection system, which we refer to as Infant AFAR (Automated Facial Action Recognition), is available to the research community for further testing and applications in infant emotion, social interaction, and related topics.


Asunto(s)
Expresión Facial , Reconocimiento Facial , Humanos , Lactante , Redes Neurales de la Computación , Emociones , Interacción Social , Bases de Datos Factuales
11.
Front Artif Intell ; 5: 942248, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-36277167

RESUMEN

Data from 255 Thais with chronic pain were collected at Chiang Mai Medical School Hospital. After the patients self-rated their level of pain, a smartphone camera was used to capture faces for 10 s at a one-meter distance. For those unable to self-rate, a video recording was taken immediately after the move that causes the pain. The trained assistant rated each video clip for the pain assessment in advanced dementia (PAINAD). The pain was classified into three levels: mild, moderate, and severe. OpenFace© was used to convert the video clips into 18 facial action units (FAUs). Five classification models were used, including logistic regression, multilayer perception, naïve Bayes, decision tree, k-nearest neighbors (KNN), and support vector machine (SVM). Out of the models that only used FAU described in the literature (FAU 4, 6, 7, 9, 10, 25, 26, 27, and 45), multilayer perception is the most accurate, at 50%. The SVM model using FAU 1, 2, 4, 7, 9, 10, 12, 20, 25, and 45, and gender had the best accuracy of 58% among the machine learning selection features. Our open-source experiment for automatically analyzing video clips for FAUs is not robust for classifying pain in the elderly. The consensus method to transform facial recognition algorithm values comparable to the human ratings, and international good practice for reciprocal sharing of data may improve the accuracy and feasibility of the machine learning's facial pain rater.

12.
Sensors (Basel) ; 22(17)2022 Aug 30.
Artículo en Inglés | MEDLINE | ID: mdl-36080983

RESUMEN

Physical exercise has become an essential tool for treating various non-communicable diseases (also known as chronic diseases). Due to this, physical exercise allows to counter different symptoms and reduce some risk of death factors without medication. A solution to support people in doing exercises is to use artificial systems that monitor their exercise progress. While one crucial aspect is to monitor the correct physical motions for rehabilitative exercise, another essential element is to give encouraging feedback during workouts. A coaching system can track a user's exhaustion and give motivating feedback accordingly to boost exercise adherence. For this purpose, this research investigates whether it is possible to predict the subjective exhaustion level based on non-invasive and non-wearable technology. A novel data set was recorded with the facial record as the primary predictor and individual exhaustion levels as the predicted variable. 60 participants (30 male, 30 female) took part in the data recording. 17 facial action units (AU) were extracted as predictor variables for the perceived subjective exhaustion measured using the BORG scale. Using the predictor and the target variables, several regression and classification methods were evaluated aiming to predict exhaustion. The results showed that the decision tree and support vector methods provide reasonable prediction results. The limitation of the results, depending on participants being in the training data set and subjective variables (e.g., participants smiling during the exercises) were further discussed.


Asunto(s)
Terapia por Ejercicio , Ejercicio Físico , Terapia por Ejercicio/métodos , Retroalimentación , Humanos
13.
Subst Use Misuse ; 57(10): 1618-1625, 2022.
Artículo en Inglés | MEDLINE | ID: mdl-35869663

RESUMEN

Background: The goal of this study was to test the interactive effects of negative urgency, state negative affect, and alcohol intoxication on intimate partner aggression (IPA) perpetration. Methods: Heavy drinkers who recently perpetrated IPA completed self-report measures of impulsivity, were administered an alcohol or control beverage, and completed a laboratory aggression task. State negative affect was assessed unobtrusively via the Facial Action Coding System. Results: Consistent with our prediction, negative urgency was significantly and positively related to IPA when state negative affect was also high, but this relation was not significant when state negative affect was low. Conclusions: These results have implications for understanding the role of negative affect and impulsivity in IPA perpetration and for understanding trait models of impulsivity in general.


Asunto(s)
Intoxicación Alcohólica , Violencia de Pareja , Agresión/psicología , Consumo de Bebidas Alcohólicas/psicología , Intoxicación Alcohólica/psicología , Etanol , Humanos , Conducta Impulsiva , Violencia de Pareja/psicología , Parejas Sexuales
14.
Sensors (Basel) ; 22(4)2022 Feb 10.
Artículo en Inglés | MEDLINE | ID: mdl-35214255

RESUMEN

Parkinson's disease (PD) is a neurological disorder that mainly affects the motor system. Among other symptoms, hypomimia is considered one of the clinical hallmarks of the disease. Despite its great impact on patients' quality of life, it remains still under-investigated. The aim of this work is to provide a quantitative index for hypomimia that can distinguish pathological and healthy subjects and that can be used in the classification of emotions. A face tracking algorithm was implemented based on the Facial Action Coding System. A new easy-to-interpret metric (face mobility index, FMI) was defined considering distances between pairs of geometric features and a classification based on this metric was proposed. Comparison was also provided between healthy controls and PD patients. Results of the study suggest that this index can quantify the degree of impairment in PD and can be used in the classification of emotions. Statistically significant differences were observed for all emotions when distances were taken into account, and for happiness and anger when FMI was considered. The best classification results were obtained with Random Forest and kNN according to the AUC metric.


Asunto(s)
Enfermedad de Parkinson , Emociones , Cara , Expresión Facial , Humanos , Enfermedad de Parkinson/diagnóstico , Calidad de Vida
15.
Behav Res Methods ; 54(4): 1912-1927, 2022 08.
Artículo en Inglés | MEDLINE | ID: mdl-34755285

RESUMEN

Understanding facial signals in humans and other species is crucial for understanding the evolution, complexity, and function of the face as a communication tool. The Facial Action Coding System (FACS) enables researchers to measure facial movements accurately, but we currently lack tools to reliably analyse data and efficiently communicate results. Network analysis can provide a way to use the information encoded in FACS datasets: by treating individual AUs (the smallest units of facial movements) as nodes in a network and their co-occurrence as connections, we can analyse and visualise differences in the use of combinations of AUs in different conditions. Here, we present 'NetFACS', a statistical package that uses occurrence probabilities and resampling methods to answer questions about the use of AUs, AU combinations, and the facial communication system as a whole in humans and non-human animals. Using highly stereotyped facial signals as an example, we illustrate some of the current functionalities of NetFACS. We show that very few AUs are specific to certain stereotypical contexts; that AUs are not used independently from each other; that graph-level properties of stereotypical signals differ; and that clusters of AUs allow us to reconstruct facial signals, even when blind to the underlying conditions. The flexibility and widespread use of network analysis allows us to move away from studying facial signals as stereotyped expressions, and towards a dynamic and differentiated approach to facial communication.


Asunto(s)
Expresión Facial , Movimiento , Animales , Humanos
16.
Front Robot AI ; 8: 699090, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34869609

RESUMEN

In this paper, we present a study aimed at understanding whether the embodiment and humanlikeness of an artificial agent can affect people's spontaneous and instructed mimicry of its facial expressions. The study followed a mixed experimental design and revolved around an emotion recognition task. Participants were randomly assigned to one level of humanlikeness (between-subject variable: humanlike, characterlike, or morph facial texture of the artificial agents) and observed the facial expressions displayed by three artificial agents differing in embodiment (within-subject variable: video-recorded robot, physical robot, and virtual agent) and a human (control). To study both spontaneous and instructed facial mimicry, we divided the experimental sessions into two phases. In the first phase, we asked participants to observe and recognize the emotions displayed by the agents. In the second phase, we asked them to look at the agents' facial expressions, replicate their dynamics as closely as possible, and then identify the observed emotions. In both cases, we assessed participants' facial expressions with an automated Action Unit (AU) intensity detector. Contrary to our hypotheses, our results disclose that the agent that was perceived as the least uncanny, and most anthropomorphic, likable, and co-present, was the one spontaneously mimicked the least. Moreover, they show that instructed facial mimicry negatively predicts spontaneous facial mimicry. Further exploratory analyses revealed that spontaneous facial mimicry appeared when participants were less certain of the emotion they recognized. Hence, we postulate that an emotion recognition goal can flip the social value of facial mimicry as it transforms a likable artificial agent into a distractor. Further work is needed to corroborate this hypothesis. Nevertheless, our findings shed light on the functioning of human-agent and human-robot mimicry in emotion recognition tasks and help us to unravel the relationship between facial mimicry, liking, and rapport.

17.
Front Psychol ; 12: 608483, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-34149502

RESUMEN

The 2016 United States presidential election was exceptional for many reasons; most notably the extreme division between supporters of Donald Trump and Hillary Clinton. In an election that turned more upon the character traits of the candidates than their policy positions, there is reason to believe that the non-verbal performances of the candidates influenced attitudes toward the candidates. Two studies, before Election Day, experimentally tested the influence of Trump's micro-expressions of fear during his Republican National Convention nomination acceptance speech on how viewers evaluated his key leadership traits of competence and trustworthiness. Results from Study 1, conducted 3 weeks prior to the election, indicated generally positive effects of Trump's fear micro-expressions on his trait evaluations, particularly when viewers were first exposed to his opponent, Clinton. In contrast, Study 2, conducted 4 days before Election Day, suggests participants had at that point largely established their trait perceptions and were unaffected by the micro-expressions.

18.
Front Psychol ; 12: 605928, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-33716870

RESUMEN

Emoji faces, which are ubiquitous in our everyday communication, are thought to resemble human faces and aid emotional communication. Yet, few studies examine whether emojis are perceived as a particular emotion and whether that perception changes based on rendering differences across electronic platforms. The current paper draws upon emotion theory to evaluate whether emoji faces depict anatomical differences that are proposed to differentiate human depictions of emotion (hereafter, "facial expressions"). We modified the existing Facial Action Coding System (FACS) (Ekman and Rosenberg, 1997) to apply to emoji faces. An equivalent "emoji FACS" rubric allowed us to evaluate two important questions: First, Anatomically, does the same emoji face "look" the same across platforms and versions? Second, Do emoji faces perceived as a particular emotion category resemble the proposed human facial expression for that emotion? To answer these questions, we compared the anatomically based codes for 31 emoji faces across three platforms and two version updates. We then compared those codes to the proposed human facial expression prototype for the emotion perceived within the emoji face. Overall, emoji faces across platforms and versions were not anatomically equivalent. Moreover, the majority of emoji faces did not conform to human facial expressions for an emotion, although the basic anatomical codes were shared among human and emoji faces. Some emotion categories were better predicted by the assortment of anatomical codes than others, with some individual differences among platforms. We discuss theories of emotion that help explain how emoji faces are perceived as an emotion, even when anatomical differences are not always consistent or specific to an emotion.

19.
Front Psychol ; 12: 800657, 2021.
Artículo en Inglés | MEDLINE | ID: mdl-35185697

RESUMEN

Android robots capable of emotional interactions with humans have considerable potential for application to research. While several studies developed androids that can exhibit human-like emotional facial expressions, few have empirically validated androids' facial expressions. To investigate this issue, we developed an android head called Nikola based on human psychology and conducted three studies to test the validity of its facial expressions. In Study 1, Nikola produced single facial actions, which were evaluated in accordance with the Facial Action Coding System. The results showed that 17 action units were appropriately produced. In Study 2, Nikola produced the prototypical facial expressions for six basic emotions (anger, disgust, fear, happiness, sadness, and surprise), and naïve participants labeled photographs of the expressions. The recognition accuracy of all emotions was higher than chance level. In Study 3, Nikola produced dynamic facial expressions for six basic emotions at four different speeds, and naïve participants evaluated the naturalness of the speed of each expression. The effect of speed differed across emotions, as in previous studies of human expressions. These data validate the spatial and temporal patterns of Nikola's emotional facial expressions, and suggest that it may be useful for future psychological studies and real-life applications.

20.
Am J Phys Anthropol ; 173(3): 411-422, 2020 11.
Artículo en Inglés | MEDLINE | ID: mdl-32820559

RESUMEN

OBJECTIVES: While it has been demonstrated that even subtle variation in human facial expressions can lead to significant changes in the meaning and function of expressions, relatively few studies have examined primate facial expressions using similarly objective and rigorous analysis. Construction of primate facial expression repertoires may, therefore, be oversimplified, with expressions often arbitrarily pooled and/or split into subjective pigeonholes. Our objective is to assess whether subtle variation in primate facial expressions is linked to variation in function, and hence to inform future attempts to quantify complexity of facial communication. MATERIALS AND METHODS: We used Macaque Facial Action Coding System, an anatomically based and hence more objective tool, to quantify "silent bared-teeth" (SBT) expressions produced by wild crested macaques engaging in spontaneous behavior, and utilized discriminant analysis and bootstrapping analysis to look for morphological differences between SBT produced in four different contexts, defined by the outcome of interactions: Affiliation, Copulation, Play, and Submission. RESULTS: We found that SBT produced in these contexts could be distinguished at significantly above-chance rates, indicating that the expressions produced in these four contexts differ morphologically. We identified the specific facial movements that were typically used in each context, and found that the variability and intensity of facial movements also varied between contexts. DISCUSSION: These results indicate that nonhuman primate facial expressions share the human characteristic of exhibiting meaningful subtle differences. Complexity of facial communication may not be accurately represented simply by building repertoires of distinct expressions, so further work should attempt to take this subtle variability into account.


Asunto(s)
Expresión Facial , Macaca/fisiología , Conducta Social , Comunicación Animal , Animales , Cara/fisiología , Femenino , Masculino , Diente/fisiología
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA