Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 5.155
Filtrar
2.
J Int Neuropsychol Soc ; : 1-10, 2024 Sep 18.
Artículo en Inglés | MEDLINE | ID: mdl-39291402

RESUMEN

OBJECTIVES: This study investigated the relationship between various intrapersonal factors and the discrepancy between subjective and objective cognitive difficulties in adults with attention-deficit hyperactivity disorder (ADHD). The first aim was to examine these associations in patients with valid cognitive symptom reporting. The next aim was to investigate the same associations in patients with invalid scores on tests of cognitive symptom overreporting. METHOD: The sample comprised 154 adults who underwent a neuropsychological evaluation for ADHD. Patients were divided into groups based on whether they had valid cognitive symptom reporting and valid test performance (n = 117) or invalid cognitive symptom overreporting but valid test performance (n = 37). Scores from multiple symptom and performance validity tests were used to group patients. Using patients' scores from a cognitive concerns self-report measure and composite index of objective performance tests, we created a subjective-objective discrepancy index to quantify the extent of cognitive concerns that exceeded difficulties on objective testing. Various measures were used to assess intrapersonal factors thought to influence the subjective-objective cognitive discrepancy, including demographics, estimated premorbid intellectual ability, internalizing symptoms, somatic symptoms, and perceived social support. RESULTS: Patients reported greater cognitive difficulties on subjective measures than observed on objective testing. The discrepancy between subjective and objective scores was most strongly associated with internalizing and somatic symptoms. These associations were observed in both validity groups. CONCLUSIONS: Subjective cognitive concerns may be more indicative of the extent of internalizing and somatic symptoms than actual cognitive impairment in adults with ADHD, regardless if they have valid scores on cognitive symptom overreporting tests.

3.
Water Res ; 266: 122397, 2024 Sep 16.
Artículo en Inglés | MEDLINE | ID: mdl-39288725

RESUMEN

The concept of incorporating foam fractionation in aerated bioreactors at wastewater treatment plants (WWTPs) for the removal of per- and polyfluoroalkyl substances (PFAS) has recently been proposed. The extent of PFAS enrichment in aerated bioreactors' foams, as indicated by enrichment factors (EFs), has been observed to vary widely. Laboratory evidence has shown that factors affecting PFAS enrichment in foams include conductivity, surfactant concentrations and initial PFAS concentrations. However, real wastewaters are complex heterogenous matrices with physical, chemical and biological characteristics potentially contributing to the phenomenon of PFAS partitioning into foams. In this study, we characterised mixed liquor suspensions, including conductivity, filament content, aqueous PFAS concentrations, surface tension and total suspended solids concentrations (TSS) as well as foams, including bubble size and half-life. We used statistical tools - linear mixed-effects model - to establish relationships between PFAS enrichment in aerated bioreactor foams and the examined characteristics. We found that some of the examined characteristics, specifically filament content, surface tension and TSS concentrations measured in mixed liquor suspension and foam half-life, are negatively and significantly associated with the enrichment of longer chain PFAS (with perfluorinated carbon number ≥ 6). Of these, filament content is the important determinant of PFAS enrichment, potentially leading to an increase in, for example, perfluorooctanoic acid (PFOA) EF from 3 to 100 between typical filamentous and non-filamentous suspended biomass. However, enrichment of shorter chain PFAS (with perfluorinated carbon number ≤ 5) is negligible and is not affected by the characteristics that were measured. The findings of our study may serve as valuable information for the implementation of foam fractionation at WWTPs by elucidating the drivers that contribute to the enrichment of longer chain PFAS, under conditions typically found at WWTPs.

4.
Nat Med ; 2024 Sep 14.
Artículo en Inglés | MEDLINE | ID: mdl-39277671

RESUMEN

Among the goals of patient-centric care are the advancement of effective personalized treatment, while minimizing toxicity. The phase 2 I-SPY2.2 trial uses a neoadjuvant sequential therapy approach in breast cancer to further these goals, testing promising new agents while optimizing individual outcomes. Here we tested datopotamab-deruxtecan (Dato-DXd) in the I-SPY2.2 trial for patients with high-risk stage 2/3 breast cancer. I-SPY2.2 uses a sequential multiple assignment randomization trial design that includes three sequential blocks of biologically targeted neoadjuvant treatment: the experimental agent(s) (block A), a taxane-based regimen tailored to the tumor subtype (block B) and doxorubicin-cyclophosphamide (block C). Patients are randomized into arms consisting of different investigational block A treatments. Algorithms based on magnetic resonance imaging and core biopsy guide treatment redirection after each block, including the option of early surgical resection in patients predicted to have a high likelihood of pathological complete response, the primary endpoint. There are two primary efficacy analyses: after block A and across all blocks for the six prespecified breast cancer subtypes (defined by clinical hormone receptor/human epidermal growth factor receptor 2 (HER2) status and/or the response-predictive subtypes). We report results of 103 patients treated with Dato-DXd. While Dato-DXd did not meet the prespecified threshold for success (graduation) after block A in any subtype, the treatment strategy across all blocks graduated in the hormone receptor-negative HER2-Immune-DNA repair deficiency- subtype with an estimated pathological complete response rate of 41%. No new toxicities were observed, with stomatitis and ocular events occurring at low grades. Dato-DXd was particularly active in the hormone receptor-negative/HER2-Immune-DNA repair deficiency- signature, warranting further investigation, and was safe in other subtypes in patients who followed the treatment strategy. ClinicalTrials.gov registration: NCT01042379 .

5.
Nat Med ; 2024 Sep 14.
Artículo en Inglés | MEDLINE | ID: mdl-39277672

RESUMEN

Sequential adaptive trial designs can help accomplish the goals of personalized medicine, optimizing outcomes and avoiding unnecessary toxicity. Here we describe the results of incorporating a promising antibody-drug conjugate, datopotamab-deruxtecan (Dato-DXd) in combination with programmed cell death-ligand 1 inhibitor, durvalumab, as the first sequence of therapy in the I-SPY2.2 phase 2 neoadjuvant sequential multiple assignment randomization trial for high-risk stage 2/3 breast cancer. The trial includes three blocks of treatment, with initial randomization to different experimental agent(s) (block A), followed by a taxane-based regimen tailored to tumor subtype (block B), followed by doxorubicin-cyclophosphamide (block C). Subtype-specific algorithms based on magnetic resonance imaging volume change and core biopsy guide treatment redirection after each block, including the option of early surgical resection in patients predicted to have a high likelihood of pathologic complete response, which is the primary endpoint assessed when resection occurs. There are two primary efficacy analyses: after block A and across all blocks for six prespecified HER2-negative subtypes (defined by hormone receptor status and/or response-predictive subtypes). In total, 106 patients were treated with Dato-DXd/durvalumab in block A. In the immune-positive subtype, Dato-DXd/durvalumab exceeded the prespecified threshold for success (graduated) after block A; and across all blocks, pathologic complete response rates were equivalent to the rate expected for the standard of care (79%), but 54% achieved that result after Dato-DXd/durvalumab alone (block A) and 92% without doxorubicin-cyclophosphamide (after blocks A + B). The treatment strategy across all blocks graduated in the hormone-negative/immune-negative subtype. No new toxicities were observed. Stomatitis was the most common side effect in block A. No patients receiving block A treatment alone had adrenal insufficiency. Dato-DXd/durvalumab is a promising therapy combination that can eliminate standard chemotherapy in many patients, particularly the immune-positive subtype.ClinicalTrials.gov registration: NCT01042379 .

6.
JBJS Rev ; 12(9)2024 Sep 01.
Artículo en Inglés | MEDLINE | ID: mdl-39283966

RESUMEN

BACKGROUND: The utility and risks associated with the use of cervical collars in the postoperative period after cervical spine surgery have been of debate. The purpose of this study was to systematically review the currently available evidence on the use of cervical collars after cervical spine surgery to assess their impact on outcomes. METHODS: A literature search of the PubMed database was performed using keywords "cervical collar," "anterior cervical discectomy and fusion (ACDF)," "posterior cervical decompression and fusion," "laminoplasty," "post-operative orthotic bracing," "cervical decompression," and "cervical orthosis" in all possible combinations. All English studies with the level of evidence of I to IV that were published from May 1, 1986, to December 3, 2023, were considered for inclusion. RESULTS: A total of 25 articles meeting the inclusion criteria were identified and reviewed. Regarding anterior and posterior fusion procedures, cervical collar use demonstrated improved short-term patient-reported outcomes and pain control. While surgeon motivation for collar use was to increase fusion rates, this is not well drawn out in the literature with the majority of studies demonstrated no significant difference in fusion rates between patients who wore a cervical collar and those who did not. Regarding motion-preserving procedures such as cervical laminoplasty, patients with prolonged postoperative cervical collar use demonstrated increased rates of axial neck pain and decreased final range of motion (ROM). CONCLUSION: Surgeon motivation for postoperative cervical collar immobilization after completion of fusion procedures is to increase fusion rates and improve postoperative pain and disability despite this not being fully drawn out in the literature. After completion of motion-sparing procedures, the benefits of collar immobilization diminish with their prolonged use which could lead to increased rates of axial neck pain and decreased ROM. Cervical collar immobilization in the postoperative period should be considered its own intervention, with its own associated risk-benefit profile. LEVEL OF EVIDENCE: Therapeutic Level III. See Instructions for Authors for a complete description of levels of evidence.


Asunto(s)
Vértebras Cervicales , Fusión Vertebral , Humanos , Vértebras Cervicales/cirugía , Descompresión Quirúrgica , Discectomía , Aparatos Ortopédicos
7.
Cancers (Basel) ; 16(17)2024 Aug 23.
Artículo en Inglés | MEDLINE | ID: mdl-39272793

RESUMEN

Hi-C sequencing is a DNA-based next-generation sequencing method that preserves the 3D genome conformation and has shown promise in detecting genomic rearrangements in translational research studies. To evaluate Hi-C as a potential clinical diagnostic platform, analytical concordance with routine laboratory testing was assessed using primary pediatric leukemia and sarcoma specimens. Archived viable and non-viable frozen leukemic cells and formalin-fixed paraffin-embedded (FFPE) tumor specimens were analyzed. Pediatric acute myeloid leukemia (AML) and alveolar rhabdomyosarcoma (A-RMS) specimens with known genomic rearrangements were subjected to Hi-C to assess analytical concordance. Subsequently, a discovery cohort consisting of AML and acute lymphoblastic leukemia (ALL) cases without known genomic rearrangements based on prior clinical diagnostic testing was evaluated to determine whether Hi-C could detect rearrangements. Using a standard sequencing depth of 50 million raw read-pairs per sample, or approximately 5X raw genomic coverage, we observed 100% concordance between Hi-C and previous clinical cytogenetic and molecular testing. In the discovery cohort, a clinically relevant gene fusion was detected in 45% of leukemia cases (5/11). This study provides an institutional proof of principle evaluation of Hi-C sequencing to medical diagnostic testing as it identified several clinically relevant rearrangements, including those that were missed by current clinical testing workflows.

8.
Acute Crit Care ; 39(3): 359-368, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-39266271

RESUMEN

BACKGROUND: Abnormal red blood cell distribution width (RDW) is associated with poor cardiovascular, respiratory, and coronavirus disease 2019 (COVID-19) outcomes. However, whether RDW provides prognostic insights regarding COVID-19 patients admitted to the intensive care unit (ICU) was unknown. Here, we retrospectively investigated the association of RDW with 30-day and 90- day mortalities, duration of mechanical ventilation, and length of ICU and hospital stay in patients with COVID-19. METHODS: This study included 321 patients with COVID-19 aged >18 years who were admitted to the ICU between March 2020 and July 2022. The outcomes were mortality, duration of mechanical ventilation, and length of stay. RDW >14.5% was assessed in blood samples within 24 hours of admission. RESULTS: The mortality rate was 30.5%. Multivariable Cox regression analysis showed an association between increased RDW and 30-day mortality (hazard ratio [HR], 3.64; 95% CI, 1.54-8.65), 90-day mortality (HR, 3.66; 95% CI, 1.59-8.40), and shorter duration of invasive ventilation (2.7 ventilator-free days, P=0.033). CONCLUSIONS: Increased RDW in COVID-19 patients at ICU admission was associated with increased 30-day and 90-day mortalities, and shorter duration of invasive ventilation. Thus, RDW can be used as a surrogate biomarker for clinical outcomes in COVID-19 patients admitted to the ICU.

9.
Dementia (London) ; : 14713012241285485, 2024 Sep 14.
Artículo en Inglés | MEDLINE | ID: mdl-39276146

RESUMEN

Background: Social service professionals routinely use screening tools to assess for cognitive decline or identify suspected dementia in nursing home residents. Published literature lacks details about the specific tools used and how professionals use and perceive them in practice. The aim of this study is to better understand cognitive screening roles performed by nursing home social service professionals and how they view their use and efficacy.Methods: An online survey was administered to all 230 nursing homes in the US state of Alabama between October 2021 and March 2022. Fifty-three social service professionals who conduct resident cognitive screenings responded to the survey.Results: In addition to completing the US-mandated Brief Interview of Mental Status (BIMS) quarterly, 75% of participants reported using additional tools, most notably the Mini Mental Status Examination (MMSE). Participants reported using different tools for varied purposes. Those who used both the BIMS and MMSE rated the BIMS significantly higher on ease and time to administer while rating the MMSE higher on reliability and validity. Although most participants reported high levels of confidence using the tools, over half of participants indicated interest in further training in cognitive assessment tools.Discussion: Findings provide evidence regarding who administers nursing home cognitive screenings, which tools are used, and their experiences using those tools. Participant responses reveal the value of using multiple screening tools for improved detection of cognitive status and decline for residents as well as a need for additional training in cognitive assessment. Findings also suggest that the primary tool used for cognitive screening may be quick and easy to use at the expense of perceived reliability and validity. Further evaluation of nursing home cognitive assessment is needed.

10.
Soc Sci Med ; 359: 117283, 2024 Aug 31.
Artículo en Inglés | MEDLINE | ID: mdl-39232379

RESUMEN

Workers' perception of control over work is a key construct in the relationship between the psychosocial work environment and health. While exposure to low job control has been prospectively linked to poor mental health including depression and anxiety, there is less research examining the impact of prolonged exposure to low job control on mental health. Data from 5054 employed men from 2013 to 2021 in the Australian Longitudinal Study on Male Health was used to examine persistent and intermittent low job control and subsequent major depression symptoms. Persistent low job control was based on consecutive self-reports of low job control over waves 1 and 2. Combinations of low and high job control were classified as intermittent exposure and continuous high job control exposure over both waves was classified as persistent high job control. Major depression symptoms, derived from the Patient Health Questionnaire-9, was measured in wave 3. Generalised linear models and augmented inverse probability weighting were undertaken. There was a strong stepwise relationship between low job control and major depression. Compared to persistent high job control, intermittent low job control was associated with an increased risk of subsequent major depression symptoms by 32% (RR 1.32, 95% CI 0.82, 2.15); and persistent low job control by 103% (RR 2.03, 95% CI 1.21, 3.41). Compared to men exposed to persistent high job control, the average treatment effect for persistent low job control was 0.036 (95%CI 0.014, 0.058) and intermittent low job control 0.019 (95%CI 0.006, 0.032) equating to a risk ratio of 2.46 (95%CI 1.43, 3.50) and 1.79 (95% CI 1.14, 2.45) respectively. This study's findings have implications for public health and occupational policies, as they underscore the importance of reducing prolonged exposure to low job control to protect against the risk of major depression in the working population.

11.
J Phys Act Health ; : 1-5, 2024 Sep 09.
Artículo en Inglés | MEDLINE | ID: mdl-39251194

RESUMEN

BACKGROUND: To meet the World Health Organization goal of reducing physical inactivity by 15% by 2030, a multisectoral system approach is urgently needed to promote physical activity (PA). We report the process of and findings from a codesigned systems mapping project to present determinants of PA in the context of urban New South Wales, Australia. METHODS: A participatory conceptual mapping workshop was held in May 2023 with 19 participants working in education, transportation, urban planning, community, health, and sport and recreation. Initial maps were developed and refined using online feedback from the participants. Interviews were conducted with 10 additional policymakers from relevant sectors to further refine the maps. RESULTS: Two systems maps were cocreated, identifying over 100 variables influencing PA and their interconnections. Five settings emerged from the adults' map-social and community, policy, built environment and transportation, health care, and workplace-and 4 for the young people's map-family, school, transportation, and community and environment. The maps share similarities, such as regarding potential drivers within the transportation, community, and built environment sectors; however, the young people's map has a specific focus on the school setting and the adults' map on workplace and health care settings. Interviews with policymakers provided further unique insights into understanding and intervening in the PA system. CONCLUSIONS: This codesigned participatory systems mapping process, supplemented by stakeholder interviews, provided a unique opportunity to bring together stakeholders across sectors to understand the complexity within the PA system and begin to identify leverage points for tackling physical inactivity in New South Wales.

12.
J Evid Based Soc Work (2019) ; : 1-19, 2024 Sep 09.
Artículo en Inglés | MEDLINE | ID: mdl-39252456

RESUMEN

PURPOSE: Children with behavioral issues in residential care settings have high rates of trauma, with a range of trauma experiences such as abuse and neglect, issues with attachment, and multiple disruptions in placements. Staff in these settings should have an understanding of trauma, its impact, and how to engage in trauma-informed practice.The purpose of this study was to examine whether a trauma-informed training, developed specifically based on the identified needs of a residential group care facility, had an impact on future staff attitudes and behaviors. MATERIALS AND METHODS: A 3-h training was delivered by the researchers. Three identical sessions were provided to all 48 staff, regardless of education and role, across a 3-day period. Prior to the training, staff were given a pretest survey measuring components of trauma-informed (TI) practice that indicated how often the staff members engaged in TI practice. Thirty days later, the same participants completed a posttest survey to gauge if the training had an impact on their subsequent attitudes and behavior. RESULTS: There were improvements in many of the trauma-informed practice areas on the posttest survey. T-test analysis revealed five trauma-informed practice areas had improvements that were statistically significant from the pretest survey. DISCUSSION: The findings present the opportunity for recommendations for trauma-informed training development and delivery, as well as providing implications for the field of social work. CONCLUSION: This study demonstrates the feasibility of administering a trauma-informed training program and observing relatively rapid improvements in future attitudes and behavior among staff.

13.
Acta Biomater ; 2024 Aug 30.
Artículo en Inglés | MEDLINE | ID: mdl-39218278

RESUMEN

Primary open angle glaucoma (POAG) is currently the most prevalent cause of irreversible blindness globally. To date, there are few in vitro models that can faithfully recapitulate the complex architecture of the trabecular meshwork (TM) and the specialized trabecular meshwork cell (TMC) characteristics that are local to structurally opposing regions. This study aimed to investigate the parameters that govern TMC phenotype by adapting the extracellular matrix structure to mimic the juxtacanalicular tissue (JCT) region of the TM. Initially, TMC phenotypic characteristics were investigated within type I collagen matrices of controlled fiber density and anisotropy, generated through confined plastic compression (PC). Notably, PC-collagen presented biophysical cues that induced JCT cellular characteristics (elastin, α-ß-Crystallin protein expression, cytoskeletal remodeling and increased mesenchymal and JCT-specific genetic markers). In parallel, a pathological mesenchymal phenotype associated with POAG was induced through localized transforming growth factor -beta 2 (TGFß-2) exposure. This resulted in a profile of alternative mesenchymal states (fibroblast/smooth muscle or myofibroblast) displayed by the TMC in vitro. Overall, the study provides an advanced insight into the biophysical cues that modulate TMC fate, demonstrating the induction of a JCT-specific TMC phenotype and transient mesenchymal characteristics that reflect both healthy or pathological scenarios. STATEMENT OF SIGNIFICANCE: Glaucoma is the most prevalent cause of blindness, with a lack of efficacy within current drug candidates. Reliable trabecular meshwork (TM) in vitro models will be critical for enhancing the fields understanding of healthy and disease states for pre-clinical testing. To date, trabecular meshwork cells (TMCs) display heterogeneity throughout the hierarchical TM, however our understanding into recapitulating these phenotypes in vitro, remains elusive. This study hypothesizes the importance of specific matrix/growth factor spatial stimuli in governing TMC phenotype. By emulating certain biophysical/biochemical in vivo parameters, we introduce an advanced profile of distinct TMC phenotypic states, reflecting healthy and disease scenarios. A notion that has not be stated prior and a fundamental consideration for future TM 3D in vitro modelling.

14.
Front Public Health ; 12: 1430540, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-39109149

RESUMEN

Mental health problems among the working population represent a growing concern with huge impacts on individuals, organizations, compensation authorities, and social welfare systems. The workplace presents both psychosocial risks and unique opportunities for intervention. Although there has been rapid expansion of workplace mental health interventions over recent decades, clear direction around appropriate, evidence-based action remains limited. While numerous workplace mental health models have been proposed to guide intervention, general models often fail to adequately consider both the evidence base and where best-practice principles alone inform action. Further, recommendations need to be updated as new discoveries occur. We seek to update the Framework for Mentally Healthy Workplaces based on new evidence of intervention effectiveness while also incorporating evidence-based principles. The updated model also integrates concepts from existing alternate models to present a comprehensive overview of strategies designed to enhance wellbeing, minimize harm, and facilitate recovery. Examples of available evidence and obstacles to implementation are discussed. The Framework is designed to support employers and managers in determining which strategies to apply and to guide future avenues of research.


Asunto(s)
Lugar de Trabajo , Humanos , Salud Mental , Salud Laboral , Trastornos Mentales , Política de Salud , Personal Administrativo
16.
Leukemia ; 2024 Aug 27.
Artículo en Inglés | MEDLINE | ID: mdl-39192036

RESUMEN

Third-generation chimeric antigen receptor T cells (CARTs) for relapsed or refractory (r/r) chronic lymphocytic leukemia (CLL) may improve efficacy compared to second-generation CARTs due to their enhanced CAR design. We performed the first phase 1/2 investigator-initiated trial evaluating escalating doses of third-generation CARTs (HD-CAR-1) targeting CD19 in patients with r/r CLL and B-cell lymphoma. CLL eligibility criteria were failure to two therapy lines including at least one pathway inhibitor and/or allogeneic hematopoietic cell transplantation. Nine heavily pretreated patients received HD-CAR-1 at dose levels ranging from 1 × 106 to 200 × 106 CART/m2. In-house HD-CAR-1 manufacturing was successful for all patients. While neurotoxicity was absent, one case of grade 3 cytokine release syndrome was observed. By day 90, six patients (67%) attained a CR, five of these (83%) with undetectable MRD. With a median follow-up of 27 months, 2-year PFS and OS were 30% and 69%, respectively. HD-CAR-1 products of responders contained significantly more CD4 + T cells compared to non-responders. In non-responders, a strong enrichment of effector memory-like CD8 + T cells with high expression of CD39 and/or CD197 was observed. HD-CAR-1 demonstrated encouraging efficacy and exceptionally low treatment-specific toxicity, presenting new treatment options for patients with r/r CLL. Trial registration: #NCT03676504.

17.
Sci Total Environ ; 950: 175283, 2024 Nov 10.
Artículo en Inglés | MEDLINE | ID: mdl-39111449

RESUMEN

There has been an increase in tile drained area across the US Midwest and other regions worldwide due to agricultural expansion, intensification, and climate variability. Despite this growth, spatially explicit tile drainage maps remain scarce, which limits the accuracy of hydrologic modeling and implementation of nutrient reduction strategies. Here, we developed a machine-learning model to provide a Spatially Explicit Estimate of Tile Drainage (SEETileDrain) across the US Midwest in 2017 at a 30-m resolution. This model used 31 satellite-derived and environmental features after removing less important and highly correlated features. It was trained with 60,938 tile and non-tile ground truth points within the Google Earth Engine cloud-computing platform. We also used multiple feature importance metrics and Accumulated Local Effects to interpret the machine learning model. The results show that our model achieved good accuracy, with 96 % of points classified correctly and an F1 score of 0.90. When tile drainage area is aggregated to the county scale, it agreed well (r2 = 0.69) with the reported area from the Ag Census. We found that Land Surface Temperature (LST) along with climate- and soil-related features were the most important factors for classification. The top-ranked feature is the median summer nighttime LST, followed by median summer soil moisture percent. This study demonstrates the potential of applying satellite remote sensing to map spatially explicit agricultural tile drainage across large regions. The results should be useful for land use change monitoring and hydrologic and nutrient models, including those designed to achieve cost-effective agricultural water and nutrient management strategies. The algorithms developed here should also be applicable for other remote sensing mapping applications.

18.
J Psychol ; : 1-23, 2024 Aug 23.
Artículo en Inglés | MEDLINE | ID: mdl-39177682

RESUMEN

The expression of contextually appropriate emotions in the workplace is critical to fostering effective interpersonal interactions. What constitutes an appropriate emotional expression is determined by the display rules an employee perceives. Within the emotional labor framework, the management of emotional expression at work (i.e., ensuring alignment with display rules) occurs through the engagement in two primary strategies by employees. These are known as surface acting and deep acting. Despite theoretical efforts to synthesize these strategies with the broader emotion regulation framework and its strategies of expressive suppression and cognitive reappraisal, no empirical examination of their relationship exists. The present study aimed to investigate this empirical relationship to provide clarity on the extent to which these constructs (i.e., strategies) are unique across frameworks. A second aim was to assess whether method bias could explain any overlap between these constructs. A total of 800 participants (Mage = 22.4 years, 78.8% female) who worked across a range of service industries completed measures of emotion regulation and emotional labor under two conditions designed to manipulate the presence of method bias (i.e., varying the order of item administration). Participants also completed the DASS-21, a measure of affective symptom severity. Using multigroup analysis, the results indicated that analogous latent constructs (cognitive reappraisal and deep acting; expressive suppression and surface acting) yielded significant, small-to-moderate correlations, and that correlation coefficients were invariant regardless of how items were administered. The pattern of correlations with affective symptoms also differed across constructs. Together, the limited correlations between the analogous strategies, and the differential associations with affective symptoms, suggest a relative independence between these constructs. Findings carry theoretical and practical implications across research and clinical settings.

19.
J Surg Oncol ; 2024 Aug 19.
Artículo en Inglés | MEDLINE | ID: mdl-39155667

RESUMEN

BACKGROUND: Large Language Models (LLM; e.g., ChatGPT) may be used to assist clinicians and form the basis of future clinical decision support (CDS) for colon cancer. The objectives of this study were to (1) evaluate the response accuracy of two LLM-powered interfaces in identifying guideline-based care in simulated clinical scenarios and (2) define response variation between and within LLMs. METHODS: Clinical scenarios with "next steps in management" queries were developed based on National Comprehensive Cancer Network guidelines. Prompts were entered into OpenAI ChatGPT and Microsoft Copilot in independent sessions, yielding four responses per scenario. Responses were compared to clinician-developed responses and assessed for accuracy, consistency, and verbosity. RESULTS: Across 108 responses to 27 prompts, both platforms yielded completely correct responses to 36% of scenarios (n = 39). For ChatGPT, 39% (n = 21) were missing information and 24% (n = 14) contained inaccurate/misleading information. Copilot performed similarly, with 37% (n = 20) having missing information and 28% (n = 15) containing inaccurate/misleading information (p = 0.96). Clinician responses were significantly shorter (34 ± 15.5 words) than both ChatGPT (251 ± 86 words) and Copilot (271 ± 67 words; both p < 0.01). CONCLUSIONS: Publicly available LLM applications often provide verbose responses with vague or inaccurate information regarding colon cancer management. Significant optimization is required before use in formal CDS.

20.
Artículo en Inglés | MEDLINE | ID: mdl-39164888

RESUMEN

BACKGROUND: We aimed to describe the association between insertion of a new long-term enteral feeding tube during admission for aspiration and in-hospital mortality. METHODS: This retrospective cohort study across 28 Canadian hospitals from 2015 to 2022 included consecutive patients who were admitted for aspiration. Patients were categorized based on new long-term enteral feeding tube insertion during hospital stay or not. The primary outcome was the time to death in hospital. Secondary outcomes included time to discharge alive and hospital readmission for aspiration within 90 days. We used propensity score weighting to balance covariates, and a competing risk model to describe in-hospital death and discharge. RESULTS: Of 12,850 patients admitted for aspiration, 852 (6.6%) patients received a long-term enteral feeding tube. In the hospital, 184 (21.6%) and 2489 (20.8%) patients in the enteral feeding tube group and no enteral feeding tube group died, respectively. Within 90 days of discharge, 127 (14.9%) and 1148 (9.6%) patients in the enteral feeding tube and no enteral feeding tube group were readmitted for aspiration, respectively. After balancing covariates, an enteral feeding tube was associated with a similar in-hospital mortality risk (subdistribution hazard ratio [sHR] = 1.05, 95% CI = 0.89-1.23; P = 0.5800), longer time to discharge alive (sHR = 0.58, 95% CI = 0.54-0.63; P < 0.0001), and a higher risk of readmission (risk difference = 5.0%, 95% CI = 2.4%-7.6%; P = 0.0001). CONCLUSION: Initiation of long-term enteral tube feeding was not uncommon after admission for aspiration and was not associated with an improvement in the probability of being discharged alive from the hospital or readmitted for aspiration.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA