Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 20 de 124
Filtrar
1.
Pharmacoepidemiol Drug Saf ; 33(9): e70001, 2024 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-39252433

RESUMEN

PURPOSE: This retrospective real-world study compared overall survival (OS) between patients with BRCA wild-type (BRCAwt) recurrent epithelial ovarian cancer (OC) who received niraparib second-line maintenance (2LM) versus active surveillance (AS) using target trial emulation, cloning, inverse probability of censoring weighting (IPCW) methodology to minimize immortal time bias. METHODS: Eligible patients from a United States-based, deidentified, electronic health record-derived database were diagnosed with epithelial OC (January 1, 2011-May 31, 2021), were BRCAwt, and completed second-line (2L) therapy (January 1, 2017-March 2, 2022). Patient data were cloned at index (2L last treatment date), assigned to niraparib 2LM and AS cohorts, and censored when treatment deviated from clone assignment. Follow-up was measured from index to earliest of study end (May 31, 2022), last activity, or death. Median OS (mOS) and hazard ratios were estimated from stabilized IPCW Kaplan-Meier curves and Cox regression models. RESULTS: Overall, 199 patients received niraparib 2LM, and 707 had their care managed with AS. Key characteristics were balanced across cohorts after cloning and stabilized IPCW. Median follow-up was 15.6- and 9.3-months pre-cloning. IPCW mOS was 24.1 months (95% CI: 20.9-29.5) and 18.4 months (95% CI: 15.1-22.8) in niraparib 2LM and AS cohorts, respectively (hazard ratio, 0.77; 95% CI: 0.66-0.89). CONCLUSIONS: This real-world study provides supportive evidence of an OS benefit for patients with BRCAwt recurrent OC who received 2LM niraparib monotherapy compared with those whose care was managed with AS. The analytic strategies implemented were useful in minimizing immortal time bias and measured confounding.


Asunto(s)
Indazoles , Recurrencia Local de Neoplasia , Neoplasias Ováricas , Piperidinas , Humanos , Femenino , Piperidinas/uso terapéutico , Piperidinas/administración & dosificación , Indazoles/uso terapéutico , Indazoles/administración & dosificación , Persona de Mediana Edad , Estudios Retrospectivos , Anciano , Neoplasias Ováricas/tratamiento farmacológico , Neoplasias Ováricas/mortalidad , Carcinoma Epitelial de Ovario/tratamiento farmacológico , Carcinoma Epitelial de Ovario/mortalidad , Inhibidores de Poli(ADP-Ribosa) Polimerasas/uso terapéutico , Inhibidores de Poli(ADP-Ribosa) Polimerasas/administración & dosificación , Adulto , Espera Vigilante , Estados Unidos/epidemiología , Quimioterapia de Mantención/métodos , Bases de Datos Factuales
2.
Am J Epidemiol ; 2024 Sep 09.
Artículo en Inglés | MEDLINE | ID: mdl-39252558

RESUMEN

By evaluating published emulations of oncology RCTs studies in which both the active and comparator groups are sourced from RWD and target trial results are available for benchmarking, this systematic review aims to gain insight into factors related to emulation performance. Thirteen oncology emulation studies using various types of RWD were identified through an online database search of PubMed through 2022. Based on the ROBINS-I tool, most studies (N=8) had a serious risk of overall bias driven by risk of bias from confounding. Approximately half of the studies (N=6) fully proxied the RCT entry criteria. Of 11 RWD studies that provided sufficient detail to quantify emulation performance, the emulation HR estimate fell within the 95% CI of the trial estimate in 9 of the studies. There were no clear trends between risk of bias or degree to which the entry criteria were proxied and emulation performance. Findings may have been influenced by publication bias and researcher degrees of freedom, as only one emulation study pre-registered its protocol. Tools for comprehensively characterizing factors that affect emulation performance, including the real-world clinical context as it relates to the RCT research question, are needed to evaluate the feasibility of a RCT emulation.

3.
Transpl Int ; 37: 13208, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-39267619

RESUMEN

Living donation (LD) transplantation is the preferred treatment for kidney failure as compared to donation after brain death (DBD), but age may play a role. We compared the 1-year estimated glomerular filtration rate (eGFR) after kidney transplantation for recipients of LD and DBD stratified by recipient and donor age between 2015 and 2018 in a matched cohort. The strength of the association between donation type and 1-year eGFR differed by recipient age (P interaction < 0.0001). For LD recipients aged 40-54 years versus same-aged DBD recipients, the adjusted odds ratio (aOR) for eGFR ≥60 mL/min/1.73 m2 was 1.48 (95% CI: 1.16-1.90). For DBD recipients aged ≥ 60 years, the aOR was 0.18 (95% CI: 0.12-0.29) versus DBD recipients aged 40-54 years but was 0.91 (95% CI: 0.67-1.24) versus LD recipients aged ≥60 years. In the matched cohort, 4-year graft and patient survival differed by donor age and type. As compared with DBD grafts, LD grafts increased the proportion of recipients with 1-year eGFR ≥60 mL/min/1.73 m2. Recipients aged ≥60 years benefited most from LD transplantation, even if the donor was aged ≥60 years. For younger recipients, large age differences between donor and recipient could also be addressed with a paired exchange program.


Asunto(s)
Muerte Encefálica , Tasa de Filtración Glomerular , Supervivencia de Injerto , Trasplante de Riñón , Donadores Vivos , Humanos , Persona de Mediana Edad , Adulto , Masculino , Femenino , Factores de Edad , Anciano , Donantes de Tejidos , Estudios Retrospectivos
4.
Am J Epidemiol ; 2024 Sep 12.
Artículo en Inglés | MEDLINE | ID: mdl-39270677

RESUMEN

Postexposure vaccination has the potential to prevent or modify the course of clinical disease among those exposed to a pathogen. However, due to logistical constraints, postexposure vaccine trials have been difficult to implement in practice. In place of trials, investigators have used observational data to estimate the effectiveness or optimal timing window for postexposure vaccines, but the relationship between these analyses and those that would be conducted in a trial is often unclear. Here, we define several possible target trials for postexposure vaccination and show how, under certain conditions, they can be emulated using observational data. We emphasize the importance of the incubation period and the timing of vaccination in trial design and emulation. As an example, we specify a protocol for postexposure vaccination against mpox and provide a step-by-step description of how to emulate it using data from a healthcare database or contact tracing program. We further illustrate some of the benefits of the target trial approach through simulation.

5.
Am J Obstet Gynecol ; 2024 Aug 05.
Artículo en Inglés | MEDLINE | ID: mdl-39111517

RESUMEN

BACKGROUND: The effect of primary cytoreductive surgery vs interval cytoreductive surgery on International Federation of Gynecology and Obstetrics stage IV ovarian cancer outcomes remains uncertain and may vary depending on the stage and the location of extraperitoneal metastasis. Emulating target trials through causal assessment, combined with propensity score adjustment, has become a leading method for evaluating interventions using observational data. OBJECTIVE: This study aimed to assess the effect of primary vs interval cytoreductive surgery on progression-free and overall survival in patients with International Federation of Gynecology and Obstetrics stage IV ovarian cancer using target trial emulation. STUDY DESIGN: Using the comprehensive French national health insurance database, we emulated a target trial to explore the causal impacts of primary vs interval cytoreductive surgery on stage IV ovarian cancer prognosis (Surgery for Ovarian cancer FIGO 4: SOFI-4). The clone method with inverse probability of censoring weighting was used to adjust for informative censoring and to balance baseline characteristics between the groups. Subgroup analyses were conducted based on the stages and extraperitoneal metastasis locations. The study included patients younger than 75 years of age, in good health condition, who were diagnosed with stage IV ovarian cancer between January 1, 2014, and December 31, 2022. The primary and secondary outcomes were respectively 5-year progression-free survival and 7-year overall survival. RESULTS: Among the 2772 patients included in the study, 948 (34.2%) were classified as having stage IVA ovarian cancer and 1824 (65.8%) were classified as having stage IVB ovarian cancer at inclusion. Primary cytoreductive surgery was performed for 1182 patients (42.6%), whereas interval cytoreductive surgery was conducted for 1590 patients (57.4%). The median progression-free survival for primary cytoreductive surgery was 19.7 months (interquartile range, 19.3-20.1) as opposed to 15.7 months (interquartile range, 15.7-16.1) for those who underwent interval cytoreductive surgery. The median overall survival was 63.1 months (interquartile range, 61.7-65.4) for primary cytoreductive surgery in comparison with 55.6 months (interquartile range, 53.8-56.3) for interval cytoreductive surgery. The findings of our study indicate that primary cytoreductive surgery is associated with a 5.0-month increase in the 5-year progression-free survival (95% confidence interval, 3.8-6.2) and a 3.9-month increase in 7-year overall survival (95% confidence interval, 1.9-6.2). These survival benefits of primary over interval cytoreductive surgery were observed in both the International Federation of Gynecology and Obstetrics stage IVA and IVB subgroups. Primary cytoreductive surgery demonstrated improved progression-free survival and overall survival in patients with pleural, supradiaphragmatic, or extra-abdominal lymph node metastasis. CONCLUSION: This study advocates for the benefits of primary cytoreductive surgery over interval cytoreductive surgery for patients with stage IV ovarian cancer and suggests that extraperitoneal metastases like supradiaphragmatic or extra-abdominal lymph nodes should not automatically preclude primary cytoreductive surgery consideration in suitable patients.

6.
J Clin Epidemiol ; 174: 111504, 2024 Aug 17.
Artículo en Inglés | MEDLINE | ID: mdl-39159770

RESUMEN

OBJECTIVES: To quantify the ability of two new comorbidity indices to adjust for confounding, by benchmarking a target trial emulation against the randomized controlled trial (RCT) result. STUDY DESIGN AND SETTING: Observational study including 18,316 men from Prostate Cancer data Base Sweden 5.0, diagnosed with prostate cancer between 2008 and 2019 and treated with primary radical prostatectomy (RP, n = 14,379) or radiotherapy (RT, n = 3,937). The effect on adjusted risk of death from any cause after adjustment for comorbidity by use of two new comorbidity indices, the multidimensional diagnosis-based comorbidity index and the drug comorbidity index, were compared to adjustment for the Charlson comorbidity index (CCI). RESULTS: Risk of death was higher after RT than RP (hazard ratio [HR] = 1.94; 95% confidence interval [CI]: 1.70-2.21). The difference decreased when adjusting for age, cancer characteristics, and CCI (HR = 1.32, 95% CI: 1.06-1.66). Adjustment for the two new comorbidity indices further attenuated the difference (HR 1.14, 95% CI 0.91-1.44). Emulation of a hypothetical pragmatic trial where also older men with any type of baseline comorbidity were included, largely confirmed these results (HR 1.10; 95% CI 0.95-1.26). CONCLUSION: Adjustment for comorbidity using two new indices provided comparable risk of death from any cause in line with results of a RCT. Similar results were seen in a broader study population, more representative of clinical practice.

7.
Clin Perinatol ; 51(3): 605-616, 2024 Sep.
Artículo en Inglés | MEDLINE | ID: mdl-39095099

RESUMEN

The authors summarize the methodology for a new pragmatic comparative effectiveness research investigation, Cooling Prospectively Infants with Mild Encephalopathy (COOLPRIME), which uses sites' existing mild hypoxic-ischemic encephalopathy (HIE) treatment preference (hypothermia or normothermia) to assess hypothermia effectiveness and safety. COOLPRIME's primary aim is to determine the safety and effectiveness of hypothermia compared to normothermia in mild HIE. Engagement of Families and Community Affected by Hypoxic-Ischemic Encephalopathy strongly favored Effectiveness over Efficacy Trials leading to COOL PRIME design.


Asunto(s)
Investigación sobre la Eficacia Comparativa , Hipotermia Inducida , Hipoxia-Isquemia Encefálica , Humanos , Hipotermia Inducida/métodos , Hipoxia-Isquemia Encefálica/terapia , Recién Nacido , Estudios Prospectivos , Lactante , Resultado del Tratamiento
8.
Artículo en Inglés | MEDLINE | ID: mdl-39169475

RESUMEN

BACKGROUND: The target trial framework was developed as a strategy to design and analyze observational epidemiologic studies with the aim of reducing bias due to analytic decisions. It involves designing a hypothetical randomized trial to answer a question of interest and systematically considering how to use observational data to emulate each trial component. AIMS: The primary aim of this paper is to provide a detailed example of the application of the target trial framework to a research question in oral epidemiology. MATERIALS AND METHODS: We describe the development of a hypothetical target trial and emulation protocol to evaluate the effect of preconception periodontitis treatment on time-to-pregnancy. We leverage data from Pregnancy Study Online (PRESTO), a preconception cohort, to ground our example in existing observational data. We discuss the decision-making process for each trial component, as well as limitations encountered. RESULTS: Our target trial application revealed data limitations that precluded us from carrying out the proposed emulation. Implications for data quality are discussed and we provide recommendations for researchers interested in conducting trial emulations in the field of oral epidemiology. DISCUSSION: The target trial framework has the potential to improve the validity of observational research in oral health, when properly applied. CONCLUSION: We encourage the broad adoption of the target trial framework to the field of observational oral health research and demonstrate its value as a tool to identify directions for future research.

9.
Am J Epidemiol ; 2024 Aug 06.
Artículo en Inglés | MEDLINE | ID: mdl-39108175

RESUMEN

Studying the effect of duration of treatment on prognostic outcomes using real-world data is challenging because only people who survive for a long time can receive a treatment for a long time. Specifying a target trial helps overcome such challenge. We aimed to estimate the effect of different durations of treatment with antihypertensive drugs with anticholinergic properties (AC AHT) on the risk of vascular dementia and Alzheimer's disease by emulating a target trial using the UK CPRD GOLD database (2001-2017). Comparing treatment for 3-6 years versus ≤3 years yielded null results for both types of dementia. Comparing a longer duration of treatment, >6 years versus ≤3 years, yielded a 10-year risk ratio of 0.69 (95% CI, 0.54-0.90) for vascular dementia and 0.91 (95% CI, 0.77-1.10) for Alzheimer's disease. For illustration, we performed an analysis that failed to emulate a target trial by assigning exposure categories using post-baseline information, obtaining implausible beneficial estimates. Our findings indicate a modest benefit of longer duration of treatment with AC AHT on vascular dementia and highlight the value of the target trial emulation to avoid selection bias in the evaluation of the effect of different durations of treatment.

10.
Open Forum Infect Dis ; 11(8): ofae446, 2024 Aug.
Artículo en Inglés | MEDLINE | ID: mdl-39183812

RESUMEN

Background: We aimed to determine the effectiveness of switching to bictegravir in maintaining an undetectable viral load (<50 copies/mL) among people with HIV (PWH) as compared with continuing dolutegravir-, efavirenz-, or raltegravir-based antiretroviral therapy using nationwide observational data from Mexico. Methods: We emulated 3 target trials comparing switching to bictegravir vs continuing with dolutegravir, efavirenz, or raltegravir. Eligibility criteria were PWH aged ≥16 years with a viral load <50 copies/mL and at least 3 months of current antiretroviral therapy (dolutegravir, efavirenz, or raltegravir) between July 2019 and September 2021. Weekly target trials were emulated during the study period, and individuals were included in every emulation if they continued to be eligible. The main outcome was the probability of an undetectable viral load at 3 months, which was estimated via an adjusted logistic regression model. Estimated probabilities were compared via differences, and 95% CIs were calculated via bootstrap. Outcomes were also ascertained at 12 months, and sensitivity analyses were performed to test our analytic choices. Results: We analyzed data from 3 028 619 PWH (63 581 unique individuals). The probability of an undetectable viral load at 3 months was 2.9% (95% CI, 1.9%-3.8%), 1.3% (95% CI, .9%-1.6%), and 1.2% (95% CI, .8%-1.7%) higher when switching to bictegravir vs continuing with dolutegravir, efavirenz, and raltegravir, respectively. Similar results were observed at 12 months and in other sensitivity analyses. Conclusions: Our findings suggest that switching to bictegravir could be more effective in maintaining viral suppression than continuing with dolutegravir, efavirenz, or raltegravir.

11.
BMC Health Serv Res ; 24(1): 943, 2024 Aug 19.
Artículo en Inglés | MEDLINE | ID: mdl-39160528

RESUMEN

BACKGROUND: Research suggests an association between COVID-19 infection and certain financial hardships in the shorter term and among single-state and privately insured samples. Whether COVID-19 is associated with financial hardship in the longer-term or among socially vulnerable populations is unknown. Therefore, we examined whether COVID-19 was associated with a range of financial hardships 18 months after initial infection among a national cohort of Veterans enrolled in the Veterans Health Administration (VHA)-the largest national integrated health system in the US. We additionally explored the association between Veteran characteristics and financial hardship during the pandemic, irrespective of COVID-19. METHODS: We conducted a prospective, telephone-based survey. Out of 600 Veterans with COVID-19 from October 2020 through April 2021 who were invited to participate, 194 Veterans with COVID-19 and 194 matched comparators without a history of infection participated. Financial hardship outcomes included overall health-related financial strain, two behavioral financial hardships (e.g., taking less medication than prescribed due to cost), and seven material financial hardships (e.g., using up most or all savings). Weighted generalized estimating equations were used to estimate risk ratios (RR) and 95% confidence intervals (CI) of financial hardship by COVID-19 status, and to assess the relationship between infection and Veteran age, VHA copay status, and comorbidity score, irrespective of COVID-19 status. RESULTS: Among 388 respondents, 67% reported at least one type of financial hardship since March 2020, with 21% reporting behavioral hardships and 64% material hardships; 8% reported severe-to-extreme health-related financial strain. Compared with uninfected matched comparators, Veterans with a history of COVID-19 had greater risks of severe-to-extreme health-related financial strain (RR: 4.0, CI: 1.4-11.2), taking less medication due to cost (RR: 2.9, 95% CI: 1.0-8.6), and having a loved one take time off work to care for them (RR: 1.9, CI: 1.1-3.6). Irrespective of COVID-19 status, Veterans aged < 65 years had a greater risk of most financial hardships compared with Veterans aged ≥ 65 years. CONCLUSIONS: Health-related financial hardships such as taking less medication due to cost and severe-to-extreme health-related financial strain were more common among Veterans with a history of COVID-19 than among matched comparators. Strategies are needed to address health-related financial hardship after COVID-19. TRIAL REGISTRATION: NCT05394025, registered 05-27-2022.


Asunto(s)
COVID-19 , Estrés Financiero , Veteranos , Humanos , COVID-19/epidemiología , COVID-19/economía , Estados Unidos/epidemiología , Estudios Prospectivos , Masculino , Femenino , Veteranos/estadística & datos numéricos , Persona de Mediana Edad , Estrés Financiero/epidemiología , Anciano , SARS-CoV-2 , Adulto , Pandemias/economía , United States Department of Veterans Affairs
12.
BMC Med ; 22(1): 306, 2024 Jul 29.
Artículo en Inglés | MEDLINE | ID: mdl-39075484

RESUMEN

BACKGROUND: The net benefit of aspirin cessation in older adults remains uncertain. This study aimed to use observational data to emulate a randomized trial of aspirin cessation versus continuation in older adults without cardiovascular disease (CVD). METHODS: Post hoc analysis using a target trial emulation framework applied to the immediate post-trial period (2017-2021) of a study of low-dose aspirin initiation in adults aged ≥ 70 years (ASPREE; NCT01038583). Participants from Australia and the USA were included if they were free of CVD at the start of the post-trial intervention period (time zero, T0) and had been taking open-label or randomized aspirin immediately before T0. The two groups in the target trial were as follows: aspirin cessation (participants who were taking randomized aspirin immediately before T0; assumed to have stopped at T0 as instructed) versus aspirin continuation (participants on open-label aspirin at T0 regardless of their randomized treatment; assumed to have continued at T0). The outcomes after T0 were incident CVD, major adverse cardiovascular events (MACE), all-cause mortality, and major bleeding during 3, 6, and 12 months (short-term) and 48 months (long-term) follow-up. Hazard ratios (HRs) comparing aspirin cessation to continuation were estimated from propensity-score (PS) adjusted Cox proportional-hazards regression models. RESULTS: We included 6103 CVD-free participants (cessation: 5427, continuation: 676). Over both short- and long-term follow-up, aspirin cessation versus continuation was not associated with elevated risk of CVD, MACE, and all-cause mortality (HRs, at 3 and 48 months respectively, were 1.23 and 0.73 for CVD, 1.11 and 0.84 for MACE, and 0.23 and 0.79 for all-cause mortality, p > 0.05), but cessation had a reduced risk of incident major bleeding events (HRs at 3 and 48 months, 0.16 and 0.63, p < 0.05). Similar findings were seen for all outcomes at 6 and 12 months, except for a lowered risk of all-cause mortality in the cessation group at 12 months. CONCLUSIONS: Our findings suggest that deprescribing prophylactic aspirin might be safe in healthy older adults with no known CVD.


Asunto(s)
Aspirina , Enfermedades Cardiovasculares , Humanos , Aspirina/administración & dosificación , Aspirina/uso terapéutico , Anciano , Masculino , Femenino , Enfermedades Cardiovasculares/prevención & control , Anciano de 80 o más Años , Inhibidores de Agregación Plaquetaria/administración & dosificación , Australia , Estados Unidos , Hemorragia/inducido químicamente
13.
Am J Psychiatry ; 181(7): 630-638, 2024 Jul 01.
Artículo en Inglés | MEDLINE | ID: mdl-38946271

RESUMEN

OBJECTIVE: Antidepressants are commonly used to treat bipolar depression but may increase the risk of mania. The evidence from randomized controlled trials, however, is limited by short treatment durations, providing little evidence for the long-term risk of antidepressant-induced mania. The authors performed a target trial emulation to compare the risk of mania among individuals with bipolar depression treated or not treated with antidepressants over a 1-year period. METHODS: The authors emulated a target trial using observational data from nationwide Danish health registers. The study included 979 individuals with bipolar depression recently discharged from a psychiatric ward. Of these, 358 individuals received antidepressant treatment, and 621 did not. The occurrence of mania and bipolar depression over the following year was ascertained, and the intention-to-treat effect of antidepressants was analyzed by using Cox proportional hazards regression with adjustment for baseline covariates to emulate randomized open-label treatment allocation. RESULTS: The fully adjusted analyses revealed no statistically significant associations between treatment with an antidepressant and the risk of mania in the full sample (hazard rate ratio=1.08, 95% CI=0.72-1.61), in the subsample concomitantly treated with a mood-stabilizing agent (hazard rate ratio=1.16, 95% CI=0.63-2.13), and in the subsample not treated with a mood-stabilizing agent (hazard rate ratio=1.16, 95% CI=0.65-2.07). Secondary analyses revealed no statistically significant association between treatment with an antidepressant and bipolar depression recurrence. CONCLUSIONS: These findings suggest that the risk of antidepressant-induced mania is negligible and call for further studies to optimize treatment strategies for individuals with bipolar depression.


Asunto(s)
Antidepresivos , Trastorno Bipolar , Manía , Humanos , Trastorno Bipolar/tratamiento farmacológico , Antidepresivos/efectos adversos , Antidepresivos/uso terapéutico , Masculino , Femenino , Dinamarca/epidemiología , Adulto , Manía/inducido químicamente , Persona de Mediana Edad , Sistema de Registros , Modelos de Riesgos Proporcionales
14.
Cancers (Basel) ; 16(14)2024 Jul 11.
Artículo en Inglés | MEDLINE | ID: mdl-39061155

RESUMEN

This manuscript examines the synergistic potential of prospective real-world/time data/evidence (RWTD/E) and randomized controlled trials (RCTs) to enrich healthcare research and operational insights, with a particular focus on its impact within the sarcoma field. Through exploring RWTD/E's capability to provide real-world/time, granular patient data, it offers an enriched perspective on healthcare outcomes and delivery, notably in the complex arena of sarcoma care. Highlighting the complementarity between RWTD/E's expansive real-world/time scope and the structured environment of RCTs, this paper showcases their combined strength, which can help to foster advancements in personalized medicine and population health management, exemplified through the lens of sarcoma treatment. The manuscript further outlines methodological innovations such as target trial emulation and their significance in enhancing the precision and applicability of RWTD/E, underscoring the transformative potential of these advancements in sarcoma care and beyond. By advocating for the strategic incorporation of prospective RWTD/E into healthcare frameworks, it aims to create an evidence-driven ecosystem that significantly improves patient outcomes and healthcare efficiency, with sarcoma care serving as a pivotal domain for these developments.

15.
Eur Heart J Open ; 4(4): oeae054, 2024 Jul.
Artículo en Inglés | MEDLINE | ID: mdl-39011092

RESUMEN

Aims: In patients with advanced heart failure requiring dobutamine infusion, it is usually recommended to initiate beta-blockers after weaning from dobutamine. However, beta-blockers are sometimes initiated under dobutamine infusion in a real-world scenario. The association between such early beta-blocker initiation with clinical outcomes is unknown. Therefore, this study investigates the association between initiating beta-blockers under dobutamine infusion and survival outcomes. Methods and results: This observational study with a multicentre inpatient-care database emulated a pragmatic randomized controlled trial (RCT) of the beta-blocker initiation strategy. First, 1151 patients on dobutamine and not on beta-blockers on the day of heart failure admission (Day 0) were identified. Among 1095 who met eligibility criteria, patients who were eventually initiated beta-blockers under dobutamine infusion by Day 7 (early initiation strategy) were 1:1 matched to those who were not initiated (conservative strategy). The methods of cloning, censoring, and weighting were applied to emulate the target trial. Patients were followed up for up to 30 days. The primary outcome was all-cause death. Among 780 matched patients (median age, 81 years), the adjusted hazard ratio was 1.11 (95% confidence interval 0.75-1.64, P = 0.59) for the early initiation strategy. The estimated 30-day all-cause mortalities in the early initiation strategy and the conservative strategy were 19.3% (10.6-30.7) and 16.2% (9.2-25.3), respectively. The results were consistent when we used different days to determine strategies (i.e. 5 and 9) instead of 7 days. Conclusion: The present observational study emulating a pragmatic RCT found no positive or negative association between beta-blocker initiation under dobutamine infusion and overall survival.

16.
Sichuan Da Xue Xue Bao Yi Xue Ban ; 55(3): 653-661, 2024 May 20.
Artículo en Chino | MEDLINE | ID: mdl-38948274

RESUMEN

Objective: Non-alcoholic fatty liver disease (NAFLD) and alcohol-associated fatty liver disease (ALD) are the most common chronic liver diseases. Hepatic steatosis is an early histological subtype of both NAFLD and ALD. Excessive alcohol consumption is widely known to lead to hepatic steatosis and subsequent liver damage. However, reported findings concerning the association between moderate alcohol consumption and hepatic steatosis remain inconsistent. Notably, alcohol consumption as a modifiable lifestyle behavior is likely to change over time, but most previous studies covered alcohol intake only once at baseline. These inconsistent findings from existing studies do not inform decision-making concerning policies and clinical guidelines, which are of greater interest to health policymakers and clinician-scientists. Additionally, recommendations on the types of alcoholic beverages are not available. Usually, assessing the effects of two or more hypothetical alcohol consumption interventions on hepatic steatosis provides answers to questions concerning the population risk of hepatic steatosis if everyone changes from heavy drinking to abstinence, or if everyone keeps on drinking moderately, or if everyone of the drinking population switches from red wine to beer? Thus, we simulated a target trial to estimate the effects of several hypothetical interventions, including changes in the amount of alcohol consumption or the types of alcoholic beverages consumed, on hepatic steatosis using longitudinal data, to inform decisions about alcohol-related policymaking and clinical care. Methods: This longitudinal study included 12687 participants from the UK Biobank (UKB), all of whom participated in both baseline and repeat surveys. We excluded participants with missing data related to components of alcohol consumption and fatty liver index (FLI) in the baseline and the repeat surveys, as well as those who had reported liver diseases or cancer at the baseline survey. We used FLI as an outcome indicator and divided the participants into non-, moderate, and heavy drinkers. The surrogate marker FLI has been endorsed by many international organizations' guidelines, such as the European Association for the Study of the Liver. The calculation of FLI was based on laboratory and anthropometric data, including triglyceride, gamma-glutamyl transferase, body mass index, and waist circumference. Participants responded to questions about the types of alcoholic beverages, which were defined in 5 categories, including red wine, white wine/fortified wine/champagne, beer or cider, spirits, and mixed liqueurs, along with the average weekly or monthly amounts of alcohol consumed. Alcohol consumption was defined as pure alcohol consumed per week and was calculated according to the amount of alcoholic beverages consumed per week and the average ethanol content by volume in each alcoholic beverage. Participants were categorized as non-drinkers, moderate drinkers, and heavy drinkers according to the amount of their alcohol consumption. Moderate drinking was defined as consuming no more than 210 g of alcohol per week for men and 140 g of alcohol per week for women. We defined the following hypothetical interventions for the amount of alcohol consumed: sustaining a certain level of alcohol consumption from baseline to the repeat survey (e.g., none to none, moderate to moderate, heavy to heavy) and changing from one alcohol consumption level to another (e.g., none to moderate, moderate to heavy). The hypothetical interventions for the types of alcoholic beverages were defined in a similar way to those for the amount of alcohol consumed (e.g., red wine to red wine, red wine to beer/cider). We applied the parametric g-formula to estimate the effect of each hypothetical alcohol consumption intervention on the FLI. To implement the parametric g-formula, we first modeled the probability of time-varying confounders and FLI conditional on covariates. We then used these conditional probabilities to estimate the FLI value if the alcohol consumption level of each participant was under a specific hypothetical intervention. The confidence interval was obtained by 200 bootstrap samples. Results: For the alcohol consumption from baseline to the repeat surveys, 6.65% of the participants were sustained non-drinkers, 63.68% were sustained moderate drinkers, and 14.74% were sustained heavy drinkers, while 8.39% changed from heavy drinking to moderate drinking. Regarding the types of alcoholic beverages from baseline to the repeat surveys, 27.06% of the drinkers sustained their intake of red wine. Whatever the baseline alcohol consumption level, the hypothetical interventions for increasing alcohol consumption from the baseline alcohol consumption were associated with a higher FLI than that of the sustained baseline alcohol consumption level. When comparing sustained non-drinking with the hypothetical intervention of changing from non-drinking to moderate drinking, the mean ratio of FLI was 1.027 (95% confidence interval [CI]: 0.997-1.057). When comparing sustained non-drinking with the hypothetical intervention of changing from non-drinking to heavy drinking, the mean ratio of FLI was 1.075 (95% CI: 1.042-1.108). When comparing sustained heavy drinking with the hypothetical intervention of changing from heavy drinking to moderate drinking, the mean ratio of FLI was 0.953 (95% CI: 0.938-0.968). The hypothetical intervention of changing to red wine in the UKB was associated with lower FLI levels, compared with sustained consumption of other types of alcoholic beverages. For example, when comparing sustaining spirits with the hypothetical intervention of changing from spirits to red wine, the mean ratio of FLI was 0.981 (95% CI: 0.948-1.014). Conclusions: Regardless of the current level of alcohol consumption, interventions that increase alcohol consumption could raise the risk of hepatic steatosis in Western populations. The findings of this study could inform the formulation of future practice guidelines and health policies. If quitting drinking is challenging, red wine may be a better option than other types of alcoholic beverages in Western populations.


Asunto(s)
Consumo de Bebidas Alcohólicas , Enfermedad del Hígado Graso no Alcohólico , Humanos , Consumo de Bebidas Alcohólicas/efectos adversos , Consumo de Bebidas Alcohólicas/epidemiología , Estudios Longitudinales , Enfermedad del Hígado Graso no Alcohólico/etiología , Enfermedad del Hígado Graso no Alcohólico/epidemiología , Masculino , Femenino , Bebidas Alcohólicas/efectos adversos , Hígado Graso Alcohólico/etiología , Persona de Mediana Edad , Hígado Graso/etiología , Estudios de Cohortes
17.
Am J Epidemiol ; 2024 Jul 02.
Artículo en Inglés | MEDLINE | ID: mdl-38957996

RESUMEN

Non-benzodiazepine hypnotics ( "Z-drugs") are prescribed for insomnia, but might increase risk of motor vehicle crash (MVC) among older adults through prolonged drowsiness and delayed reaction times. We estimated the effect of initiating Z-drug treatment on the 12-week risk of MVC in a sequential target trial emulation. After linking New Jersey driver licensing and police-reported MVC data to Medicare claims, we emulated a new target trial each week (July 1, 2007 - October 7, 2017) in which Medicare fee-for-service beneficiaries were classified as Z-drug-treated or untreated at baseline and followed for an MVC. We used inverse probability of treatment and censoring weighted pooled logistic regression models to estimate risk ratios (RR) and risk differences with 95% bootstrap confidence limits (CLs). There were 257,554 person-trials, of which 103,371 were Z-drug-treated and 154,183 untreated, giving rise to 976 and 1,249 MVCs, respectively. The intention-to-treat RR was 1.06 (95%CLs 0.95, 1.16). For the per-protocol estimand, there were 800 MVCs and 1,241 MVCs among treated and untreated person-trials, respectively, suggesting a reduced MVC risk (RR 0.83 [95%CLs 0.74, 0.92]) with sustained Z-drug treatment. Z-drugs should be prescribed to older patients judiciously but not withheld entirely over concerns about MVC risk.

18.
Am J Epidemiol ; 2024 Jul 04.
Artículo en Inglés | MEDLINE | ID: mdl-38965743

RESUMEN

Women and other people of childbearing potential living with HIV (WLHIV) have a higher risk of adverse birth outcomes than those without HIV (WWHIV). A higher risk of anemia in WLHIV could partially explain this disparity. Using a birth outcomes surveillance study in Botswana, we emulated target trials corresponding to currently available or feasible interventions on anemia. The first target trial evaluated two interventions: initiate multiple micronutrient supplementation (MMS), and MMS or iron and folic acid supplementation by 24 weeks gestation. The remaining target trials evaluated the interventions: eliminate anemia before pregnancy; and jointly eliminate anemia before pregnancy and initiate MMS. We estimated the observed disparity in adverse birth outcomes between WLHIV and WWHIV and compared the observed disparity measure (ODM) to the counterfactual disparity measure (CDM) under each intervention. Of 137,499 individuals (22% WLHIV), the observed risk of any adverse birth outcome was 26.0% in WWHIV and 34.5% in WLHIV (ODM, 8.5% [95% CI, 7.9-9.1%]). CDMs (95% CIs) ranged from 6.6% (4.8-8.4%) for the intervention to eliminate anemia and initiate MMS to 8.4% (7.7-9.1%) for the intervention to eliminate anemia only. Preventing anemia and expanding MMS may reduce HIV disparities in birth outcomes, but interventions with greater impact should be identified.

19.
Am J Epidemiol ; 2024 Jul 05.
Artículo en Inglés | MEDLINE | ID: mdl-38973750

RESUMEN

The 2018 World Cancer Research Fund/American Institute for Cancer Research recommends sustained strategies of physical activity and diet for cancer prevention, but evidence for long-term prostate cancer risk is limited. Using observational data from 27,859 men in the Health Professionals Follow-up Study, we emulated a target trial of recommendation-based physical activity and dietary strategies and 26-year risks of prostate cancer, adjusting for risk factors via the parametric g-formula. Compared with no intervention, limiting sugar-sweetened beverages showed a 0.4% (0.0-0.9%) lower risk of lethal (metastatic or fatal) disease and 0.5% (0.1-0.9%) lower risk of fatal disease. Restricting consumption of processed foods showed a 0.4-0.9% higher risk of all prostate cancer outcomes. Estimated risk differences for clinically significant disease were close to null for strategies involving fruits and non-starchy vegetables, whole grains and legumes, red meat, and processed meat, as well as under a joint strategy of physical activity and diet. Compared with a "low adherence" strategy, maintaining recommended physical activity levels showed a 0.4% (0.1-0.8%) lower risk of lethal and 0.5% (0.2-0.8%) lower risk of fatal disease. Adhering to specific components of current physical activity and dietary recommendations may help to prevent lethal and fatal prostate cancer over 26 years.

20.
Am J Epidemiol ; 2024 Jul 19.
Artículo en Inglés | MEDLINE | ID: mdl-39030720

RESUMEN

There is mounting interest in the possibility that metformin, indicated for glycemic control in type 2 diabetes, has a range of additional beneficial effects. Randomized trials have shown that metformin prevents adverse cardiovascular events, and metformin use has also been associated with reduced cognitive decline and cancer incidence. In this paper, we dig more deeply into whether metformin prevents cancer by emulating target randomized trials comparing metformin to sulfonylureas as first line diabetes therapy using data from Clinical Practice Research Datalink, a U.K. primary care database (1987-2018). We included individuals with diabetes, no prior cancer diagnosis, no chronic kidney disease, and no prior diabetes therapy who initiated metformin (N=93353) or a sulfonylurea (N=13864). In our cohort, the estimated overlap weighted additive separable direct effect of metformin compared to sulfonylureas on cancer risk at 6 years was -1% (.95 CI=-2.2%, 0.1%), which is consistent with metformin providing no direct protection against cancer incidence or substantial protection. The analysis faced two methodological challenges-poor overlap, and pre-cancer death as a competing risk. To address these issues while minimizing nuisance model misspecification, we develop and apply double/debiased machine learning estimators of overlap weighted separable effects in addition to more traditional effect estimates.

SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA