Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 5 de 5
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Perm J ; 27(3): 79-91, 2023 09 15.
Artículo en Inglés | MEDLINE | ID: mdl-37545198

RESUMEN

Background Since 2015, the Veterans Health Administration (VHA) Diffusion of Excellence Program has supported spread of practices developed by frontline employees. Shark Tank-style competitions encourage "Sharks" nationwide (VHA medical center/regional directors) to bid for the opportunity to implement practices at their institutions. Methods The authors evaluated bidding strategies (2016-2020), developing the "QuickView" practice comparator to promote informed bidding. Program leaders distributed QuickView and revised versions in subsequent competitions. Our team utilized in-person observation, online chats after the competition, bidder interviews, and bid analysis to evaluate QuickView use. Bids were ranked based on demonstrated understanding of resources required for practice implementation. Results Sharks stated that QuickView supported preparation before the competition and suggested improvements. Our revised tool reported necessary staff time and incorporated a "WishList" from practice finalists detailing minimum requirements for successful implementation. Bids from later years reflected increased review of facilities' current states before the competition and increased understanding of the resources needed for implementation. Percentage of bids describing local need for the practice rose from 2016 to 2020: 4.7% (6/127); 62.1% (54/87); 78.3% (36/46); 80.6% (29/36); 89.7% (26/29). Percentage of bids committing specific resources rose following QuickView introduction: 81.1% (103/127) in 2016, 69.0% (60/87) in 2017, then 73.9% (34/46) in 2018, 88.9% (32/36) in 2019, and 89.7% (26/29) in 2020. Discussion In the years following QuickView/WishList implementation, bids reflected increased assessment before the competition of both local needs and available resources. Conclusion Selection of a new practice for implementation requires an understanding of local need, necessary resources, and fit. QuickView and WishList appear to support these determinations.


Asunto(s)
Innovación Organizacional , Servicios de Salud para Veteranos
2.
Implement Sci Commun ; 4(1): 6, 2023 Jan 16.
Artículo en Inglés | MEDLINE | ID: mdl-36647162

RESUMEN

BACKGROUND: There are challenges associated with measuring sustainment of evidence-informed practices (EIPs). First, the terms sustainability and sustainment are often falsely conflated: sustainability assesses the likelihood of an EIP being in use in the future while sustainment assesses the extent to which an EIP is (or is not) in use. Second, grant funding often ends before sustainment can be assessed. The Veterans Health Administration (VHA) Diffusion of Excellence (DoE) program is one of few large-scale models of diffusion; it seeks to identify and disseminate practices across the VHA system. The DoE sponsors "Shark Tank" competitions, in which leaders bid on the opportunity to implement a practice with approximately 6 months of implementation support. As part of an ongoing evaluation of the DoE, we sought to develop and pilot a pragmatic survey tool to assess sustainment of DoE practices. METHODS: In June 2020, surveys were sent to 64 facilities that were part of the DoE evaluation. We began analysis by comparing alignment of quantitative and qualitative responses; some facility representatives reported in the open-text box of the survey that their practice was on a temporary hold due to COVID-19 but answered the primary outcome question differently. As a result, the team reclassified the primary outcome of these facilities to Sustained: Temporary COVID-Hold. Following this reclassification, the number and percent of facilities in each category was calculated. We used directed content analysis, guided by the Consolidated Framework for Implementation Research (CFIR), to analyze open-text box responses. RESULTS: A representative from forty-one facilities (64%) completed the survey. Among responding facilities, 29/41 sustained their practice, 1/41 partially sustained their practice, 8/41 had not sustained their practice, and 3/41 had never implemented their practice. Sustainment rates increased between Cohorts 1-4. CONCLUSIONS: The initial development and piloting of our pragmatic survey allowed us to assess sustainment of DoE practices. Planned updates to the survey will enable flexibility in assessing sustainment and its determinants at any phase after adoption. This assessment approach can flex with the longitudinal and dynamic nature of sustainment, including capturing nuances in outcomes when practices are on a temporary hold. If additional piloting illustrates the survey is useful, we plan to assess the reliability and validity of this measure for broader use in the field.

3.
Front Health Serv ; 3: 1223277, 2023.
Artículo en Inglés | MEDLINE | ID: mdl-38420338

RESUMEN

Introduction: The Veterans Health Administration (VHA) Diffusion of Excellence (DoE) program provides a system to identify, replicate, and spread promising practices across the largest integrated healthcare system in the United States. DoE identifies innovations that have been successfully implemented in the VHA through a Shark Tank style competition. VHA facility and regional directors bid resources needed to replicate promising practices. Winning facilities/regions receive external facilitation to aid in replication/implementation over the course of a year. DoE staff then support diffusion of successful practices across the nationwide VHA. Methods: Organized around the Reach, Effectiveness, Adoption, Implementation, and Maintenance (RE-AIM) Framework, we summarize results of an ongoing long-term mixed-methods implementation evaluation of DoE. Data sources include: Shark Tank application and bid details, tracking practice adoptions through a Diffusion Marketplace, characteristics of VHA facilities, focus groups with Shark Tank bidders, structured observations of DoE events, surveys of DoE program participants, and semi-structured interviews of national VHA program office leaders, VHA healthcare system/facility executives, practice developers, implementation teams and facilitators. Results: In the first eight Shark Tanks (2016-2022), 3,280 Shark Tank applications were submitted; 88 were designated DoE Promising Practices (i.e., practices receive facilitated replication). DoE has effectively spread practices across the VHA, with 1,440 documented instances of adoption/replication of practices across the VHA. This includes 180 adoptions/replications in facilities located in rural areas. Leadership decisions to adopt innovations are often based on big picture considerations such as constituency support and linkage to organizational goals. DoE Promising Practices that have the greatest national spread have been successfully replicated at new sites during the facilitated replication process, have close partnerships with VHA national program offices, and tend to be less expensive to implement. Two indicators of sustainment indicate that 56 of the 88 Promising Practices are still being diffused across the VHA; 56% of facilities originally replicating the practices have sustained them, even up to 6 years after the first Shark Tank. Conclusion: DoE has developed a sustainable process for the identification, replication, and spread of promising practices as part of a learning health system committed to providing equitable access to high quality care.

4.
Implement Sci ; 17(1): 7, 2022 Jan 22.
Artículo en Inglés | MEDLINE | ID: mdl-35065675

RESUMEN

BACKGROUND: The challenges of implementing evidence-based innovations (EBIs) are widely recognized among practitioners and researchers. Context, broadly defined as everything outside the EBI, includes the dynamic and diverse array of forces working for or against implementation efforts. The Consolidated Framework for Implementation Research (CFIR) is one of the most widely used frameworks to guide assessment of contextual determinants of implementation. The original 2009 article invited critique in recognition for the need for the framework to evolve. As implementation science has matured, gaps in the CFIR have been identified and updates are needed. Our team is developing the CFIR 2.0 based on a literature review and follow-up survey with authors. We propose an Outcomes Addendum to the CFIR to address recommendations from these sources to include outcomes in the framework. MAIN TEXT: We conducted a literature review and surveyed corresponding authors of included articles to identify recommendations for the CFIR. There were recommendations to add both implementation and innovation outcomes from these sources. Based on these recommendations, we make conceptual distinctions between (1) anticipated implementation outcomes and actual implementation outcomes, (2) implementation outcomes and innovation outcomes, and (3) CFIR-based implementation determinants and innovation determinants. CONCLUSION: An Outcomes Addendum to the CFIR is proposed. Our goal is to offer clear conceptual distinctions between types of outcomes for use with the CFIR, and perhaps other determinant implementation frameworks as well. These distinctions can help bring clarity as researchers consider which outcomes are most appropriate to evaluate in their research. We hope that sharing this in advance will generate feedback and debate about the merits of our proposed addendum.


Asunto(s)
Ciencia de la Implementación , Motivación , Humanos
5.
Implement Sci ; 16(1): 67, 2021 07 02.
Artículo en Inglés | MEDLINE | ID: mdl-34215286

RESUMEN

BACKGROUND: Qualitative approaches, alone or in mixed methods, are prominent within implementation science. However, traditional qualitative approaches are resource intensive, which has led to the development of rapid qualitative approaches. Published rapid approaches are often inductive in nature and rely on transcripts of interviews. We describe a deductive rapid analysis approach using the Consolidated Framework for Implementation Research (CFIR) that uses notes and audio recordings. This paper compares our rapid versus traditional deductive CFIR approach. METHODS: Semi-structured interviews were conducted for two cohorts of the Veterans Health Administration (VHA) Diffusion of Excellence (DoE). The CFIR guided data collection and analysis. In cohort A, we used our traditional CFIR-based deductive analysis approach (directed content analysis), where two analysts completed independent in-depth manual coding of interview transcripts using qualitative software. In cohort B, we used our new rapid CFIR-based deductive analysis approach (directed content analysis), where the primary analyst wrote detailed notes during interviews and immediately "coded" notes into a MS Excel CFIR construct by facility matrix; a secondary analyst then listened to audio recordings and edited the matrix. We tracked time for our traditional and rapid deductive CFIR approaches using a spreadsheet and captured transcription costs from invoices. We retrospectively compared our approaches in terms of effectiveness and rigor. RESULTS: Cohorts A and B were similar in terms of the amount of data collected. However, our rapid deductive CFIR approach required 409.5 analyst hours compared to 683 h during the traditional deductive CFIR approach. The rapid deductive approach eliminated $7250 in transcription costs. The facility-level analysis phase provided the greatest savings: 14 h/facility for the traditional analysis versus 3.92 h/facility for the rapid analysis. Data interpretation required the same number of hours for both approaches. CONCLUSION: Our rapid deductive CFIR approach was less time intensive and eliminated transcription costs, yet effective in meeting evaluation objectives and establishing rigor. Researchers should consider the following when employing our approach: (1) team expertise in the CFIR and qualitative methods, (2) level of detail needed to meet project aims, (3) mode of data to analyze, and (4) advantages and disadvantages of using the CFIR.


Asunto(s)
Ciencia de la Implementación , Humanos , Investigación Cualitativa , Estudios Retrospectivos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA