Your browser doesn't support javascript.
loading
Mostrar: 20 | 50 | 100
Resultados 1 - 3 de 3
Filtrar
Más filtros











Base de datos
Intervalo de año de publicación
1.
Acad Pediatr ; 24(1): 147-154, 2024.
Artículo en Inglés | MEDLINE | ID: mdl-37245666

RESUMEN

OBJECTIVE: The COVID-19 pandemic resulted in training programs restructuring their curricula. Fellowship programs are required to monitor each fellow's training progress through a combination of formal evaluations, competency tracking, and measures of knowledge acquisition. The American Board of Pediatrics administers subspecialty in-training examinations (SITE) to pediatric fellowship trainees annually and board certification exams at the completion of the fellowship. The objective of this study was to compare SITE scores and certification exam passing rates before and during the pandemic. METHODS: In this retrospective observational study, we collected summative data on SITE scores and certification exam passing rates for all pediatric subspecialties from 2018 to 2022. Trends over time were assessed using analysis of variance (ANOVA) analysis to test for trends across years within one group and t-test analysis to compare groups before and during the pandemic. RESULTS: Data were obtained from 14 pediatric subspecialties. Comparing prepandemic to pandemic scores, Infectious Diseases, Cardiology, and Critical Care Medicine saw statistically significant decreases in SITE scores. Conversely, Child Abuse and Emergency Medicine saw increases in SITE scores. Emergency Medicine saw a statistically significant increase in certification exam passing rates, while Gastroenterology and Pulmonology saw decreases in exam passing rates. CONCLUSIONS: The COVID-19 pandemic resulted in restructuring didactics and clinical care based on the needs of the hospital. There were also societal changes affecting patients and trainees. Subspecialty programs with declining scores and certification exam passing rates may need to assess their educational and clinical programs and adapt to the needs of trainees' learning edges.


Asunto(s)
COVID-19 , Pandemias , Humanos , Estados Unidos , Niño , Evaluación Educacional , Certificación , Educación de Postgrado en Medicina/métodos
2.
BMC Med Educ ; 23(1): 286, 2023 Apr 27.
Artículo en Inglés | MEDLINE | ID: mdl-37106417

RESUMEN

BACKGROUND: The American Board of Anesthesiology piloted 3-option multiple-choice items (MCIs) for its 2020 administration of 150-item subspecialty in-training examinations for Critical Care Medicine (ITE-CCM) and Pediatric Anesthesiology (ITE-PA). The 3-option MCIs were transformed from their 4-option counterparts, which were administered in 2019, by removing the least effective distractor. The purpose of this study was to compare physician performance, response time, and item and exam characteristics between the 4-option and 3-option exams. METHODS: Independent-samples t-test was used to examine the differences in physician percent-correct score; paired t-test was used to examine the differences in response time and item characteristics. The Kuder and Richardson Formula 20 was used to calculate the reliability of each exam form. Both the traditional (distractor being selected by fewer than 5% of examinees and/or showing a positive correlation with total score) and sliding scale (adjusting the frequency threshold of distractor being chosen by item difficulty) methods were used to identify non-functioning distractors (NFDs). RESULTS: Physicians who took the 3-option ITE-CCM (mean = 67.7%) scored 2.1 percent correct higher than those who took the 4-option ITE-CCM (65.7%). Accordingly, 3-option ITE-CCM items were significantly easier than their 4-option counterparts. No such differences were found between the 4-option and 3-option ITE-PAs (71.8% versus 71.7%). Item discrimination (4-option ITE-CCM [an average of 0.13], 3-option ITE-CCM [0.12]; 4-option ITE-PA [0.08], 3-option ITE-PA [0.09]) and exam reliability (0.75 and 0.74 for 4- and 3-option ITE-CCMs, respectively; 0.62 and 0.67 for 4-option and 3-option ITE-PAs, respectively) were similar between these two formats for both ITEs. On average, physicians spent 3.4 (55.5 versus 58.9) and 1.3 (46.2 versus 47.5) seconds less per item on 3-option items than 4-option items for ITE-CCM and ITE-PA, respectively. Using the traditional method, the percentage of NFDs dropped from 51.3% in the 4-option ITE-CCM to 37.0% in the 3-option ITE-CCM and from 62.7% to 46.0% for the ITE-PA; using the sliding scale method, the percentage of NFDs dropped from 36.0% to 21.7% for the ITE-CCM and from 44.9% to 27.7% for the ITE-PA. CONCLUSIONS: Three-option MCIs function as robustly as their 4-option counterparts. The efficiency achieved by spending less time on each item poses opportunities to increase content coverage for a fixed testing period. The results should be interpreted in the context of exam content and distribution of examinee abilities.


Asunto(s)
Evaluación Educacional , Examen Físico , Humanos , Estados Unidos , Niño , Evaluación Educacional/métodos , Reproducibilidad de los Resultados
3.
Genet Med ; 24(1): 225-231, 2022 01.
Artículo en Inglés | MEDLINE | ID: mdl-34906492

RESUMEN

PURPOSE: The American Board of Medical Genetics and Genomics (ABMGG) certifying examinations (CEs) are designed to assess relevant basic knowledge, clinical knowledge, and diagnostic skills of board-eligible candidates in primary specialty areas. The ABMGG in-training examinations (ITEs) provide formative feedback regarding knowledge and learning over time and assess readiness to attempt board certification. This study addresses the validity of the ABMGG ITE by evaluating its relationship with performance on CE utilizing established psychometric approaches. METHODS: Statistical analysis included bivariate Pearson correlation coefficients and linear regression to evaluate the strength of associations between ITE scores and CE scores. Logistic regression was used to assess the association between ITE scores and the probability of passing each CE. RESULTS: Logistic regression results indicated that ITE scores accounted for 22% to 44% of the variability in CE outcomes. Across 3 certification cycles, for every 1-point increase in ITE scores, the odds ratio for earning a passing score increased by a factor of 1.12 to 1.20 for the general CE, 1.14 to 1.25 for the clinical CE, and 1.12 to 1.20 for the laboratory CEs. CONCLUSION: The findings show a positive correlation between performance on the ITE examination and performance on and passing the ABMGG CE.


Asunto(s)
Genética Médica , Internado y Residencia , Certificación , Competencia Clínica , Evaluación Educacional/métodos , Genómica , Humanos , Estados Unidos
SELECCIÓN DE REFERENCIAS
DETALLE DE LA BÚSQUEDA