Original Research Paper

Diagnostic radiology curriculum for emergency medicine trainees in India: comparing simulation and didactics

Kevin Durgun1, Leonardo Aliaga2, Katherine Douglass3, and Tania Ahluwalia4

1MD, MPH, Staff Physician, Department of Emergency Medicine, Brooke Army Medical Center, Ft Sam Houston, United States

2MD, Clinical Assistant Professor, Department of Emergency Medicine, Stanford University School of Medicine, Stanford, United States

3MD, MPH, Associate Professor, Department of Emergency Medicine, George Washington University School of Medicine & Health Sciences, Washington, DC, United States

4MD, MPH, Assistant Professor of Pediatrics and Emergency Medicine, Children’s National Medical Center, Washington, DC; Department of Emergency Medicine, George Washington University School of Medicine & Health Sciences, Washington, DC, United States


Abstract

Background: Emergency medicine education in India lacks effective diagnostic radiology interpretation curricula. Active learning curricula delivered through virtual asynchronous learning have the potential to fill this gap in international emergency medical education. This study aimed to evaluate an active learning radiology curriculum among Indian emergency medicine (EM) trainees. Methods: We conducted a randomized controlled trial at multiple EM training programs in India. Trainees were randomized to receive an active (experimental) or a passive learning curriculum (control). Trainees took a pre-test, initial post-test, and delayed one-month post-test to assess their diagnostic accuracy of computed tomography (CT) brain and musculoskeletal X-ray (MSK XR) interpretation. We obtained surveys assessing perceptions of test difficulty. The primary outcome was the change in pre-test to initial post-test scores. The secondary outcomes were changes in the pre-test to one-month post-test and self-efficacy scores. Results: A total of 79 trainees from 12 sites across North, West, and South India were enrolled, and 59 completed the one-month post-test. The active learning cohort (ALC) showed a significant improvement in CT brain pre-test to initial post-test of 9.8% (95% CI, 3.1% to 16.6%) (p = 0.006) with a medium effect size (d = 0.7). There was no significant improvement at the one-month post-test for the ALC or CT brain post-test for the passive learning cohort (PLC). There was no significant improvement for the MSK XR post-test for either cohort. Linear regressions comparing the ALC and PLC did not find a statistically significant effect to explain the mean score difference between cohorts. Self-efficacy scores significantly increased for both the ALC (p < 0.001) and PLC (p = 0.004). Conclusion: This pilot study showed modest benefits of an ALC compared to a PLC for improving CT brain interpretation in an international setting. Future research can refine active learning curricula to improve diagnostic radiology interpretation among EM trainees in international settings.

Key Words: Emergency medicine, Simulation, Clinical education, Health Education, Graduate Medical Education

Date submitted: 26-August-2024

Email: Kevin Durgun (kdurg001@gmail.com)

This is an open access journal, and articles are distributed under the terms of the Creative Commons Attribution-Non Commercial-Share Alike 4.0 License, which allows others to remix, tweak, and build upon the work non-commercially, as long as appropriate credit is given and the new creations are licensed under the identical terms.

Citation: Durgun K, Aliaga L, Douglass K, and Ahluwalia T. Diagnostic radiology curriculum for emergency medicine trainees in India: comparing simulation and didactics. Educ Health 2025;38:46-56

Online access: www.educationforhealthjournal.org
DOI: 10.62694/efh.2025.169

Published by The Network: Towards Unity for Health


Background

Timely and accurate interpretation of diagnostic imaging in the emergency department is essential to treat disease and reduce morbidity and mortality.1,2 Rapid interpretation of computed tomography (CT) brain imaging is necessary for time-sensitive pathologies such as intracranial hemorrhage and acute ischemic stroke.3 Accurate interpretation of musculoskeletal X-rays (MSK XRs) becomes essential in community or resource-limited settings where a radiologist’s interpretation may not be immediately available.4 Emergency medicine (EM) residency programs’ leadership across the United States (US) specifically highlight CT brain and MSK XRs interpretation as key skills for graduating EM residents.57 However, EM residents receive less training in these skills, which can lead to patient harm from delayed or incorrect diagnosis. Minimal time outside clinical shifts is dedicated to emergency radiology education, and programs that provide formal instruction often use passive learning didactic lectures.5

In many low- and middle-income countries, even less is known about the quality and speed of radiologic diagnosis. There is a radiology workforce shortage impacting countries across the world.810 In India, 20,000 radiologists, primarily in cities, provide services for the country of 1.4 billion.11 Non-radiologists with varying interpretation skills often perform the initial evaluation and diagnosis.12 This highlights the opportunity to enhance the training of emergency physicians in India by introducing innovative teaching strategies in training programs to improve workforce capabilities.13,14

A promising solution to enhance emergency radiology curricula for EM trainees in India is to adapt successful strategies used for EM residents in the US. This includes active learning modules that use radiology simulation software such as Pacsbin.1518 This is a web-based platform that simulates a typical radiology viewer, allowing users to scroll through CTs and interact with radiology studies mirroring their clinical context. Users can practice manipulating diagnostic imaging in a simulation setting that affords time and space to learn, make mistakes, and iteratively build skills. This mode of learning is conceptually supported by constructivist learning theory and can benefit residents by producing more durable learning.19 These strategies demonstrated increased self-efficacy and knowledge in diagnostic radiology interpretation among radiology and EM residents.16,17 Other studies on simulation-based education for EM trainees in India have shown improved learner self-efficacy and knowledge.20

It remains unclear how these radiology teaching strategies would translate to Indian EM training programs across a virtual or asynchronous platform. This pilot study aimed to evaluate a simulation-based active learning strategy for improving the diagnostic accuracy of CT brain and MSK XR interpretation among EM trainees in Indian training programs.

Methods

This multicenter, prospective, randomized, controlled parallel-group trial compared a simulation-based active learning strategy to a didactic lecture passive learning strategy. The Ronald Reagan Institute of Emergency Medicine at George Washington University (GWU) has partnerships with 20 institutions from seven private hospital systems across India to deliver a structured three-year EM training program (GWU-MEM). All partner sites were invited to participate, with 12 sites from two hospital systems agreeing to join (Table 1). Recruitment of participating sites, and participants, used convenience sampling. The GW-MEM training includes a hybrid three-year curriculum combining didactics, online modules, sim-based learning, in addition to quarterly in-person EM teaching by GWU MEM faculty, and clinical and didactic teaching by local Indian EM physicians. Trainees from all three postgraduate years, representing male and female genders, were included (Table 1). These programs represent a diverse group of trainees throughout India, with training site locations varying from urban to rural settings. Each site offered comparable medical, surgical, and radiologic services. The study was conducted virtually at these 12 EM training sites in India from April 2023 to June 2023.

Table 1 Participant demographics after randomized assignment to study cohorts.

At the volunteering sites, 111 EM trainees were invited by email to participate in the study, and 79 initially agreed. Four participants withdrew after completing the consent process. The remaining 75 participants were randomized to an active learning cohort (ALC) or a passive learning cohort (PLC) using an internet-based random assignment generator.21 Each participant received a unique, randomly generated, three-digit alphanumeric code as their study identification to blind results and protect participant data.

Participants were given a pre-trial survey of perceptions, attitudes, educational experiences, and access to diagnostic radiology education and services adapted from Villa et al., 2021. For the pre-test, initial post-test, and delayed post-test one month after completing teaching modules, participants had one week and one attempt for each per assigned study ID. Links to active and passive learning modules were distributed by email, with one week for asynchronous completion. After the initial post-test, participants completed a self-efficacy survey to evaluate confidence in their learning.

The ALC (experimental arm) accessed radiology teaching cases on Pacsbin. Participants attempted to identify critical findings before receiving didactic teaching content tailored for each case, creating an active learning component. The PLC (control arm) viewed pre-recorded videos with narrated didactic slide presentations reviewing the same radiology cases as the ALC, including videos that scrolled through the CTs for comparable content exposure. Participants could access their assigned learning modules multiple times during one week between the pre-test and post-test.

The teaching content for CT brain interpretation (Pascbin cases and instructional videos) was adapted from curricular material developed for EM residents at a US training program.17 New content for the MSK XR cases was developed for this study, following the same format as the CT brain educational material. Teaching points focused on image manipulation heuristics to aid in diagnostic interpretation. The CT brain curriculum covered intracranial hemorrhage, acute ischemic stroke, and increased intracranial pressure cases, while the MSK XR curriculum included upper and lower extremity pathologies relevant to EM. Each module introduced a clinical vignette with an image, followed by a teaching intervention and display of the pathology. The active and passive learning modules were reviewed before the study to ensure consistency in teaching content.

The primary outcome was the change in pre-test to initial post-test scores for each cohort, measured through a multiple-choice question and image-hotspot test assessing for diagnostic accuracy of CT brain and MSK XR interpretation. The test was delivered on Qualtrics,22 with each question linked to a radiology case hosted on Pacsbin. Participants selected a diagnosis from a multiple-choice question for each case and then clicked on a follow-up image to indicate the pathology location (hotspot). This hotspot selection helped assess whether the selected areas matched the multiple-choice answers. CT brain cases from a prior study’s test were used.16 A new test was created for the MSK XR content. The same CT brain and MSK XR tests were used as the pre-test, initial, and delayed post-test. The secondary outcome measure was the change in the pre-test to one-month post-test self-efficacy scores for each cohort, measured throughpost-test surveys using a Likert scale. Participants rated their confidence in their exam answers from 1 (low confidence) to 5 (high confidence) and assessed test difficulty from 1 (least difficult) to 10 (most difficult) for each test.

Our sample size calculation indicated that 68 participants were needed to detect a 5% difference in post-test scores, with 80% power at a 0.05 significance level, based on an estimated effect size of 0.7 from prior US studies.17 Pre-test and post-test scores were analyzed using two-tailed paired t-tests and multiple linear regression, while self-efficacy data were assessed with a Wilcoxon signed-rank test. Effect sizes were calculated for statistically significant results. All statistical analyses were performed using SPSS version 29 (IBM). The George Washington University Institutional Review Board approved this study, and all participants provided informed written consent.

Results

The study invited 111 EM trainees from 12 training sites across India to participate. Of which, 75 completed the pre-test and were randomized to either the ALC or PLC (Figure 1). Table 1 shows the distribution of trainees by location, PGY, and gender.



Figure 1 Participant recruitment, allocation, and follow-up flowchart

For the CT brain assessments, the ALC showed a significant improvement from the pre-test to the initial post-test, with a mean score increase of 9.8% (95% CI, 3.1 to 16.6) (p = 0.006) and a medium effect size (Cohen’s d = 0.7). The PLC showed no statistically significant change in CT brain pre-test to initial post-test scores, nor did either cohort show a significant change in CT brain pre-test to one-month post-test scores. For MSK XR assessments, neither cohort demonstrated statistically significant changes in pre-test to initial post-test scores or pre-test to one-month post-test scores (Table 2). Linear regression analysis did not reveal a significant relationship between cohort and mean test score difference (Table 3). However, significant coefficients were found in the MSK XR study, comparing the relationship between program location and initial post-test mean score difference.

Table 2 Paired t-test of test scores with effect sizes

Table 3 Linear regression, mean score differences from pre-test to initial post-test and one-month post-test for CT brain and MSK-XR.

Results of the Wilcoxon signed rank analysis on the post-module self-efficacy survey are described in Table 4. Participant confidence in diagnostic interpretation increased for both the PLC and ALC. The pre-test PLC median score increased from 2 to 3 (n = 36, p < 0.01, r = 0.33), and the ALC median score also increased from 2 to 3 (n = 30, p < 0.01, r = 0.43). Participant perception of test difficulty significantly decreased from a median of 7 to 5 (n = 30, p < 0.05, r = −0.264) for the ALC MSK XR test.

Table 4 Descriptive statistics, Wilcoxon Signed Rank Test, and effect size of participant self-efficacy

Participant attitudes generally indicate that EM trainees should be responsible for independently interpreting diagnostic radiology imaging. The complete survey data is presented in Table 5. The results show a perception of limited time dedicated to radiology instruction, with 72% of participants reporting no hours or 0–2 hours per month spent on radiology education. Most education is provided by EM faculty or other EM trainees rather than faculty or trainees from different specialties. Nearly 60% of participants noted that radiology education occurs during shifts, while 40.5% identified didactic lectures as the primary source of instruction.

Table 5 Emergency medicine trainee perceptions and attitudes towards diagnostic radiology education and clinical practice availability of diagnostic radiology

Discussion

This pilot study highlights the value of an ALC in EM training within an international context. The results align with previous studies conducted in the U.S. regarding CT interpretation17 and confidence.16

The linear regression analysis examining the relationship between cohort-test changes relationship gives pause for certainty, as it does not meet the threshold for conventional statistical significance. However, given the study’s small sample size, there is a modest suggestion that an ALC may offer greater benefits. Future studies with a larger sample size, improved follow-up testing, and more robust engagement with the material could help clarify the uncertainties observed in the paired t-tests and linear regression analysis.

Study improvements should also address the technical issues encountered when accessing MSK XR images. As detailed in the limitations section, these access problems may explain why the MSK XR did not show a statistically significant benefit from the ALC. An alternative explanation could be that radiograph interpretation involves less manipulation of images compared to CT scans, resulting in less pronounced benefits from the ALC. Regardless, the improvements observed in CT interpretation likely reflect the effectiveness of the teaching strategy. The consistency with US studies is likely due to mutual comprehension of a universal diagnostic radiology principle.

The data gathered from the EM radiology curriculum needs assessment provides valuable context for the demand for quality training in the Indian setting. It offers evidence of a shared understanding between American and Indian contexts regarding the importance of diagnostic radiology interpretation as a critical EM skill. Current literature needs to provide more data on educational needs of EM, including radiology within international settings. This needs assessment guided the focus of this study, identifying key radiology topics for the intervention, similar to US studies.5,17 This data can also serve as a foundation for future interventions aimed at enhancing EM radiology education internationally, potentially expanding beyond simulation-based learning topics.

This study contributes to a growing body of research on simulation-based teaching interventions in EM training and suggests an educational benefit for the international context. While some commentaries have raised concerns about the costs associated with simulation-based education, particularly those involving high-fidelity mannequins and tissue replication,23 a notable strength of this study was its low implementation cost for a subscription to radiology simulation software (Pacsbin).15 Additionally, lessons learned from the pandemic were leveraged to utilize virtual spaces and interconnectivity.24

Although technical issues were encountered, they can be addressed with modest investments in improving content image sources, enhancing expertise in Digital Imaging and Communications in Medicine (DICOM) image processing, and refining PACS simulation services. As internet services and bandwidth continue to improve, access to remote locations is increasingly feasible through satellite internet services. The scalability of this modality depends on the adoption of individual training programs, which are often guided by the effectiveness of the curriculum. Future research to better characterize the relationship between learning strategy and test scores is recommended. Future studies should aim to increase sample sizes, implement strategies to reduce cohort dropout rates and explore alternative statistical testing to evaluate other factors that may influence exam scores, such as residency programs and hospital settings.

Limitations

A disproportionate number of participants from the ALC dropped out of the study. Efforts to reach out to participants who did not communicate an intent to withdraw but were nearing, or at deadlines, helped minimize losses. This study occurred during the end-of-year exam period in the participants’ curriculum, which may have influenced leaving the study without prior communication to focus on required program exams. Additionally, the ALC involved more frequent use of Pacsbin images, and reports of technical issues accessing MSK images may have contributed to the higher dropout rate in this cohort.

Technical issues were reported by a small number of participants who had difficulty accessing the MSK XR Pacsbin images. None of the participants reported issues accessing CT images. These technical issues occurred sporadically, without identifiable patterns that could be reproduced during troubleshooting. Since the ALC required regular use of Pacsbin for module training, participants who experienced technical difficulties opted to leave at that time. Four participants, three among ALC and one among PLC reported leaving the study after being unable to load the test images despite multiple attempts. Others informally reported uncertainty about whether they had these issues and self-reported continuing the study. No technical issues were reported in feedback surveys, which raises concerns about underreporting which may have influenced mean post-test scores. Issues with mounting images were discussed with Pacsbin support staff, who could not provide an immediate solution but suggested that local internet bandwidth limitations and DICOM image file sizes might be contributing factors.

Other limitations include the potential for discussions among participants regarding test and module contents. Since the content was delivered and accessed remotely, it was difficult to confirm in-person module use or participant discussions. Although access to modules and testing was tracked through unique participant identifiers, and strict instructions were to not share or discuss content, participants may have discussed questions, answers, or content. However, there was no confirmation of such events occurring.

The study may be limited by volunteer bias as not all programs agreed to join the study, and of those programs, not all trainees volunteered or continued participating. Selection bias is also a potential factor, as the programs represent trainees in the GWU-MEM program in private hospitals in India and may not reflect conditions in public or state-run hospitals. However, the data captured in this study suggests that this approach could be applied in international settings for low- and middle-income populations, provided they have an electronic media device such as a smartphone, tablet, or computer and a reliable internet connection.

Future studies could explore or mitigate technical issues by utilizing alternative PACS simulation software, DICOM image processing tools, or collaborating with radiology technicians to standardize image file sizes. Additionally, more resources could be allocated to improving follow-up with participants and ensuring the accuracy of participant activity, either through dedicated staff to monitor training progress or by implementing incentives to encourage continued participation.

Conclusions

This study provides evidence that supports active learning curricula within EM education in international training contexts. These findings contribute to the growing body of knowledge surrounding active learning strategies, their impact on health workforce development, and the design of distance education curricula. These implications may extend beyond EM training programs to other training programs, particularly pertinent in quarantine situations or travel restrictions, like the recent pandemic. This study suggests the potential to leverage active learning methodologies to optimize cost-effectiveness in education, particularly in scenarios where traditional in-person teaching may be transitioned to software module-based formats, thereby reducing travel and expenses. This aspect of EM education, particularly in XR and CT diagnostic imaging interpretation, appears well-suited to such strategies. Future research should focus on comparing the effectiveness of active versus passive learning methodologies to evaluate their impact on clinical outcomes related to diagnostic interpretation education.

References

1. Hautz WE, Kämmer JE, Hautz SC, Sauter TC, Zwaan L, Exadaktylos AK, et al. Diagnostic error increases mortality and length of hospital stay in patients presenting through the emergency room. Scandinavian Journal of Trauma, Resuscitation and Emergency Medicine. 2019;27(1). https://doi.org/10.1186/s13049-019-0629-z
Crossref  PubMed  PMC

2. Newman-Toker DE, Peterson SM, Badihian S, Hassoon A, Nassery N, Parizadeh D, et al. Diagnostic Errors in the Emergency Department: A Systematic Review. Agency for Healthcare Research and Quality (AHRQ); 2022. https://doi.org/10.23970/ahrqepccer258

3. Emberson J, Lees KR, Lyden P, Blackwell L, Albers G, Bluhmki E, et al. Effect of treatment delay, age, and stroke severity on the effects of intravenous thrombolysis with alteplase for acute ischaemic stroke: a meta-analysis of individual patient data from randomised trials. The Lancet. 2014;384(9958):1929–35. https://doi.org/10.1016/s0140-6736(14)60584-5
Crossref

4. Tranovich M, Gooch C, Dougherty J. Radiograph Interpretation Discrepancies in a Community Hospital Emergency Department. Western Journal of Emergency Medicine. 2019;20(4):626–32. https://doi.org/10.5811/westjem.2019.1.41375
Crossref  PubMed  PMC

5. Villa S, Wheaton N, Lai S, Jordan J. Radiology Education Among Emergency Medicine Residencies: A National Needs Assessment. Western Journal of Emergency Medicine. 2021;22(5):1110–6. https://doi.org/10.5811/westjem.2021.6.52470
Crossref  PubMed  PMC

6. Accreditation Council for Graduate Medical Education. ACGME program requirements for graduate medical education in emergency: Accreditation Council for Graduate Medical Education; 2022 [updated 07/01/2023; cited 2023. Available from: https://www.acgme.org/globalassets/pfassets/programrequirements/110_emergencymedicine_2022.pdf

7. Accreditation Council for Graduate Medical Education. Emergency Medicine Milestones: Accreditation Council for Graduate Medical Education; 2021 [Available from: https://www.acgme.org/globalassets/pdfs/milestones/emergencymedicinemilestones.pdf

8. Radiology Facing a Global Shortage [press release]. Oak Brooke, IL: Radiological Society of North America, 2022 2022.

9. Jamil H, Tariq W, Ameer MA, Asghar MS, Mahmood H, Tahir MJ, et al. Interventional radiology in low- and middle-income countries. Annals of Medicine Surgery. 2022;77:103594. https://doi.org/10.1016/j.amsu.2022.103594
Crossref  PubMed  PMC

10. Jeganathan S. The Growing Problem of Radiologist Shortages: Australia and New Zealand’s Perspective. Korean Journal of Radiology. 2023;24(11):1043. https://doi.org/10.3348/kjr.2023.0831
Crossref  PubMed  PMC

11. Thakur A. Shortage of radiologists a concern, fostering research in AI can augment capabilities of radiologists: Experts. Indian Express. 08/09/2023.

12. McCollum ED, Higdon MM, Fancourt NSS, Sternal J, Checkley W, De Campo J, et al. Training physicians in India to interpret pediatric chest radiographs according to World Health Organization research methodology. Pediatric Radiology. 2021;51(8):1322–31. https://doi.org/10.1007/s00247-021-04992-2
Crossref  PubMed  PMC

13. Misra A, Yadav DC, Kole T. Emergency care in India beyond 75 years of independence – problems and solutions. Journal of Global Health. 2023;13. https://doi.org/10.7189/jogh.13.03015

14. Davey K, Blanchard J, Douglass K, Verma A, Jaiswal S, Sheikh W, et al. Emergency medicine in India: Time for more than applause. Academic Emergency Medicine. 2021;28(5):600–2.

15. Technologies OM. Pacsbin. Baltimore, MD: https://www.pacsbin.com2023. https://doi.org/10.1111/acem.14244

16. McRoy C, Patel L, Gaddam DS, Rothenberg S, Herring A, Hamm J, et al. Radiology Education in the Time of COVID-19: A Novel Distance Learning Workstation Experience for Residents. Academic Radiology. 2020;27(10):1467–74. https://doi.org/10.1016/j.acra.2020.08.001
Crossref  PubMed  PMC

17. Aliaga L, Clarke S. Rethinking Radiology: An Active Learning Curriculum for Head Computed Tomography Interpretation. Western Journal of Emergency Medicine. 2022;23(1):47–51. https://doi.org/10.5811/westjem.2021.10.53665
Crossref  PubMed  PMC

18. Morin CE, Hostetter JM, Jeudy J, Kim WG, McCabe JA, Merrow AC, et al. Spaced radiology: encouraging durable memory using spaced testing in pediatric radiology. Pediatric Radiology. 2019;49(8):990–9. https://doi.org/10.1007/s00247-019-04415-3
Crossref  PubMed  PMC

19. Brame CJ. Active Learning. Elsevier; 2019. p. 61–72. https://doi.org/10.1016/b978-0-12-814702-3.00004-4

20. Ahluwalia T, Toy S, Gutierrez C, Boggs K, Douglass K. Feasible and effective use of a simulation-based curriculum for post-graduate emergency medicine trainees in India to improve learner self-efficacy, knowledge, and skills. International Journal of Emergency Medicine. 2021;14(1). https://doi.org/10.1186/s12245-021-00363-8
Crossref  PubMed  PMC

21. Urbaniak GC, Plous S. Research Randomizer (Version 4.0) Social Psychology Network; 2023 [Available from: http://www.randomizer.org/

22. Qualtrics. Qualtrics. Provo, UT, USA2023.

23. Davis D, Warrington SJ. Simulation Training and Skill Assessment in Emergency Medicine. StatPearls. Treasure Island, FL: StatPearls Publishing; 2024.

24. Dunn S, Milman BD, Bavolek RA, Bralow L, Jones D, Kane BG, et al. Impacts of the COVID-19 Pandemic on United States Emergency Medicine Education: A Council of Residency Directors in Emergency Medicine (CORD) Task Force Survey-Based Analysis. Cureus. 2023. https://doi.org/10.7759/cureus.35994
Crossref


© Education for Health.


Education for Health | Volume 38, No. 1, January-March 2025

(Return to Top)