Anita Samuel1, Eulho Jung2, and Beth King3
1PhD, Associate Professor, Department of Health Professions Education, School of Medicine, Uniformed Services University of Health Sciences, Bethesda, United States
2PhD, Assistant Professor, Department of Health Professions Education, School of Medicine, Uniformed Services University of Health Sciences, Bethesda, United States
3MPP, Program Evaluator, Department of Health Professions Education, School of Medicine, Uniformed Services University of Health Sciences, Bethesda, United States
ABSTRACT
Background: Prensky’s concept of the “digital native,” though flawed, has taken hold in the popular psyche especially in the United States. Medical schools especially subscribe to this notion, overlooking the digital divides of access and competencies. Orientation programs can be an appropriate time to provide training for students in technology skills. However, orientation sessions often fail to adequately offer training in the skills in technology that medical students will need to succeed. To address this, we introduced a digital skills session as a part of medical school orientation to bridge the digital skills divide.
Methods: This study examines students’ perceptions of the digital skills session and their confidence in utilizing digital tools. Anonymous pre- and post-surveys were administered to assess students’ confidence using various digital tools. The quantitative and qualitative data obtained through the pre- and post-surveys were analyzed.
Results: 475 students participated in the workshop over three years. Evaluations showed significant improvement in confidence with various digital tools, especially those that were less familiar. Fifty-eight percent found training on unfamiliar tools and practical tips valuable, and half of the group saw no need for improvement. The digital skills workshop revealed diverse digital competencies and highlighted that the assumption that students are digital natives, inherently proficient with technology, is flawed.
Discussion: This study suggests that a digital divide exists within the group, digital native. Furthermore, the participants desired workshops that respected their status as adult learners, emphasizing choice and relevance to their needs, including immediate practical application of skills. These findings point to the importance of structuring orientation workshops that not only equip students with digital skills, but also align with their developmental needs and real-world applications, thereby promoting digital equity in medical education.
Key Words: digital skills workshop, medical school orientation, technology skills gap, digital divide in medical education, student confidence in digital tools
Date submitted: 4-March-2025
Email: Anita Samuel (anita.samuel@usuhs.edu)
This is an open access journal, and articles are distributed under the terms of the Creative Commons Attribution-Non Commercial-Share Alike 4.0 License, which allows others to remix, tweak, and build upon the work non-commercially, as long as appropriate credit is given and the new creations are licensed under the identical terms.
Citation: Samuel A, Jung E, and King B. Empowering medical students: navigating the digital frontier in education. Educ Health 2025;38:237-243
Online access: www.educationforhealthjournal.org
DOI: 10.62694/efh.2025.292
Published by The Network: Towards Unity for Health
Variations in digital literacy, access, and enthusiasm among students1 have challenged the assumption made by Marc Prensky and others that students born after the 1990s inherently know how to use technology, and that digital natives are “young and expert users of digital technologies.”2,3,4 Medical students, although frequently assumed to be digitally competent, have been shown to exhibit diverse levels of confidence and proficiency with digital tools essential for their education.5,6,7
Recognizing this disparity, some higher education institutions have integrated digital skills sessions into orientation programs, which traditionally aim to facilitate students’ transition by introducing institutional resources and expectations.8,9 However, these sessions often remain narrow, focusing primarily on basic institutional technologies such as email or security protocols. A lack of training in such technologies could be problematic for medical students who must navigate a dense curriculum, starting from the first year.
We introduced a digital skills session as a part of medical school orientation to bridge the digital skills divide and ensure that all incoming medical students are well-prepared for the digital demands of their education. This study investigates medical students’ perceptions of these digital skills sessions and evaluates their impact on student confidence and proficiency.
Uniformed Services University of the Health Sciences is a medical school in the Northeast United States. Each medical class cohort comprises approximately 150 students. The educational journey for matriculated students begins with a three-week orientation that incorporates activities such as academic support and wellness services, student organizations, career development services, library services, and technology skills such as setting up email accounts, accessing institutional electronic resources, and configuring systems for wireless connectivity. For incoming students in 2020, a new digital skills orientation workshop was added to prepare them for the virtual learning environment they were entering during the socially distanced COVID-19 lockdown period.
The digital skills workshop was crafted as a 90-minute, immersive, and interactive workshop. Students were introduced to essential digital proficiencies custom-tailored to their first year in medical school at Uniformed Services University of the Health Sciences. This instructional experience was designed to foster active student engagement, offering them opportunities to familiarize themselves with these digital tools but also to actively practice their utilization. The workshop encompassed a comprehensive range of topics, including an orientation to the university’s Learning Management System (LMS) and the suite of Google tools that comprise the university’s digital ecosystem. Beyond showing students how to use the technologies, this workshop focused on providing suggestions for the effective integration of technology to enhance learning.
Anonymous pre- and post-surveys were administered to assess students’ confidence using various digital tools. For each tool, students rated their confidence on a 5-point Likert scale from Not at all confident (1) to Extremely confident (5). The post-survey also included questions about the design of the workshop. Six Likert scale questions assessed how students felt about the workshop (from Strongly Disagree (1) to Strongly Agree (5)) and three open-ended questions. The data collected from three cohorts of medical students, 2020, 2021, and 2022, are utilized in this study. This study was deemed as not requiring Institutional Review Board (IRB) review (Protocol DBS.2023.639).
The quantitative and qualitative data obtained through the pre- and post-surveys were analyzed. Descriptive statistics on the overall differences between pre- and post-orientation scores were calculated. Due to the anonymous nature of the collected data, a paired or repeated measures t-test was not feasible. The independent samples t-test was used to compare the means of the pre- and post-surveys. This helped identify statistically significant differences between the two data sets. The open-ended survey questions on the feedback forms were collated and thematically analyzed. Since this topic is understudied, existing theories or models could not inform data analysis.
Across the three years, approximately 475 students participated in the digital skills workshops. Table 1 provides a breakdown of respondents for the pre-and post-surveys and the open-ended questions in the post survey. There were 432 responses for the pre-survey and 295 for the post-survey in total.
Table 1 Responses received from the pre- and post-workshop surveys
The largest gains in confidence were associated with less common digital tools and tools that students might not have had access to. Therefore, Google Tasks and Sakai (LMS) showed the largest gains in 2020 and 2021. In 2022 the university transitioned to Canvas, a learning management system that students had more familiarity with, and students’ confidence coming in was high. Gmail was the one tool that students came in with a high degree of confidence, and there was not much gain at the end of the workshop. There was about a 0.5-point gain in confidence for the use of Google Calendar. Between 2021 and 2022, the gain in confidence for Google Meet dropped dramatically from 0.91 to 0.38. (See Figure 1.)

Figure 1 Difference in confidence level for the digital tools pre- and post-workshop
An independent samples t-test comparing participants’ technology confidence levels pre- and post-workshop showed significant differences in confidence levels for all technologies between the two groups. Additionally, Levene’s test indicated a significant difference in variances (p < .05), suggesting that the assumption of equal variances was not met. See Table 2.
Table 2 Independent samples t-test
Levene’s test for each technology confidence level between pre- and post- workshop revealed that Levene’s test was significant for all technologies, with p-values less than .001, suggesting a substantial violation of the assumption of equal variances between the two groups. (See Table 3.)
Table 3 Homogeneity of Variances Test (Levene’s)
Three Likert-scale questions were asked to gauge student satisfaction with the session. After the first year, a fourth question, “This workshop was the right length,” was added to the survey. Across all three years, most students Agreed or Strongly Agreed that the workshop activities contributed to their learning and were satisfied with the workshop. The satisfaction score dropped from 93% in 2020 to 76% in 2022. Only 77% of the respondents in 2022 agreed that the workshop was the right length.
The three open-ended questions provided space for students to give free responses to what they found most useful and least useful about the workshops and elicited suggestions on how to improve the workshop. Topics and the teaching process were identified as being the most useful aspects of the workshops. Training on Canvas was found to be useful by 17% of respondents and “tips and tricks” were identified as useful by 11%. Thirty percent of respondents found all aspects of the workshop to be useful, and 17% of respondents commented that the workshops covered topics with which they were already familiar. Fifty percent of respondents did not have any suggestions for change.
Three themes emerged from thematic analysis of open-ended responses: familiarity, and unfamiliarity with technology tools; and applicability of the workshop content. (See Table 4 for themes and quotations.) When discussing familiar technology tools, participants liked learning new things, and disliked working through tools they were already familiar with. This led to the suggestion that these tools be eliminated from the workshop. For unfamiliar tools, however, participants were dissatisfied that not enough time had been spent on them, and suggested setting aside more time for these tools. Finally, applicability of the content to learning was noted by participants; time spent on tools they were already familiar with was considered redundant.
Table 4 Thematic analysis of open-ended responses
The findings from this study on student perceptions of a digital skills workshop highlight the diversity in digital skills of medical students and the need for an andragogical approach to designing the workshops. The study revealed a range of self-confidence in technological proficiency among learners, demonstrating the existence of a digital divide in digital skill sets. The disparity in confidence across different digital tools and participants’ self-declared lack of digital competency questions the common assumption that current students are digital natives “who speak and breathe the language of computers and the culture of the web in which they were born”.10 These assumptions about the digital competencies of digital natives may overlook the reality that digital exposure does not equate to digital competencies.1
As adult learners, students wanted more agency over the workshop in terms of flexibility in attendance and choice of topics.11 They did not want to spend time on “unnecessary information” when they knew “pretty much all of it.” When the information presented fell below their zone of proximal development (ZPD), students lost interest.12 However, when the session aligned with the students’ ZPD, they were appreciative of the “small tips and tricks even for ‘veteran’ users.” As one respondent noted, “I learned things which surprised me because I’m pretty tech savvy.” Some participants also asked for scaffolding materials such as “[a]n FAQ sheet that is organized by topic” or “[a] shared document of tips and tricks” which they noted would be helpful.
Students also commented on the immediate applicability of the skills they were learning. They wanted “more time spent on Sakai (LSM) on how to submit assignments,” but appreciated “[b]eing able to immediately apply/practice the methods being taught.” Applicability and relevance to practice motivated participants to engage with the workshop.
We acknowledge that this study has some limitations. One challenge was in the reduced number of participants completing the post-test which may introduce bias in the estimation of gains due to the workshop. Furthermore, the pre- and post-workshop surveys were anonymous, limiting the analysis that could be conducted on the data. It was not possible to identify race/socioeconomic patterns from this data. We plan to modify the survey instrument to collect demographic information to allow for a more nuanced data analysis. While respondents reported high satisfaction with the workshop’s contribution to their learning, we would like to explore how this initial boost in digital skills impacts their subsequent academic experiences.
This study suggests that assumptions about medical students’ inherent digital competencies are not universally valid. The significant variability in student confidence and skill highlights the necessity for tailored digital skills orientation workshops. New technologies adopted by medical schools to enhance the learning experiences of their students may require adjustment of workshop content. Without training programs for these new technologies the existing digital divide may become wider.13 Importantly, this study underscores the value of an andragogical approach, advocating for personalized learning experiences that respect adult learners’ diverse prior knowledge and immediate application needs. Future orientation programs should focus not only on introducing digital tools but also on contextualizing their use within medical education and practice in order to bridge the digital divide and foster digital equity. Incorporating continuous, scaffolded digital skills training throughout medical education could further ensure that technological competencies are sustainably developed and aligned with evolving clinical environments.
The opinions and assertions expressed herein are those of the author(s) and do not necessarily reflect the official policy or position of the Uniformed Services University or the US Department of Defense.
None
The authors declare that they did not receive funding for this research.
None
1. Reid L, Button D, Brommeyer M. (2023). Challenging the myth of the digital native: A narrative review. Nursing Reports. 13(2):573–600. MDPI. https://doi.org/10.3390/nursrep13020052
Crossref PubMed PMC
2. Prensky, M. (2010). Teaching digital natives: Partnering for real learning. Thousand Oaks, CA: Corwin Press.
3. Dastane O, Haba HF. (2023). The landscape of digital natives research: a bibliometric and science mapping analysis. FIIB Business Review, 23197145221137960. https://journals.sagepub.com/doi/abs/10.1177/23197145221137960
Crossref
4. Adjin-Tettey TD. (2020). Can ‘digital natives’ be ‘strangers’ to digital technologies? An analytical reflection. Inkanyiso: Journal of Humanities and Social Sciences.12(1): 11–23. https://www.ajol.info/index.php/ijhss/article/view/199513
5. Bigdeli S, Kaufman D. (2017). Digital games in medical education: Key terms, concepts, and definitions. Medical Journal of the Islamic Republic of Iran. 31(52). https://doi.org/10.14196/mjiri.31.52
Crossref
6. Essary AC. (2011). The impact of social media and technology on professionalism in medical education. The Journal of Physician Assistant Education. 22(4):50–53. https://journals.lww.com/jpae/abstract/2011/22040/the_impact_of_social_media_and_technology_on.9.aspx
Crossref
7. Kennedy G, Gray K, Tse J. (2008). ‘Net Generation’ medical students: technological experiences of pre-clinical and clinical students. Medical Teacher. 30(1); 10–16. https://doi.org/10.1080/01421590701798737
Crossref PubMed
8. Ellaway RH, Cooper G, Al-Idrissi T, Dubé T, Graves L. (2014). Discourses of student orientation to medical education programs. Medical Education Online. 19(1), 23714. https://doi.org/10.3402/meo.v19.23714
Crossref PubMed PMC
9. Wolcott GV, Reckmeyer WJ, Conner AK, Flores R. (2021). Becoming a champion of orientation. In Handbook of Research on the Changing Role of College and University Leadership (pp. 259–273). IGI Global; 2021. p. 259–273.
Crossref
10. Benini S, Murray L. (2013). Critically evaluating Prensky in a language learning context: The “digital native/immigrants debate” and its implications for CALL. In L. Bradley S. Thouesny (Eds.). 20 years of EUROCALL: Learning from the past, looking to the future. Proceedings of the 2013 EUROCALL Conference. Universidade de Evora. 2013.
11. Knowles MS. (1978). Andragogy: Adult learning theory in perspective. Community College Review. 3:9–20.
Crossref
12. Kaufman DM. (2018). Teaching and learning in medical education: How theory can inform practice. Understanding medical education: evidence, theory, and practice. Wiley Blackwell, Oxford, UK. p.37–69.
Crossref
13. Wang C, Boerman SC, Kroon AC, Möller J, H de Vreese C. (2024). The artificial intelligence divide: Who is the most vulnerable?. New Media Society. 14614448241232345. https://doi.org/10.1177/1461444824123234
© Education for Health.
Education for Health | Volume 38, No. 3, July-September 2025