Medical educators are quietly rethinking how they prepare students for an AI-driven healthcare world, moving beyond simple technical training to address deeper questions about professional identity and patient care. At the University of North Carolina at Chapel Hill, one researcher is leading this shift by helping students understand not just how AI works, but how it fundamentally changes what it means to be a healthcare professional. Why Are Medical Schools Suddenly Focused on AI's Impact on Professional Identity? Bethany Stearns, a clinical assistant professor in the Radiologic Science division at UNC Chapel Hill, has spent nearly a decade as a radiologic technologist and is now investigating a question that most medical educators haven't yet asked: Does automating certain tasks change how students see themselves as healthcare professionals?. Her PhD research explores whether AI integration affects students' professional identity, examining whether they begin to see themselves primarily as caregivers, as technologists, or somewhere in between, and how that shift could influence patient care. This focus reflects a growing recognition that AI training in healthcare education can't be purely technical. Stearns emphasized that AI is rapidly reshaping both clinical practice and education, making it essential for students to understand its applications, capabilities, and ethical implications. Her goal is to prepare students not only to use these systems effectively but also to critically understand how AI integration affects patient care and professional practice. What Does Comprehensive AI Training for Healthcare Students Actually Look Like? Stearns' engagement with AI spans three interconnected areas: professional development, scholarship, and curriculum design. Rather than treating AI as a standalone technical skill, she's weaving it throughout how students learn to think about their roles in healthcare. - Algorithm Development and Application: In February 2026, Stearns presented "AI Demystified: The Journey from Creation to Function" at the Association of Collegiate Educators in Radiologic Technology annual conference, exploring how AI algorithms are developed, trained, and applied in medical imaging, and how curriculum can evolve to prepare students for AI-integrated clinical practice. - Ethical Use of AI Tools: She will expand on this work at the June 2026 ASRT Educational Symposium with her presentation "Navigating Artificial Intelligence in Professional Writing: Ethics, Guidelines, and Best Practices," which examines the capabilities and limitations of AI tools in academic and professional writing, emphasizing responsible use, strategies for crafting effective prompts, evaluating AI-generated suggestions for accuracy, and ethical considerations of AI-generated content. - Clinical Integration and Quality Control: AI is increasingly used to assist with equipment positioning, protocols, quality control, and diagnostics in radiologic science, requiring students to understand how these systems fit into real-world clinical workflows. "AI is rapidly reshaping both clinical practice and education, making it essential for students to understand its applications, capabilities, and ethical implications," emphasized Stearns. Bethany Stearns, Clinical Assistant Professor in Radiologic Science at University of North Carolina at Chapel Hill How to Prepare Healthcare Students for an AI-Integrated Career - Build Technical Literacy First: Students need to understand how AI algorithms are created, trained, and deployed in clinical settings, not just how to use them as black-box tools. - Integrate Ethics Into Every Lesson: Rather than treating ethics as a separate module, weave discussions of responsible AI use, bias, and limitations throughout the curriculum so students develop ethical reasoning alongside technical skills. - Examine Professional Identity Shifts: Help students reflect on how automation changes their role and responsibilities, ensuring they maintain a patient-centered perspective even as tasks become automated. - Practice Evaluating AI Output: Teach students to critically assess AI-generated suggestions and recommendations, understanding both the capabilities and limitations of these systems in clinical contexts. Stearns' work also includes a systematic review on AI-augmented portable imaging systems and service on UNC's Department of Health Sciences Generative AI ad hoc Committee beginning in May 2026. This institutional commitment signals that the university recognizes AI education as a core competency for future healthcare professionals, not a peripheral skill. The broader implication is clear: as AI becomes embedded in clinical practice, medical schools can no longer treat it as optional or purely technical training. Instead, educators must help students develop both the technical skills to work alongside AI systems and the critical thinking to understand how these tools reshape professional practice. By focusing on professional identity alongside technical competency, Stearns and her colleagues are preparing a generation of healthcare workers who can harness AI's potential while keeping patient care at the center of their decision-making.