The AI Tools Health Students Are Actually Using (And Why It Matters for Your Future Care)
Health professions students are actively integrating artificial intelligence tools into their education, with ChatGPT emerging as the dominant platform for learning support, knowledge acquisition, and academic tasks. A qualitative study of 51 health sciences students at the University of Ottawa found that while AI adoption is widespread and viewed favorably, there is a critical gap: most students lack structured guidance on responsible AI use, and educational institutions are only beginning to address this training need .
Which AI Tools Are Health Students Actually Using?
ChatGPT (OpenAI) stands out as the most frequently used AI tool among health professions students, according to the University of Ottawa research. The study included participants from 10 different health professions, with most respondents being women between the ages of 20 and 29 years. Students reported using AI tools for a variety of academic and clinical learning purposes, driven by curiosity, peer influence, and the desire to improve work efficiency. The adoption pattern reflects a broader trend: AI is no longer a novelty in health education but an expected part of the learning toolkit.
What makes this finding significant is not just that students use these tools, but how they integrate them into their learning processes. Students described AI as a complementary tool that facilitated knowledge acquisition, skill development, writing, and problem-solving. However, the research also revealed that students critically evaluated AI results and emphasized the importance of technical skills, critical thinking, and digital literacy. This suggests that while AI adoption is rapid, students recognize its limitations and are developing safeguards on their own.
How Are Health Professions Students Using AI for Learning?
- Knowledge Acquisition: Students use AI tools like ChatGPT to explain complex medical concepts, answer subject-specific questions instantly, and create study materials tailored to their learning needs.
- Skill Development: AI supports the development of clinical reasoning, writing proficiency, and problem-solving abilities by providing immediate feedback and alternative explanations for difficult topics.
- Academic Task Support: Students leverage AI for writing assistance, essay drafting, quiz creation, and curriculum development, reducing time spent on routine tasks and freeing capacity for deeper learning.
- Simulated Clinical Environments: AI tools can create realistic clinical scenarios, evaluate student communication with simulated patients, and provide personalized feedback on clinical performance.
The study found that students did not passively accept AI outputs. Instead, they integrated critical evaluation into their workflow, cross-referencing AI-generated information with textbooks, peer discussions, and instructor feedback. This active engagement suggests that the concern about students blindly trusting AI may be overstated, at least among this cohort. However, the research also highlighted a troubling gap: without structured training, students are essentially teaching themselves how to use these tools responsibly.
Why the Lack of Formal AI Training Is a Problem
Despite the widespread adoption of AI in health professions education, most institutions have not yet developed comprehensive training programs on responsible AI use. The University of Ottawa study found that students themselves recommended peer learning, hands-on demonstrations, and access to online resources as the most effective strategies for raising awareness and training their peers on proper AI use. This peer-driven approach reflects a reality: students are ahead of their institutions in AI adoption, and formal education is playing catch-up.
The implications are significant. Health professions students are learning to use AI tools that will shape their clinical decision-making, diagnostic reasoning, and patient care practices. If their training is informal and peer-driven rather than structured and evidence-based, there is a risk that problematic habits, biases in AI systems, or misunderstandings about AI limitations could become embedded in the next generation of healthcare providers. The study emphasized that structured guidance, critical evaluation skills, and peer-supported training are essential to ensure responsible and effective adoption of AI in health education.
What About Equity and Access?
The research also flagged an important equity concern. Many AI tools are primarily programmed in English and sometimes require payment to access advanced features. At a bilingual institution like the University of Ottawa, where courses are taught in French, English, or both, ensuring equitable access to AI resources for all students is essential. Socioeconomic and linguistic factors can influence which students benefit from AI tools and which are left behind. This gap could widen disparities in health professions education and, downstream, in clinical practice.
The broader context matters here. A 2022 national Canadian survey of nearly 2,000 health professions students showed that AI was viewed favorably by students, who also believe that this technology will positively influence their future careers. This optimism is warranted, but it also underscores the urgency of developing equitable, evidence-based training programs before AI adoption becomes so entrenched that it is difficult to course-correct.
What Should Health Professions Programs Do Now?
- Develop Structured Curricula: Health professions programs should create formal, evidence-based training modules on AI use, covering both the capabilities and limitations of tools like ChatGPT, as well as ethical considerations and potential biases in AI systems.
- Ensure Equitable Access: Institutions must guarantee that all students, regardless of socioeconomic status or language preference, have access to AI tools and training, preventing the creation of a two-tiered system where some students benefit while others do not.
- Foster Critical Evaluation Skills: Rather than banning or discouraging AI use, programs should teach students how to critically evaluate AI outputs, recognize when AI may be unreliable, and integrate AI insights with clinical judgment and evidence-based practice.
- Support Peer Learning: The study found that students themselves recommended peer learning as an effective training strategy, suggesting that institutions should formalize and support peer-led AI training initiatives rather than relying solely on top-down instruction.
The University of Ottawa research highlights a critical moment in health professions education. AI tools are already embedded in how students learn, study, and prepare for clinical practice. The question is no longer whether to integrate AI into health education, but how to do so in a way that enhances learning outcomes, fosters skill acquisition, and ensures responsible and equitable adoption. Without deliberate action, the informal, peer-driven approach that currently dominates could leave gaps in students' understanding of AI's limitations and ethical implications, potentially affecting patient care quality and safety for years to come .