Universities are moving beyond teaching students about artificial intelligence and instead building ecosystems where they solve actual business and research problems using AI tools. Old Dominion University's new MonarchSphere incubator, launched in partnership with Google Public Sector, represents a fundamental shift in how institutions define and teach AI literacy. Rather than treating AI as a subject to study in isolation, the university is embedding it into applied, cross-disciplinary projects that connect students, faculty, and industry partners around real-world challenges. What's Wrong With Traditional AI Education in Universities? Most college AI programs excel at building foundational knowledge, but they struggle to bridge the gap between theory and practice. Students learn algorithms and model architecture in classrooms, yet rarely encounter the messy reality of implementing AI solutions in actual organizations. This disconnect leaves graduates unprepared for how AI actually gets used in the workplace, where technical skill alone isn't enough. You also need judgment, ethical reasoning, and the ability to work across disciplines. MonarchSphere was created specifically to address this gap. Rather than positioning itself as a standalone lab or makerspace focused on exploration, the incubator functions as an applied ecosystem connected to academic programs, research initiatives, institutional operations, and external partners. This structure enables something traditional classrooms cannot: students working on problems that matter to real organizations, with stakes and complexity that mirror actual professional environments. "MonarchSphere was created to address a gap we were seeing between learning about AI and actually applying it in meaningful ways. Traditional academic programs are very strong in building foundational knowledge, but they are not always structured to support rapid, cross-disciplinary experimentation tied to real problems," explained Chrysoula Malogianni, senior associate vice president of digital innovation and chief digital experience officer at Old Dominion University. Chrysoula Malogianni, Senior Associate Vice President of Digital Innovation and Chief Digital Experience Officer at Old Dominion University How Does AI Literacy Actually Work Across Different Fields? At Old Dominion, AI literacy isn't one-size-fits-all. The university recognizes that understanding AI in healthcare looks fundamentally different from understanding it in engineering, business, or education. Rather than a generic "AI for everyone" course, the institution embeds AI literacy within specific disciplines so students understand how the technology reshapes their particular field. This approach builds on decades of work around digital literacy and information literacy, but with a crucial difference: AI is now reshaping how work actually gets done across every sector. At ODU, AI literacy means the entire community can engage with AI in ways that are informed, intentional, and appropriate to their field. That includes understanding what these systems can and cannot do, using them in context, evaluating outputs critically, and making sound decisions rather than relying on them blindly. Steps to Implement Applied AI Learning in Your Institution - Connect Real Problems to Student Projects: Partner with industry, government, and research organizations to identify actual challenges that AI could help solve, then structure student work around these problems rather than abstract exercises. - Build Cross-Disciplinary Teams: Require students from different majors to collaborate on the same project, mirroring how AI work actually happens in organizations where engineers, domain experts, and ethicists must work together. - Embed Applied Courses Within the Ecosystem: Move beyond one-off projects by integrating structured learning experiences directly into the applied environment, where students can move from theory to hands-on work with real tools and datasets. - Train Faculty as Facilitators, Not Just Lecturers: Prepare instructors to guide students through open-ended problems and uncertainty, helping them develop judgment and problem-framing skills alongside technical competence. What Are Students Actually Building in MonarchSphere? One concrete example from MonarchSphere involves student pathways and academic advising. Many students struggle to connect what they're studying with where it leads professionally. Through the incubator, ODU is exploring how AI can support this process more effectively, helping students understand career pathways, options, and decisions. But the work goes deeper than just using an AI tool. Students are thinking through the logic behind recommendations, understanding what data is appropriate to use, and evaluating what makes an output trustworthy. In engineering contexts, students work with data to support modeling, simulation, and optimization problems. In healthcare-related projects, they explore how AI can support analysis, decision support, and workflow improvement, always within appropriate ethical boundaries. The common thread across all these projects is that the work is grounded in real use cases. Students aren't completing abstract exercises; they're engaging with problems that require both technical understanding and professional judgment. How Does This Change What Students Actually Learn? Inside a traditional classroom, students work within defined structures with clear expectations and predetermined outcomes. Inside MonarchSphere, students engage with open-ended problems, collaborate across disciplines, and navigate genuine uncertainty. This fundamentally changes how they develop as thinkers and professionals. Rather than simply completing assignments, students help frame problems, test ideas, evaluate outputs, and think critically about how solutions would actually be used in practice. The skills that emerge from this experience are distinctly different from what traditional courses produce. Students develop comfort working in real-world settings, stronger professional judgment, and a deeper understanding of how technology fits into actual workflows. They learn not just to use AI tools, but to think about when and how to use them responsibly. Measuring success in this model requires a different approach than traditional grading. ODU looks at several indicators: student engagement in applied experiences, exposure to real use cases and external partners, and the kinds of skills students are developing. Over time, the university plans to track how students approach problems, how they use AI tools, and how they communicate their thinking to stakeholders. Is This Model Spreading Beyond Old Dominion? While MonarchSphere is positioned as the first AI incubator in higher education, the broader shift toward applied, ecosystem-based learning is happening globally. Across Europe and Asia, universities and governments are recognizing that AI adoption among students is already widespread, regardless of whether formal policies exist. This reality is forcing institutions to rethink how they prepare students. Global usage data reveals the urgency of this shift. According to a 2025 report from the Higher Education Policy Institute, 92% of UK university students use AI in academic work. The Digital Education Council found that 86% of students across 16 countries regularly use AI in their studies, with 54% using it weekly. In Singapore, 84% of students ages 15 to 25 use AI tools for homework at least weekly, with 100% reporting some AI use. This widespread adoption is happening faster than policy frameworks can keep up. Fewer than 20 countries have even issued national policy guidelines for AI in education, yet students are already using these tools at scale. This gap between adoption and governance means institutions like Old Dominion are essentially pioneering the models that will eventually become standard practice. The MonarchSphere approach offers a replicable model for other universities grappling with the same challenge: how to prepare students for a world where AI is not a specialized skill but a foundational literacy that cuts across every discipline and profession. By connecting students to real problems, external partners, and cross-disciplinary collaboration, universities can move beyond the question of "what should we teach about AI?" to the more urgent question of "how do we help students develop judgment and skills they'll actually need in AI-enabled workplaces?"