Eleven universities and organizations just received up to $25,000 each to expand AI education and literacy across K-12 schools, community colleges, and local communities. The Institute for Trustworthy AI in Law and Society (TRAILS), a coalition of four academic institutions, announced the Broader Impact Awards in March 2026 to help diverse groups understand and engage with artificial intelligence responsibly. Rather than focusing on cutting-edge AI research, these projects prioritize teaching students, teachers, and families how to think critically about AI systems. What Are Schools Actually Teaching About AI? The funded projects reveal a practical, ground-level approach to AI education that goes beyond buzzwords. Several initiatives focus on helping educators integrate AI literacy into existing curricula. Sarah McGrew of the University of Maryland and Virginia Byrne of Morgan State University are leading "AI Literacy Codesign Workshops with Social Studies Teachers," which convenes middle and high school educators to develop lessons that help students critically evaluate AI-generated text and media. Through collaborative workshops, teachers align AI literacy concepts with existing curricular standards and pilot classroom-ready materials. In elementary education, one particularly innovative project combines human teachers with AI tutoring. "High-Dosage Human-Enabled AI Tutoring to Build Trust in Using AI to Teach SoR with Young Learners," led by Valerie Riggs of Morgan State, pairs an adaptive AI reading tutor with educators who provide emotional support in a 1:4 ratio. The project measures gains in literacy, student engagement, and trust in AI-assisted instruction while developing guidance for scaling human-centered AI in schools. How Are Community Colleges Preparing Students for an AI-Driven Workforce? Community colleges, often overlooked in AI education discussions, are getting direct support through the TRAILS initiative. Krishna Bista of Morgan State University is launching "AI Learning Labs: Cross-Sector Seminars and Participatory Design Tools for Community Colleges." The project features a three-week online seminar series with TRAILS researchers and external experts, engaging community college and higher education leaders in applied learning and participatory design activities focused on responsible AI adoption. The initiative also produces a curated repository of tools to guide ethical, context-sensitive AI implementation across diverse campuses. "Our goal is to help close the loop among academia, industry and communities, all of whom are focused on harnessing the power of AI to advance the social good," explained Hal Daumé III, professor of computer science at the University of Maryland and director of TRAILS. Steps to Build AI Capacity in Under-Resourced Schools - Pair Teacher Candidates with Mentors: Martha James of Morgan State is hosting a four-day workshop that pairs teacher candidates with mentor teachers to build shared expertise in evaluating AI accuracy, fairness, and risk, particularly in under-resourced schools. - Develop Classroom-Ready Materials: Rather than relying on generic AI resources, educators co-design lessons aligned with existing standards, ensuring AI literacy fits naturally into social studies, reading, and other core subjects. - Create Accessible Toolkits for Families: Elizabeth Morgan of Morgan State is developing an "AI for Family Advocacy Toolkit" to help families of children receiving special education services understand how AI tools can support individualized education plans (IEPs) and 504 plans. - Train Librarians on AI-Powered Search Tools: Ryan O'Grady of the University of Maryland is developing training programs to help librarians navigate emerging AI-driven search tools such as Google Gemini and Perplexity AI, working with a Maryland county library system to refine educational materials. The breadth of these initiatives reflects a recognition that AI literacy cannot be confined to computer science classrooms. Teachers, librarians, families, and school administrators all need practical knowledge about how AI systems work and where they can fail. Beyond K-12, the TRAILS awards also fund summer camps and youth outreach programs. "Digital Futures: AI and Business Analytics Workshops for Baltimore Youth," led by Maxim Bushuev of Morgan State, launches a summer camp for high school students in the greater Baltimore area, building on previous programming that reached more than 200 students and counselors. The initiative connects AI and business analytics skills to real-world career pathways. Similarly, "Training Researchers in AI-Enabled Learning and Systems: A Summer Academy for G9-12 Students," led by Ekundayo Shittu of George Washington University, hosts a summer camp in Washington, D.C., where students develop technical and analytical skills while exploring the social and ethical implications of AI. The trustworthy AI curricula for these camps are being co-designed alongside TRAILS' annual AI Summer Camp at the University of Maryland, with the goal of eventually expanding the program to similar initiatives across the country. Why Is Public Understanding of AI Trust Important? Beyond formal education, the TRAILS awards also fund creative public engagement. Jordan Boyd-Graber of the University of Maryland is "Creating an AI Trust Game Show," which helps members of the public learn how to calibrate trust in AI through a game-show-style competition. Teams craft questions, evaluate AI outputs, and compare human strengths with machine capabilities, offering a public window into where AI systems succeed and where they fall short. The broader portfolio of awards also addresses AI governance and organizational readiness. Brandeis Marshall, founder and CEO of DataedX Group, is developing an "AI Governance Assessment Toolkit for Resource-Limited Organizations" to help small and midsize enterprises evaluate AI risks, strengthen safeguards, and align digital tools with organizational values. Meanwhile, Katie Shilton and María Isabel Magaña from the University of Maryland are creating success metrics for participatory AI initiatives and evaluating projects from the TRAILS database and other National Science Foundation AI Institutes. David Broniatowski, professor of engineering management and systems engineering at George Washington University and deputy director of TRAILS, emphasized the long-term vision: "The Broader Impact Program is designed not only to support promising pilot projects, but also to build lasting connections that ensure AI technologies are developed with transparency, accountability and community trust at their core". Collectively, the 11 awards reflect TRAILS' mission to broaden participation in trustworthy AI and ensure that communities most affected by emerging technologies have a voice in shaping them. By investing in educators, students, families, librarians, and small businesses, TRAILS leaders hope to build sustainable pipelines of talent and informed stakeholders who can guide AI innovation responsibly.