The EdTech industry is at a crossroads. Over 7,000 community members are converging at the ASU+GSV Summit in San Diego this April to grapple with a fundamental problem: the education technology market is drowning in AI tools that promise transformation but fail to earn the trust of teachers and students who must use them daily. This year's summit arrives at a genuinely pivotal moment for EdTech. Artificial intelligence is dramatically reshaping what's possible in terms of product features and business models, and major technology companies are leaning hard into learning use cases. Yet this rapid innovation is happening against a backdrop of contracting public education budgets, declining university enrollment, and widespread cultural skepticism around screen time and learning tools. What's Driving the EdTech Trust Problem? The core issue isn't that AI tools don't work. Rather, the sector has become what industry leaders call an overcrowded "junk drawer" of shallow applications that prioritize rapid deployment over pedagogical integrity. Ben Kornell, co-founder of Edtech Insiders, will moderate a panel at the summit specifically addressing this challenge, examining why so many AI education tools fail to deliver on their promises and what it takes to build solutions that educators actually want to use. The problem reflects a broader tension in the EdTech market. Venture capital and big tech companies are racing to capture the education space, but speed-to-market often comes at the expense of rigorous learning science. Teachers and students can sense the difference between tools built with genuine pedagogical thinking and those designed primarily to capture market share. How Are Industry Leaders Proposing to Raise Standards? The summit's programming reflects a sector-wide push to establish higher standards for AI education tools. Rather than accepting the status quo of mediocre applications, industry leaders are proposing concrete steps to rebuild trust and ensure that new technologies actually improve learning outcomes. - Prioritize Pedagogical Integrity: Tools must be designed with genuine learning science principles at their core, not as an afterthought. This means involving educators in the design process and testing claims against rigorous evidence before launch. - Emphasize User Agency: Teachers and students need tools that respect their autonomy and decision-making, rather than systems that automate away human judgment or create dependency on AI recommendations. - Establish Credible Evaluation Standards: The summit will feature a dedicated session on "AI and Efficacy: What Works, What Doesn't, and What We Need to Know," sponsored by ElevenLabs, to help educators and decision-makers separate genuine impact from marketing hype. - Support Teacher-Centered Design: A panel featuring Dr. Rachel Book and Charlie Thayer of Lincoln Learning Solutions will explore how AI should be designed to support teachers, not replace them, ensuring that technology enhances rather than undermines educator expertise. These aren't abstract principles. They represent a deliberate shift in how the EdTech sector thinks about innovation. Instead of asking "What can AI do?" the conversation is shifting to "What should AI do to genuinely improve learning?" What Role Will Emerging Technologies Play? The summit will also showcase cutting-edge developments that could reshape education technology. Major tech companies are demonstrating new hardware and software, including Google Beam and Apple's latest hardware technology designed specifically for students. These tools represent the next frontier of how consumer technology meets the classroom, but their success will depend on whether they're built with the same commitment to pedagogical integrity that industry leaders are now demanding. The conversation around AI tutoring, in particular, has become more nuanced. While AI tutors have been positioned as a "Holy Grail" in education, promising unlimited scalability and 24/7 availability, the summit will feature a panel titled "Can Tutoring Save Us?" that digs into whether this vision is realistic or aspirational myth. This reflects a broader maturation in how the sector thinks about AI's role in education. Why Does This Matter for Students and Schools? The stakes are high. Schools are making significant investments in AI-powered learning tools, and students are increasingly relying on these systems for homework help, tutoring, and personalized learning. If the tools flooding the market are shallow or poorly designed, schools waste resources and students miss opportunities for genuine learning gains. Conversely, if the sector successfully raises its standards, the potential benefits are enormous: more effective personalized learning, better support for struggling students, and tools that genuinely enhance what teachers can accomplish in the classroom. The ASU+GSV Summit represents a critical moment for the EdTech industry to course-correct. By bringing together 7,000 founders, investors, educators, and technologists to confront the trust crisis head-on, the sector has an opportunity to establish new norms around what constitutes acceptable AI education technology. The question hanging over this year's summit is simple but profound: How do we navigate these turbulent times and ensure that innovation serves learning, not just market growth ? For educators and school leaders evaluating AI tools, the message is clear. Don't settle for promises. Demand evidence. Ask whether tools were designed with genuine pedagogical thinking. And insist that any AI system respect the expertise and autonomy of teachers. The EdTech industry is finally having this conversation. It's time for schools to demand that vendors listen.