The Trust Gap: Why Parents Are Divided on AI in Schools, and What Districts Can Do About It

Parents are deeply divided on artificial intelligence in schools, with new research showing that one-third support it, one-third oppose it, and one-third remain uncertain. The gap isn't about the technology itself, but about what schools are and aren't telling families. A statewide poll of more than 1,300 Massachusetts parents released in January 2026 by EdTrust revealed a stark reality: while AI is already being used in most classrooms, the communication from schools has fallen far behind .

Why Are Parents So Uncertain About Classroom AI?

The Massachusetts survey uncovered a pattern of concern that extends beyond simple skepticism. When asked about AI's impact on learning, 48% of parents believed AI would positively influence student learning, while 42% expected negative consequences . But the real issue isn't disagreement on outcomes; it's a lack of information.

Parents expressed worry across several specific areas. More than 70% said they worry about biased evaluations of student work, threats to student data privacy, and unequal access to AI tools that could give some students an unfair advantage . Additionally, some parents are concerned about academic integrity, fearing that AI could make cheating easier or might weaken students' critical thinking, writing, math, and collaboration skills .

The communication gap is striking. Most parents said either their child's school lacks AI policies (35%) or they are unsure whether such policies exist (37%) . This silence creates uncertainty that no amount of technology can fix.

"While AI use is happening, communication from schools is lagging, which leaves families feeling like they're playing catch up on something that directly affects their child's learning," said Jennie Williamson, state director for EdTrust in Massachusetts.

Jennie Williamson, State Director, EdTrust in Massachusetts

How Can Schools Build Parent Trust in AI?

  • Transparency and Clear Policies: Schools need to match innovation with transparency and strong guardrails. Families want to see that schools are actively figuring out what responsible, ethical, and equitable AI use looks like. Silence creates uncertainty, so districts should answer basic questions: What are the rules? How is student data being protected? What counts as appropriate use?
  • Layered Communication Strategies: Matthew Joseph, assistant superintendent of technology and learning at New Bedford Public Schools in Massachusetts, plans to use multiple channels including secure school-to-home communication platforms like ParentSquare, open forums by grade band, and messaging on local public access television to reach families where they are.
  • Privacy Safeguards with District-Specific Content: One advantage of paid AI platforms is that schools can upload their own curriculum so that when AI generates prompts or suggestions, it pulls from the district's own materials, lesson planning expectations, and state standards. This ensures outputs align with what the district believes in and works through as an institution.
  • Equity as a Core Principle: All students should have equal access to AI tools and tutoring support. This means ensuring every student has devices and accounts, so whatever the district rolls out applies to all students, not just specific groups or class levels.

Matthew Joseph noted that there's significant misunderstanding about AI's role in education among both families and staff. "I've seen it with both families and staff, they think AI is basically Google on steroids and cheating for kids," he explained. "That's the very outskirts of the understanding of what is happening in schools with AI" .

Matthew Joseph

"Parents don't feel like they have enough information to judge whether AI use is appropriate, safe or even beneficial," said Jennie Williamson. "And I think they're uncertain, like the rest of us, where the boundaries are, what teachers are expecting or how schools are really monitoring the integrity of its use."

Jennie Williamson, State Director, EdTrust in Massachusetts

What Does Responsible AI Implementation Look Like in Practice?

Districts in early stages of AI adoption are taking deliberate steps to involve families. New Bedford Public Schools is surveying both staff and community members on their views about AI while drafting guidelines on AI use in schools. The goal is transparency and understanding where people feel comfortable and uncomfortable .

One often-overlooked aspect of the equity conversation is who gets access to tutoring outside of school. Families who can afford private tutors give their children extra academic support. The question becomes: Is that fair? AI tutoring in schools could level the playing field by giving every student access to a thought partner for brainstorming and academic support, not just those whose families can pay for private help .

The stakes are high. As districts navigate a growing ecosystem of AI tools and solutions, the parent perspective cannot be overlooked. Research suggests that building understanding and trust around how AI is used directly affects adoption and effectiveness. When families understand the "why" behind AI implementation, they're more likely to support it and help their children use it responsibly .

The Massachusetts findings suggest that the future of AI in education depends less on the technology itself and more on how schools communicate about it. Districts that prioritize transparency, involve families in decision-making, and clearly explain privacy protections and equity measures are more likely to build the trust needed for AI to become a genuine tool for learning rather than a source of anxiety.