Southeast Asia's AI Education Boom Is Happening Without Safety Guardrails. Here's What's at Risk

Across Southeast Asia, artificial intelligence is arriving in classrooms faster than education systems can responsibly manage. Teachers are experimenting with AI tools on their own, students are accessing ChatGPT through smartphones, and governments are investing in digital infrastructure, yet much of this adoption is happening without formal training, guidance, or institutional frameworks to support it . The question for education systems is no longer whether AI will be adopted, but how its use can be guided so that it is safe, evidence-informed, and beneficial for those most at risk of being left behind.

Are Teachers Ready to Use AI in the Classroom?

The short answer is no, not yet. Around 86% of teachers across Southeast Asia are already using AI tools in their daily work, according to findings from EdTech Hub's research on the role of teachers in AI-driven education . Tools such as ChatGPT are being used to help plan lessons, generate classroom materials, and reduce administrative workload. However, this experimentation is taking place almost entirely without formal training or institutional guidance. Teachers in lower-resourced settings are often navigating this shift largely on their own, covering hidden costs such as subscriptions, devices, or connectivity.

"Once they see how quickly AI can support their teaching tasks, their confidence increases significantly," explained Dr. Nurhasmiza Sazalli, a researcher from Universiti Teknologi Malaysia who has worked directly with educators adopting AI tools.

Dr. Nurhasmiza Sazalli, Universiti Teknologi Malaysia

Dr. Sazalli's experience reveals a critical insight: the main barrier to AI adoption is not the technology itself, but teacher confidence. Many educators question whether AI-generated outputs are reliable enough for classroom use or whether they are prompting the tools correctly to produce meaningful results. Her approach prioritizes hands-on training where teachers generate materials and experiment directly. Within her programs, educators have used tools like NotebookLM to design posters, create infographics, and draft lesson materials. Once teachers experience the efficiency gains firsthand, their willingness to adopt AI increases dramatically .

How to Build Teacher Capacity for Responsible AI Use

  • Hands-On Training Programs: Provide direct experience where teachers generate actual classroom materials using AI tools rather than passive instruction about how AI works.
  • Institutional Support Systems: Establish formal guidance, professional development, and institutional frameworks that help teachers navigate AI adoption safely and effectively.
  • Cost Assistance and Infrastructure: Address hidden costs such as subscriptions, devices, and connectivity that disproportionately burden teachers in lower-resourced settings.

Why Technology Access Alone Doesn't Solve Learning Inequality

One of the most striking findings from Southeast Asian education initiatives reveals a troubling gap between access and actual learning outcomes. Thailand's Equitable Education Fund (EEF) distributed free high-speed internet SIM cards to more than 400,000 eligible students. Of those, 113,000 registered for the program, yet only 18,000 actively used them for educational purposes . This dramatic drop-off illustrates a fundamental truth: removing the financial barrier to technology is only the first step.

"Even when we remove the financial barriers to internet access, other invisible barriers remain," noted Dr. Sahawarat Polahan, Research Director at Thailand's Equitable Education Fund.

Dr. Sahawarat Polahan, Research Director at Equitable Education Fund

These invisible barriers include limited digital literacy, a lack of relevant learning content tailored to local contexts, and weak motivation to use technology for learning purposes. Rather than beginning with technology, the EEF first seeks to understand the environments in which learners live and study. The organization is developing AI systems that draw on student risk profiles and community-level data collected through Thailand's Sirindhorn Anthropology Centre, which documents local histories and socio-economic conditions. This data is combined with scholarship opportunity information to enable personalized pathway recommendations for disadvantaged students .

This approach reflects a broader lesson: the effectiveness of AI in education does not depend solely on technological capability, but more so on how well systems reflect the realities of the communities they are intended to serve. Without deliberate design that accounts for local contexts, AI systems risk reinforcing existing learning inequalities, especially for those most marginalized .

How Can AI Open Doors for Girls in Tech?

Across Southeast Asia, a paradox is emerging. AI is opening new opportunities for girls to engage with digital learning and future employment pathways, yet confidence gaps and social norms continue to shape participation in technology. In many contexts, STEM and digital fields are still widely perceived as male-dominated domains, discouraging girls from engaging with advanced technologies. However, AI tools, particularly conversational systems such as large language models (LLMs), which are AI systems trained on vast amounts of text to generate human-like responses, may be shifting this dynamic by lowering barriers to entry .

"AI, especially large language models, is opening the door for many girls who previously felt technology was too complicated. When they can use natural language, they become curious and much more willing to start learning," said Natalja Rodionova, founder of Sisters of Code in Cambodia.

Natalja Rodionova, Founder of Sisters of Code

Natural language interfaces make complex technologies feel more approachable, allowing girls to engage without needing advanced technical skills at the outset. Rodionova noted that in some Sisters of Code programs, 90% of students did not own computers, yet 70% were already using ChatGPT through their smartphones . This finding suggests that smartphone-based access to AI is democratizing entry into technology for populations that might otherwise be excluded.

What Governance Gaps Are Putting Children at Risk?

Despite the promise of AI in education, significant risks remain unaddressed across Southeast Asia. Data protection gaps and weak governance frameworks risk leaving children exposed to privacy violations, commercial exploitation, and other harms . Without regional coordination and inclusion by design, AI risks fragmenting policy and deepening existing inequities. The invisibility of learners with disabilities in AI systems is another pressing concern, as is the need for cross-sector collaboration between governments, educators, technology providers, and civil society organizations.

Key concerns identified by researchers and practitioners include the need for stronger data protection standards, frameworks that ensure AI systems are designed with marginalized learners in mind, and structured environments that support digital literacy and online safety, particularly for girls engaging with AI tools . Education systems across the region must move quickly to establish governance frameworks that guide AI adoption responsibly, before the technology becomes so embedded in classrooms that retrofitting safety measures becomes exponentially more difficult.

The window for proactive governance is narrowing. As AI tools become increasingly accessible and teachers continue experimenting without institutional support, the stakes for getting this right have never been higher.