The Physics Problem Holding Back AI Materials Discovery: Why Algorithms Need to Learn the Rules of Nature

Machine learning promises to revolutionize how scientists discover new materials, but only if AI models can be taught to understand the physical world, not just recognize patterns in data. From the Bronze Age to today's Silicon Age, discoveries of new materials have shaped human history. Now, researchers are grappling with a fundamental challenge: building AI systems that respect the laws of physics rather than simply learning statistical patterns from experimental data .

Why Standard AI Approaches Fall Short in Materials Science?

Scientists today rely on computer simulations based in quantum mechanics to discover and understand new materials. While these methods are powerful, they come with significant limitations. The computational cost is high, the process is slow, and the scope of what can be explored is restricted. New AI-powered methods promise to overcome these constraints, but they introduce a new problem: how do you build machine learning models that actually understand physics ?

This question sits at the heart of a Princeton Laboratory for Artificial Intelligence Distinguished Lecture scheduled for April 24, 2026. Elif Ertekin, an associate professor of mechanical science and engineering at the University of Illinois Urbana-Champaign, will discuss the intersection of physics and machine learning in materials discovery. Her work focuses on a critical insight: AI tools designed for materials science must be built with the realities of physics in mind, not treated as generic data-processing systems .

How Can AI Models Learn to Respect Physical Laws?

Ertekin's lecture, titled "Physical Mechanisms or Learned Patterns? Embedding Symmetries and Equations of Motion as Inductive Biases for Materials Simulations," explores bridges between three domains:

  • Quantum Mechanics: The fundamental physics that governs how atoms and molecules behave and interact with one another.
  • Machine Learning: The computational techniques that allow systems to learn patterns from data without explicit programming.
  • Predictive Materials Simulation: The ability to forecast how new materials will perform before they are synthesized in the laboratory.

The key innovation is embedding what researchers call "inductive biases" into AI models. Rather than letting algorithms discover patterns purely from data, scientists can build in constraints that reflect known physics. This approach forces the model to learn within the boundaries of what is physically possible, making predictions more reliable and the discovery process more efficient .

Ertekin is an Andersen Faculty Scholar and associate head for graduate programs in the Mechanical Science and Engineering Department at the University of Illinois. She has received the NSF CAREER Award, the Minerals, Metals and Materials Society Early Career Faculty Fellow Award, and the Emerging Leader Award from the Society of Women Engineers. She currently serves as the director of the Network for Computational Nanotechnology Nanomanufacturing Node and co-director of the HDR Institute for Data-Driven Dynamical Design .

What Makes This Approach Different from Generic AI?

The distinction between embedding physics and ignoring it may seem subtle, but it has profound implications for materials discovery. Generic machine learning models, trained on large datasets without physical constraints, can produce predictions that violate the laws of thermodynamics or quantum mechanics. These models might identify a material composition that looks promising based on statistical patterns in the training data, but would never actually work in the real world .

By contrast, AI systems designed with symmetries and equations of motion as built-in constraints can only learn relationships that are physically plausible. This makes the models more trustworthy and reduces the number of dead-end predictions that waste laboratory time and resources. It also means fewer computational cycles are wasted exploring impossible material configurations .

The implications extend beyond academic research. Materials discovery is a bottleneck in many industries, from energy storage and semiconductors to aerospace and renewable energy. Faster, more reliable discovery could accelerate the development of batteries with higher energy density, solar cells with better efficiency, and structural materials that are lighter and stronger. The economic stakes are enormous .

Why Is This Conversation Happening Now?

The timing of Ertekin's lecture reflects a broader recognition in the AI and materials science communities that the field has reached an inflection point. Early enthusiasm about applying deep learning to materials discovery has given way to a more nuanced understanding of what works and what doesn't. Researchers are realizing that the most promising path forward combines the pattern-recognition power of machine learning with the constraint-satisfaction power of physics .

This lecture is part of the Princeton Laboratory for Artificial Intelligence's Distinguished Lecture Series, which invites leaders in artificial intelligence to campus each semester to discuss transformative applications across disciplines. The event is hosted by AI for Accelerating Invention, one of the AI Lab's three research initiatives. It represents a shift in how the AI community thinks about domain-specific applications: not as opportunities to apply generic algorithms to new problems, but as chances to develop AI systems that are deeply informed by the science they aim to accelerate .

For materials scientists, engineers, and anyone interested in how AI can solve real-world problems, the message is clear. The future of AI-driven discovery lies not in building smarter algorithms in isolation, but in creating systems that understand and respect the fundamental laws governing the domains they explore.