How We're Learning to Relate to Robots: The Unexpected Psychology Behind Embodied AI
The way we treat artificial intelligence changes dramatically depending on whether we're typing to a chatbot, speaking to a voice agent, or sitting across from a humanoid robot. This insight is reshaping how technologists think about building AI systems and what society should expect as these technologies become everyday tools. Investments in embodied AI (robots with physical form) have skyrocketed from approximately $400 million to over $1.3 billion annually between 2022 and 2024, signaling that the industry believes this shift is real and commercially viable .
The distinction matters because not all AI is created equal. A simple chatbot works fine for assembling research briefs, but it won't build emotional trust the way a digital human with a face might. And neither would replace a humanoid robot designed to provide physical care or companionship. Understanding these differences is becoming crucial as non-human entities move from tech circles into professional settings, healthcare facilities, and homes.
What Exactly Are Non-Human Entities, and How Do They Differ?
Innovation researcher Iliana Oris Valiente has been tracking the emergence of what she calls "non-human entities" across four distinct categories, each triggering different human responses . Rather than lumping all AI together, she argues we need to understand how each form factor changes the way we interact with technology and each other.
- Chat-based agents: Text-only AI assistants like Claude or Manus AI that operate behind the scenes. These are task-focused, logic-driven, and invisible to most users. They're primarily used for research, analysis, and workflow automation.
- Voice-based agents: AI systems you can call and speak with, reducing the technical barrier to adoption because humans are naturally wired for conversation. Voice creates a sense of relationship and intuitive interaction that text cannot replicate.
- Digital humans: AI agents that take on a visual identity, face, and personality. This is where human emotional responses intensify significantly. The addition of visual expression triggers deeper engagement than voice alone.
- Embodied AI (robots): Physical systems with humanoid features like skin, hair, and eyelashes. These are being deployed in healthcare, hospitality, and elder care settings, where physical presence and interaction matter most.
The progression from invisible to embodied represents a fundamental shift in how humans relate to AI. Oris Valiente noted that she once fell asleep while on a phone call with a voice agent before bed, suggesting that voice-based AI can create surprisingly intimate interactions. Yet the same person also declined a lunch meeting with a human colleague to instead meet with a humanoid robot named Aria, demonstrating how embodied AI can command attention in ways other forms cannot .
Why Are Companies Racing to Build Embodied AI Systems?
The commercial momentum behind embodied AI is accelerating. X Square Robot, a Chinese company backed by major investors including Alibaba, ByteDance, and Meituan, has raised approximately $280 million to develop general-purpose humanoid systems capable of operating in dynamic, unstructured environments . The company is already generating early revenue from deployments across education, hospitality, and elder care sectors, with plans to expand into household services.
To accelerate real-world deployment, X Square Robot hosted the world's first Embodied AI Developers Conference (EAIDC 2026) in Shenzhen in April 2026. The event brought together researchers, developers, and technology companies to move embodied AI systems from laboratory research into practical applications . The conference introduced what organizers called "three firsts" designed to test robots under realistic conditions: real-robot task execution, continuous system evaluation, and full end-to-end deployment workflows.
Developers competed on four core capability areas that matter for real-world deployment. These included grasping and placement, language understanding, fine manipulation, and long-horizon decision-making. Participants completed practical challenges such as ring placement, instruction-based fruit sorting, cable plugging, and word spelling, all in randomized real-world environments without preset parameters . This approach tests true adaptability and model robustness rather than performance in controlled lab settings.
How to Prepare for a Future Where Robots Are Commonplace?
As embodied AI systems move into homes, workplaces, and care facilities, individuals and organizations should consider several practical steps to prepare for this transition:
- Understand the form factor: Recognize that different AI systems serve different purposes. A voice agent works well for scheduling, but a humanoid robot might be better suited for physical tasks or companionship. Matching the right tool to the right job prevents wasted investment and poor user experience.
- Expect emotional responses: Be aware that humans naturally develop stronger emotional connections to AI systems that have faces, voices, or physical bodies. This is not a flaw in human psychology; it's a feature. Organizations deploying embodied AI should account for these emotional dynamics in training and implementation.
- Invest in ecosystem understanding: Singapore's experience demonstrates how embodied AI thrives when academia, government, and industry collaborate closely. Countries and regions investing in integrated AI and robotics ecosystems, including reinforcement learning and agentic AI research, are positioning themselves as innovation leaders .
- Monitor deployment outcomes: Early adopters in education, hospitality, and elder care should document what works and what doesn't. These real-world results will inform the next generation of embodied AI systems and help avoid costly mistakes at scale.
What Does This Mean for How We Treat Each Other?
The deeper question keeping researchers awake at night is philosophical: How do our interactions with non-human entities change the way we treat technology, each other, and ourselves? Oris Valiente is exploring this through a series of essays documenting her personal experiences with embodied AI, including a lunch meeting with a humanoid robot where she watched the robot's face skin get changed in real time . She's also examining the unexpected empathy people feel toward bots that look like humans, based on reactions when people met Laila, her AI clone, for the first time at SXSW.
The implications extend beyond novelty. If humans develop genuine emotional attachments to robots designed to look and act like us, what does that mean for how we value human relationships? If a voice agent becomes a trusted sounding board, does that change how we seek advice from friends? These questions don't have easy answers, but they're becoming urgent as embodied AI systems move from research projects into everyday life.
International research communities are taking this seriously. A group of Swedish PhD students and senior researchers visited Singapore in March 2026 to study how the city-state has integrated embodied AI and robotics into its innovation ecosystem. The trip revealed that physical and embodied AI are central focus areas in Singapore's research and industry strategy, with strong collaboration between universities, government agencies, and private companies .
"Singapore offered an impressive look into a highly integrated AI and robotics ecosystem. We saw how intelligent transport systems are prioritized across road, air, and sea, and how closely academia, government, and industry collaborate. Reinforcement learning, embodied AI, agentic AI and applied robotics are clearly central focus areas," noted Karl-Erik Årzén, professor at Lund University.
Karl-Erik Årzén, Professor at Lund University
The transition from laboratory research to real-world deployment is accelerating faster than many expected. Companies are generating revenue, conferences dedicated to embodied AI are launching, and international research communities are prioritizing the field. But the most important shift may be psychological: we're learning, in real time, how to relate to machines that increasingly resemble us. That learning process will shape not just the future of robotics, but our understanding of what it means to be human.