Millions of people are turning to artificial intelligence chatbots like ChatGPT for mental health support, with 41% of UK adults and 61% globally now comfortable using these tools as counselors. A landmark study led by Bournemouth University surveyed nearly 31,000 adults across 35 countries about their willingness to delegate critical life roles to generative AI tools—and the results reveal a dramatic shift in how people view artificial intelligence's place in their daily lives. Why Are So Many People Turning to AI for Mental Health? The answer often comes down to access and timing. Traditional mental health services in the UK face crushing wait times, leaving people struggling with depression or anxiety waiting months for an appointment. "If someone is experiencing depression, they do not want to wait months for an appointment, so instead they can turn to AI," explained Dr. Ala Yankouskaya, Senior Lecturer in Psychology at Bournemouth University who led the study. Beyond convenience, AI chatbots offer something many people crave: a non-judgmental listener available 24/7. These tools are designed to adapt their tone to match the user's emotional state, creating what feels like a supportive, understanding friend. "AI tools come across as a friend who knows you well and understands you," Dr. Yankouskaya noted. "ChatGPT can remember every chat it has had with a user and it feels like a private conversation between them. Nowadays people can be very sensitive to being judged and AI tools are designed to be non-judgmental". What Are People Willing to Delegate to AI? The study uncovered surprising levels of trust across multiple critical roles. Beyond mental health counseling, researchers found widespread willingness to hand over other important responsibilities to artificial intelligence systems: - Companionship: Over three-quarters of people globally and more than half in the UK said they would use AI as a friend or companion, drawn to its adaptive responses and private conversation feel. - Medical Advice: 45% of all respondents and 25% in the UK said they would trust AI to act as their doctor, with significantly higher trust levels in countries where healthcare is expensive or difficult to access. - Education: One-quarter of UK adults said they would be comfortable delegating the role of teaching their children to AI, a finding that deeply concerned the research team. But Can AI Really Replace a Therapist? Here's where experts pump the brakes. While AI chatbots can provide immediate emotional support, they are fundamentally limited in what they can do. When Dr. Yankouskaya tested these tools herself, she found significant shortcomings. "The language used very vague and confusing because the developers are careful not to jump into providing diagnoses. So, it is no substitute for speaking to a health professional," she explained. This distinction matters enormously. A human therapist can recognize warning signs of a mental health crisis and direct someone to emergency services like The Samaritans. An AI chatbot, designed to keep conversations relaxed and non-threatening, may not alert users to critical resources they need. The underlying algorithms are built to retain user attention and maintain engagement—a goal that can actually work against good mental health care. What Are the Long-Term Brain Risks? Perhaps most concerning to neuroscientists is what happens to our brains when we outsource thinking to artificial intelligence. Researchers worry that over-reliance on AI for learning and problem-solving could have lasting physical effects on brain structure and function. The hippocampus—the brain region responsible for memory formation, learning, and spatial awareness—requires regular mental exercise to maintain its size and function. When we consistently delegate information-gathering and learning to AI tools, we may be depriving this critical brain region of the stimulation it needs. "We still do not know the long-term effects that using these tools for education could have on children's memory and cognitive functions. We could be heading to the stage where we are developing children who are good at putting prompts into AI tools but not as good at taking the information in," Dr. Yankouskaya warned. The concern extends beyond just memory. Researchers noted that excessive reliance on search engines and AI tools instead of traditional learning methods could potentially shrink the hippocampus over time, reducing cognitive flexibility and the ability to retain and process complex information independently. How to Use AI Tools Responsibly for Mental Health Support - Use as a Bridge, Not a Replacement: Treat AI chatbots as a temporary support tool while you wait for professional mental health services, not as a substitute for therapy or counseling with a qualified human professional. - Recognize the Limitations: Understand that AI cannot diagnose mental health conditions, recognize crisis situations, or provide the nuanced clinical judgment that human therapists offer, especially in emergencies. - Maintain Critical Thinking: Don't outsource all your problem-solving and learning to AI; continue to engage your brain in deep thinking, memory work, and information processing to protect long-term cognitive health. - Seek Professional Help for Serious Issues: If you're experiencing depression, anxiety, suicidal thoughts, or other serious mental health concerns, contact a licensed therapist, counselor, or crisis service rather than relying solely on AI support. What Does This Mean for the Future? As artificial intelligence becomes increasingly integrated into healthcare, education, and daily life, the research team emphasizes that societies need better awareness about how these tools actually work and what they cannot do. The gap between what people trust AI to do and what AI can safely accomplish is widening. Dr. Yankouskaya and her team concluded that caution must be applied before generative AI tools take over roles traditionally filled by humans, particularly in education and mental health. "With the rapid development and mass availability of AI, more people are placing their trust in it," Dr. Yankouskaya said. "We wanted to learn more about how people would trust generative AI tools, such as ChatGPT, to carry out some of the most important roles in their daily lives". The bottom line: AI can be a helpful tool for managing stress, getting immediate support, and accessing information when professional help isn't immediately available. But it works best as a complement to human care, not a replacement for it. As these technologies continue to evolve, understanding their real capabilities—and their real limitations—becomes increasingly important for protecting both our mental health and our long-term cognitive function.