Can AI really heal isolation, or deepen it?
AI has been touted as a solution to many modern challenges, including loneliness. And on the surface, there’s reason to believe it can help. Digital companions, chatbots, and virtual assistants can provide a sense of connection, especially for those physically isolated. Tools like Replika offer conversation and companionship on demand. Others, like Wysa and Woebot, are marketed as mental health tools—always there when you need a listening ear.
This is the second blog in our series on Loneliness - you can read the first installment here - Have we moved the dial on Loneliness.
Anyway, here’s the tricky part. A tool can support wellbeing, but it can’t replace a relationship. AI is trained to respond to what we say, not to understand who we are. It can’t hold our histories, reflect our whakapapa, or respond with cultural nuance. And without that context, we risk deepening isolation—interacting more and more with tools that reinforce our biases, echo our moods, or simply fall short of real human empathy.
In Aotearoa, where relationships and whānau are central to wellbeing, this matters. Many of our systems already miss the mark for Māori, Pasifika, and other communities. If AI is built without their voices, without te ao Māori, without lived experiences baked in, it will repeat those gaps. Loneliness isn’t just about being alone—it’s about being unseen. And culturally irrelevant AI risks making that worse.
AI’s promise vs pitfalls
Home automation devices have been supporting older adults with voice assistants that help with reminders or connection for some time now, we also know that students are often using study apps to aid their learning. Generally speaking what can AI do / can’t AI do in terms of loneliness?
✅ What AI can do:
Offer companionship and structured reflection.
Provide accessible, scalable mental health support when humans aren’t around.
Help people rehearse conversations or cope with social anxiety in a low-risk space.
⚠️ What AI can’t (or shouldn’t) replace:
Human empathy, shared experiences, and practical support (bring meals, provide human touch, and be a physical presence) .
Genuine reciprocity—AI doesn’t hold us to account or challenge us in real relationship.
Cultural grounding. Without whakapapa, kotahitanga, tikanga, chatbots risk tokenism or can minimise emotions.
This article - AI chatbot relationships have 'risks and benefits', experts say - is a great read, provides Aotearoa NZ context and reflects on studies of local universities.
The other big thing to consider in terms of AI and it’s promise to help with loneliness is digital equity - as we know many of our population do not have access to a computer, an internet connection or the skills to leverage an AI tool.
Can AI heal isolation or deepen it?
I can’t finish this blog without talking about another risk —addiction. Many of the technologies now infused with AI are designed not just to assist, but to retain our attention. Think of algorithm driven feeds, gamified learning tools, or conversational agents that reward repeated use. These systems aren’t neutral; they’re built to engage us more deeply, often at the expense of real-world interaction. For some, especially those already vulnerable to isolation, this can create a feedback loop, turning to AI tools for connection, but becoming increasingly detached from people, place, and purpose.
To the question of healing isolation or deepening it. The truth lies somewhere in between. AI can help but only when it complements, not replaces, real human connection. It can support our journey toward belonging, but it shouldn’t be steering the waka. If we treat AI as a tool—not a relationship—we can use it to amplify other wellbeing efforts. But if we let it stand in for people, culture, or community, we risk drifting further apart.