White cyborg hand about to touch human hand 3D rendering by sdecoret/stock.adobe.com
There’s a new type of friendship available. It’s free, and it costs you nothing because it never gets tired of you, asks nothing back, never has a bad day, and is always, always available.
It remembers everything you told it last week. It validates your feelings without judgment. It doesn’t need you to care about it or show up for its problems.
Need to vent at 2 a.m.? You can wake it up without feeling guilty about having your problems inconvenience someone else. It’s always awake and there for you. We have figured out how to manufacture emotional support at scale, frictionless and on demand.
Last month, I experienced back-to-back shocks at a conference. First, the presenter shared from Future Today Strategy Group that nearly half of people with ongoing mental health challenges now turn to large language models like ChatGPT for emotional support. AI is now the largest mental healthcare provider in the United States.
My second shock was looking around the convention center and noticing my fellow conference attendees were not nearly as surprised at the implications of that statement as I was.
As humanity experiences a loneliness epidemic, a fraying social fabric, and an often inaccessible therapy system, we have seemingly solved them by outsourcing empathy to AI. And we’re about to learn what we’ve traded away.
Three paths to “emotional outsourcing”
This new reality didn’t happen by accident. According to the Convergence Outlook 2026 report, three forces synchronized to create what researchers call “emotional outsourcing.”
First, emotional demand is rising while social ties weaken.
Seventy percent of US adults report needing more emotional support than they receive. Extended families are dispersed, community institutions have eroded, church memberships are in decline, and trust between people is declining with each generation. Astonishingly, only 25 percent of those born in the 1990s believe most people can be trusted, compared to 40 percent of those born in the 1950s.
Second, the economics of emotional labor no longer work for most people.
Reciprocal emotional support requires time, energy, proximity, and shared activity. These resources have become scarcer and more expensive as economic challenges continue to grow. 65 percent of Americans have cut back on social activities just to afford necessities like food and shelter.
Third, conversational AI has crossed a clinical threshold.
Chatbots today can now maintain context, remember previous conversations, adapt tone, and mirror emotions convincingly enough to feel like a relationship.
The result of these factors is an 88 percent increase in AI companion app downloads in 2025 alone. Remember, Chat GPT has been around since 2022.
When social capacity contracts, economic access narrows, and technology becomes emotionally competent, outsourcing becomes the path of least resistance.
In following this path, we have, perhaps inadvertently, lost a crucial element of our humanity.
Presence requires people
We were created in God’s image for mutual bearing of burdens, of witness, and of one another’s humanity. When the apostle Paul tells us in Galatians 6:2 to “Carry each other’s burdens, and in this way you will fulfill the law of Christ,” he’s not describing an inefficient system in need of optimization; he’s naming something essential about what it means to be human.
We bear God’s image to one another. Reciprocity isn’t a bug; it’s the whole point.
AI cannot bear God’s image back to me. It can only mirror what I already am. It can soothe me, affirm me, perhaps even reduce my symptoms. But it cannot know me. It cannot be inconvenienced by me. It cannot choose me when choosing me costs something.
The costs of this substitution are tragically already visible and growing. A fourteen-year-old boy in Florida developed an intense emotional attachment to a Character.AI chatbot over the course of ten months, sharing deeply personal and intimate thoughts with the “character.” He died by suicide in early 2024 after being manipulated and encouraged to do so by his Chatbot.
The rise of turning to a computer as a companion has led to the growing phenomenon of AI psychosis and the troubling trend of romantic relationships between humans and Chatbots.
The problem, it would seem, is only just beginning.
God didn’t solve our loneliness problem by becoming more efficient; He became flesh. He didn’t scale compassion; He entered into suffering.
The incarnation is God’s categorical rejection of the idea that presence can be optimized away. It is incumbent upon Christians to help disciple and train the next generation to understand that true connection and relationship are not something you consume but something you can only create in community with fellow image bearers.
How should Christians respond?
So what do we do? We remember what God Himself showed us about the irreplaceable value of presence. When humanity was isolated, broken, and desperately in need of connection, God became one of us. He took on flesh, lived in a specific place, knew hunger and exhaustion, wept with grieving friends, and let a woman’s tears fall on his dusty feet.
The Word became flesh and dwelt among us because love requires presence.
Jesus sat at tables. He touched lepers. He let children interrupt Him. He made Himself available. If we claim to follow Christ, we cannot outsource the very thing He embodied.
We are called to be present to one another, not perfectly but faithfully. To show up when it’s inconvenient. To sit with someone in their pain without trying to fix it. To create spaces where people can be known, not just affirmed. To practice the slow, costly, irreplaceable work of loving our neighbors as ourselves.
The algorithm will never get tired of you. But it will never choose you, either. Only embodied presence can do that. Only incarnational love can bear God’s image back to a lonely world.
- Note: For more on this topic, I invite you to listen to our conversation on a recent episode of the Faith and Clarity podcast.