How the Chinese AI company GoLaxy could change our lives

Sunday, August 10, 2025

Site Search
Give

The Daily Article

How the Chinese AI company GoLaxy could change our lives

A reality that breaks God’s heart

August 8, 2025 -

Chat GPT app icon is seen on a smartphone screen, Monday, Aug. 4, 2025, in Chicago. (AP Photo/Kiichiro Sato)

Chat GPT app icon is seen on a smartphone screen, Monday, Aug. 4, 2025, in Chicago. (AP Photo/Kiichiro Sato)

Chat GPT app icon is seen on a smartphone screen, Monday, Aug. 4, 2025, in Chicago. (AP Photo/Kiichiro Sato)

With yesterday’s introduction of GPT-5—the latest and most powerful version of the popular ChatGPT—artificial intelligence is back in the news, with the focus largely on how its advancements will impact day-to-day life. But while that story is important, today I’d like to discuss a more subtle way that AI is likely to change our lives going forward. Let’s start with GoLaxy, a Chinese company that would probably prefer to stay out of the news but has found itself at the center of a growing international controversy. 

As Brett J. Goldstein and Brett. V. Benson—two professors at Vanderbilt University who specialize in international and national security—recently documented, GoLaxy is “emerging as a leader in technologically advanced, state-aligned influence campaigns, which deploy humanlike bot networks and psychological profiling to target individuals.” 

They go on to note that the company—which denies any official connection with the Chinese government despite a preponderance of evidence to the contrary—has been used in recent years to promote China’s preferred candidates and positions across elections in both Hong Kong and Taiwan. 

GoLaxy works by mining social media to build profiles “customized to a person’s values, beliefs, emotional tendencies and vulnerabilities.” They then feed that information through their AI personas to engage in conversations that feel sufficiently authentic and human enough to largely evade the protections companies put in place to limit or identify AI activity. 

As Goldstein and Benson describe, “The result is a highly efficient propaganda engine that’s designed to be nearly indistinguishable from legitimate online interaction, delivered instantaneously at a scale never before achieved.”

Now it appears they are looking to expand their efforts into the United States, with data suggesting that they’ve already built profiles on at least 117 members of Congress and more than 2,000 other political and cultural leaders. And if their past actions are any indication of their future intent, odds are good that they won’t stop there. 

Fortunately, the reviews are mixed on just how influential their efforts have been to date. They will likely improve with time, but for now, they seem to represent more of a potential threat than an imminent one. However, the same cannot be said for a growing trend in AI that resides much closer to home. 

“The AI companion who cares”

On this week’s inaugural episode of Faith & Clarity—our recently relaunched Denison Forum Podcast—we discussed the rise of AI companions and their increasing pervasiveness throughout our culture. While ChatGPT and others are driving much of the innovation in Artificial Intelligence, these companion bots have carved out a growing niche among some of our society’s most vulnerable. 

Companies like Character.ai, Replika, and others offer users the chance to engage with AI personas tailored to their whims while promising a judgment-free interaction. Replika, for example, markets its bots as “The AI companion who cares. Always here to listen and talk. Always on your side.”  

While there are a host of reasons why such promises should send a shiver down your spine, the fact is that these companies have identified a very real need in our culture, and the statistics prove that they are increasingly effective at meeting it.

As of December of last year, Character.ai users spent an average of 93 minutes a day chatting with bots, which is 18 minutes longer than the average user spent on TikTok—the gold standard for social media addiction. As AI improves, it’s easy to see a world where that number only grows larger. 

But while users claim these conversations are “harmless fun and can be a lifeline for people coping with anxiety and isolation,” the truth is that they are often relied upon most heavily by those who are least equipped to use them well. 

That’s a problem we cannot afford to ignore. But how should it be addressed?

A flaw in the system

The most tempting cure for the misuse of AI will be to try to curtail its use or control the degree to which those most vulnerable to its abuse can gain access. A good example of that solution at work is the Illinois law passed this week, which makes it illegal for AI chatbots to offer any therapeutic advice or communication. But while the idea has merit, I doubt it will do much to truly solve the issue. 

You see, the problem is not the technology. It’s the people who use it. 

As long as there are hurting people who would rather find affirmation in AI than real relationships with other humans, services like these will have a market. We can try to limit access or curtail what kinds of services they can offer, but the root cause will still remain. 

Given that the yearning for an accountability-free community is as old as the Garden of Eden (Genesis 3:8), I doubt we’ll fix that particular flaw in the system anytime soon.

But if we can’t stop the desire for what AI offers and we can’t prevent companies from making it available, what can we do? The Gospels give us a good place to start.

A level of intimacy AI can’t touch

Jesus had a habit of meeting with people who were desperate for connection, rocked by insecurity, and yearning for acceptance. Whether it’s the woman at the well, the leper afraid to do more than shout from a distance, or Peter after he denied even knowing Jesus, many of the most memorable moments throughout Christ’s ministry were stories of him meeting broken people and making them whole once again. 

Unfortunately, in this world, it seems like for every Peter, there’s a Judas: someone in desperate need of God’s grace but too trapped in his guilt and sin to seek it. And that is just as true when it comes to AI as it is for any other temptation we face. 

At the end of the day, we can warn people of the dangers, point them toward healthier alternatives, and do our best to offer the kind of genuine community they need, but it’s still up to the individual to decide whether or not they will accept it. The same free will that leads some to salvation traps others in damnation. 

That reality breaks God’s heart (2 Peter 3:9), and it should break ours as well. But it’s the truth. 

At the same time, on this side of heaven, it will always be too soon to give up on a person who needs Jesus. Our job is to love those God brings across our path and make sure every facet of our lives points to the joy, peace, and contentment that can only be found in him. 

But we can’t offer others what we don’t have ourselves. 

So, take some time today to ask God to help you identify any areas in your life where you need his healing. Truly open your heart and soul to his Spirit, committing to make whatever changes he asks of you and surrender any aspects of your life that are not submitted to him. Then ask him to make you aware of anyone you meet today who needs the same healing as well. 

While AI can mimic a host of human interactions, it will never approach the level of intimacy available through the Holy Spirit’s presence in our lives. 

How well will your life reflect that reality today?

Quote of the day:

“Technology is per se neutral: but a race devoted to the increase of its own power by technology with complete indifference to ethics does seem to me a cancer in the Universe.” —C. S. Lewis

Our latest website resources:

What did you think of this article?

If what you’ve just read inspired, challenged, or encouraged you today, or if you have further questions or general feedback, please share your thoughts with us.

Name(Required)
This field is for validation purposes and should be left unchanged.

Denison Forum
17304 Preston Rd, Suite 1060
Dallas, TX 75252-5618
[email protected]
214-705-3710


To donate by check, mail to:

Denison Ministries
PO Box 226903
Dallas, TX 75222-6903