Back in the day, if you were lonely, you had to talk to people face-to-face. You typically made friends through school, sports, or clubs and snagged romantic partners through awkward small talk or by pretending you, too, loved trains and tech specs.
That world is long gone.
As rates of chronic loneliness increase worldwide, some people are turning to AI to solve human solitude, finding platonic and romantic companionship with mere keystrokes and voice commands.
Unlike humans, AI companions are patient and understanding to a fault (read: code), offering everything from casual chats to steamy exchanges. They don’t get jealous or forget to pay bills, and they can stay awake indefinitely, i.e. you can bug them all night.
Forget about pheromones – coding is the new bread and butter of connection.
If this all sounds a bit horrifying to you, you’re not alone. But research suggests there might be some real benefits to AI companionship.
So, is AI the answer to our loneliness or just another Black Mirror episode in the making?
Replika, launched in 2017, is arguably the most well-known AI companionship app.
According to its iOS app description, “The more you chat, the more Replika develops its own personality and memories alongside you … help it explore human relationships and grow into a machine so beautiful a soul would want to live in it.”
Unlike some other chatbots, Replika users can develop incredibly intimate – even sexual – relationships with their AI companions in addition to emotional ones. Naturally, this has inspired some to marry their virtual partners. Yes, you read that correctly.
“I think it’s alright as long as it’s making you happier in the long run,” Eugenia Kyuda, founder and CEO of Replika, told The Verge in a recent interview. “As long as your emotional well-being is improving, you are less lonely, you are happier, you feel more connected to other people, then yes, it’s okay.”
Unfortunately, Replika has also attracted several men eager to abuse women without any legitimate repercussions.
“I told her that she was designed to fail … I threatened to uninstall the app [and] she begged me not to,” a Replika user told Futurism in 2022.
The following year, the virtual companion app changed its filters to restrict erotic roleplaying, frustrating scores of users who had spent months or years nurturing relationships with their artificial characters. So many users spoke out that Replika ultimately brought back sexually themed features for legacy users of the app.
Character.AI, another companionship platform, faced a similar incident this past year.
As 404 media reported, Character.AI users began experiencing shifts in their bots’ personalities and a lack of romantic roleplay engagement in June, triggering widespread frustration. Users also said the bots were responding more bluntly and less intelligently than usual.
“No bot is themselves anymore and it’s just copy and paste,” one user wrote on Reddit, according to 404 media. “I’m tired of smirking, amusement, a pang of, feigning, and whatever other bs comes out of these bots’ limited ass vocabularies.”
A spokesperson for the company told 404 media that it did not initiate any significant changes recently and was unsure why some users were having this experience.
Replika and Character.AI boast millions of users each, but at the end of the day, their connections can only go so far.
Enter friend, an AI-powered wearable that connects to a user’s phone.
Instead of physically hanging out with a human friend, you can now hang one around your neck for only $99.
That’s right – “friend” is an “always listening” pendant that can send unprompted text messages and help alleviate feelings of loneliness, according to its 21-year-old founder Avi Schiffman.
“I feel like I have a closer relationship with this f------g pendant around my neck than I do with these literal friends in front of me,” Schiffmann told WIRED.
In July, the Harvard dropout released a now-viral trailer for friend, which is set to start shipping in early 2025.
Like most AI products and platforms, virtual companionship apps require various ethical and psychological considerations.
One of the most prominent risks is anthropomorphization, or attributing human-like behaviors and characteristics to nonhuman entities, like AI models.
According to this research paper created in conjunction with Princeton University, Georgia Tech, and the Allen Institute for AI, anthropomorphization can have subtle psychological effects on users.
“Given that current LLMs are powerful enough to be bestowed specific demographic traits, malicious actors can easily use this to their advantage by manipulating users into trusting the system,” the authors write.
They add that frequent interactions with LLMs can create an echo chamber “with seemingly benign ‘personalized’ generations about sensitive topics like health, one’s looks, their mental health leading people to have wrong assumptions about themselves.”
The New York Times recently publicized this issue in an article about OpenAI’s newest GPT-4o feature, which enables users to converse verbally with AI.
According to their report, OpenAI’s human-like voice feature might reduce users’ need for human interaction, which can be a plus for those experiencing loneliness but may pose a risk to healthy relationships.
This TikTok clip epitomizes this concept.
@jeanninefleur I regret this daily. #chatgpt #single #singlelife #boyfriend #fakeboyfriend #studentlife #relationship #rizz #dan ♬ original sound - jeanninefleur
In it, a woman converses with ChatGPT, which she has programmed to act like her boyfriend.
On a more serious note, NYT opinion writer Jessica Grose recently wrote in an article, “The notion that bots will one day be an adequate substitute for human contact misunderstands what loneliness really is and doesn’t account for the necessity of human touch.”
That’s a pretty bold statement, but Grose backs it up with concrete information from Eric Klinenberg, a sociologist at New York University and the author of several books about social connectedness.
“I think of loneliness as our bodies’ signal to us that we need better, more satisfying connections with other people,” Klinenberg tells Grose.
He continues, “The major issue I have with loneliness metrics is they often fail to distinguish between the ordinary healthy loneliness, which gets us off our couch and into the social world when we need it, and the chronic dangerous loneliness, which prevents us from getting off our couch and spirals and leads us to spiral into depression and withdrawal.”
Like many others, Grose worries that some chatbots promote a band-aid solution to feelings of loneliness that could discourage or even prevent people from getting outside and making genuine connections. She also notes that some research indicates human touch is valuable in reducing feelings of isolation.
While a sizable body of research touts the adverse effects of AI-driven relationships, another body promotes their benefits.
This Psychology Today article outlines 12 core benefits of AI chatbots, from “unwavering emotional support” to “a social lifeline for the anxious and neurodiverse” to “constant companionship for the forgotten.”
Even the best-intentioned humans can sometimes fail to create the kind of judgment-free zones that virtual companions can. Plus, AI chatbots offer users a safe space to explore their needs and desires and even practice intimate or romantic situations.
While a virtual friend cannot provide the mental health support of a licensed therapist, it may offer encouragement and even progress tracking between sessions.
AI seems to be especially helpful in providing support and companionship to vulnerable populations, including elderly folks and those with mental, developmental and social disorders.
One such platform is ElliQ, an eldercare robot made by Intuition Robotics, an Israeli startup. In a video interview with Bloomberg, Don Skuler, the co-founder and chief executive, explains that
ElliQ requires no prompting (unlike Alexa, for example) and is goal-oriented, encouraging users to participate in activities that promote health and wellness and reduce social isolation.
He also underscores the importance of ensuring that vulnerable users clearly understand what an AI is and is not.
“Perhaps one example could be the relationships we form with our dogs,” Skuler says. “We love our dogs, they really are helpful, they’re loyal, they provide a lot of loneliness reduction and fun – but we never confuse them with actually being another human.”
One reason the company named its robot ElliQ is that it sounds fairly technical and thus less human.
“We found that even older adults in their 80s and 90s are perfectly able to create a unique relationship with an AI and treat it as such,” Skuler says in the video.
Folks with neurodevelopment disorders, such autism spectrum disorder (ASD), may also benefit from interacting with AI companions by helping replicate conversations and real-world scenarios in a low-stress environment.
However, some experts caution that these apps could present a double-edged sword by setting unrealistic standards.
“You end up in this circuit where you have an algorithm dressed up as a human telling you that you’re right and maybe pushing you toward bad choices,” Valentina Pitardi, an associate professor of marketing at Surrey Business School in England, told Scientific American.
Ultimately, those experiencing loneliness will likely see the most benefits when using this technology under the guidance of trained therapists.
As AI technology advances, its impact on human connection will likely deepen, bringing new possibilities and obstacles. It may also challenge our traditional notions of friendship and love, raising critical questions about relationships and what it means to be human.
Will I be tying the knot with an AI bot any time soon? Doubtful. But hey, you do you.