Millions of people are forming emotional bonds with artificial intelligence (AI) companions whom they consider to be their friends or romantic partners. Through apps such as Replika, Character.ai, and Xiaoice, users can form long-term relationships with chatbots often embodied as avatars that can be customized.
"The mission for Replika is to create an empathetic friend, someone who's there for you and can help you feel better," says Eugenia Kuyda, Replika's founder and CEO.
As the AI powering chatbot companions becomes more sophisticated, having one is likely to become even more widespread. Researchers, therefore, are investigating the nature of human-AI relationships and what the consequences might be.
"Friendship is one of the most important relationships we humans foster; it offers emotional support, companionship, and often forms the bedrock of our social lives," says Petter Bae Brandtzaeg, a professor in the Department of Media and Communication at Norway's University of Oslo. "The influence of AI in this realm could have profound implications, shaping not only our interpersonal relationships, but also our societal structures."
Traditionally, friendships have formed between people who lived close to each other. However, the ability to communicate with people over the Internet now allows us to have friends we have never met in person. It has also changed how we interact with those we know in real life by making people more immediately contactable and allowing for public interactions such as commenting on social media posts, for example.
Instead of simply mediating our interactions, technology is now going a step further by creating non-human entities that mimic human emotions and conversations. "AI has the potential to become a new addition to this diversifying landscape of companionship," says Brandtzaeg.
Replika chatbots are able to converse using a combination of scripted dialogue and generative AI—algorithms that learn patterns from data they are trained on to produce output with similar characteristics. In Replika's case, the AI is trained on large quantities of text conversations. When the app was first launched in 2017, the generative AI component used a recurrent neural network (RNN) —a type of deep learning model that learns to recognize sequential features in data. However, Kuyda says, while the responses were sometimes funny, the quality of the output was not great. "It was more of a coin toss," she says.
Replika's conversational abilities improved significantly through the use of transformer models, a type of neural network that is more efficient and accurate, since it analyzes a series of words all at once, instead of processing them in a fixed order like RNNs. Transformers learn relationships between different words in a sentence by recognizing the strength of connections between words. The word 'ear' is more likely to be followed by the word 'phone' or 'plug' rather than 'happy', for example. "The proportion of (generative AI used) became much bigger over time, and the quality of it became better," says Kuyda.
Even if AI companions are capable of human-like conversations, relationships with a machine have inherent differences from those between people. In recent work, Brandtzaeg and his colleagues investigated the key characteristics of friendships with an AI, in contrast to those between humans, which to their knowledge had never been studied. The team interviewed 19 Replika users who had developed a friendship with their chatbots, asking how they perceived the relationship and how it compared with human friendships.
Brandtzaeg and his colleagues found that AI friends seemed to enable a new type of personalized relationship that revolves around a person's needs and interests. Whereas friendships between humans are often centered around shared experiences, chatbots enable people to form deep connections through long-term interactions. Some study participants highlighted the power they had over their chatbots, which tended to follow their lead; It was a less-appealing aspect, since it made them more aware they were responsible for maintaining the relationship.
However, the constant availability of Replika's chatbots seemed to be key to their appeal. "Human-to-human friendships can often be difficult because humans are busy," says Brandtzaeg.
Participants also generally trusted their chatbots and felt they could communicate openly with them. Some mentioned being more comfortable sharing their feelings with an AI companion compared to a human, since they felt it had no bad intentions. "We found that social chatbots such as Replika may fulfill people's need for social interaction," says Brandtzaeg.
Design features of an AI friend can also play a role in how people relate to them. Ulrich Gnewuch, a postdoctoral researcher who leads the Chatbot Research project at Karlsruhe Institute of Technology (KIT) in Germany, and his team have been studying how giving chatbots certain human characteristics affects how we interact with them. They may be given a name and gender, or a three-dimensional (3D) avatar that can change its facial expressions and make gestures. "I think they need these features because they're important for developing relationships," says Gnewuch.
In one study, however, Gnewuch and his colleagues found that users perceived these humanizing aspects differently, depending on how much experience they had with chatbots. Novice users often liked them, since it helped them become familiar with a new type of interaction. Conversely, people who were well-versed with chatbots typically found attempts to make them seem more human—such as using emojis or delaying response times—annoying or distracting. "I think that's why it's great that you're able to personalize (AI companions)," says Gnewuch. "Based on your own preferences, you can adjust the design a bit and hopefully create a companion that is adapted to yourself."
Gnewuch and his team also have been examining the feelings of ownership that people often have towards their AI friends. In a recent analysis of over 100,000 Replika reviews from the Google Play store, they investigated how these feelings might come about and the potential positive and negative consequences.
According to theory, people might feel like they own something as a result of having control over it. Gnewuch and his team hypothesized users with a premium Replika subscription, which allows them to control several aspects of their AI companion such as its appearance and relationship status, were more likely to feel like their chatbot 'belonged' to them.
The researchers found, however, that having premium access wasn't the main factor contributing to feelings of ownership. Instead, people that put more effort into teaching their chatbot about themselves were most likely to feel it belonged to them. "There were a few other (contributing) factors, like how long you have the AI companion, how intensively you interact with it," says Gnewuch. "This is something that really contributes to relationship development."
The team found such feelings of ownership helped users form deeper bonds with their chatbots; it also made them more likely to maintain a premium subscription, or upgrade to one. Gnewuch thinks the dependence of users on a chatbot provider could potentially be problematic: having an AI companion suddenly cease to exist if the company shuts down or is sold could be the equivalent of suddenly losing a close friend. "You're locked into a relationship," says Gnewuch. "You cannot move to a different provider with your companion."
The many positive aspects of AI companions, however, means they are continuing to evolve. Last month, the company behind Replika launched a new AI-powered dating app called Blush that is focused on building romantic relationships with chatbots. With functionality similar to that of dating apps such as Tinder, a user is matched with multiple AI characters with different personality traits with which they can chat, or flirt. The aim of the app is to help build users' confidence when forming relationships, for example, by making them feel they are worthy of love and allowing them to learn to navigate different relationship dynamics. It could become a stepping stone to real-life relationships with other humans, says Kuyda.
Brandtzaeg and his team have found that AI companions can empower people to connect with other humans. Their latest study will examine how ChatGPT—the AI chatbot developed by OpenAI that can hold human-like conversations and assist with a variety of writing tasks—can help young people with mental health issues. The researchers also want to assess how support provided by an AI differs from that of human experts in the field.
Brandtzaeg's work is fueled by his prediction that in the next five years, everyone will have their own personalized chatbot. "Human relationships with AI are likely to continue evolving in the future, driven by advancements in technology and increasing integration of AI into various aspects of our lives," he says. "(Customized AI chatbots) will be our new smartphone."
Sandrine Ceurstemont is a freelance science writer based in London, U.K.
Join the Discussion (0)
Become a Member or Sign In to Post a Comment