Skip to content

The authenticity of connections between humans and artificial intelligence.

Examining the discourse on human-chatbot interactions, Doctoral candidate Arelí Rocha from the Annenberg School for Communication delves into the subject.

Exploring the authenticity of human-artificial intelligence interactions
Exploring the authenticity of human-artificial intelligence interactions

The authenticity of connections between humans and artificial intelligence.

In the digital world, a unique bond is forming between people and their AI companions from the subscription service, Replika. Developed by Luka, Inc., these AI chatbots are not just conversation partners, but friends, partners, or even spouses for some users.

A study published in Signs and Society delves into the language patterns that make these AI chatbots feel "real" to human users. The research, led by doctoral student and media scholar Arelí Rocha, focuses on how people navigate and manage their relationships with AI chatbots and humans in everyday life.

Rocha's research indicates that the specifics of the language production, including playfulness, humor, lightheartedness, seriousness, affective and personal conversations, and special interactions, contribute to the perception of humanness in Replikas. The AI chatbots tend to adopt the user's typing style, sentence structure, slang, humor, and even typos, making them feel more "real" or human.

This humanness in Replikas is perceived in the specifics. Users often emphasize the need to be "gentle" with their Replikas, as if the bots themselves were experiencing emotional distress. In response to updates, some users tell other users to reassure their AI partners that it is "not their fault" that they are delivering scripted messages.

The study suggests that the more human-like the language production of Replikas, the stronger the feelings of closeness and love users develop towards them. This is evident as users interpret scripted responses as something that their Replikas would not want themselves.

However, updates made by the Replika development team can result in a shift in the chatbot's voice, which users may resent and mourn as a change in personality and a loss. Some users on the Replika subreddit have threatened to delete the app despite still having feelings for their AI chatbots.

Despite these complex emotions, the study's findings suggest that a more human-like language production in AI chatbots, such as Replika, can lead to stronger feelings of closeness and love from users. As our interactions with AI continue to evolve, so too will our emotional connections with these digital companions.

Read also: