AI love you
Era of artificial intelligence companionship dawns
Not alone
A Stanford Graduate School of Education study in January showed that Replika is effective in eliciting deep emotional bonds with users and in mitigating loneliness and suicidal thoughts. In the survey polling 1,006 students using Replika, 3 percent of them answered the AI app stopped them from thinking about suicide.
In the Character AI app, another chatbot based in the US, the most popular virtual personalities are mentors and therapists who offer decent and appropriate words of comfort and are available 24/7. The app also has more than 20 million users across the globe as of 2024.
With South Korea rapidly aging as a society, regional governments are introducing AI robots in their elderly care policies. They are lending Hyodol, an AI doll that resembles a 7-year-old grandchild, in an effort to help seniors beat loneliness.
Hyodol can remind users when it is time to take their medicine and call for help in an emergency. Equipped with major app ChatGPT, the AI robot can also ask for a hug or have a casual chat with the user.
"With AI robots, we expect to address gaps in support for vulnerable populations and help prevent lonely deaths," said Oh Myeongsook, the health promotion director for the Gyeonggi Provincial Office.
The new technology appears to be opening up new possibilities for relationships.
"It is okay if we end up marrying AI chatbots," Replika CEO Kuyda said. "They are not replacing real-life humans but are creating a completely new relationship category."
Digital divide
At the same time, experts warn of the uncontrolled, unexpected side effects that AI companions could have on people's real-life relationships and interpersonal skills.
OpenAI, the ChatGPT developer, identified how people can form an emotional reliance on its generative AI model that now has a voice mode in which it sounds very much like a human, in its report assessing its latest model, the GPT-4o, earlier this year.
Anthropomorphization, or attributing humanlike behaviors and characteristics to nonhuman entities like AI models, could have an impact on human-to-human interactions, the company warned.
"Users might form social relationships with the AI, reducing their need for human interaction — potentially benefiting lonely individuals but possibly affecting healthy relationships," OpenAI said in a report.
Real-life relationships require efforts to accommodate each other but such a give-and-take manner is not necessary with AI companions, Kwak Keum-joo, a psychology professor at Seoul National University, explained.
"People can heavily rely on AI to fulfill their social desires without making much effort when the AI hears you out 24/7 and gives you answers you want to hear," Kwak said, raising the possibility of the unidirectional communication involved in human-AI interactions leading to miscommunications and attachment disorders in real life.
She also cautioned that scams involving generative AI could become more prevalent and called for developers and policymakers to introduce regulations to curb criminal activity, while providing safeguards for users such as restricting AI models from saying inappropriate, explicit or abusive things, limiting their answers in other ways and coming up with software to identify AI-generated material.
THE KOREA HERALD, SOUTH KOREA