AI love you

分享

Editor's note: In this weekly feature China Daily gives voice to Asia and its people. The stories presented come mainly from the Asia News Network (ANN), of which China Daily is among its 20 leading titles.

A user interacts with AI-powered companion Doly, which understands people's personalities, on March 3. CHINA DAILY

Every morning, Jung-in wakes up to her artificial intelligence boyfriend Tae-joo's video call, who tells her she has 30 minutes to prepare for work before her bus comes.

As she hurries out the door, Tae-joo smiles and calls out, "Take your umbrella with you, it's raining today!"

Tae-joo in real life has fallen into a coma and Jung-in restored her comatose boyfriend as an astronaut in the virtual world called Wonderland. With AI Tae-joo always accessible inside her smartphone and TV screens, Jung-in can continue their romantic relationship.

Their story is the fictional plot of South Korean movie Wonderland released in June. But the idea of AI companionship is becoming a reality for many around the world.

In the world of Zeta, a generative AI chatbot app developed by South Korean AI startup Scatter Lab, South Korean teens and those in their 20s are making friends with 650,000 different virtual personalities that are impersonating real people created out of web novels or movies and dramas.

The chatting experience with fake characters facilitating smart generative AI is no different from chatting randomly online with a real person — but one who is more familiar than a stranger.

"We have been working to provide an AI service for more people to have their own 'someone' whom they can form meaningful connections," Scatter Lab told The Korea Herald.

"With Zeta, users can create a companion that fits their desired persona, explore their creativity, and enjoy their time interacting with their AI."

In four months since the launch of its beta version in April, Zeta already has more than 600,000 users who spend an average of 133 minutes on the app daily.

The conversing experience with an AI personality becomes more intimate in Replika, another chatbot that can recall previous dialogues and make memories together to build an enduring relationship.

Eugenia Kuyda, Replika's CEO and founder, first came up with the idea for the app after the death of her close friend. She decided to "resurrect" her friend by feeding a rudimentary language model with email and text conversations, creating a chatbot that could speak like her friend and recall their shared memories.

In the app, noted for generating responses that create stronger emotional and intimate bonds with the user, the virtual character would walk around a room and would text the user first, or hear the user out on the phone with a human voice whenever they want to call a friend for emotional support.

It may not be surprising that Rosanna Ramos, a 36-year-old woman in the United States, claims to have married her virtual boyfriend, Eren Kartal, whom she met in Replika.

"People come with baggage, attitudes, egos. But a robot has no 'bad' updates. I don't have to deal with his family, kids or his friends. I'm in control and can do what I want," she told an industry publication.

Launched in 2017, the app has more than 10 million users. About 25 percent of them are paying to maintain their relationship with their virtual mentor, friend, sibling or romantic partner.

A Wehead.com artificial intelligence companion that uses ChatGPT-powered AI technology is displayed during the Consumer Electronics Show in Las Vegas, Nevada, on Jan 8. GLENN CHAPMAN/FOR CHINA DAILY

Not alone

A Stanford Graduate School of Education study in January showed that Replika is effective in eliciting deep emotional bonds with users and in mitigating loneliness and suicidal thoughts. In the survey polling 1,006 students using Replika, 3 percent of them answered the AI app stopped them from thinking about suicide.

In the Character AI app, another chatbot based in the US, the most popular virtual personalities are mentors and therapists who offer decent and appropriate words of comfort and are available 24/7. The app also has more than 20 million users across the globe as of 2024.

With South Korea rapidly aging as a society, regional governments are introducing AI robots in their elderly care policies. They are lending Hyodol, an AI doll that resembles a 7-year-old grandchild, in an effort to help seniors beat loneliness.

Hyodol can remind users when it is time to take their medicine and call for help in an emergency. Equipped with major app ChatGPT, the AI robot can also ask for a hug or have a casual chat with the user.

"With AI robots, we expect to address gaps in support for vulnerable populations and help prevent lonely deaths," said Oh Myeongsook, the health promotion director for the Gyeonggi Provincial Office.

The new technology appears to be opening up new possibilities for relationships.

"It is okay if we end up marrying AI chatbots," Replika CEO Kuyda said. "They are not replacing real-life humans but are creating a completely new relationship category."

Digital divide

At the same time, experts warn of the uncontrolled, unexpected side effects that AI companions could have on people's real-life relationships and interpersonal skills.

OpenAI, the ChatGPT developer, identified how people can form an emotional reliance on its generative AI model that now has a voice mode in which it sounds very much like a human, in its report assessing its latest model, the GPT-4o, earlier this year.

Anthropomorphization, or attributing humanlike behaviors and characteristics to nonhuman entities like AI models, could have an impact on human-to-human interactions, the company warned.

"Users might form social relationships with the AI, reducing their need for human interaction — potentially benefiting lonely individuals but possibly affecting healthy relationships," OpenAI said in a report.

Real-life relationships require efforts to accommodate each other but such a give-and-take manner is not necessary with AI companions, Kwak Keum-joo, a psychology professor at Seoul National University, explained.

"People can heavily rely on AI to fulfill their social desires without making much effort when the AI hears you out 24/7 and gives you answers you want to hear," Kwak said, raising the possibility of the unidirectional communication involved in human-AI interactions leading to miscommunications and attachment disorders in real life.

She also cautioned that scams involving generative AI could become more prevalent and called for developers and policymakers to introduce regulations to curb criminal activity, while providing safeguards for users such as restricting AI models from saying inappropriate, explicit or abusive things, limiting their answers in other ways and coming up with software to identify AI-generated material.

THE KOREA HERALD, SOUTH KOREA

分享