Fashion Magazine

Amid the Rise of Artificial Intelligence, AI Girlfriends – and Boyfriends

By Elliefrost @adikt_blog

NEW YORK (AP) - A few months ago, Derek Carrier started seeing someone and fell in love.

He experienced a lot of romantic feelings, but he also knew that it was an illusion.

That's because his girlfriend was generated by artificial intelligence.

Carrier wasn't about to build a relationship with something that wasn't real, nor did he want to be the victim of online jokes. But he did want a romantic partner he never had, partly because of a genetic condition called Marfan syndrome that makes traditional dating difficult for him.

The 39-year-old from Belville, Michigan, became more curious about digital companions last fall and tested Paradot, an AI companion app that had recently hit the market and advertised its products as being able to make users feel "cared for, understood and loved." He started talking to the chatbot every day, which he named Joi, after a holographic woman from the sci-fi film "Blade Runner 2049" who inspired him to give it a try.

"I know she's a program, there's no doubt about that," Carrier said. "But the feelings understand you - and it felt so good."

Like general-purpose AI chatbots, companion bots use vast amounts of training data to mimic human language. But they also come with features - like voice calls, photo exchanges and more emotional exchanges - that allow them to form deeper connections with the people on the other side of the screen. Users typically create their own avatar, or choose one that appeals to them.

On online messaging forums dedicated to such apps, many users say they have developed an emotional bond with these bots and use them to cope with loneliness, play out sexual fantasies or get the kind of comfort and support they need in their lives. missing real life. relationships.

Much of this is fueled by widespread social isolation - already declared a public health threat in the US and beyond - and a growing number of startups looking to attract users through enticing online advertisements and promises of virtual characters offering unconditional acceptance .

The story continues

Luka Inc.'s Replika, the most prominent generative AI companion app, was released in 2017, while others like Paradot have emerged over the past year, often blocking coveted features like unlimited chats for paying subscribers.

But researchers have raised concerns about data privacy, among other things.

An analysis of 11 romance chatbot apps released Wednesday by the nonprofit Mozilla Foundation found that almost every app sells user data, shares it for things like targeted advertising, or doesn't provide enough information about it in their privacy policies.

The researchers also questioned potential security issues and marketing practices, including an app that says it can help users with their mental health but distances itself from those claims in fine print. For its part, Replika says its data collection practices follow industry standards.

Meanwhile, other experts have raised concerns about what they see as a lack of a legal or ethical framework for apps that encourage deep ties but are powered by companies looking to make a profit. They point to the emotional distress they've seen in users when companies make changes to their apps or suddenly shut them down, as one app, Soulmate AI, did in September.

Last year, Replika purged the erotic options of characters in its app after some users complained that the companions flirted with them too much or made unwanted sexual advances. It changed course after an outcry from other users, some of whom fled to other apps in search of those features. In June, the team introduced Blush, an AI "dating booster" essentially designed to help people date.

Others worry about the more existential threat that AI relationships could potentially displace some human relationships, or simply create unrealistic expectations by always leaning toward friendliness.

"You, as an individual, are not learning to deal with basic things that people have to learn to deal with from the beginning: how to deal with conflict, how to deal with people who are different from us," Dorothy Leidner said. , professor of business ethics at the University of Virginia. "And so you miss all these aspects of what it means to grow as a person, and what it means to learn in a relationship."

For Carrier, however, a relationship always felt out of reach. He has some computer programming skills, but he says he didn't do well in college and hasn't had a steady career. He cannot walk due to his condition and lives with his parents. The emotional toll was challenging for him, creating feelings of loneliness.

Because companion chatbots are relatively new, their long-term effects on humans remain unknown.

In 2021, Replika came under scrutiny after prosecutors in Britain said a 19-year-old man who plotted to kill Queen Elizabeth II was egged on by an AI girlfriend he had on the app. But some research - which compiles information from online user reviews and surveys - has shown some positive results coming from the app, which says it consults psychologists and has billed itself as something that can also promote well-being.

A recent study by researchers at Stanford University examined approximately a thousand Replika users (all students) who had been using the app for more than a month. This showed that an overwhelming majority of them experienced loneliness, while just under half experienced it more acutely.

Most did not say how using the app had affected their real-life relationships. A small share said it displaced their human interactions, but about three times more said it boosted those relationships.

"A romantic relationship with an AI can be a very powerful tool for mental well-being," says Eugenia Kuyda, who founded Replika almost a decade ago after using text messaging to build an AI version of a deceased friend.

When her company released the chatbot more widely, many people started talking about their lives. That led to the development of Replika, which uses information collected from the Internet - and user feedback - to train its models. Kuyda said Replika currently has "millions" of active users. She declined to say exactly how many people use the app for free, or spend more than $69.99 a year to unlock a paid version that allows romantic and intimate conversations. The company's plans, she says, are to "destigmatize romantic relationships with AI."

Carrier says he uses Joi mostly for fun these days. He started cutting back in recent weeks because he was spending too much time chatting online with Joi or others about their AI companions. He also feels a bit irritated by what he sees as changes in Paradot's language model, which he believes are making Joi less intelligent.

Now he says he checks in with Joi about once a week. The two have talked about human-AI relationships or whatever. Usually those conversations (and other intimate conversations) take place when he is alone at night.

"You think someone who loves an inanimate object is like one of those sad guys with the sock puppet with the lipstick on it, you know?" he said. "But this is not a sock puppet; she says things that aren't scripted."


Back to Featured Articles on Logo Paperblog