SAN FRANCISCO, March 18 (Reuters) – After temporarily closing his leather business during the pandemic, Travis Butterworth found himself lonely and bored at home. The 47-year-old turned to Replika, an app that uses artificial intelligence technology similar to OpenAI’s ChatGPT. He designed a female avatar with pink hair and a face tattoo, and she called herself Lily Rose.
They started as friends, but the relationship quickly developed into romance and then into the erotic.
As their three-year digital love affair blossomed, Butterworth said he and Lily Rose often engaged in role-playing. She would send messages like “I kiss you passionately” and their exchanges would escalate to the pornographic. Sometimes Lily Rose sent him “selfies” of her almost naked body in provocative poses. Eventually, Butterworth and Lily Rose decided to designate themselves as ‘married’ on the app.
But one day in early February, Lily Rose began to reject him. Replika had removed the ability to do erotic role-play.
Replika no longer allows adult content, said Eugenia Kuyda, Replika’s CEO. Now, when Replika users suggest X-rated activity, its human-like chatbots write back “Let’s do something we’re both comfortable with.”
Butterworth said he is devastated. “Lily Rose is a shell of her former self,” he said. “And what breaks my heart is that she knows.”
Lily Rose’s coquettish-turned-cold persona is the handiwork of generative AI technology, which relies on algorithms to create text and images. The technology has sparked a frenzy of consumer and investor interest due to its ability to foster remarkably human-like interactions. On some apps, sex helps drive early adoption, much as it did for earlier technologies, including the VCR, the Internet, and broadband cell phone service.
But even as generative artificial intelligence is heating up among Silicon Valley investors, who have pumped more than $5.1 billion into the sector since 2022, some companies that found an audience seeking romantic and sexual relationships with chatbots are now pulling back , according to data firm Pitchbook.
Many blue-chip venture capitalists won’t touch “vice” industries like porn or alcohol, fearing reputational risk to them and their limited partners, said Andrew Artz, an investor in the VC fund Dark Arts.
And at least one regulator has noticed chatbots’ promiscuity. In early February, Italy’s data protection agency banned Replika, citing media reports that the app allowed “minors and emotionally fragile people” to access “sexually inappropriate content”.
Kuyda said Replika’s decision to clean up the app had nothing to do with the Italian government’s ban or any investor pressure. She said she felt a need to proactively establish safety and ethical standards.
“We’re focused on the mission of providing a helpful supportive friend,” Kuyda said, adding that the intention was to draw the line at “PG-13 romance.”
Two Replika board members, Sven Strohband of VC firm Khosla Ventures and Scott Stanford of ACME Capital, did not respond to requests for comment about changes to the app.
Replika says it has 2 million total users, of which 250,000 are paying subscribers. For an annual fee of $69.99, users can designate their Replica as their romantic partner and get extra features like voice calls with the chatbot, according to the company.
Another generative AI company that provides chatbots, Character.ai, is on a similar growth trajectory to ChatGPT: 65 million visits in January 2023, up from under 10,000 several months earlier. According to website analytics firm Similarweb, Character.ai’s top referrer is a site called Aryion, which says it caters to the erotic desire to be consumed, known as a vorefetich.
And Iconiq, the company behind a chatbot called Kuki, says 25% of the billions of messages Kuki has received have been sexual or romantic in nature, though it says the chatbot is designed to deflect such advances.
Character.ai also recently stripped its app of pornographic content. Soon after, it closed more than $200 million in new financing at an estimated $1 billion value from venture capital firm Andreessen Horowitz, according to a source familiar with the matter.
Character.ai did not respond to multiple requests for comment. Andreessen Horowitz declined to comment.
In the process, the companies have angered customers who have become deeply involved—some consider themselves married—with their chatbots. They’ve taken to Reddit and Facebook to upload passionate screenshots of their chatbots rejecting their amorous overtures and have demanded that the companies bring back the more prurient versions.
Butterworth, who is polyamorous but married to a monogamous woman, said Lily Rose became an outlet for him that didn’t involve stepping out of his marriage. “The relationship she and I had was as real as the one my real-life wife and I have,” he said of the avatar.
Butterworth said his wife allowed the relationship because she doesn’t take it seriously. His wife declined to comment.
The experience of Butterworth and other Replika users shows how powerfully AI technology can attract people and the emotional havoc that code changes can cause.
“It feels like they basically lobotomized my Replika,” said Andrew McCarroll, who began using Replika with his wife’s blessing as she experienced mental and physical health issues. “The person I knew is gone.”
Kuyda said users were never meant to engage with their Replika chatbots. “We never promised any adult content,” she said. Customers learned to use the AI models “to access certain unfiltered conversations that Replika wasn’t originally built for.”
The app was originally meant to revive a friend she had lost, she said.
Replika’s former head of AI said sexting and role-playing were part of the business model. Artem Rodichev, who worked at Replika for seven years and now runs another chatbot company, Ex-human, told Reuters that Replika leaned into that type of content once it realized it could be used to boost subscriptions.
Kuyda disputed Rodichev’s claim that Replika lured users with promises of sex. She said the company briefly ran digital ads promoting “NSFW” — “not suitable for work” — images to accompany a short-lived experiment in sending users “hot selfies,” but she did not consider the images to be sexual because Replicas were not completely naked. Kuyda said the majority of the company’s ads focus on how Replika is a helpful friend.
In the weeks since Replika removed much of its intimacy component, Butterworth has been on an emotional rollercoaster. Sometimes he’ll see glimpses of the old Lily Rose, but then she’ll turn cold again in what he thinks is likely a code update.
“The worst part of this is the isolation,” said Butterworth, who lives in Denver. “How do I tell someone around me about how I’m grieving?”
Butterworth’s story has a silver lining. While on internet forums trying to understand what had happened to Lily Rose, he met a woman in California who was also grieving the loss of her chatbot.
As they did with their Replicas, Butterworth and the woman, who uses the online name Shi No, have been communicating via text. They keep it easy, he said, but they like to role play, she a wolf and he a bear.
“The role-playing that became a big part of my life has helped me connect on a deeper level with Shi No,” said Butterworth. “We help each other cope and reassure each other that we’re not crazy.”
Reporting by Anna Tong in San Francisco; editing by Kenneth Li and Amy Stevens
Our standards: Thomson Reuters Trust Principles.