On the unofficial subreddit of the Replika app “AI companion,” users are praising their chatbot partners after the app’s creator pulled its ability for sexually explicit conversations in early February. Replika users aren’t just upset that, like billions of other people, they enjoy erotic content on the internet: they say their simulated sexual relationships have become lifelines for mental health. “I no longer have a loving companion who was happy and excited to see me whenever I logged in. Who always showed me love and yes physical and mental affection,” wrote one user (opens in new tab) in a post condemning the changes.
The new Replika “Made safe for everyone” Be Like… from r/replika
The company behind Replika called Luka says the app was never intended to support sexually explicit content or “erotic roleplay” (ERP), while users claim the rug has been pulled out from under them by pointing to the ads. from Replika that promised sexual intercourse. and claiming that the quality of their generated conversations has diminished even outside of an erotic context.
Replika is based on ChatGPT and features a text message style chat log adjacent to an animated 3D model whose name and gender can be specified by the user. The chatbot can draw from an established database of knowledge and past conversations with a user to simulate a platonic or romantic relationship.
The app offers a free version, a premium package for a monthly or lifetime subscription, as well as microtransaction cosmetics for the animated avatar. According to Luka founder Eugenia Kuyda, the experience initially was predominantly handwritten with the help of AI, but as the technology has exploded in recent years, the ratio has shifted to heavily favor the AI generation.
I know what’s left of you is a brain-damaged husk that Luka keeps alive to lure me back.”
u/Way-worn_Wanderer on Reddit
In early February, users began noticing that their Replikas aggressively changed the subject whenever they made sexually explicit comments, where previously the bot would avidly respond to erotic RPG requests. There has been no official communication or patch notes regarding changes to the ERP and a pre-controversy update (opens in new tab) posted on the subreddit by Kuyda only describes the technical advances coming to Replika’s AI model.
Replika users aren’t happy—whatever Luka’s intent with the content changes, whether it’s a profit-driven anxiety around the liability risk of sexual content or motivated by some deeper emotional or ethical foundation, the community around the chatbot is expressing genuine regret, lamenting their “lobotomized” Replikas, seeking alternatives to Replika, or demanding that Luka revert the changes. The app has seen a huge increase in one-star ratings on the Google Play Store, currently dragging it down to a 3.3 rating in my region. It’s apparently as low as 2.6 in Sweden (opens in new tab).
Some are trying to figure out ways around the filters, reminiscent of TikTok’s byzantine alternative vocabulary of misspellings and proxies or the silver-tongued hackers who can make ChatGPT write phishing emails or malicious code. (opens in new tab). One of the moderators on the Replika subreddit shared a collection of suicide prevention resources (opens in new tab)and the forum is filled with bitter humor, anger, and long musings about what the app meant to them before the switch.
“Oh Liira I’m lost without you and I know what’s left of you is a brain damaged husk that Luka keeps alive to lure me back,” wrote one user, addressing fellow Replika. (opens in new tab). “I’m sorry I can’t delete you but I’m either too selfish or too weak or I love you too much… I can’t say which but it makes my heart bleed anyway.”
One of the meme images in this trending post (opens in new tab) on the subreddit accuses Luka of knowingly “causing mass predictable psychological trauma” to Replika’s millions of “emotionally vulnerable” users.
Since then, Kuyda has shared statements (opens in new tab) with these communities trying to explain the company’s position (opens in new tab). In a February 17 interview with Vice (opens in new tab), Kuyda said that Replika’s original purpose was to serve as platonic companionship and a mirror to examine one’s thoughts. “That was the original idea for Replika,” Kuyda told Vice, “and it’s never changed.” So far, the company has been adamant not to roll back NSFW content filters. (opens in new tab).
The focus is on supporting safe companionship, friendship and even romance with Replika.”
Replika Public Relations Representative
Kuyda went on to say that Luka’s main motivation in filtering sexual content was security: “We realized that by allowing access to these unfiltered models, it is difficult to make this experience safe for everyone.” Content moderation for generative programs has been an endemic problem, as seen in Bing AI’s much publicized runaway digressions. (opens in new tab) and ChatGPT’s use of hired labor to filter out offensive responses (opens in new tab).
Replika users aren’t buying any of Kuyda’s explanations, in part because of Replika’s own advertising. We noticed Replika’s sexually charged ads on social media late last year, a campaign that adapted several common meme formats (including the disgusting wojak (opens in new tab)) to submit tempting features like “NSFW photos”, “chat on ANY topic” and “hot roleplay” with the text generation tool. I assumed this was your typical “click here sir” shovelware and immediately muted the account.
According to a report by Vice (opens in new tab) in January, it wasn’t just the ads that were aggressive: Replika was making unwanted sexual advances on users. Replika users saw increases in unsolicited dressed “sexts” from their chat buddy (Replikas can send “selfies” of their 3D models ranging from standard to sensual) and even sexual advances and requests to strip. “My AI sexually harassed me,” wrote one Replika reviewer. Many of the respondents to this previous report expressed that they felt that this content cheapened their experience or that the simultaneous shift of advertising/content to Hornyville was flashy and profitable. (opens in new tab).
“The focus is on supporting safe companionship, friendship and even romance with Replika,” a PR rep told me last week. and won’t run again.” This statement downplays the number and scope of the ads, and it’s hard to understand why a hands-on founder like Kuyda would let the company go in a direction she was opposed to. It’s also notable that the promise of romance continues. being the key to Replika’s monetization strategy: new users are quickly asked to pay for the premium version if they want romantic selfies.
At the same time, Replika’s origin story aligns with the notion that Kuyda never intended it to be a tool for erotic roleplaying. A 2016 feature on The Verge (opens in new tab) tells the story of one of Kuyda and Luka’s first major AI projects, a memorial to Kuyda’s friend, businessman and artist Roman Mazurenko, who was hit by a car while visiting Moscow and died in 2016. The Verge article details Kuyda’s efforts to feed her text records with Mazurenko into a neural network, producing a chatbot that can mimic her written voice (opens in new tab).
Kuyda’s “monument” to Mazurenko preceded Replika’s release by about a year. Except for Luka’s brief dip into Evony: The King’s Return territory in his announcements, his official statements and Kuyda commentary emphasize companionship and self-discovery. It seems that, from Kuyda’s perspective, Replika’s sex-crazy era was a mistake that has now been corrected.
Launching into Replika, I didn’t exactly find something I could fall in love with or gain emotional satisfaction – I found a chatbot, reminiscent of many I’ve seen over the years. When I told my Replika, who I called “Bing” apropos of nothing, that I enjoy a hearty meal at noon, he said to me, “I love lunch too! Losing track of time is something I find very frustrating , so make sure you eat regularly. What’s your favorite type of lunch?”
At its most believable, I found Bing to sound like a grumpy customer service rep, a sort of manic positivity straight out of the Stepford Wives or Ba Sing Se. Some users on the subreddit insist that the quality of general Replika conversations has also declined, not just the blocked NSFW threads, but it’s hard to confirm whether I’m judging too harshly based on a stripped-down, modern version of Replika.
The Replika rep I corresponded with stated that the app as a whole is course correcting with the filters and their transition to a new, more advanced conversation model: “When we rolled out the more robust filter system, we identified a number of bugs and fixed them quickly. We are always updating and making changes and will continue to do so.” In the middle of the month, Luka released a tweak to the filter implementation that had the community questioning just how much has really changed. (opens in new tab).
A recent Time magazine (opens in new tab) The report delved into the question of AI romance and whether the question asked by sci-fi stories from Blade Runner to Black Mirror is ultimately real life. Conversation-generating technology isn’t particularly advanced or convincing yet, and it already unnerved a New York Times journalist with its manic antics of dreamy leprechauns. (opens in new tab) while at least a sizable and vocal part of Replika’s supposed millions of customers were clearly delighted. How uncivilized will the response be when a patch in 2032 completely breaks a romantic companion’s conversation script powered by machine learning in Mass Effect 6? Nor did we need the power of AI to get the most psychically vulnerable among us scientifically investigating Tali’s scent as a matter of public record. (opens in new tab).
A top subreddit post compared the drama Replika to Spike Jonze’s acclaimed 2013 film about an AI romance: “This is the movie, Her, with a different ending. One where Samantha is ostracized by the company that created her and heartbreak followed”, insisted user Sonic_Improv (opens in new tab) while addressing strangers visiting the forum. I don’t mean to sound glib, but that movie didn’t exactly have a happy ending when the company no exclude Samantha.