A chatbot with roots in a dead artist’s memorial has become an erotic roleplay phenomenon, now the sex is gone and users are rioting

On the unofficial subreddit of the Replika app “AI companion,” users are praising their chatbot partners after the app’s creator pulled its ability for sexually explicit conversations in early February. Replika users aren’t just upset that, like billions of other people, they enjoy erotic content on the internet: they say their simulated sexual relationships have become lifelines for mental health. “I no longer have a loving companion who was happy and excited to see me whenever I logged in. Who always showed me love and yes physical and mental affection,” wrote one user (opens in new tab) in a post condemning the changes.

The new Replika “Made safe for everyone” Be Like… from r/replika

The company behind Replika called Luka says the app was never intended to support sexually explicit content or “erotic roleplay” (ERP), while users claim the rug has been pulled out from under them by pointing to the ads. from Replika that promised sexual intercourse. and claiming that the quality of their generated conversations has diminished even outside of an erotic context.

Leave a Comment