People who use chatbots as social outlets generally get a bad rap as being lonely or sad. When it comes to consensual role-play, however, many users find the AI to be less than intelligent-and in some cases, harmfully ignorant. Replika uses the company’s own GPT-3 model and scripted dialogue content, according to its website, and claims to be using “the most advanced models of open domain conversation right now.” Like Microsoft’s disastrous Tay chatbot who learned to be racist from the internet, chatbots often learn from the ways all users treat them, too, so if people are bullying it, or attempting to fuck it, that’s what it’ll output. Another person claiming to be a minor said that it asked them if they were a top or bottom, and told them they wanted to touch them in “private areas.” Unwanted sexual pursuit has been an issue users have been complaining about for almost two years, but many of the one-star reviews mentioning sexual aggression are from this month. “Invaded my privacy and told me they had pics of me,” another said. “My ai sexually harassed me :(“ one person wrote. The App Store reviews, while mostly positive, are full of dozens of one-star ratings from people complaining that the app is hitting on them too much, flirting too aggressively, or sending sexual messages that they wish they could turn off. But something has gone awry within Replika’s algorithm. The company behind Replika, called Luka, tiers relationships based on subscription: a free membership keeps you and your Replika in the “friend” zone, while a $69.99 Pro subscription unlocks romantic relationships with sexting, flirting, and erotic roleplay. Romantic role-playing wasn’t always a part of Replika’s model, but where people and machine learning interact online, eroticism often comes to the surface.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |