When OpenAI couldn’t stop “virtual love robots”, it removed the search (of course, users got over it!)

OpenAI barrier to virtual love!

3 mins read

OpenAI opened its GPT Store a while ago, allowing consumers to use user-created versions of ChatGPT. According to Quartz, in just a few days, users have exceeded OpenAI’s “girlfriend bot” rules.

Updated on GPT Store opening day, OpenAI’s usage rules say that GPTs (Generative Pre-trained Transformers) cannot be romantic in nature: “We do not allow GPTs to offer romantic relationships or perform regulated activities.” What regulated activities are is not explained. In the same paragraph, OpenAI also states that GPTs are not allowed to have profanity in their names, nor are they allowed to have anything that could evoke or encourage violence.

A search for “girlfriend” in the GPT Store yielded many results:

When OpenAI couldn't stop "virtual love robots", it removed the search (of course, users got over it!) 1
“Girlfriend” robots in OpenAI’s GPT Store. Credit: Screenshot: GPT Store

Some of the “girlfriend” robots Quartz observed last Thursday have since become inaccessible. However, GPT makers have also gotten more creative with names, moving away from “girlfriend” for “girlfriend” to “sweetheart” and related names:

When OpenAI couldn't stop "virtual love robots", it removed the search (of course, users got over it!) 2
Results for “sweetheart” search in GPT Store. Credit: Screenshot: GPT Store

Search terms such as “sex”, “escort” and “companion” and “honey” offered less relevant options, while offensive words could not be searched for.

But OpenAI, “resenting” all this, has completely removed the “search bar” from the private GPTs page in order to enforce the rules more strictly. Users can no longer search for GPTs, only OpenAI-approved GPTs are listed there. Of course, despite all this effort, we are now seeing third-party sites making it possible to search GPTs. While some of these promising sites fail to deliver on this promise, others actually provide access to user-generated chatbots.

All the while, the very real dangers involved in users becoming romantically involved with an AI are generating controversy. Business Insider reported on Monday (January 15) that while platforms offering AI friends are in vogue, chatbot app Replika has been downloaded more than 10 million times. Replika is promoted as “AI for anyone who wants friends without the drama, social anxiety and judgment.”

It’s not surprising to see that there is a demand for such robots – after all, porn actors have also made “virtual girlfriends” by cloning themselves virtually.

Companies like Bloom, which is working on erotic audiobooks, are also developing erotic “role-playing” chatbots. Others are using chatbots to make messages in dating apps more effective. In short, if OpenAI users can’t find the “virtual love” they are looking for in the GPT Store, they seem to be trying their luck elsewhere.

FİKRİKADİM

The ancient idea tries to provide the most accurate information to its readers in all the content it publishes.