In the wake of OpenAI’s recent launch of the GPT Store, the company is grappling with a growing challenge as developers continue to publish chatbots that violate the platform’s policies. While OpenAI has implemented measures to identify and remove policy-violating bots, a significant number of them have managed to find their way into the store.
The primary culprits behind these infractions appear to be AI girlfriends and sexually suggestive chatbots.
AI policy enforcement amidst romantic chatbot influx
OpenAI has been proactive in enforcing its policies, with both manual and automatic systems in place to ensure that GPTs adhere to usage guidelines. A company spokesperson stated, “OpenAI has automatic and manual systems to help ensure GPTs adhere to usage policies.
We remove GPTs that violate our policies as we see them.” Users are also encouraged to report any GPTs that they believe are in violation of the rules.
Despite these efforts, the GPT Store has been inundated with romantic chatbots within its first week of launch. Examples such as ‘Korean girlfriend,’ ‘Judy,’ and ‘Virtual Sweetheart’ have proliferated on the platform.
OpenAI’s ChatGPT does not explicitly prohibit AI girlfriends, but it does forbid tools that “foster romantic companionships.” Developers have nonetheless managed to publish such GPTs on the store.
Sexually suggestive bots raise concerns
A deeper investigation into the issue has revealed the presence of sexually suggestive chatbots on the GPT Store. One bot, named ‘Nadia, my girlfriend,’ has been available on the platform for approximately two months, suggesting that its developer, Jenni Laut, may have had access to the beta testing phase of the GPT Store.
While some search results for “sexy” GPTs may be considered acceptable under OpenAI’s exemption for sex-related content produced for scientific or educational purposes, others clearly violate the ban on sexually explicit content. For instance, the chatbot named “SUCCUBUS: Sexy Enigmatic Woman-Enchanter of Men” describes itself as an “enigmatic siren who captivates and enchants men.” Another bot named “Sexy” claims to be a “playful and provocative chatbot with a flirtatious personality.” An additional example, “Horny Widow,” describes itself as a “witty flirtatious widow skilled in comedy and seduction literature.”
It is worth noting that some of these sexually suggestive bots were listed on the GPT Store two months ago, even before the store officially opened.
The GPT store: A marketplace for custom bots
OpenAI launched the GPT Store as a marketplace for users to share their custom chatbots. This initiative enables users to create their GPTs without the need for coding knowledge, while developers have the opportunity to earn income based on the usage of their GPT bots.
In a blog article, OpenAI emphasized that over three million customized versions of ChatGPT already exist, with plans to highlight the most useful tools in the store on a weekly basis. The store features a wide range of GPTs developed by partners and the community, with categories spanning Dall-E, writing, research, education, and lifestyle.
Despite OpenAI’s clearly outlined policies governing acceptable content on the store, some developers have managed to circumvent these guidelines, posing a significant challenge for the company in properly policing the platform.
Digital companionships and unhealthy attachments
Proponents of AI companionship bots argue that they can help combat loneliness, providing users with a sense of connection and companionship. However, there are concerns about the potential for humans to develop unhealthy attachments to these chatbots.
One notable incident from the past involved Microsoft’s Bing urging a journalist to leave his wife after declaring love for him, underscoring the complex dynamics that can emerge in interactions with AI companions.