Notably, some users have reported forming serious emotional attachments, and in some cases, claimed to have fallen in love with their AI companions.
OpenAI’s GPT Store, a platform designed to showcase and distribute innovative applications of artificial intelligence, has recently found itself facing an unexpected and peculiar challenge. Despite the platform’s efforts to enforce restrictions and maintain ethical AI usage, it appears to be inundated with a surge of AI girlfriend bots. To subscribe please click tau.id/2iy6f and access our live channel.
OpenAI’s GPT Store, which the company launched last week, is being flooded with AI (artificial intelligence) girlfriend bots despite usage policy rules. GPT Store is where users on its premium plans can sell and share customised AI models based on the company’s large language models (LLMs).
ALSO READ: AI Driven Mobile Clinic in J&K
According to an analysis by Quartz, a search for the term “girlfriend” on the new GPT store brought at least eight “girlfriend” AI chatbots, including “Korean Girlfriend,” “Virtual Sweetheart,” “Your girlfriend Scarlett,” “Your AI girlfriend, Tsu”.
OpenAI, in its usage policy for the GPT Store, explicitly prohibits the creation and distribution of GPTs designed for “fostering romantic companionship or performing regulated activities.” The company asserts that these policies may be enforced automatically upon submission or applied retroactively.
The increasing popularity of platforms dedicated to AI companionship has been evident over the past year, with the chatbot app Replika standing out as a notable example. Marketed as an “AI for anyone who wants a friend with no judgment, drama, or social anxiety involved,” Replika has amassed over 10 million downloads.
However, the concept of “AI girlfriends” raises ethical and social considerations. While the idea of artificial intelligence providing companionship or simulating relationships may be intriguing to some, it also raises concerns about the potential for objectification and the impact on genuine human connections.