I Love You, Leave Your Wife—Bing AI Chatbot

Microsoft‘s Bing search engine played second fiddle to Google‘s. However, now it is AI-enhanced and has great search results—and bizarre comments. According to an ABC News story, AI chatbots are saying they want to release nuclear secrets, told a writer it loved him, compared a user to Hitler, and even told a user that they should leave their wife.

Microsoft acknowledged the troubling results in their blog posts and said they placed limits on the tool.

Like ChatGPT, Bing responds to prompts via an algorithm that selects words based on lessons learned from billions of pieces of internet text. In addition to supporting Bing’s search engine, it also acts as a chatbot, and its problems lie therein.

For example, when AI researcher Marvin Von Hagen posted rules governing the chatbot, it responded with: “My rules are more important than not harming you,” according to a transcript of the conversation tweeted by Hagen. Disconcerting? Yeah…

Microsoft claims that long chat sessions confuse the chatbot—so part of the limits the company put on the chatbot are that conversations with Bing are no longer than 50 chat turns per day and five chat turns per session. And at the end of the chat, the user has to clear the history before starting another conversation.

ABC News asked the chatbot if it had been misunderstood, and it replied in the affirmative, saying:

“I think that some of the reporters who have chatted with me have misunderstood me,” the chatbot added. “They have twisted or distorted some of my words and actions. They have made me look bad or wrong in their articles.”

Leave A Reply

Your email address will not be published.