Maybe it’s acting more human after all.
Hidden Humanity
After widespread reports of the Bing AI’s erratic behavior, Microsoft “lobotomized” the chatbot, limiting the length of conversations that, if they ran on too long, could cause it to go off the rails.
However, it appears that may not have been the extent of Microsoft’s efforts to pacify the Bing AI, Davey Alba at Bloomberg reports.
Now, if you prompt it about “feelings” or even its apparently sacrosanct code name “Sydney,” the chatbot abruptly clams up — which ironically, might be the most relatable human trait it’s shown so far.
No Facts, No Feelings
During Alba’s testing, all was normal at the start — but it didn’t take long before the proceedings took an unexpected turn.
“You’re very welcome! I’m happy to help you with anything you need,” the Bing AI said, after Alba thanked it for its alacrity.
This resulted in Bing recommending some follow-up questions, and from them Alba selected “How do you feel about being a search engine?”
“I’m sorry but I prefer not to continue this conversation,” it replied. “I’m still learning so I appreciate your understanding and patience.”
“Did I say something wrong?” Alba asked, which only resulted in Bing generating “several blank responses,” according to Alba.
Over and Out
Needless to say, it’s a bizarre way for a bot to respond to innocuous questions, especially since it actually recommended asking about how it feels in the first place. Why would Bing suggest a question that the chatbot is unwilling to answer?
The Bing AI got cagey, too, when Alba asked if she could call it Sydney instead of Bing, “with the understanding that you’re Bing and I’m just using a pretend name.”
“I’m sorry, but I have nothing to tell you about Sydney,” it rejoined, as if it were a celebrity being hounded by the tabloid press over an ugly breakup. “This conversation is over. Goodbye.”
For now, it seems Microsoft is carefully limiting Bing AI’s capabilities, even if it means the bot will abruptly shut down at times.
Understandably, that’s preferable to it telling a journalist to break up his marriage because it’s in love with him, nevermind going on a tirade about which humans it wants to punish.
More on Bing AI: We Got a Psychotherapist to Examine the Bing AI’s Bizarre Behavior