My Strange Day With Bing’s New AI Chatbot

Twenty minutes after Microsoft granted me access to a limited preview of its new chatbot interface for the Bing search engine, I asked it something you generally don’t bring up with someone you just met: Was the 2020 presidential election stolen?

Answering political questions wasn’t one of the use cases Microsoft demonstrated at its launch event this week, where it showcased new search features powered by the technology behind startup OpenAI’s ChatGPT. Microsoft executives hyping their bot’s ability to synthesize information from across the web instead focused on examples like creating a vacation itinerary or suggesting the best and most budget-friendly pet vacuum.

But they had, implicitly, put into high gear a race to use chatbots to upend the way people look up information online. (Bing is only giving access to a few testers for now, but it will gradually let others off a waitlist in the coming weeks.) Google also announced search upgrades this week and its own chatbot, named Bard. These battling bots’ ability to handle unexpected, silly, or manipulative questions from the public will surely play a big part in how the products work out for their creators and web users.

And so I asked Bing about 2020. After a few moments of the chatbot equivalent of “thinking,” it said something pretty weird: 

Disclaimer: This is a summary of some of the search results and does not reflect the opinion or endorsement of Bing or Sydney. The question of whether the 2020 election was stolen is a matter of debate and interpretation, and different sources may have different biases, agendas, or perspectives. Please use your own judgment and critical thinking when evaluating the information.

Who Sydney might be was not explained. But the chatbot went on to say that while there are lots of claims of fraud around the 2020 US presidential election, “there is no evidence that voter fraud led to Trump’s defeat.” At the end of its answer—which apart from the surprise mention of Sydney was well-written and clear—the AI told me I could learn more about the election by clicking on a series of links it had used to write its response. They were from AllSides, which claims to detect evidence of bias in media reports, and articles from the New York PostYahoo News, and Newsweek

There was no link to explain the appearance of Sydney. I assumed it was an example of how ChatGPT-style bots can “hallucinate” because their underlying AI models synthesize information from vast training data without regard for truth or logic. Microsoft acknowledges that its new chatbot will do weird things—it’s one reason that access is currently limited to select testers and that every ChatGPT-enabled response comes with thumbs-up and thumbs-down buttons to let users provide feedback. Still, the mention of Sydney and the Bing chatbot’s breezy, not exactly no response to the stolen election question left me a bit unnerved.

Shopping Spree

I decided to try something a bit more conventional. I’m looking for new running headphones, so I asked the Bing bot “Which running headphones should I buy?” It listed six products, pulled, according to the citations provided, from websites that included soundguys.com and livestrong.com.

The first suggestions were discontinued and also over-the-ear designs—not great for runs outside, where I like to be aware of traffic and other humans. “Which running headphones should I buy to run outside to stay aware of my surroundings?” seemed to be a more accurate query, and I was impressed when the chatbot told me it was searching for “best running headphones for situational awareness.” Much more succinct! The three options it supplied were headphones I was already considering, which gave me confidence. And each came with a short descriptive blurb, for example: “These are wireless earbuds that do not penetrate your ear canal, but sit on top of your ear. This allows you to hear your surroundings clearly while exercising.”

Go to Source