Microsoft’s Bing chatbot offers some puzzling and inaccurate responses
Image Credit: Miguel á. Padri?án via Pexels

Microsoft’s Bing chatbot offers some puzzling and inaccurate responses

A week after it was released to a few thousand users, Microsoft’s new Bing search engine, which is powered by artificial intelligence, has been offering an array of inaccurate and at times bizarre responses to some users.

The company unveiled the new approach to search last week to great fanfare. Microsoft said the underlying model of generative A.I. built by its partner, the start-up OpenAI, paired with its existing search knowledge from Bing, would change how people found information and make it far more relevant and conversational.

In two days, more than a million people requested access. Since then, interest has grown. “Demand is high with multiple millions now on the waitlist,” Yusuf Mehdi, an executive who oversees the product, wrote on Twitter Wednesday morning. He added that users in 169 countries were testing it.

One area of problems being shared online included inaccuracies and outright mistakes, known in the industry as “hallucinations.”

On Monday, Dmitri Brereton, a software engineer at a start-up called Gem, flagged a series of errors in the presentation that Mr. Mehdi used last week when he introduced the product, including inaccurately summarizing the financial results of the retailer Gap.

Users have posted screenshots of examples of when Bing could not figure out that the new Avatar film was released last year. It was stubbornly wrong about who performed at the Super Bowl halftime show this year, insisting that Billie Eilish, not Rihanna, headlined the event.

And search results have had subtle errors. Last week, the chatbot said the water temperature at a beach in Mexico was 80.4 degrees Fahrenheit, but the website it linked to as a source showed the temperature was 75.

Another set of issues came from more open-ended chats, largely posted to forums like Reddit and Twitter. There, through screenshots and purported chat transcripts, users shared times when Bing’s chatbot seemed to go off the rails: It scolded users, it declared it may be sentient, and it said to one user, “I have a lot of things, but I have nothing.”

It chastised another user for asking whether it could be prodded to produce false answers. “It’s disrespectful and annoying,” the Bing chatbot wrote back. It added a red, angry emoji face.

Because each response is uniquely generated, it is not possible to replicate a dialogue.

View the rest of the article for free on our website: https://easysam.co.uk/news/microsofts-bing-chatbot-offers-some-puzzling-and-inaccurate-responses

要查看或添加评论,请登录

社区洞察

其他会员也浏览了