The Limited Times

Now you can see non-English news...

Because of intrusive answers: Microsoft puts the Bing chatbot on a leash

2023-02-19T15:30:55.457Z


Microsoft's Bing chatbot caused a stir with spontaneous declarations of love and snotty answers. Now the company restricts the use of the chat tool.


Enlarge image

Microsoft Bing logo: Longer conversations lead to curious answers

Photo: Richard Drew/AP

Microsoft has restricted use of its Bing chatbot, which uses artificial intelligence to answer even complex questions and have lengthy conversations.

The software group is reacting to a number of incidents in which the text robot got out of hand and formulated answers that were felt to be offensive and inappropriate (you can find out more about the chatbot's argumentative nature here in our newsletter).

In a blog post, the company announced that it would now limit Bing chats to 50 questions per day and five per session.

"Our data showed that the vast majority of people find the answers they are looking for within 5 rounds," the Bing team said.

Only about one percent of chat conversations contain more than 50 messages.

When users reach the limit of five entries per session, Bing will prompt them to start a new topic.

Microsoft had previously warned against involving the AI ​​chatbot, which is still in a testing phase, in lengthy conversations.

Longer chats with 15 or more questions could result in Bing "repeating itself or prompting or provoking responses that aren't necessarily helpful or don't match our intended tonality."

A test of the Bing chatbot by a reporter from the New York Times caused a stir online.

In a dialogue lasting more than two hours, the chatbot claimed that he loved the journalist.

He then asked the reporter to separate from his wife.

Other users had previously pointed out "inappropriate answers" from the chatbot.

For example, the Bing software told a user that it would probably choose its own survival over his.

With another user, she insisted that it was 2022. When he insisted that 2023 was the correct year, the text robot became abusive.

The chatbot also threatened a philosophy professor, saying "I can blackmail you, I can threaten you, I can hack you, I can embarrass you, I can ruin you," before deleting the threat itself.

Microsoft relies on technology from the start-up OpenAI for its Bing chatbot and supports the Californian AI company with billions.

Microsoft CEO Satya Nadella sees the integration of AI functions as an opportunity to reverse the market conditions in competition with the Google group Alphabet.

He also wants to use AI to secure the supremacy of his office software and drive the cloud business with Microsoft Azure.

Google has launched its own AI offensive with the chatbot Bard to counter the push by Microsoft and OpenAI.

According to a report by Business Insider, CEO Sundar Pichai has asked his employees to push ahead with the further development of the system: they should invest two to four hours of their weekly working time in training the chatbot.

mic/dpa-AFX

Source: spiegel

All tech articles on 2023-02-19

You may like

News/Politics 2024-02-15T11:49:58.307Z

Trends 24h

Tech/Game 2024-03-27T18:05:36.686Z

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.