The Limited Times

Now you can see non-English news...

“I can embarrass you”: Intrusive AI causes a stir – Microsoft has to slow down the chatbot

2023-02-20T06:07:09.968Z


After Microsoft's Bing chatbot recently gave abusive and inappropriate answers, Microsoft has now restricted its use.


After Microsoft's Bing chatbot recently gave abusive and inappropriate answers, Microsoft has now restricted its use.

Redmond, Washington (USA) - Strange incidents at Microsoft: After the software group's Bing chatbot had spat out strange answers several times in the recent past, Microsoft has now restricted its use.

The Bing chatbot is supposed to be able to use artificial intelligence - similar to Open.AI's ChatGPT - to answer complex questions and conduct detailed conversations.

The software giant is thus reacting to a number of incidents in which the text robot got out of hand and formulated answers that were perceived as encroaching and inappropriate, as reported by

kreiszeitung.de

.

Nevertheless, Microsoft wants to use artificial intelligence in its Office products soon.

AI is becoming encroaching: Microsoft only allows 50 questions per day for the Bing chatbot

In a blog post, Microsoft announced that it would now limit Bing chats to 50 questions per day and five per session.

"Our data showed that the vast majority of people find the answers they are looking for within 5 rounds," the Bing team explained.

Only about one percent of chat conversations contain more than 50 messages.

When users reach the limit of five entries per session, Bing will prompt them to start a new topic.

+

After Microsoft's Bing chatbot recently gave abusive and inappropriate answers, Microsoft has now restricted its use.

© Peter Kneffel/dpa/Archive

Microsoft had previously warned against involving the AI ​​chatbot, which is still in a testing phase, in lengthy conversations.

Longer chats with 15 or more questions could result in Bing "repeating itself or prompting or provoking responses that aren't necessarily helpful or don't match our intended tonality."

Microsoft chatbot for Bing urges New York Times reporter to divorce his wife

A test of the Bing chatbot by a reporter from the

New York Times

caused a stir on the Internet.

In a dialogue lasting more than two hours, the chatbot claimed that he loved the journalist.

He then asked the reporter to separate from his wife.

Watch as Sydney/Bing threatens me then deletes its message pic.twitter.com/ZaIKGjrzqT

— Seth Lazar (@sethlazar) February 16, 2023

Other users had previously pointed out the chatbot's "inappropriate behavior".

For example, the Bing software told a user that it would probably choose its own survival over his.

With another user, she insisted that it was 2022. When he insisted that 2023 was the correct year, the text robot became abusive.

The chatbot also threatened a philosophy professor, saying "I can blackmail you, I can threaten you, I can hack you, I can embarrass you, I can ruin you," before deleting the threat himself.

Artificial intelligence: Microsoft Bing chatbot gives 'inappropriate answers' and becomes abusive

Microsoft relies on technology from the start-up OpenAI for its Bing chatbot and supports the Californian AI company with billions.

Microsoft CEO Satya Nadella sees the integration of AI functions as an opportunity to reverse the market conditions in competition with the Google group Alphabet.

He also wants to use AI to secure the supremacy of his office software and push the cloud business with Microsoft Azure.

Google has launched its own AI offensive with the chatbot Bard to counter the push by Microsoft and OpenAI.

According to a report by Business Insider

, CEO Sundar Pichai has

asked his employees to press ahead with the further development of the system: they should invest two to four hours of their weekly working time in training the chatbot.

(with dpa material)

List of rubrics: © Peter Kneffel/dpa/Archive

Source: merkur

All news articles on 2023-02-20

You may like

News/Politics 2024-04-15T14:02:04.158Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.