The Limited Times

Now you can see non-English news...

Meaningless answers, ChatGpt in confusion for a few hours - Future Tech

2024-02-22T12:01:54.584Z

Highlights: ChatGpt, the popular artificial intelligence app, went "crazy" for a few hours on Tuesday. The chatbot started generating meaningless sentences, containing non-existent words. Several screenshots appeared on social media with incomprehensible responses. OpenAI, the company that develops the service, said the issue was caused by an "optimization" intervention on the platform. Only a subsequent update resolved the issue, according to the company's website. The episode recalls the problems with Microsoft Bing Chat, which proved unfriendly to users upon launch a year ago.


ChatGpt, the popular artificial intelligence app, capable of producing texts at the user's request, went "crazy" for a few hours on Tuesday. (HANDLE)


ChatGpt, the popular artificial intelligence app, capable of producing texts at the user's request, went "crazy" for a few hours on Tuesday.

The chatbot started generating meaningless sentences, containing non-existent words, which left many users dumbfounded.

Several screenshots appeared on social media with incomprehensible responses which, according to OpenAI, the company that develops the service, were caused by an "optimization" intervention on the platform.

Only a subsequent update resolved the issue.

 In some conversations, ChatGpt ended up mixing English and Spanish, making the issue, although serious, slightly more playful.

"Thanks a lot for your understanding. I'm sure that from now on we will have a dialogue as clear as water" is one of the phrases spread online.

On a more critical note, some have pointed out that by asking the chatbot to check computer code, it ended up rambling and changing the subject.

In one case, he formulated several identical lines, with the phrase "enjoy listening".

As the Ars Technica website writes, the episode recalls the problems with Microsoft Bing Chat (now called Copilot), which proved unfriendly to users upon launch a year ago.

On that occasion, Bing Chat's problems arose due to long conversations that pushed the chatbot's "behavior" system to go outside its context window, generating so-called hallucinations, inconsistent, incorrect and misleading responses.


Reproduction reserved © Copyright ANSA

Source: ansa

All news articles on 2024-02-22

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.