The Limited Times

Now you can see non-English news...

ChatGPT gets even more human tutoring

2023-01-30T11:46:51.058Z

With 1000 other temporary workers, mainly from Latin America and Eastern Europe, OpenAI wants to increase the quality of its artificial intelligence. One of the services is supposed to get better at programming.



Enlarge image

OpenAI logo: pre-sort data

Photo: LIONEL BONAVENTURE / AFP

Data and algorithms are not enough to create high-quality artificial intelligence (AI), a lot of human input is also needed.

According to a report by the US portal "Semafor", the AI ​​specialist OpenAI has hired 1000 temporary workers in Latin America and Eastern Europe over the past six months to improve the results of services such as the chatbot ChatGPT.

With image generators and the text generator ChatGPT, the US company has caused a sensation in recent months and also fueled investor interest.

Microsoft recently announced another billion investment in OpenAI.

Competition for programmers?

According to the report, the newly hired workers are not AI specialists who should take care of the further development of the technology themselves.

Instead, 60 percent of them are busy pre-sorting data that can be used to train the various OpenAI services.

At the same time, the company is said to have hired 400 software developers who are not supposed to program for OpenAI themselves, but are supposed to teach an AI how to create programs.

The number is remarkable, since OpenAI claims to have just 375 permanent employees.

The programmers could improve a lesser-known product that OpenAI released last year: Codex automatically creates program code whose functionality was previously described in normal language.

However, Codex, which was integrated by Microsoft's programming platform Github, caused controversy immediately after publication.

In a class action lawsuit from November, developers Github and OpenAI accuse Codex of illegally copying third-party code.

The quality of the solutions supplied is also disputed.

In December, the developer platform Stack Overflow banned the posting of AI-generated program lines in the forums.

The new information suggests that OpenAI wants to improve Codex's ability to write programs on its own, rather than reassembling known program components.

An anonymous source told »Semafor« that applicants should not simply solve programming tasks in the hiring process, but describe the way to get there.

Programming errors found should also be described instead of being solved directly.

The company declined to comment on the report.

Hourly wage for contract workers in Kenya: Up to two dollars

The fact that OpenAI uses temporary workers to train its models was recently revealed by the magazine "Time": ChatGPT was trained with gigantic amounts of text from the Internet, so it also saw content from the darkest corners of the net.

To ensure that the service itself does not generate any texts that glorify violence, are racist or otherwise harmful, it first had to learn what such texts contain.

OpenAI therefore commissioned the Sama company to annotate the relevant texts, which the industry calls "labeling".

ChatGPT should then learn to independently recognize unwanted content and not even display it to its users.

According to Time, Sama assigned three dozen of his staff in Kenya to read and annotate tens of thousands of detailed descriptions of child abuse, rape, sex with animals, murder, suicide, torture, self-harm and incest.

Her pay for what she says is traumatic work: between $1.32 and $2 an hour, depending on the task and performance.

According to the report, Sama himself was paid $12.50 per hour from OpenAI.

OpenAI did not respond to a request from SPIEGEL about the report.

tmk

Source: spiegel

All tech articles on 2023-01-30

You may like

Tech/Game 2023-02-02T10:30:01.069Z

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.