By Matt O'Brien and Barbara Ortutay -
The Associated Press
Facebook announced on Tuesday that it will remove its facial recognition program and erase the data of more than one billion users.
"This will represent one of the biggest changes in the use of facial recognition in the history of technology," said Jerome Pesenti, vice president of artificial intelligence at Facebook's new parent company Meta, in a blog post on the company.
"
Its removal will mean the erasure of the individual facial recognition templates of more than a billion people,
" he added.
Pesenti said the company was trying to assess positive use cases for the technology "in the face of growing concerns from society, especially as regulators have yet to provide clear rules."
The Facebook logo next to the company's new name: Meta.Dado Ruvic / Reuters
Facebook's turnaround comes after a hectic few weeks.
On Thursday, it announced a new name for the company (but not for the social network) that, they said, will help them focus on creating technology for what is envisioned as the next iteration (repetition of a process) of the Internet: the metaverse. .
The company is also facing perhaps its biggest public relations crisis to date
, after documents leaked by whistleblower Frances Haugen, a former Facebook employee, showed the company was aware of the damage its products were causing, and that often did little or nothing to reduce them.
More than a third of the daily active users of Facebook have chosen to have their faces recognized by the social network system.
That is about 640 million people.
Investigation reveals that Facebook knows the damage it causes but does not fix it
Oct. 25, 202100: 25
But
Facebook has recently started reducing the use of facial recognition after introducing it more than a decade ago
.
The company ended its practice of using facial recognition software in 2019 to identify users' friends in uploaded photos and automatically suggest they "tag" them.
Facebook was sued in Illinois for the tag suggestion feature.
This decision is "a good example of trying to make product decisions that are good for the user and the business," said Kristen Martin, professor of technology ethics at the University of Notre Dame.
[Facebook dropped its guard on its content controls ahead of the Capitol storming on January 6]
He added that the measure
demonstrates the power of regulatory pressure
, as the facial recognition system has been the subject of harsh criticism for more than a decade.