The Limited Times

Now you can see non-English news...

Spying on the poor: the digital welfare state

2020-02-18T19:20:43.612Z


The biggest challenge is how to get the privileged to contribute to the general interest. Not the disadvantaged


Its name is SyRI and corresponds to System Risk Indication. It is a digital tool designed to detect fraud in the use of social benefits in the Netherlands. Personal information recorded in different administrative units is combined to determine the risk of committing irregularities for individuals who live in neighborhoods of low income and high social complexity. Just over a year ago, a coalition of human rights advocacy groups, joined by Philip Aston, the UN special rapporteur on extreme poverty, visiting these days for our country, went to court to denounce the multiple violation of rights of the Dutch State in this new modality of social espionage 3.0. On February 5, in an unprecedented sentence, the court that has been studying the case for more than a year, ordered the immediate stoppage of SyRI for violating the human rights of persons under surveillance without prior consent or suspicion.

It is not an isolated case, Aston himself presented last October a report to the UN General Assembly warning of the dangers of the use of artificial intelligence in the management of social programs. Given the attraction of governments to the immeasurable opportunities offered by metadata, the rapporteur warned of the “serious risk of stumbling like zombies in the dystopia of the State of digital well-being”. After months of research, the British newspaper The Guardian has collected information on recent million-dollar investments in countries as diverse as the United Kingdom, the United States, India or Australia to robotize social assistance services. Biometric experiments designed in theory to detect illicit uses can, in practice, randomly cancel aid payments or claim debts that cannot be traced. More serious, these new faceless bureaucracies cause anxiety, fear and mistrust in people who already live to the limit. In her recent book Automating Inequality, the American political scientist Virginia Eubanks details the operation of these systems in several American states. If it were not for the rigorous collection of documentary evidence, we would think that everything is being invented. Eubanks denounces the creation of a disturbing system of punitive, opaque and highly invasive social control aimed exclusively at people in situations of extreme vulnerability. The ruling on the SyRI case reflects a very similar reality.

These movements still minority but in continuous rise force us, once again, to question the relationship between scientific and technological progress and human progress. Put at the service of the common good, large-scale data management opens unexplored horizons. It can help to carry out public policy evaluations based on evidence, to improve the efficiency with which social assistance is distributed, to create the channels that allow an agile exchange of information. The editorial Data Strategy published on February 5 by this newspaper highlighted the importance for Europe of building digital platforms that guarantee the secure storage of data and allow its exchange in a transparent way. However, in the absence of severe accountability, control of data by public authorities can also contravene the guarantee of fundamental rights, even within consolidated democracies. Who handles the codes? For what purpose? The problems related to the automation of the welfare state are basically two. First, these new systems work with a total lack of transparency. The execution and elaboration of these complex algorithms is left in the hands of private big tech companies away from public scrutiny. In the case of SyRI, as we can read in the report on the judicial hearing of the Center for Human Rights and Global Justice, the prosecutor argued that in order for the system to meet its objectives, the indicators used must be secret. If, for example, it was publicly announced that information on beneficiaries of social assistance is crossed with data on the pattern of water consumption to determine how many people actually live in a home, the investigated people would leave the tap open, he said. That is, if the formula is revealed, its purpose is ruined. The violation of the right of people to their privacy by who else would have to guarantee it suddenly becomes a lesser evil. Second, automation eliminates the human factor in situations where there is no standardized response. It is not the same to pay for the purchase of the supermarket without the mediation of anyone than to face a situation of vital eviction with a machine programmed to say no. But the most important fundamental question that we must face in front of this new reality is for what? Why all this effort to criminalize the poor when the greatest challenge that contemporary welfare states have is how to get the most privileged to contribute. The algorithms are definitely located in the wrong place.

Margarita León is a professor of Political Science at the Autonomous University of Barcelona.

You can follow THE COUNTRY Opinion on Facebook, Twitter or subscribe here to the Newsletter.

Source: elparis

All news articles on 2020-02-18

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.