The Limited Times

Now you can see non-English news...

The advances of algorithmic reason, a major challenge for democracy

2024-03-07T09:36:45.699Z

Highlights: The advances of algorithmic reason, a major challenge for democracy. Politics in the digital age has to consider users as reflective subjects. Algorithmic reason is governed by the promise of satisfying our preferences, once it is assumed capable of identifying them accurately, writes Daniel Daniel. Daniel Daniel: Algorithm governance is very depoliticizing and is indeterminacy of the future. The promise of algorithms is that if we let them sift through the data we have inadvertently generated, they can determine who we are, what we need, and what we want.


Politics in the digital age has to consider users as reflective subjects.


Much of the discomfort in relation to democracy has to do with the fact that the political system does not know us well enough and, if we are honest, we ourselves do not know very well what is best for us, we are not capable of processing all the information available and we are largely unaware of the options available to us.

There are criticisms of “surveillance capitalism” (Zuboff), for understanding that whoever governs us (states or companies) knows too much about us, but the opposite could also be criticized: that they do not know our preferences and interests enough, that they do not represent us. properly and doesn't know what we want.

The basis of this discomfort would be the ignorance of power and not its excess of knowledge about us.

Another dimension of this ignorance has a temporary nature.

The time of politics establishes some solemn moments of verification of popular opinion (elections, referendums, consultations, surveys, accountability) which are followed by long periods, too long, of delegation, trust and even betrayal of the voters.

They may have known what we wanted at a certain time (and in a few matters), but they have stopped knowing it, so to speak.

We then have three problems of ignorance: that of the rulers, that of the governed and that which comes from this discontinuity in the verification of the popular will.

What then if we had a technology that would allow us to cover these deficiencies, that is, that we, rulers and governed, would be continually informed about everything that was relevant to our collective decisions?

The promise of algorithms is precisely that if we let them sift through the data we have inadvertently generated, they can determine who we are, what we need, and what we want.

We could call this new form of democracy a reco-cracy, that is, a democracy in which the demos is constituted as the final aggregate of all the recommendations received.

It would be an implicit democracy, where what we want in fact would be established as the new sovereign.

Citizens would be consulted implicitly on the most varied issues, without overloading us with excessive complexity and ensuring that our points of view will be taken into account in decision-making.

The world imagined by algorithmic reason is governed by the promise of satisfying our preferences, once it is assumed capable of identifying them accurately, without any desire for authoritarian prescription.

And in this regard I have a double suspicion: that algorithmic rationality implies undue interference and an also unjustified cutback, that in our political will conceived in this way others are the ones who decide what we should prefer and that it is taken for granted that we can only prefer what which we have preferred in the past.

The first objection is that it is really about our preferences.

Are they offering us what we want or do we end up wanting what they offer us?

There is a construction dimension to our preferences for recommendation algorithms;

Although they present themselves as merely identifying preferences, they may be inducing them to a certain extent.

The second objection is that they are preferences that correspond to the past and that excessively determine our future ones.

Personalization algorithms and recommendations are configured based on information about past decisions, interests and preferences.

The problem with these systems based on machine learning is that they give us “more of the same.”

This model is especially inadequate for those activities that, like politics in a democratic society, have the purpose of intervening in the world with the aim of changing it.

An algorithm could not have generated movements like #MeToo or #Seacabó, which imply a deliberate break with the sexist practices of the past.

How do we want to understand the reality of our societies if we do not introduce into our analyses, in addition to our de facto behaviors, the enormous asymmetries in terms of power, the injustices of this world and our aspirations to change it?

Are we willing to let the data that feed algorithms turn our past into the future?

The digital age makes us dream of the horizontalization of power, the apotheosis of networks, the return of the sovereign individual, but, instead of releasing spontaneity, enabling bifurcation and unpredictable alteration, we have a system that locks us in calculation. of the possible.

In this sense, recommendation systems, despite what it seems, are outside the control of the subjects;

They are based on non-reflective behavior rather than on expressed preferences or the object of deliberation.

It is a form of knowledge and communication that excludes self-reflection in the process of learning about oneself.

Algorithmic governance is very depoliticizing.

Politics in the digital age has to consider users as reflective and political subjects, for which we must moderate the weight of the past in algorithmic governance and protect the indeterminacy of the future.

Daniel Innerarity is Professor of Political Philosophy at the University of the Basque Country, Spain.

Director of the School of Transnational Governance of the European University Institute, Florence.

Copyright La Vanguardia, 2024

Source: clarin

All news articles on 2024-03-07

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.