The Limited Times

Now you can see non-English news...

AI and the not-so-new ideal of impossible beauty

2024-04-12T05:00:58.121Z

Highlights: We live in a society that imposes ideals on us that are impossible to fulfill. We have so normalized the pressure around our image that most of the time we are not aware that it exists. We interpret as normal a physical perfection that, directly, does not exist. Virtual women like Shudu, Miquela, Imma Gram, Aitana López seem to be real, even though they aren't. The ideal of perfection now reaches a new facet of cruelty, writes Frida Ghitis, author of the book "The Art of the Perfect Body" She says we have gone with a corset and scalpel and in a small leap from there, in a very big way, for the industry, for society, for our well-being and that of our families and loved ones, Ghitis says. The new ideal does not sleep, doesn't get sick, does not gain weight. Ghitis: If technology can be disruptive and this implementation of digital technology can. be disruptive to humans, it's not at its psychosocial effect.


Not only manipulations of reality begin to appear, but we interpret as normal a physical aspect that does not even exist.


Mental health is a recurring topic, which is beginning to appear in written media, in public policies, and even in conversations in the real world (the one that goes beyond the screens). It's something that worries us. And with reasons. We know that, as a society, we are burdened with problems that make our lives, the community life, the personal life, the life of everyone, more difficult.

Among the issues that endanger our well-being in terms of mental health, material aspects appear (essential for a quality life), but aspects related to the growing social pressure aimed at supposed perfection also stand out. We live in a society that imposes ideals on us that are impossible to fulfill, with enormous demands that become contradictory to each other. We don't quite know who imposed them, but we abide by them and reproduce them.

One of those overwhelming demands has to do with our physical appearance. Part of the enormous pressure we feel derives from the desire (almost need) to emulate unattainable body ideals, which manifests itself in different discomforts and a series of dysfunctions. In fact, we have so normalized the pressure around our image that most of the time we are not aware that it exists. Along with it, we have also normalized hypersexualization and the symbolic fusion between what is sold and who sells it (or, rather, what image who sells it has). Body, object and emulation of feelings as constant confusion.

Ideals around the body are not something new. Change the form of that pressure, the focus (which part of your body is going to be criticized now) and even the way and the ideal prototype: Jean Harlow, Betty Grable, Marilyn Monroe, Twiggy, Kate Moss, Cindy Crawford have been aspirations and images of femininity, ideal women, not attainable for the average mortal. If the ideals were already complicated in themselves, the appearance of Photoshop represented a twist: it was no longer difficult for any woman to look like the great models; It's that the great models stopped looking like themselves. The rupture between reality and the imposed ideal became increasingly greater.

The ideal of perfection now reaches a new facet of cruelty. Not only manipulations of reality are beginning to appear (I insist: the reference women stop looking like themselves or do so at an unaffordable psychological cost that includes money, interventions and pain) but we interpret as normal a physical perfection that, directly , does not exist. Referring to women who are not real. I am referring to the models and

influencers

created by artificial intelligence (AI): Shudu, Miquela, Imma Gram, Aitana López. Perfect. None, by the way, appear to be over 20 years old.

We could think that the anthropomorphism of AI is actually a continuation of an already existing problem: a growing pressure associated with the physique that imposes measures and ideals that cause suffering, dysphoria, hunger and pain. But here we are faced with a relevant nuance: virtual

influencers

seem

real women. Even though they aren't. And it is impossible to look like someone who does not exist.

The difference between the pressure and the current situation is that the idealized (and retouched) images referred to a real person. A person who slept, who could change her physical appearance, even gain weight (and be totally vilified for it) and age. Change. The new ideal does not sleep, does not get sick, does not gain weight.

These transmitters of messages (mostly noise with which to fill the hours), free of social problems or needs, are creators not only of content aimed at the sale of different products, but of a deep body dissatisfaction. They don't sleep, they don't age, they don't get sick and, what's more, they never get angry. They don't have periods, or migraines, they don't suffer from acne or retain fluids.

If technology and this implementation of digital humans can be disruptive, its psychosocial effect is not at all disruptive. From the corset we have gone with a scalpel and from there, in a small leap (a small one for the industry, a very big one and backwards for well-being), to the algorithm. I'm not even talking here about

deepfakes

, revenge porn, or image theft, all symptoms of a society that continues to confuse women with an object.

The big problem we face with this digitalization of beauty is no longer the pressure of an unattainable physical ideal, but rather that we stop being aware that it is unattainable. Are we able to distinguish an

influencer

-person from another created by artificial intelligence? Do we have to educate younger people to be aware that the ideal of beauty is completely fictitious or should we ask ourselves how ethical this new use of digital tools is?

Once again, artificial intelligence is nothing more than an instrument at the service of vested interests and can be as good and as useful (or destructive) as we want it to be. For me, the question is not whether the algorithm is good, bad or average, but about what are the limits that, as a society, we should impose on its uses. Basically, the question is, again, what kind of society we want to be and how much we care about the well-being of the people in it.

Irene Lebrusán

is a Doctor in Sociology, professor at the Autonomous University of Madrid and author of the book

Housing in old age: problems and strategies for aging in society

.

Source: elparis

All news articles on 2024-04-12

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.