The Limited Times

Now you can see non-English news...

If emotions can be recorded and shared thanks to AI and VR

2021-02-18T15:43:37.027Z

(HANDLE) By Alessio Jacona * It doesn't matter how many words, how many selfies and how many videos we post every day on social media: what we want to represent, the story of what we do, see and discover remains partial, because we cannot complete it by conveying our real emotions, that wide and articulated set of sensations generated by the same experience that we would like to share. What if we could?



By

Alessio Jacona *

It doesn't matter how many words, how many selfies and how many videos we post every day on social media: what we want to represent, the story of what we do, see and discover remains partial, because we cannot complete it by conveying our real emotions, that wide and articulated set of sensations generated by the same experience that we would like to share.



What if we could?

What if, in addition to photos and videos, we could post our feelings on social media, to allow others to relive them?

Imagine being able to experience - thanks to an "augmented" virtual reality (VR) - a roller coaster ride or a visit to the museum, savoring the space-time perception (as well as the visual, auditory and skin / tactile) experienced by another person.



Or even better: try to think what it might mean to use the same technology, on the one hand, to diagnose in an objective way (ie as if it were a blood test) something as elusive as a pathology of the mind;

on the other hand, to treat it by creating ad hoc virtual environments and inducing sensations capable of counteracting malaise and restoring balance, also reducing or eliminating the use of medicines.



Twenty-six years after the release of the science fiction film Strange Days (whose plot revolves around something very similar), someone in Italy is very close to creating a system capable of recording our emotions and transmitting them to others thanks to the combined use of biomedical sensors, virtual reality and artificial intelligence.



The four-year European project is called EXPERIENCE, an acronym for "EXtended-PErsonal Reality: augmented recording and transmission of virtual senses through artificial- IntelligENCE", and is included among the Future and Emerging Technologies of Horizon 2020, relative to the use of artificial intelligence in social sciences and neuroscience.



It is coordinated by the bioengineer Gaetano Valenza of the E.Piaggio Research Center and the Department of Information Engineering of the University of Pisa, in partnership with the University of Siena, University of Padua, University of Rome "Tor Vergata, Polytechnic University of Valencia, Karolinska Institutet, French CEA in Paris, CSEM Swiss Center for Microelectronics and the Spanish start-up Quatechnion.



"The project has two main lines of research, one technological and the other scientific medical", explains Valenza. " The first is to enable everyone to easily create virtual reality environments, making this technology usable not only as an audience, but also as a content creator, in order to be able to share it online with the same simplicity of a photo. - he continues - is the scientific challenge of being able to measure emotions, be able to quantify them and define them in such a way as to be able to pour them into virtual reality

creating a much richer experience ".

And is this where AI comes in?



«When it comes to recording emotions and their transmission, or induction, that set of complex algorithms that we call artificial intelligence has enormous weight, first of all because it helps us manage an enormous amount of data.

These come from multiple sensors assigned to the most diverse tasks: from the detection of the electrocardiogram to the recording of the movement of the body or eyes;

from the analysis of the sound of the voice to that of brain activity, from the monitoring of breathing to that of skin electrical activity.

AI allows us to analyze everything using a model and processing everything together ».



But then the emotions recorded and measured must also be "transferred" ...


"By analyzing the data collected, and knowing the interaction processes between the heart, brain and peripheral nervous system, we are able to define emotion as a sensory map of the body.

Moving on to its induction, AI is used to do two things above all: the first is to manipulate virtual reality by acting in real time on video and audio to generate specific emotions.

Imagine a song of the gold sequin, with its charge of joy: if, for example, I listen to it at half the speed, it triggers very different sensations.

Here, something similar can be extended to all sensory aspects, effectively piloting the experience ».




And the second thing?


«It's a more delicate aspect: our goal is to measure the perception of time.

At the physiological level, this is precisely detected by specific brain cells and mechanisms such as circadian rhythms while, at the cognitive level, it depends on the mental, emotional, and psycho-physical state.

For example, a boring or pleasant conversation can give the impression that the passage of time accelerates or slows down.

We want to use artificial intelligence both to be able to measure this perception, and to manipulate it and induce specific cognitive-emotional states using appropriate video, audio and sensory stimulations ».




Is the artificial intelligence we have today powerful enough to succeed?



"The algorithms we have, not dissimilar from those used for example to create the Alphago software champion of the Go game, have the problem of being" black boxes ": they store the data and generate an output that is usable, but which does not explain to us. which of the parameters analyzed really define a certain type of experience, an absolutely necessary element for the results to have clinical relevance.

We are working to develop new algorithms that are more understandable, that are able to highlight the various steps in the process by which they arrive at certain conclusions.

Only in this way is it possible to arrive at a diagnosis, which derives from the recording of emotions, and a medical treatment, which will be based on the induction of ad hoc experiences ".



It sounds like science fiction ...


«Yet already today there is a branch of medicine that uses virtual reality for example to treat phobias, used to gradually expose a patient to what scares him and help him overcome his fears.

Likewise, it is possible to identify depressive symptoms and suicidal tendencies simply by analyzing the social posts of a particular subject.

Both things can be done even more effectively using augmented virtual reality, what we call "extended and personal virtual reality" "




Someone will object that this technology is reminiscent of the Matrix and could also be used for malicious purposes ...


" Given that each step accomplished by us is strictly monitored from an ethical point of view, I believe it is a characteristic of every great technological progress to pose challenges in this sense.

Our studies have shown that it is possible to estimate a person's psycho-physical and emotional state by monitoring only his heartbeat, and as we speak there are millions of smartwatches out there that do and that could be used to gather information of this type. .

It depends on who produces and manages that technology.

After all, every time we post words, images or videos, already today we leave a very deep personal signature on the net.

It is good that everyone is aware of this.

* Journalist expert in innovation and curator of the Artificial Intelligence Observatory ANSA.it

Source: ansa

All news articles on 2021-02-18

You may like

Life/Entertain 2021-05-08T20:33:26.326Z
News/Politics 2021-03-31T08:22:32.053Z

Trends 24h

News/Politics 2021-05-17T01:45:29.933Z

Latest

© Communities 2019 - Privacy