The Limited Times

Now you can see non-English news...

The delusional method that will make you stop arguing with Waze | Israel Hayom

2023-11-29T12:00:39.772Z

Highlights: Researchers at Pennsylvania State University made interesting findings about the impact of voice assistant personality adjustment on user experience. Users rated maids perceived as having personalities similar to their own as more socially and intellectually attractive, and even more trustworthy. The more similar the assistants sounded to the subjects, the more likely they were to change their minds about the vaccines. The lead author, Eugene C. Snyder, emphasized that the revelation about the tendency to resist certain types of information precisely when it comes from a source that should be perceived as more reliable is counterintuitive.


Again Waze sends you down one path and you appeal and decide to take another? Researchers have found a strange explanation for this phenomenon – and paradoxically, it made them want to spread fake news


The use of voice assistants, such as Siri, Alexa, and Google Assistant, is becoming increasingly common. Waze is also a kind of voice assistant, specializing in a specific field. However, not all voice assistants treat equally. A recent study by researchers at Pennsylvania State University made interesting findings about the impact of voice assistant personality adjustment on user experience. ChatGPT summarized the findings for us.

The study examined how personalization, or perceived similarity between the user's personality and voice assistant (i.e., the illusion that the voice assistant is similar in nature to the user) affect the user experience, and found a strong preference for assistants with an extroverted personality, which includes traits such as speaking louder, faster pace and at a lower pitch. Users rated maids perceived as having personalities similar to their own as more socially and intellectually attractive, and even more trustworthy.

In a particularly bizarre part of the study, the researchers gave unvaccinated subjects misinformation about the vaccines through voice assistants – and the more similar the assistants sounded to the subjects, the more likely they were to change their minds about the vaccines (i.e., the mere fact that they heard the misinformation from a voice assistant tailored for them supposedly caused them to oppose the same position).

Co-author of the study, S. Shyam Sundar, who is a professor of media influence at the James Gimiro School (founder of Disney Television) at Penn State University, emphasized the importance of perceived similarity for reliability, and the potential of personalized voice assistants to improve user trust. Another co-author of the study, Saeed Abdullah, highlighted the potential to improve interactions with personal assistants by matching them to users – in other words, if Waze's voice sounds like you, you might not challenge its guidelines. The lead author, Eugene C. Snyder, emphasized that the revelation about the tendency to resist certain types of information precisely when it comes from a source that should be perceived as more reliable is counterintuitive. This is a sign that people more easily recognize the discrepancies or errors in information when it comes in this way, he said.

Wrong? We'll fix it! If you find a mistake in the article, please share with us

Source: israelhayom

All news articles on 2023-11-29

Similar news:

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.