The Limited Times

Now you can see non-English news...

“The app reads my thoughts”: when the Tiktok algorithm completely exposes us

2022-01-23T11:56:47.605Z


In a few clicks, the famous application is able to detect many aspects of the lives of its users, until generating a certa


As a teenager, Caroline, 25, wondered about her sexual orientation. But it was in 2020, when she downloaded the TikTok application, that she was finally able to put words to what she had always suspected. “I am bisexual”, she concluded in front of her screen which accumulated videos on the subject. "They described very precisely my experiences and my feelings, it is thanks to them that I came to accept myself", comments this American, employed in a law firm and based in New Orleans. “It's very liberating to not have to cut that part of my life out anymore. »

But how could a simple music video app lead to “coming out”? Caroline and many other Internet users all point to an essential cog in the platform with a billion users in the world: the mysterious – and it seems, very powerful – content recommendation algorithm. Are you a fan of dance and fashion? TikTok will offer you videos of young people creating (or imitating) original choreographies, or bloggers in the front rows of the last Fashion Week. Are you a fan of animals and good gags? Your feed will be populated by clumsy cats falling headfirst down a flight of stairs. Are you a young man passionate about fitness and bodybuilding? Here is a series of content providing tips for “putting on weight”.

The tiktok algorithm knows more about me than my daron I go to sleep

— mira 🌻 (@miiradoraa) July 2, 2021

Hidden behind the "for you page" (often abbreviated as "fyp"), the app's personalized video page - and the app's main portal, how the recommendation algorithm is jealously guarded by ByteDance, the Chinese group that owns TikTok.

An internal document analyzed by the New York Times drew the outlines of this last December: in addition to suggesting very popular videos, the application analyzes the videos "liked" directly by the user, those on which he comments, but also the viewing time of these contents - and if they view them at all.

It is this analysis that allows the application to refine its perception of the Internet user, comments Aurélie Jean, author of the book “Do algorithms make the law?

» (ed. of the Observatory).

“Whatever it is, an algorithm

will use your static profile (the one you define on the platform, with your gender, age, where you live, your language) and your dynamic profile, based on your behavior.

It includes the videos you like, which you comment on, the types of accounts followed, those who follow you, the frequency of comments and likes, connection…”

These analyzes can go very far, according to the specialist.

“On the types of posts you

like

,

information, such as the sentiment generated by a content, can be extracted by other algorithms.

If, on Titkok, people write things on their posts, we can extract a positive, negative or neutral feeling, and it can feed the description of the type of content you like.

We can also find out if you like political posts, related to beauty, fashion, movies, animals.

»

“TikTok knew I liked fashion, good restaurants and animals”

It is this mechanism that could understand everything about Fiona, a student in public relations in Paris, and specialized in luxury.

"Very quickly, TikTok offered me fashion content, and sometimes videos of people with a style of clothing, which suited me well, even though I hadn't done any research on it," she recalls.

In addition to fashion, the 21-year-old's main passion, the famous algorithm has woven a precise map of her other interests: "animals, outings in Paris, good restaurants".

Nathan (first name has been changed), a Malaysia-based TikTok fan, appears to have the same profile.

He noticed that TikTok knew he liked animals, good food, and dark humor.

But it was in a tarot reading video on his personal aspirations - esotericism has become particularly popular on social networks and among young people recently - that he recognized himself the most.

"I'm no card-shooter, but there's one that constantly pops up on my page describing how I feel, how I should move forward in my life.

She is 80% right in what she says, whether it concerns me, my work or someone else who I look like, ”sums up the 27-year-old public relations officer.

Read alsoTikTok witches, tarot and astrology: why esotericism bewitches youth

TikTok can therefore almost read you like a fortune teller can do in the cards.

But there is nothing magical about it.

The "beauty" of the "FYP" algorithm, "is that a user's interest profile can be created without it representing a burden for him", summarizes Eugene Wei, former product manager who went through Amazon, Oculus or Hulu, and specialist in tech giants, in his personal blog.

By watching (or putting aside) videos of cats, people dancing, or gags to music, Internet users inform TikTok of what they like (or not) much more easily than on Facebook or Twitter or YouTube , where he must follow other people, add friends, or follow specific pages to signify his interests.

“Everything is personalized passively, by consuming content.

Since videos are very short, the volume of data a user provides over a given time is enormous.

Since the videos are entertaining, the identification process is effortless, and it is even fun for the user”, continues Eugene Wei.

“I have the impression that the app reads my thoughts”

TikTok is not limited to the interests of its

addicts

 : it also knows how to interfere in their privacy.

Caroline saw her newsfeed change within days of her breakup.

“My ex left me in mid-September, and after three days I couldn't see any videos that talked about romantic relationships, like I had before.

I saw other people talking about their own breakup, others saying how funny this situation was, or others saying "don't text your ex!"

“, recalls the 25-year-old American.

the more it goes the more I sincerely think that the TikTok algorithm knows my life better than myself, I don't know if I'm scared or if I'm fascinated lol

— opaline (@espritsaturnien) March 21, 2021

The app even seems capable of providing its own psychological diagnoses.

For a long time, Fiona has known that she is hypersensitive.

“I had never looked for content to try to figure this out.

But one day, I had several videos on my thread explaining what it was, how to manage my hypersensitivity, ”describes the student.

“I have the impression that the application reads my thoughts, it's not possible!

»

An algorithm with harmful effects?

When used in a positive way, an algorithm as powerful as that of TikTok therefore makes it possible to get to know each other better.

But it can also reinforce more harmful beliefs.

Released last May, a study by the American NGO Media Matters expressed concern that TikTok could, for example, lead its users into sometimes dangerous “niches” because of its suggestion algorithm.

By interacting with videos with transphobic overtones, the authors of the study quickly found themselves exposed to other hateful content, ranging from "misogynistic", "racist" or even "anti-Semitic" videos, to "conspiracy theories", to "symbols of hatred" and "calls to violence". "Transphobia was a portal of hatred that could lead to even greater radicalization to the far right," said the study by the NGO, classified on the left in the United States.

The Wall Street Journal journalists meanwhile created several fake accounts, with minimal information, to analyze where the app seemed to naturally lead them.

Again, these accounts were trapped by the algorithm: an account more sensitive to depression was overwhelmed with videos talking about sadness and loneliness.

A political amateur account was diverted to videos of the QAnon conspiratorial movement.

The heart of the business model

To avoid these abuses, TikTok claims to work “proactively” to remove content that does not comply with its terms of use. "Between April and June 2021, over 81.5 million videos were removed for violating our Community Guidelines or Terms of Service, 94.1% of them before they were even reported," the platform explains. , which confirms that it analyzes many user interactions to refine its video suggestions, in order to create a "For You Page" unique to each one. But does the application use more personal data, such as the content of messages internal to the application, or data from third-party applications? Contacted by the Parisian, the platform did not answer this specific question.

In any case, TikTok does not seem shaken by the multiple questions and suspicions around the algorithm of its “For You page”. “Without their algorithms, social networks would not work. Their role is to offer interesting content. And these algorithms feed and reinforce an economic model, which is consumption over a short period of time, to make you stay there as long as possible, even if you see content that is debatable”, analyzes Aurélie Jean.

The social network - which has become a medium in its own right - is now part of the daily life of its users, who are not really afraid of being laid bare in a few clicks.

"I find it worrying, but it doesn't surprise me more than that.

Today, we are so surrounded everywhere…”, resigns Fiona, in Paris.

"I like it when people manage to identify me correctly", adds Nathan, in Malaysia.

With TikTok, he finds that feeling again, he says.

"I'm like

, oh, I'm normal, because other people feel the same way

.

And there, I tell myself that I am not alone.

»

Source: leparis

All news articles on 2022-01-23

You may like

Trends 24h

Latest

© Communities 2019 - Privacy

The information on this site is from external sources that are not under our control.
The inclusion of any links does not necessarily imply a recommendation or endorse the views expressed within them.