Krone Plus Logo

Internals leaked

What TikTok knows about the danger of the addiction algorithm

Nachrichten
16.10.2024 13:35

More than a dozen US states have recently filed a lawsuit against TikTok. The short video app is accused of addicting children and young people. TikTok rejects the allegations, but internal documents that have now been leaked show that they are not intended for the public: The company has probably been aware of the danger posed by its algorithm for years.

The attorneys general of 13 US states and the capital Washington filed a lawsuit against the subsidiary of the Chinese ByteDance Group a few days ago - krone.at reported. They accuse TikTok of having deliberately designed its platform in such a way that children and young people wanted to spend more and more time there. In this context, reference was made to functions such as the option to keep scrolling with automatically starting videos. They argue that this makes children in particular addicted to more.

TikTok, which is currently defending itself against a law in the USA aimed at forcing a change of ownership, rejected the accusations and replied that there are robust safety precautions and restrictions on the time young users spend on the platform. However, according to the Youth Internet Monitor, the application, which is also used by almost two thirds of children and young people in Austria, is likely to have misled the public about the dangers and risks it poses.

Hooked in less than 35 minutes
This emerges from internal documents that are part of the more than two-year investigation into TikTok by the 14 state attorneys general and have now been viewed by NPR (National Public Radio) and published despite a confidentiality agreement between the US states and TikTok due to an incorrect redaction when the lawsuit was filed in the US state of Kentucky.

The approximately 30 pages of now not-so-secret documents - mainly summaries of internal studies and communications - show that TikTok was aware of the great addictive potential of its platform as well as the dangers and risks of potentially harmful content for its underage users, but nevertheless took hardly any measures against it.

For example, TikTok determined the exact number of video views needed to establish a habit and keep users glued to their smartphone screens, namely 260, after which, according to government investigators, "it is likely that a user will become addicted to the platform".

In the previously redacted portion of the complaint, Kentucky authorities say, "While this may seem substantial, TikTok videos can be as short as eight seconds and are automatically played to viewers in rapid succession." Thus, an average user is "likely to become addicted to the platform in less than 35 minutes," NPR quotes.

Two thirds of young people in Austria use TikTok. (Bild: saferinternet.at)
Two thirds of young people in Austria use TikTok.

Compulsive use and its consequences
Another internal document shows that the company was aware that the many features designed to keep young people on the app lead to a constant and irresistible urge to open the app again and again. TikTok's own research states that "compulsive use correlates with a number of negative mental health outcomes, such as loss of analytical skills, memory formation, contextual thinking, conversational depth, empathy, and increased anxiety," the lawsuit states.

In addition, the documents show that TikTok was aware that "compulsive use also interferes with essential personal obligations such as adequate sleep, work/school, and connecting with loved ones."

Countermeasures "not quite as useful"
"Our goal is not to reduce time spent," a TikTok project manager is quoted as saying in the documents. In a chat message reflecting this opinion, another employee said that the aim was to "contribute to user retention".

Accordingly, videos produced specifically by TikTok to encourage users to stop scrolling endlessly and take a break were not widely advertised. According to the report, a senior employee admitted that although these videos were "a good talking point" for policy makers, they were otherwise "not all that useful".

The same applies to a tool developed by TikTok, which limits the app's usage time to 60 minutes per day by default. However, the leaked documents show that TikTok measured the success of this tool by how it "improves public trust in the TikTok platform via media coverage" - and not by how it reduces the amount of time young people spend on the app.

Regular breaks can be scheduled in the app's "Digital Wellbeing" settings, but these are rarely used. (Bild: TikTok)
Regular breaks can be scheduled in the app's "Digital Wellbeing" settings, but these are rarely used.

Internal tests showed that the tool only had a minor impact on actual usage time - it only decreased by around 1.5 minutes, with young people previously spending around 108.5 minutes a day on TikTok and around 107 minutes with the tool. According to the Attorney General's complaint, TikTok "did not revisit this issue" as a result, according to NPR.

Nice preferred
The multi-state lawsuit against TikTok also centers on the company's beauty filters, which users can place over videos to look slimmer and younger or to have fuller lips and bigger eyes. One of them, the so-called "Bold Glamour" filter, for example, uses artificial intelligence to change people's faces so that they resemble models with high cheekbones and a pronounced jawline.

Two pictures of "Krone" editor Clara Milena Steiner - one without filters, one "optimized". And strictly speaking, both pictures are a lie. (Bild: Clara Milena Steiner, Krone KREATIV,)
Two pictures of "Krone" editor Clara Milena Steiner - one without filters, one "optimized". And strictly speaking, both pictures are a lie.

As the documents show, TikTok is aware of the damage these beauty filters can cause to young users. Internally, employees therefore proposed "providing users with educational materials about self-image disorders" and launching a campaign "to raise awareness of issues related to low self-esteem (caused by excessive filter use and other problems)".

They also suggested putting a banner or video in front of the filters containing "an educational statement about filters and the importance of positive body image/mental health". However, this did not happen - on the contrary, the documents reveal that the app's algorithm specifically favors beautiful people.

An internal report analyzing TikTok's main video feed showed that "a large number of (...) unattractive subjects" filled the feed of all users. In response, Kentucky investigators found that TikTok revised its algorithm to prioritize users that the company considered beautiful.

"By [TikTok] changing the TikTok algorithm to display fewer 'unattractive subjects' in the 'For You' feed, the company took active steps to promote a narrow standard of beauty, even if it could negatively impact its young users," Kentucky authorities are quoted as saying. Meanwhile, TikTok outwardly stated that one of its "most important commitments is to support the safety and well-being of teenagers".

Deprived of sleep, food and eye contact
According to the documents, an unnamed company executive explained in drastic terms what impact the TikTok algorithm could have on them instead: The reason why children watch TikTok is the power of the app's algorithm, "but I think we need to be clear about what that could mean for other opportunities. And when I say other opportunities, I literally mean sleeping, eating, moving around the room and looking someone in the eye."

"Arms race for attention"
TikTok's own research concluded that children are the most susceptible to being sucked into the app's endless video feed. "As expected, the younger the user, the better the performance on most engagement metrics," reads a 2019 document from the short video app, which, according to an internal presentation, is in an "arms race for attention".

Nevertheless, the company has been cautious about deleting accounts of users suspected of being under 13 years old. An internal document on "younger users/U13" states that TikTok has instructed its moderators not to take action on reports of underage users unless their account identifies them as under 13.

Harmful content
This is particularly problematic because, according to the documents, TikTok is also slow to remove potentially harmful content. For example, a separate study refers to self-harm videos that were viewed more than 75,000 times before TikTok identified and removed them. At the same time, the documents show that a lot of harmful content, such as eating disorders, drug use, dangerous driving or blood and violence, is indeed "allowed", contrary to the platform's official community guidelines.

The content can often be found on TikTok and is just not "recommended", which means that it is not displayed in users' "For You" feeds or has a lower priority in the algorithm, writes NPR, referring to the incorrectly redacted documents. Accordingly, TikTok admits internally that there are significant "leak rates" of illegal content that are not removed.

According to the report, these "leak rates" include the "normalization of pedophilia" (35.71 percent), "sexual harassment of minors" (33.33 percent), "physical abuse of minors" (39.13 percent), the "glorification of sexual abuse of minors" (50 percent) or the "fetishization of minors" (100 percent).

TikTok: Publication "highly irresponsible"
TikTok itself once again rejected the accusations and referred to "robust safety precautions", which include the proactive removal of suspected underage users, as well as "voluntary safety features" such as the option to limit screen time. What's more, the short video app castigated NPR's release of the court-ordered documents as "highly irresponsible."

"Unfortunately," the radio network quoted a company spokesperson as saying, "this complaint cherry-picks misleading quotes and takes outdated documents out of context to misrepresent our commitment to public safety." It is now up to the courts to decide which of the two sides is right in the end.

This article has been automatically translated,
read the original article here.

Loading...
00:00 / 00:00
Abspielen
Schließen
Aufklappen
Loading...
Vorige 10 Sekunden
Zum Vorigen Wechseln
Abspielen
Zum Nächsten Wechseln
Nächste 10 Sekunden
00:00
00:00
1.0x Geschwindigkeit
Loading
Kommentare
Eingeloggt als 
Nicht der richtige User? Logout

Willkommen in unserer Community! Eingehende Beiträge werden geprüft und anschließend veröffentlicht. Bitte achten Sie auf Einhaltung unserer Netiquette und AGB. Für ausführliche Diskussionen steht Ihnen ebenso das krone.at-Forum zur Verfügung. Hier können Sie das Community-Team via unserer Melde- und Abhilfestelle kontaktieren.

User-Beiträge geben nicht notwendigerweise die Meinung des Betreibers/der Redaktion bzw. von Krone Multimedia (KMM) wieder. In diesem Sinne distanziert sich die Redaktion/der Betreiber von den Inhalten in diesem Diskussionsforum. KMM behält sich insbesondere vor, gegen geltendes Recht verstoßende, den guten Sitten oder der Netiquette widersprechende bzw. dem Ansehen von KMM zuwiderlaufende Beiträge zu löschen, diesbezüglichen Schadenersatz gegenüber dem betreffenden User geltend zu machen, die Nutzer-Daten zu Zwecken der Rechtsverfolgung zu verwenden und strafrechtlich relevante Beiträge zur Anzeige zu bringen (siehe auch AGB). Hier können Sie das Community-Team via unserer Melde- und Abhilfestelle kontaktieren.

Kostenlose Spiele
Vorteilswelt