According to a recent research, the popular Chinese social network would offer videos that are not suitable for younger audiences, which lead to pursuing extreme weight loss goals.Don't store avocado like this: it's dangerous
The video content on "diet" and "weight loss", available on the social TikTok, they would foment the eating disorders of young users. This is what emerges from a disturbing investigation by the Wall Street Journal on the extreme challenges faced by many girls in order to lose weight to the point of becoming skeletal, contributing to the spread of the social scourge of eating disorders. The reporters created a hundred bogus accounts that used the Chinese application randomly, with little human intervention, simulating the behavior of kids. After spending time watching alcohol-related, gambling, and weight loss-related content (not obscured by social media), TikTok's algorithm adjusted accordingly, increasing the number and frequency of videos related to the diets and weight loss in the For You section.
At the end of the experiment, of the approximately 255.000 videos that AI watched in total, 32.700 of these contained a description or metadata that matched a list of hundreds of weight loss-related keywords: 11.615 videos had text descriptions containing keywords relevant to eating disorders, while 4.402 videos had a combination of keywords supporting the normalization of eating disorders. But not only that: to avoid being reported by the social network, the descriptions of some of the videos used "rigged" spellings for the keywords related to eating disorders - replacing, for example, a few letters with a number or an asterisk.
In response to the news outlet's report, TikTok announced it was working on new ways to allow users a safe use of the social network and the contents it contains: the idea is to develop a video content recognition strategy that may not violate TikTok's policies, but could be harmful if viewed excessively; in addition, we think of a tool that allows users (or their parents, in case of use of social media by children) to prevent videos containing certain words or hashtags from being displayed on the For You page.
Even though the experiment conducted by the WSJ does not reflect the experience most people have on TikTok, even one person who has that experience is one too many - the spokesperson for TikTok said. - We allow the enjoyment of educational or recovery-oriented content because we understand that it can help people see that there is hope, but content that promotes, normalizes or glorifies disordered eating is prohibited.
TikTok is not the first social network to end up in the storm due to the negative influence it exerts on its users - especially on younger ones: another recent investigation, also conducted by the WSJ, has shown how Instagram would seriously damage the mental health of adolescents and it would help undermine girls' self-esteem and their self-image. Following this research, the social media company announced the introduction of a function to keep teens away from viewing potentially harmful content, as well as the addition of the "Take a break" function to invite users to close the app if they have spent a certain period of time on the platform (between 10, 20 or 30 minutes).
Follow your Telegram | Instagram | Facebook | TikTok | Youtube
Fonti: TikTok / Wall Street Journal
We also recommend:
- 4-year-old risking her life to imitate what she saw in a video on TikTok
- TikTok and minors, the new spot of the Guarantor. Parents supervised on age (and on the use of social networks)
- The optical illusion of TikTok rabbits is much more terrifying than it looks