A 13-year-old user signed up for the app and started searching for ‘ OnlyFans ‘ in the navigation bar in order to view adult content. The algorithms of social networks do not deceive us, so, as the young man was scrolling through the videos that appeared in the feed of ‘For you’, the recommendations began to fill with pornographic videos even though he was no longer searching that kind of format.
This function will be activated automatically.
Do not panic, this is not a real case, this account is automated and was created by The Wall Street Journal to understand what TikTok teaches younger users. After conducting an analysis of the videos that were offered, it was found through the algorithms that pornographic content , drugs, alcohol or eating disorders can easily appear . Consequently, Journal began to report this type of multimedia to restrict its distribution.
The application does not differentiate between videos for adults and minors, however, TikTok wants to create a tool that filters the content for young people . According to the terms of service, users must be at least thirteen years old and need the consent of their parents to open an account in the app .
How does this type of content appear to us?
In truth, TikTok only needs one important piece of information to figure out what a user wants: the amount of time it takes for a piece of content. Through this signal, the application can know your interests to offer you a personalized feed .
The war on illegal parties: Airbnb blocks the bookings of 50,000 young people under 25 in a year
Another example : A bot was programmed to stop on videoswith drug- related hashtags . At first it focused on a recording of a young man who was looking for marijuana in the middle of a forest, the next day, videos of cakes made with this type of drug began to appear. Finally, the feed was full of inappropriate content that can harm minors.
Importantly, if users come across something they don’t want to see, they can select ‘Not interested’ to avoid any type of content .
On this situation, the TikTok platform has contacted 20BITS to clarify it: “Although the activity and the resulting experience of these bots does not represent at all the behavior and the viewing experience of a real person.”
“We are continually working to improve our systems and are reviewing how to help prevent even the most unusual viewing habits from creating negative cycles , especially for our younger users,” they continued.