Online algorithms are leaving teens vulnerable to harmful content.
Social media platforms are a ubiquitous presence in the modern world. Today, a person’s social circle includes not only those in their nearby geographical area, but also anyone from around the world who has access to the same social media channels. Thanks to site algorithms, users can see other’s life updates from day to day.
The dangers of social media use have been well-documented and pose a threat to people of all ages. It seems that those risks are even greater for teens, however, and some of the platforms may be enhancing the threat by using an algorithm that puts users face-to-face with the very type of content that could threaten their health and well–being.
Recent research has pointed to a pattern of the social media platform TikTok promoting content related to self-harm and eating disorders almost immediately after users show some interest in such subjects. This research was performed by the Center for Countering Digital Hate.
These findings are highly concerning but not necessarily surprising. All social media platforms use algorithms that track user patterns to serve them more content related to their demonstrated interests. This isn’t much of a problem when that content is benign – a user interested in baking may see more cooking content, for instance – but it can take on a sinister tone when the suggestions present users with content that may serve to further their physical or mental health struggles.
One of the difficulties that come with studying the harms of social media and other online platforms is that these things can be difficult to test. Rather than relying on the reporting of actual users, which can be tough to secure and often unreliable, the Center for Countering Digital Hate created a set of accounts based in various countries. Each of these accounts was registered with a stated age of 13.
With these accounts in place, the study continued by using a variety of methods to engage with content that in some way relates to eating disorders or self-harm. This included pausing briefly on videos that related to these topics, or “liking” those videos, in some cases. Once this activity was completed, which was shortly after the accounts were opened, the suggested content that made its way into the feeds for these accounts was monitored and tracked.
Some of the content that appeared in these feeds was truly disturbing, including videos discussing plans to commit suicide, as well as videos that talk about cosmetic surgery as a solution for body image issues. It could be argued whether or not this type of content deserves a place on a platform available only to adults – but it’s hard to find an argument in favor of presenting it to accounts that have self-identified as belonging to preteens. Moreover, it’s common knowledge that many individuals register social media accounts stating they’re 13 when they’re actually younger – thus, harmful videos could be circulating among younger children.
What young people are exposed to online via algorithms can have a significant impact on how they view the world and what they think of their own place in it. With access to platforms like TikTok so easy to obtain, and with dangerous content being served so quickly upon joining, parents and guardians are forced to be vigilant to avoid potentially devastating consequences for the young people.
Sources:
TikTok Is Flooding Vulnerable Teenage Girls With Self-Harm Content: Report
Join the conversation!