Instagram would recommend sexual Reels to teenagers aged 13 and over

Instagram recommends Reels with sexual content to teens as young as 13, even if they’re not specifically looking for racy videos, according to separate tests conducted by The Wall Street Journal and Laura Edelson, professor at Northeastern University. Both created new accounts and set their age at 13 for the tests, which took place mainly […]

Instagram would recommend sexual Reels to teenagers aged 13 and over

Instagram recommends Reels with sexual content to teens as young as 13, even if they’re not specifically looking for racy videos, according to separate tests conducted by The Wall Street Journal and Laura Edelson, professor at Northeastern University. Both created new accounts and set their age at 13 for the tests, which took place mainly from January to April of this year. Apparently, Instagram has served moderately racy videos from the start, including those of women dancing sensually or those focusing on their bodies. Accounts that watched these videos and ignored other Reels then began receiving recommendations for more explicit videos.

Some of the recommended reels contained women miming sexual acts, others promised to send nudes to users who commented on their accounts. Tested users also allegedly saw videos showing people showing off their genitals, and in one case the alleged teenage user allegedly saw “video after video of anal sex.” It only took three minutes after creating the accounts to start receiving Sex Reels. Within 20 minutes of watching them, their recommended Reels section was dominated by creators producing sexual content.

To note, The newspaper and Edelson conducted the same test for TikTok and Snapchat and found that neither platform recommended sex videos to the teen accounts they created. The accounts never even saw age-inappropriate video recommendations after actively searching for them and following the creators who produce them.

The newspaper claims Meta employees have identified similar issues in the past, based on undisclosed documents detailing internal research into harmful experiences on Instagram for young teens. Meta security staff previously performed the same test and came up with similar results, the publication reports. Company spokesperson Andy Stone, however, shrugged off the report, saying The newspaper: “It was an artificial experience that doesn’t match the reality of how teens use Instagram.” He added that the company “has made efforts to further reduce the volume of sensitive content that teens might see on Instagram and has significantly reduced those numbers over the past few months.”

Last January, Meta introduced significant privacy updates related to protecting teen users and automatically placed teen users in its most restrictive control settings, from which they cannot opt ​​out. Log testing was done after these updates were deployed, and they were even able to reproduce the results as recently as June. Meta released the updates shortly after The newspaper published the results of a previous experiment, in which it was found that Instagram Reels would show “risqué images of children as well as overtly sexual adult videos” to test accounts that exclusively followed teenage influencers and preteens.

This post contains affiliate links; If you click on such a link and make a purchase, we may earn a commission.

Teknory