Sexually Explicit Videos Featured by Instagram’s Algorithm
Sexually Explicit Videos Featured by Instagram’s Algorithm
It seems that Instagram hides dark content, even content featuring minors, that its algorithm is willing to serve up to those who follow specific accounts. What is being done about this?
From Fox Business. Instagram’s Reels video service is designed to show users streams of short videos on topics the system decides will interest them, such as sports, fashion or humor.
Let the IFA community know how to pray for you.
The Meta Platforms-owned social app does the same thing for users its algorithm decides might have a prurient interest in children, testing by The Wall Street Journal showed.
The Journal sought to determine what Instagram’s Reels algorithm would recommend to test accounts set up to follow only young gymnasts, cheerleaders and other teen and preteen influencers active on the platform. …
The Journal set up the test accounts after observing that the thousands of followers of such young people’s accounts often include large numbers of adult men, and that many of the accounts who followed those children also had demonstrated interest in sex content related to both children and adults. The Journal also tested what the algorithm would recommend after its accounts followed some of those users as well, which produced more-disturbing content interspersed with ads.
In a stream of videos recommended by Instagram, an ad for the dating app Bumble appeared between a video of someone stroking the face of a life-size latex doll and a video of a young girl with a digitally obscured face lifting up her shirt to expose her midriff. In another, a Pizza Hut commercial followed a video of a man lying on a bed with his arm around what the caption said was a 10-year-old girl.
The Canadian Centre for Child Protection, a child-protection group, separately ran similar tests on its own, with similar results.
Meta said the Journal’s tests produced a manufactured experience that doesn’t represent what billions of users see. …
The Journal reported in June that algorithms run by Meta, which owns both Facebook and Instagram, connect large communities of users interested in pedophilic content. The Meta spokesman said a task force set up after the Journal’s article has expanded its automated systems for detecting users who behave suspiciously, taking down tens of thousands of such accounts each month. The company also is participating in a new industry coalition to share signs of potential child exploitation.
Companies whose ads appeared beside inappropriate content in the Journal’s tests include Disney, Walmart, online dating company Match Group, Hims, which sells erectile-dysfunction drugs, and The Wall Street Journal itself. Most brand-name retailers require that their advertising not run next to sexual or explicit content. …
Meta created Reels to compete with TikTok, the video-sharing platform owned by Beijing-based ByteDance. Both products feed users a nonstop succession of videos posted by others, and make money by inserting ads among them. Both companies’ algorithms show to a user videos the platforms calculate are most likely to keep that user engaged, based on his or her past viewing behavior.
The Journal reporters set up the Instagram test accounts as adults on newly purchased devices and followed the gymnasts, cheerleaders and other young influencers. The tests showed that following only the young girls triggered Instagram to begin serving videos from accounts promoting adult sex content alongside ads for major consumer brands, such as one for Walmart that ran after a video of a woman exposing her crotch.
When the test accounts then followed some users who followed those same young people’s accounts, they yielded even more disturbing recommendations. The platform served a mix of adult pornography and child-sexualizing material, such as a video of a clothed girl caressing her torso and another of a child pantomiming a sex act.
Experts on algorithmic recommendation systems said the Journal’s tests showed that while gymnastics might appear to be an innocuous topic, Meta’s behavioral tracking has discerned that some Instagram users following preteen girls will want to engage with videos sexualizing children, and then directs such content toward them. …
Current and former Meta employees said in interviews that the tendency of Instagram algorithms to aggregate child sexualization content from across its platform was known internally to be a problem. Once Instagram pigeonholes a user as interested in any particular subject matter, they said, its recommendation systems are trained to push more related content to them.
Preventing the system from pushing noxious content to users interested in it, they said, requires significant changes to the recommendation algorithms that also drive engagement for normal users. Company documents reviewed by the Journal show that the company’s safety staffers are broadly barred from making changes to the platform that might reduce daily active users by any measurable amount.
The test accounts showed that advertisements were regularly added to the problematic Reels streams. Ads encouraging users to visit Disneyland for the holidays ran next to a video of an adult acting out having sex with her father, and another of a young woman in lingerie with fake blood dripping from her mouth. …
Even before the 2020 launch of Reels, Meta employees understood that the product posed safety concerns, according to former employees.
Part of the problem is that automated enforcement systems have a harder time parsing video content than text or still images. Another difficulty arises from how Reels works: Rather than showing content shared by users’ friends, the way other parts of Instagram and Facebook often do, Reels promotes videos from sources they don’t follow. …
The Journal informed Meta in August about the results of its testing. In the months since then, tests by both the Journal and the Canadian Centre for Child Protection show that the platform continued to serve up a series of videos featuring young children, adult content and apparent promotions for child sex material hosted elsewhere. …
After the Journal began contacting advertisers about the placements, and those companies raised questions, Meta told them it was investigating the matter and would pay for brand-safety auditing services to determine how often a company’s ads appear beside content it considers unacceptable.
Meta hasn’t offered a timetable for resolving the problem or explained how in the future it would restrict the promotion of inappropriate content featuring children. …
Share this article to raise awareness of the sinister content on social media.
(Excerpt from Fox Business. Photo Credit: Souvik Banerjee on Unsplash)
Partner with Us
Intercessors for America is the trusted resource for millions of people across the United States committed to praying for our nation. If you have benefited from IFA's resources and community, please consider joining us as a monthly support partner. As a 501(c)3 organization, it's through your support that all this possible.
We use cookies to ensure that we give you the best experience on our website. If you continue to use this site we will assume that you are happy with it. Privacy Policy
Comments
FATHER GOD please let parents turn off all social media for children except for when they are sitting right next to their parents so they can view what is being shown in JESUS NAME! Amen!