Instagram Videos Sexualizing Children Shown To Adults Who Follow Preteen Influencers | ZeroHedge

In June, we noted that Meta's Instagram was caught facilitating a massive pedophile network, by which the service would promote pedo-centric content to other pedophiles using coded emojis, such as a slice of cheese pizza.

According to the the Wall Street Journal, Instagram allowed pedophiles to search for content with explicit hashtags such as #pedowhore and #preteensex, which were then used to connect them to accounts that advertise child-sex material for sale from users going under names such as "little slut for you." And according to the National Center for Missing & Exploited Children, Meta accounted for more than 85% of child pornography reports, the Journal reported.

Companies whose ads appeared next to inappropriate content included Disney, Walmart, Match.com, HIms and the Wall Street Journal itself.

And yet, no mass exodus of advertisers…

They haven't stopped…

According to a new report by the Journal, Instagram's 'reels' service – which shows users short video clips of things which the company's algorithm thinks users would find of interest – has been serving up clips of sexualized children to adults that follow preteen influencers, gymnasts, cheerleaders and other categories that child predators are attracted to.

A separate experiment run by the Canadian Centre for Child Protection had similar results after running similar tests. 

Meta defended itself, telling the Journal that their tests were a 'manufactured' experience that doesn't represent what billions of users see. That said, they refused to comment on why their algorithms "compiled streams of separate videos showing children, sex and advertisements," however the company said they gave advertisers tools in October to provide greater control over where their adds appear (so it's their fault since October?), and that Instagram "either removes or reduces the prominence of four million videos suspected of violating its standards each month."

The Journal contacted several advertisers whose ads appeared next to inappropriate videos, and several said they were 'investigating,' and would pay for brand-safety audits from an outside firm.

Meta's 'reels' feature was designed to compete with Chinese-owned TikTok, a video sharing platform owned by Beijing-based ByteDance which also features a nonstop flow of videos posted by others, which are monetized by inserting ads among them.

According to the report, experts on algorithmic recommendation systems said that the Journal's testing showed that while gymnasts might appear next to a benign topic, Meta's behavioral tracking has determined that some Instagram users following preteen girls also want to engage with videos sexualizing children, and then serves those people that content.

And despite both the Journal and the Canadian Centre for Child Protection informing Meta in August about their findings, "the platform continued to serve up a series of videos featuring young children, adult content and apparent promotions for child sex material hosted elsewhere."

So, more catering to pedos. And advertisers don't seem to care.

Leave a Reply