Instagram home to ‘vast pedophile network’: Wall Street Journal
Instagram has become a haven for large, organised pedophile networks and its algorithms even promote child porn, according to a bombshell report published last week in the .
The Journal’s 2,800-word tour de force, the result of collaboration with researchers from Stanford University and the University of Massachusetts Amherst, was titled simply “Instagram Connects Vast Pedophile Network”. Its revelations were damning.
Innumerable accounts “openly devoted to the commission and purchase of underage-sex content” are currently active on the platform, thanks to Instagram enabling explicit hashtags like #pedowhore and #preteensex. The Meta-owned site even directed users to accounts that advertise child-sex material for sale.
Especially popular with teens, Instagram is estimated to have more than 1.3 billion users.
Among those to take notice of the report was Twitter CEO Elon Musk, who retweeted a screenshot of the Journal’s headline with a two-word caption — “Extremely concerning” — garnering almost 50 million views.
Extremely concerning pic.twitter.com/PLz5jrocvv— Elon Musk (@elonmusk) June 7, 2023
Get the Free Mercator Newsletter
Get the news you may not get anywhere else, delivered right to your inbox.
Your info is safe with us, we will never share or sell you personal data.
As the Brownstone Institute’s Jeffrey Tucker highlighted, Instagram’s pedophile ecosystem thrived even as the platform went on a Covid-era crusade against accounts that questioned masks, lockdowns and vaccines.
Summarising the scope of Instagram’s dereliction, the Journal reports:
Pedophiles have long used the internet, but unlike the forums and file-transfer services that cater to people who have interest in illicit content, Instagram doesn’t merely host these activities. Its algorithms promote them. Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests.
The Instagram accounts that sell such content “often claim to be run by the children themselves and use overtly sexual handles incorporating words such as ‘little slut for you’.”
By using hashtags associated with underage sex, the researchers found 405 accounts purportedly run by children as young as 12 that offered “self-generated” child-sex material. Tens of thousands of unique users followed such accounts.
The report continues:
Certain accounts invite buyers to commission specific acts. Some menus include prices for videos of children harming themselves and “imagery of the minor performing sexual acts with animals,” researchers at the Stanford Internet Observatory found. At the right price, children are available for in-person “meet ups.”
Accounts run by pedophiles also have their chosen hashtags like #pedobait and variations of #mnsfw (“minor not safe for work”), and mixed “brazenness with superficial efforts to veil their activity,” according to the Journal. Pedophiles also use emojis as code, such as the map emoji as shorthand for “minor-attracted person” or an image of cheese pizza, which shares its initials with “child porn”.
One of the report’s lead researchers was David Thiel, chief technologist at the Stanford Internet Observatory, who had previously worked on security and safety issues at Meta. He said that “Instagram’s problem comes down to content-discovery features, the ways topics are recommended and how much the platform relies on search and links between accounts.” He added, “You have to put guardrails in place for something that growth-intensive to still be nominally safe, and Instagram hasn’t.”
Another problem, according to researchers, was Meta’s inability to scan for new child exploitation images, since its automated screening software is only designed to detect already-circulating material.
Additionally — inexplicably — Instagram presented users who searched for illegal material with a pop-up screen warning that “these results may contain images of child sexual abuse,” yet still provided the option to “see results anyway”. After the Wall Street Journal questioned Instagram about that option, the company removed it without giving an explanation for why they provided it in the first place.
Meta’s own employees who have worked on Instagram child-safe initiatives estimate the number of accounts that exist primarily to follow child-sex material to be in the “high hundreds of thousands, if not millions,” according to the Journal.
Meta told the Journal that it has taken down 27 pedophile networks over the last two years and has plans for more removals. In January alone, it also cancelled almost 500,000 accounts for violating its child safe-policies. Since the Journal began its inquiries, Instagram “has blocked thousands of hashtags that sexualize children, some with millions of posts, and restricted its systems from recommending users search for terms known to be associated with sex abuse”.
To any common-sense observer, however, these efforts are too little, too late.
In the words of MercatorNet Editor Michael Cook, “Here we have the world’s most advanced technology and the smartest guys in the room” and yet the result is an enterprise that “would make Dr Josef Mengele blush”.
The platform of choice for global influencers currently enjoys an influence on global society that should make every one of us shudder.
Where to from here, Meta?
Have your say!
Join Mercator and post your comments.