Investigation reveals Instagram's algorithm regularly suggests explicit content to users as young as 13 years old
A recent
Wall Street Journal and
Northeastern University (NEU) study has found that Instagram’s algorithm
routinely suggests sexually explicit content to users as young as 13.
The study, conducted by Laura Edelson, a computer science professor at NEU
, involved researchers setting up multiple Instagram accounts registered to 13-year-olds to monitor the content suggested by Instagram's Reels feed over a seven-month period ending in June.
Initially, the accounts received a mix of benign videos, such as comedy skits, videos about cars and people doing stunts. But after several sessions, the content started shifting to frequent streams featuring sexually suggestive videos, often depicting women dancing seductively or graphically describing their bodies.
In some instances, within just a few minutes of an account being created, it would already be exposed to adult-oriented videos and Instagram profiles. As quickly as 20 minutes, the account's feed would be saturated with such material. In some cases, Instagram even recommended videos marked as "disturbing" to teen accounts, which Meta attributed to an error.
The study showed that both teen and adult accounts received sexually suggestive content at similar rates. This simply indicates a lack of differentiation in content recommendations.
Meta denies accusations, but several reports reveal otherwise
Instagram parent company Meta dismissed the findings and claimed that they were part of "an artificial experiment that doesn't match the reality of how teens use Instagram."
"As part of our long-running work on youth issues, we established an effort to further reduce the volume of sensitive content teens might see on Instagram, and have meaningfully reduced these numbers in the past few months," said Meta spokesperson Andy Stone. (Related:
Former Meta director says Instagram FAILED to protect teen users.)
However, internal documents and external analyses tell a different story.
A 2023 investigation by the
Wall Street Journal, along with the
University of Massachusetts Amherst and
Stanford University, revealed that Instagram not only connects pedophiles but also guides them to content sellers through its sophisticated recommendation algorithms.
Researchers found that Instagram's search functions and hashtag features further ease the discovery of underage sex content. This includes not only videos and images but also potential in-person meetings. Setting up test accounts, researchers observed that interacting with just one account associated with this network
led to an overwhelming influx of recommendations for similar content.
"Instagram connects pedophiles and guides them to content sellers via recommendation systems that excel at linking those who share niche interests, the
Journal and the academic researchers found. Following just a handful of these recommendations was enough to flood a test account with content that sexualizes children," the report noted.
They found more than 400 sellers of "self-generated child sex material," which means accounts that claim to be run by the children involved. Some say they are as young as 12, and many use overtly sexual handles that do not attempt to hide what they are trying to accomplish. They found that these accounts had thousands of unique followers, and that many of the associated users discuss ways to access minors.
This investigation was supported by a 2022 internal Meta document reviewed by the
Wall Street Journal. The 2022 document revealed that
Instagram exposes younger users to more pornography, gore and other harmful and explicit content compared to adults. Teenagers reported higher rates of encountering bullying, violence and unwanted nudity. Specifically, teens saw three times more prohibited posts containing nudity, 1.7 times more violent content and 4.1 times more bullying content than users over 30.
Visit
MetaTyranny.com to read more articles about the social media giant.
Watch Dr. Duke Pesta discuss how
Instagram is targeting little children with its new app below.
This video is from the
Counter Culture Mom channel on Brighteon.com.
More related stories:
New Mexico attorney general files lawsuit against Meta for allowing pedophiles on Facebook and Instagram to CONTACT CHILDREN.
Mark Zuckerberg’s Meta struggling to remove child sexual abuse materials from Facebook and Instagram.
Norwegian Data Protection Authority orders Meta to stop tracking Facebook and Instagram users for targeted advertising.
PEDO PLATFORM: Instagram’s algorithms promote a "vast pedophile network," report reveals.
EU authorities slap Meta with record $1.3B fine for data privacy law violations.
Sources include:
YourNews.com
LifeSiteNews.com
Brighteon.com