The Algorithmic Underbelly: How Social Media Fuels Online Sexual Offences

Social media platforms have revolutionised communication, but a dark side lurks behind the endless scroll. Algorithms, the engines curating our online experiences, can inadvertently create breeding grounds for online sexual offences. This article delves into how these systems contribute to a growing problem, drawing on recent UK media reports.

Echo Chambers and Desensitisation: Algorithms personalise content based on user engagement. This can create echo chambers, as highlighted by the London School of Economics: https://blogs.lse.ac.uk/medialse/2023/02/08/the-online-safety-bill-needs-more-algorithmic-accountability-to-make-social-media-safe/. Users are exposed only to content that reinforces their existing views, potentially leading to desensitisation to harmful content, normalising unhealthy sexual behaviours or violence, as reported by the National Society for the Prevention of Cruelty to Children (NSPCC)learning.nspcc.org.uk/news/2024/january/online-harms-protecting-children-and-young-people.


Predatory Targeting: The same personalisation algorithms that curate content can be exploited by predators. By tracking user behavior and interests, perpetrators can target vulnerable individuals, especially children, with grooming tactics or age-inappropriate content. This issue was recently addressed by the Office of Communications (Ofcom): https://www.ofcom.org.uk/news-centre/2023/tech-firms-must-clamp-down-on-illegal-online-materials in its plan to tackle online child sexual abuse and grooming.


The Content Conundrum: Social media platforms struggle to strike a balance between freedom of expression and content moderation. Algorithms designed to remove harmful content, as mandated by the upcoming Online Safety Bill: https://www.gov.uk/government/news/online-safety-law-to-be-strengthened-to-stamp-out-illegal-content, can be overly broad, silencing legitimate conversations about sexuality, as argued in the Law School of England and Wales blog: https://blogs.lse.ac.uk/medialse/2023/02/08/the-online-safety-bill-needs-more-algorithmic-accountability-to-make-social-media-safe/. Conversely, inadequate filtering allows explicit material to circulate freely, raising concerns about the influence of pornography on harmful sexual behaviour among children, as reported by the Children’s Commissioner for England.


The Rabbit Hole Effect: Recommendation algorithms often prioritise engagement over safety. This can lead users down a rabbit hole of increasingly graphic or exploitative content, exposing them to material they wouldn’t have encountered otherwise, as highlighted by the NSPCClearning.nspcc.org.uk/news/2024/january/online-harms-protecting-children-and-young-people. This is particularly concerning for young people still exploring their sexuality.


What Can Be Done?

There’s no easy solution, however:

  • Social media platforms need to invest in robust content moderation systems that can identify and remove harmful content while allowing for open conversation.
  • Algorithmic transparency is crucial. Users should understand how algorithms curate their feeds and have options to personalise settings for content safety.
  • Education is key. Equipping users, especially children, with digital literacy skills is essential to navigate online spaces safely. Open communication about healthy sexuality and online dangers is vital.

Social media algorithms are powerful tools, but left unchecked, they can create hazardous environments. By acknowledging the risks and working towards solutions, as outlined in recent UK media reports and policy efforts, we can make online spaces safer for everyone.

Report indecent images and videos of children here! Reporting is quick, easy and anonymous. It can lead to the removal of criminal content and even the rescue of a victim of sexual exploitation from further abuse.