Guest blog: AI has made online sex work a lot more dangerous

Image by the brilliant Stuart F Taylor

There are so many ways AI is affecting our lives it can sometimes be difficult to keep up. I’ve written before about how depressing it is that people want to use robots to write porn, but there are even more sinister uses of AI tools out there. I’m very grateful to today’s guest blogger, Kate O’Kelly, for this deeply troubling piece about AI image recognition tools, and the danger they pose to people doing online sex work.

AI has made online sex work a lot more dangerous

The oldest profession in the world has always been stigmatised and it’s become even riskier lately, thanks to AI.

I’ll never forget a man coming into my cam room in 2020 and writing, ‘I know you’. I was instantly terrified that I was about to be outed by a colleague at my day job as a journalist, but it never happened. I later spoke to another model who said that it’s a common occurrence for men to pretend to know models to scare them – getting off on our reactions.

Sadly, this has now taken a more sinister turn because of AI – countless models in my agency are being shaken by anonymous clients coming into their chats and calling them by their real names. It’s an almost weekly occurrence. Some clients even make it clear that they have done some serious research into the model’s personal lives.

The dangers of this can’t be overestimated.

I assumed that camming would be a get-rich-quick scheme when I started in 2019 and that no one would ever find out. As a journalist on a low wage, I needed the cash. Someone more experienced told me it was inevitable that I’d be found out and I brushed it off – until it happened. My so-called best friend told everyone after my mental health declined for reasons unrelated to webcamming, falsely assuming, like so many people, that sex work is inherently exploitative and harmful.

What followed was the most traumatic few weeks of my life, and it took me years to eventually repair my relationship with my family. I never thought I’d risk it again, but I had no choice when AI largely took my day job in 2024 and I was forced to take a massive pay cut. Facing losing my home and serious financial difficulties, I contacted my old agency. Its well-meaning owner claimed that AI tools would give me more anonymity this time around and created a profile for me with fake images that resembled my body type and hair colour.

While AI is now banned on the camming site I use, there is no way of stopping the clients from using advanced facial recognition tools to easily discover models’ identities with nothing more than an undetectable screenshot. And the tools are getting better by the day.

I regularly reverse search my face to ensure none of my streams have been leaked. Mercifully, these occurrences are few and far between, as the website I use does not stream to other adult sites. But to put how advanced the technology is into context, streams that I assumed would be long gone from when I started out on less secure sites in 2019 are now appearing in the results – with little more of me in them than the bottom half of my face.

The ability for clients to identify models with screenshots of private calls using AI isn’t mentioned by agency owners trying to find new recruits on platforms like TikTok. With more and more people losing their jobs to AI and struggling with the cost of living, this upsets me because the consequences of being found out can be devastating.

As I’ve already been outed, and trust all my friends who know that I’ve returned to camming, I am not a good target for blackmail or similar. If someone was to leak a stream with my real identity, I’d simply say it was from my first camming stint that everyone knows about. But I don’t want it to happen either, as my real identity is in no way associated with sex work and I am on the hunt for a new, vanilla career.

But while the subject isn’t mentioned on TikTok, the problem is all over Reddit. One model said that in a bid to protect herself, she has deleted all her personal photos from the internet and ‘never uses’ her real name anywhere. Others have said that they have resorted to wearing masks, but as I was identified by AI facial recognition with nothing more than half my face in a stream, I’m not sure how effective this would be.

Summing up the problem, users agreed that the technology serves no purpose. One wrote: ‘You can’t control that there are more and more technological tools to stalk women. If you’ve chosen to show your face at [webcamming], as almost all of us do, you can’t control the porn half of your life. What you can control is the other half: not putting photos of yourself online under your real name.’

As implied above, even outside of sex work, these tools are harming women. On dating apps, men can find out everything about a potential date, including their workplace, with nothing more than a quick search.

While I’ve got less to lose than other models, and I’m comfortable in the knowledge that the work is legal, I still find the prevalence of AI unsettling. But I would make considerably less money if I never showed my face. The best I can do to protect myself is never show my face and nudity at the same time. I don’t want a stalker – or my naked body leaked with my real name. I’ve been harassed as a journalist and I know that websites like X do little to protect people.

What’s more, I’ve noticed that an increased number of men on webcam ask to see my whole face with nudity, but I never do it. I’ve a horrible feeling that they want it for nefarious reasons. Others simply ask to see my face up close and I’m scared that they want it to find my real identity with AI – for as little as £1 a minute.

I don’t know what the solution is, but at the very least, the companies offering AI searches of people’s faces should – from an ethical standpoint – not include explicit material, regardless of whether it is of Bonnie Blue or someone doing online sex work as a side hustle.

While I intend to eventually retire from webcaming for good, it’s quite literally keeping me afloat after AI largely took my day job. I make between £300 and £500 a week for around two hours of streaming at night and I enjoy the work ninety percent of the time. I just wish that I wasn’t left fearing the very real possibility that a man will come into my room and call me my real name.

 

1 Comment

  • Jaimie says:

    This is horrific but not surprising. The bitter irony is that someone who turns to cam work to pay the bills because of AI then has the same tech turned against her again.
    I didn’t know about reverse image searching, etc., until someone stalked me on Instagram (just one of many reasons why I’m no longer on Insta or Facebook). It scared the shit out of me, but fortunately, my account wasn’t promoting adult content, so there wasn’t that blow-back as well. I no longer have any social media for my ‘regular’ life (really don’t miss it either).
    Somebody will be able to demonstrate some positive outcomes from AI at some point, I suppose. In the meantime, I’m just going to keep hating on it with every fibre of my being.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.