Welcome, friends, to my grubby little corner of the internet. A corner so strewn with obscenity that the UK government has decided you must prove you’re a grown-up before you can access certain parts of it. The UK’s new Online Safety Act has come into force, so UK people might have noticed a bunch of websites suddenly demanding you take a selfie, share your credit card details, or jump through another hoop to prove that you’re over 18. Quite a few of my friends have been discussing this in the pub, because for understandable reasons people who aren’t embedded in the world of online pornography or internet law are suddenly curious about why the internet is now so very broken. They’re also often convinced that the government will change its mind and therefore no one really needs to worry. I’ve had this conversation so many times now that I reckon I’ve got the basis for a fairly solid layperson’s guide to age verification: what it is, how it affects you, and why we absolutely, genuinely do need to worry.
What is the Online Safety Act?
The Online Safety Act is a piece of legislation designed to do many things, but the thing we’re most concerned with here is ‘protecting the children’. The Act aims to protect children from various ‘online harms’ including things like eating disorder inspiration content, content which might tell them how to harm themselves or others, and – ’twas ever thus – pornography.
Much of the content that falls into these categories – the latter especially – is completely legal to produce, and legal for adults to enjoy. However, the government wants to stop children from stumbling across it accidentally, so it is attempting to compel websites to add age-gates – to their sites as a whole if they’re adult, or to various functions and sections if they contain a mix of content. BlueSky users in the UK were recently asked to prove their age to see any posts marked as ‘adult’, as well as use the DM function, and now even sites like Spotify have had to implement it, because some of the music videos they show are +18 only.
Protecting children seems like a good idea! What’s the harm in age verification?
This is a big question so let’s break it down.
Privacy
This is the biggest one from an individual user perspective. Firstly, in order to prove your age you’re being asked to hand over some fairly important personal details. In some cases a photo of your face, sometimes your credit card details, a passport or driver’s licence. Incidentally, I know a lot of people who are saying ‘I just uploaded the selfie to age verify because then I’m not giving them much data’ – I beg to differ. You know who else uses selfies (or, to use the technical term, ‘biometrics’) to verify who you are? Actual banks! I use my face to approve transactions in the Natwest app, and I’m not the only one. Selfies are still data, and these days that’s important data, my friends!
Usually the company you’re handing these details to is a third party, often one you will never have heard of before. You’re being asked to just trust that they’ll delete whatever you sent after having verified your age. Do you? Do you believe it will work 100% of the time? How often do you get emails from companies explaining they’ve been the subject of a hack and your personal data has been exposed? Do you still feel safe handing these personal details over, often via a site where you’re viewing things which you’d prefer to remain private?
The data that is being collected for age verification purposes is extremely tempting to hackers, some of whom have already had great success targeting ID verification data from companies like Uber and TikTok. And for most sites, you don’t get a choice about who verifies you. The Open Rights Group has already highlighted the fact that Reddit, Grindr and BlueSky have selected age verification companies with problematic privacy policies, and at the moment there is no specific regulation outlining the security standards that these companies should meet in order to qualify to process this extremely personal information about you – only existing data protection law. Is the company asking for your ID legit? Well, you just have to take that on trust.
Let’s say all the current age verification providers are incredibly robust, though. They all have brilliant security, adhere to the highest standards of care and competence when dealing with your private data (stop laughing at the back!), and you have nothing to worry about as an individual when it comes to your details getting leaked or hacked. The question still remains… should you be sharing this information with random websites anyway? After all, the companies may be as legit as they claim, but once you’ve trained the population of an entire country to routinely hand over their credit card details in order to access content, you have given them an incredibly bad habit that it’s going to be tough to break. A habit which bad actors (like those who want to set up a new phishing scam, for instance) will be ready and willing to exploit. You don’t just prove your age once, after all, you potentially have to do it dozens of times, to access a bunch of different websites. Everything from BlueSky to PornHub to Spotify and even maybe Wikipedia. It becomes a weekly or perhaps monthly occurrence. Just as individual users don’t tend to read every website’s terms and conditions, it’s unlikely they’re all going to do due diligence checks on every provider who asks for ID, especially once they’ve become used to just handing that data over.
And although that may not be a problem for you, you tech-savvy cleverclogs, if you’ve ever found yourself in the position of unpaid IT support for one of your less knowledgeable friends or relatives, hopefully you can see why it’s a huge problem for the UK population more broadly.
Censorship of non-porn material
If you wondered why I mentioned Wikipedia in the section above, and though ‘how the hell can an online encyclopaedia count as porn?!’, let me explain. It’s not because if you look hard enough on Wiki you can find the occasional dick, it’s because Wikipedia involves community editing. And where there is this kind of collaboration, there is the potential for users to send each other content that falls under scope of the Act. And therefore Wiki – famously very wary of collecting data from its anonymous contributors – must now insist that the unpaid volunteers who edit the site hand over the kinds of personal info mentioned above. Wikipedia is challenging this in court, because they believe (I think correctly) that this will lead to greater censorship of the platform – wherever you require people to hand over ID, you reduce the opportunities for people from marginalised or persecuted groups to contribute. A spokesperson for Wikipedia said that insisting on age verification could leave users open to “data breaches, stalking, vexatious lawsuits or even imprisonment by authoritarian regimes”.
Wiki isn’t the only site concerned about censorship, though, and early implementation of age verification in the UK shows that it – like any attempt to regulate ‘porn’ – ends up catching a lot of other content too. Reddit users have already begun documenting subreddits that have been age-gated, including support forums for sexual assault survivors and help on how to quit smoking. On top of this, a tonne of LGBTQ+ content and sex education has already been caught in the net. Expect much much more of this to happen going forward.
This isn’t a question of just getting the government to write in exemptions, either. Although Wikipedia may earn an exemption through the court case (I hope it does), one of the core problems with the Online Safety Act in implementation is that the definitions are incredibly broad and the penalties are potentially extremely harsh. Websites need to assess whether they have a ‘significant number’ of UK users – what’s significant? 10% of total traffic? 10 people? 10,000 visitors per month? They also need to consider whether it’s likely to be accessed by children. What does that mean – ‘likely’? Sites which aren’t marketed to children or shared in any spaces where children are likely to be browsing… are they exempt? We don’t know. What we do know, however, is that sites which do not comply will be investigated by Ofcom, and potentially fined up to 10% of their annual revenue or £18 million – whichever is greater. The chilling effect of penalties like this, especially when combined with ‘guidance’ from the regulator that could generously be described as ‘vague’ means that any site with any content that could potentially be classed as ‘harmful to children’ would be taking a giant leap into the expensive unknown if they didn’t proactively comply. And compliance with ‘age verification’ is costly and time consuming: I personally can’t afford to do it, which is why I’ve just blanket blocked UK users from hearing the audio. Many other sites – both adult and non-adult – are coming to the same conclusion. Check out the Blocked page, from the Open Rights Group, which is tracking site closures and blocks as a result of the Act. Submit any sites you know of that are doing this too – let’s keep track of what we’re losing.
So there’s another harm: you’re not just losing access to this content unless you hand over private details, in many many cases (particularly with smaller sites and services) you’re losing access to it entirely. Even flashing your passport won’t get you to the content, because the site owner can’t afford to hire a bouncer to check your ID.
Transferring power from small porn sites to the big players
So, since July 25th most porn is now either age-gated or blocked to UK users entirely. You might think this is good, because isn’t that the point of the law? Ehhhh. Kind of. But also not really. The sites that child safety campaigners are usually thinking of when they make these arguments are the big, flash, video-based tube sites which offer porn for free. Those sites have the money and tech expertise to implement age verification without trouble, so they aren’t going anywhere any time soon. At the other end of the scale, though, smaller sites and hobby sites which don’t make a profit are more likely to struggle. As mentioned above, many of them are already shutting down or blocking UK users from some or all of the content, meaning those who are looking for erotic material will end up turning to the exact sites that anti-porn crusaders are so up in arms about.
As I say, this probably isn’t going to bother you that much if you think all porn is bad by default. But if you are an adult who recognises the value of legal erotic media (in whatever form), then hopefully you can see why this is a chilling situation. The UK government’s implementation of AV, without any exemption or concession for small sites, essentially means that those with the deepest pockets will get the most traffic. That means the large, ‘free’ porn tube sites – already an extremely dominant force in the adult industry, hoovering up a lot of the money and even shaping how we define ‘porn’ in the first place – will only become bigger and more powerful. Meanwhile those smaller sites trying to swim against the tide, offering a view of sexuality that is broader and more diverse than what you see on the front page of TubeFuck will struggle to get traction. As TechDirt put it this week:
“This is exactly what happens when you regulate the internet as if it’s all just Facebook and Google. The tech giants can absorb the compliance costs, but everyone else gets crushed.”
Blocking UK users means that small, independent sites are harder to find. They will lose traffic, and therefore revenue (for further marketing) as well as word of mouth. What’s more they will drop down in search algorithms because when users click through they won’t see the content they have searched for. Example: my own site has already seen a decrease in search rank since I had to block users from hearing the audio porn back in March. For understandable reasons, if someone in the UK searches ‘audio porn‘ and they get sent to a site where none of the audio plays, they’re more likely to click away and not come back. So Google downranks me. This rank isn’t something that can easily be regained, and it affects people outside the UK too – if sites are forced to block or restrict certain users, those sites may be recommended less frequently in algorithms, so you’re less likely to come across them too. You can help with this by sharing links that you like, or supporting your favourite porn creators financially, but ultimately you as a user don’t have much say in what stays and what goes from the internet. The sites which can afford to age verify will stay, and those which can’t will eventually go. Leaving only the big players – the ones anti-porn campaigners and child safety advocates use as their examples when they talk about the ‘harms’ of porn in the first place.
I guess at least kids won’t be able to see the sexy shit though, right? … Right?!
Training users to bypass filters and blocks
This part is going to be tricky, so bear with me. The UK regulator Ofcom has released guidance to sites on how to comply with the Online Safety Act, and they’ve expressly said that sites like mine should avoid encouraging people to use certain tools to bypass blocks. I’m interpreting that to mean that I also shouldn’t tell you how to do it. It would be absurd for me not to be allowed to acknowledge that it’s possible, though. I personally use a [TOOL I AM NOT ALLOWED TO MENTION] on a daily basis, because I have a block in place to stop UK users from listening to the audio on my site, so without one of these tools I wouldn’t be able to to bypass my own site block and therefore do my job. Lol.
What’s more, I’ve noticed an uptick in people playing the audio on my site since the Online Safety Act came into force. I’m not sure it’s statistically significant so I’m keeping a close eye on it, but that stat gels with news I’ve read elsewhere about these tools suddenly seeing a huge spike in both online searches and downloads from app stores. If more people are bypassing blocks, more people will presumably be bypassing my block too. Friends with whom I have chatted about this in the pub tend to move swiftly from ‘isn’t the law an ass?’ to swapping recommendations for these kinds of tools so I think it’s safe to say that the Online Safety Act itself has done far more to educate UK users about them than my own little site ever could.
I’m not encouraging you to use these, but I do need to mention them in order to say that I think any law which drives consumers to try and bypass content filters is generally a bad idea. Why are you making it so that the population of your country has to bypass blocks… in order to access perfectly legal material? Some content blocks are in place for a very good reason, for instance to prevent people being exposed to child sexual abuse material, terrorism material, and other types of content which are genuinely illegal. Last week Labour MP Peter Kyle told us that “If you want to overturn the Online Safety Act you are on the side of predators. It is as simple as that.” But it’s not as simple as that at all – perhaps the opposite. There are perfectly legal tools which will get you round the age verification requirements imposed by the UK government, but most people didn’t bother using them before because… why would they? Now that large chunks of the internet are functionally broken for UK users, more of them will use these workarounds, and many will now be in a space where even content which is blocked because it’s genuinely illegal (and harmful to adults, as well as children) is easier to access. For the same reason I am noticing an uptick in people listening to my (legal, ethical) audio porn since July 25th, I imagine sites which host genuinely illegal material will see the same.
On top of this, it’s worth noting that those who say ‘ah I’m OK, I can just use a [TOOL I AM NOT ALLOWED TO MENTION]!’ often forget that the knowledge – of what these are and how to use them – is not something everyone has. By nudging UK users towards these tools, the government has essentially created a two-tier internet. Some adults, those who are tech-savvy and who can afford it, can continue to browse without handing over their personal data or putting themselves at risk of the hacks and scams mentioned above. Others, who don’t have the knowledge, money or time to investigate these things, are exposed to significant privacy risks. Again, worth remembering: these are risks they’re being pushed to take to access perfectly legal material.
Breaking accessibility
This is a personal bugbear of mine, and I wish more people would take note of it: because of the way the law defines ‘porn’, text content is out of scope of the Online Safety Act while audio content is in. That means sites like mine which offer audio as an alternative to text (for blind readers, and anyone else who’d otherwise use a screenreader) have been forced to break important accessibility functions in order to comply with the law. Feels pretty crappy, right? When I discuss this issue in the pub with my friends, people are genuinely gobsmacked by the idea that it is OK for me to publish erotic stories when they’re written down, but not OK for me to post the accompanying audio. Even more absurd when you realise that AI text-to-speech generators exist, and can read my words aloud (but without a lot of the tone, some of which is often valuable in conveying things like consent and enthusiasm), but ‘paying other creators to record the work into a mic’ magically turns that content into something that is harmful to children.
Isn’t all this worth it if it protects children, though?
Ehhhh. Hmm. This is debatable. I’m going to have a crack. Firstly, it’s not entirely clear that this new law will protect children. Sure, it might prevent some children from stumbling across porn by accident, or being sent TubeFuck links by shitty kids in their class at school who want to show a blow job video for shock value in the group chat, but there are other factors at play that make this question a lot more complex.
- If parents and carers believe the internet has now been successfully child-proofed, are many of them going to relax how they monitor their child’s usage of the internet, and no longer feel the need to educate them about safe browsing habits or use the filtering tools that are readily available on their kids’ devices and connections?
- Can children bypass the age checks anyway? The answer to this is already ‘yes, and people have shown how this is done in some quite hilarious ways’, but I’m not allowed to go into detail about them here.
- Is the harm children might face from accidentally seeing some adult content greater than the harm to those children as they grow into adults who have been trained to upload their personal data to any website that asks for it?
This last point is the one I find most compelling, personally. The thing about kids is that they turn into adults, and I think it’s more important for us to gift the adults they will become with a free and open internet, where people have access to sex education content, LGBTQ+ content, and erotic material rather than persistent digital surveillance. I obviously don’t want kids to watch porn, it’s not for them, but I also don’t think that this law will do much to prevent that from happening – certainly not enough to outweigh the massive harms caused to all of us by implementing age verification as it stands.
So how do we stop kids from seeing porn, if not by age verification?
We actually already have measures to deal with this: back in 2011 the government worked with ISPs (internet service providers) to come up with a Code of Practice on implementing ‘parental controls’ for all new customers. In 2013 this was adopted by all the major players. So when you (an adult – because you have to be over 18 to do this) register for an internet connection, you are offered adult content filtering by default. You can tweak this, if you like, for example you can decide you’re happy for your family to access social media sites but not pornography. Or if you don’t anticipate any children using your connection, you can opt out of adult filters altogether. Research conducted in 2022, however, found that although 61% of parents were aware of these filters, only 27% actually used them. Again, sing it with me: lol.
The kind of filter that the UK government is attempting to apply to the whole internet – the one that means you have to hand over private data in order to look at perfectly legal content or join in discussions on subreddits or even… watch music videos on Spotify! – that technology already exists. You can apply it at the ISP level, and it is much much better and more comprehensive than anything the government could implement across the entire web. Everyone in the UK who signs up to a new internet connection is offered this kind of filter. There are additional controls available at a device level too – apps and services that limit a child’s internet use, even allowing the parent/guardian to implement granular detail such as blocking individual pages and sites. Internet filters already exist! Parents have had access to these tools for a long long time!
If we want to stop children from accessing porn I’d argue that we should use… them. Use the tools we already have, make sure that parents and guardians are aware of them, and provide education on how they can be used. If those adults don’t know how to do this, that’s a skill/education issue, and one that can and should be solved by giving them better information and guidance – not by blanket age-gating the entire internet, preventing every adult in the UK from accessing perfectly legal content on their own connections, many of which will never be accessed by children to begin with.
Kids exist in the world, and that world has many dangers. It’s important to protect them from those dangers where we can take reasonable and proportionate steps to do so. That’s why we fence off playgrounds, insist on DBS checks for adults working with children, make kids wear cycle helmets when they ride their bikes, etc. But ‘reasonable’ and ‘proportionate’ are key here. What we don’t do is insist that the entire world be made a space as safe as a kids’ playground to the detriment of adults who just want to live their lives. We don’t make pavements out of kid-safe squishy tarmac in case they fall over, insist on DBS checks for all adults in case they ever encounter a child in the wild, or tell drivers to hit the brakes and wait for a child to disappear over the horizon if they spot one nearby on a bike. We make kids’ spaces safe for them, so there are fenced off areas where children can play and parents can relax a little, but at the same time acknowledge that we can’t bubble-wrap the entire world, because adults need to move about that world as well.
This law breaks the internet, destroys people’s online privacy, kills independent websites, wrecks accessibility, trains users to routinely use tools that will bypass government blocks of genuinely illegal content, and offers less protection to children than the tools that are already freely available from every major ISP. That is the harm. And, as I said in the intro, I think it’s one that we should definitely be worried about.
This age verification thing won’t last long, will it?
One of the biggest and most frustrating questions I get asked in the pub about this is: when will the government repeal it? Or at least stop enforcing it?
For the last decade or so, every time the UK government has floated an idea to censor the internet, when I explain to friends and loved ones what the plans are, they all scoff and say ‘but that’s ludicrous! It’ll never happen’ or something along those lines. Now that it actually has happened – arguably in an even worse form than the Digital Economy Bill which was its last incarnation – people tell me ‘well this is ludicrous! It can’t last long!’ and I have to smash my head into the table and get my face all sticky with the cider I spilled while angrily gesticulating as I explained the points above.
At the time of writing, a petition to ‘repeal the online safety act’ has almost 500,000 signatures. The people who scoffed at this when it was only affecting small sites are now paying attention because now they are getting a taste of its impact on the wider internet – being compelled to take a selfie to access BlueSky DMs, or Reddit, or whichever big site it might be. And great, I’m super glad people are paying attention now! Unfortunately, repealing a law is a pretty complex process, and it isn’t one the government is especially interested in doing in this instance. So far its response to the petition has been a big fat ‘no’ with a side order of ‘shut up, perverts.’ More worryingly, the response reads to me as if any attempt to ‘take people’s concerns on board’ will come in the form of guiding Ofcom in how to enforce, but in a vague and generic way with no clarity whatsoever. For example:
“Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.”
What do they mean by ‘sensible’? What counts as a ‘smaller service’? What’s ‘low risk’? What is ‘proportionate and appropriate’ enforcement? How do I as a small site owner determine the ‘impact’ of harm? I’ve diligently done my ‘illegal content risk assessment’ and ‘children’s access risk assessment’, as mandated by the law, and I’ve also blocked UK users from hearing my audio. But I’d argue very strongly that the ‘risk and impact of harm’ of my audio is negligible given that it’s just readings of erotic stories which are perfectly legal in their text-based form. I’d actually argue that there is more harm in me turning off accessibility features than keeping them online. So am I small and insignificant and harmless enough to avoid enforcement? Maybe. I can’t bank on it though. And nor can my colleagues who run other sex blogs. Nor can the owner of a tiny community-run forum giving kink safety tips, or an LGBTQ+ Mastodon instance.
I think this point is an important one to make, because when people shrug off the impact of laws like this and say ‘ah well they’ll repeal it eventually anyway’, what they fail to take into account is that the law is having an effect on these sites right now. Immediately. Some are shutting down. Some are breaking features or blocking UK users from accessing them at all – and remember what I said above? Doing this will have an impact worldwide, not just in the UK, as these sites become harder to discover in things like search. Our government, which apparently cares so much about ‘growth’, is driving a tank through many small businesses – in the adult space and beyond – because it has not properly thought through the implementation of a law that has a very broad impact. On privacy, security, communication, speech and education… and also, yeah, on wanking.
Can’t believe I’ve got this far through a 5,000 word post without explicitly mentioning wanking, so let’s do this.
To be honest with you, even if the Online Safety Act somehow magically avoided all the other issues and only affected your ability to watch/read/listen to porn while you wank… you’re an adult and you should be allowed to do that! Ideally without the government demanding to see your papers before you proceed! You should be able to enjoy erotic content without putting your privacy and safety at risk. I know it sounds odd to end a piece like this by saying ‘I want the government to recognise your right to wank’ but so often this debate descends into talking about the peripherals, as if all our carefully-argued points will be moot if we acknowledge the (healthy, acceptable) desires of adults to see/read/listen to porn. We fall into the trap that the moral prudes have set by treating porn as if it’s inherently shameful or wrong. So then people don’t care much if pornographers are destroyed, the bigger issue is whether other sites might be taken down with them.
Some of us have to stand up for porn as a social good in and of itself, though, so I’m gonna be one of them. Some porn is shit, for sure, and some is exploitative and harmful (there is exploitation and harm in every single industry, porn is not unique here). But a lot of porn, erotica, and other sex-related content is a valuable contribution to society and culture. If you rip that up – or just make it so that the only people who can publish it are massive corporations which put profits over people – then you are doing significant harm. That harm cannot easily be undone if you repeal this law in a year’s time. The ethical pornographers who delete their sites today can’t just pick it all up where they left off in 2026 or after the next election. I find it depressing and chilling that the discourse on age verification (and other forms of adult censorship) so rarely acknowledges this. Burning the books today means you can’t read them tomorrow. Destroying independent, ethical adult websites today means the landscape changes significantly in future.
This is harm, too. Destroying work by people who are creating legal media that represents adult desires is incredibly harmful. It’s a significant loss to the adult you are now, and the adults your children will one day grow up to become.