Age verification: what’s the harm?

Image by the brilliant Stuart F Taylor

Welcome, friends, to my grubby little corner of the internet. A corner so strewn with obscenity that the UK government has decided you must prove you’re a grown-up before you can access certain parts of it. The UK’s new Online Safety Act has come into force, so UK people might have noticed a bunch of websites suddenly demanding you take a selfie, share your credit card details, or jump through another hoop to prove that you’re over 18. Quite a few of my friends have been discussing this in the pub, because for understandable reasons people who aren’t embedded in the world of online pornography or internet law are suddenly curious about why the internet is now so very broken. They’re also often convinced that the government will change its mind and therefore no one really needs to worry. I’ve had this conversation so many times now that I reckon I’ve got the basis for a fairly solid layperson’s guide to age verification: what it is, how it affects you, and why we absolutely, genuinely do need to worry.

What is the Online Safety Act?

The Online Safety Act is a piece of legislation designed to do many things, but the thing we’re most concerned with here is ‘protecting the children’. The Act aims to protect children from various ‘online harms’ including things like eating disorder inspiration content, content which might tell them how to harm themselves or others, and – ’twas ever thus – pornography.

Much of the content that falls into these categories – the latter especially – is completely legal to produce, and legal for adults to enjoy. However, the government wants to stop children from stumbling across it accidentally, so it is attempting to compel websites to add age-gates – to their sites as a whole if they’re adult, or to various functions and sections if they contain a mix of content. BlueSky users in the UK were recently asked to prove their age to see any posts marked as ‘adult’, as well as use the DM function, and now even sites like Spotify have had to implement it, because some of the music videos they show are +18 only.

Protecting children seems like a good idea! What’s the harm in age verification?

This is a big question so let’s break it down.

Privacy

This is the biggest one from an individual user perspective. Firstly, in order to prove your age you’re being asked to hand over some fairly important personal details. In some cases a photo of your face, sometimes your credit card details, a passport or driver’s licence. Incidentally, I know a lot of people who are saying ‘I just uploaded the selfie to age verify because then I’m not giving them much data’ – I beg to differ. You know who else uses selfies (or, to use the technical term, ‘biometrics’) to verify who you are? Actual banks! I use my face to approve transactions in the Natwest app, and I’m not the only one. Selfies are still data, and these days that’s important data, my friends!

Usually the company you’re handing these details to is a third party, often one you will never have heard of before. You’re being asked to just trust that they’ll delete whatever you sent after having verified your age. Do you? Do you believe it will work 100% of the time? How often do you get emails from companies explaining they’ve been the subject of a hack and your personal data has been exposed? Do you still feel safe handing these personal details over, often via a site where you’re viewing things which you’d prefer to remain private?

The data that is being collected for age verification purposes is extremely tempting to hackers, some of whom have already had great success targeting ID verification data from companies like Uber and TikTok. And for most sites, you don’t get a choice about who verifies you. The Open Rights Group has already highlighted the fact that Reddit, Grindr and BlueSky have selected age verification companies with problematic privacy policies, and at the moment there is no specific regulation outlining the security standards that these companies should meet in order to qualify to process this extremely personal information about you – only existing data protection law. Is the company asking for your ID legit? Well, you just have to take that on trust.

Let’s say all the current age verification providers are incredibly robust, though. They all have brilliant security, adhere to the highest standards of care and competence when dealing with your private data (stop laughing at the back!), and you have nothing to worry about as an individual when it comes to your details getting leaked or hacked. The question still remains… should you be sharing this information with random websites anyway? After all, the companies may be as legit as they claim, but once you’ve trained the population of an entire country to routinely hand over their credit card details in order to access content, you have given them an incredibly bad habit that it’s going to be tough to break. A habit which bad actors (like those who want to set up a new phishing scam, for instance) will be ready and willing to exploit. You don’t just prove your age once, after all, you potentially have to do it dozens of times, to access a bunch of different websites. Everything from BlueSky to PornHub to Spotify and even maybe Wikipedia. It becomes a weekly or perhaps monthly occurrence. Just as individual users don’t tend to read every website’s terms and conditions, it’s unlikely they’re all going to do due diligence checks on every provider who asks for ID, especially once they’ve become used to just handing that data over.

And although that may not be a problem for you, you tech-savvy cleverclogs, if you’ve ever found yourself in the position of unpaid IT support for one of your less knowledgeable friends or relatives, hopefully you can see why it’s a huge problem for the UK population more broadly.

Censorship of non-porn material

If you wondered why I mentioned Wikipedia in the section above, and thought ‘how the hell can an online encyclopaedia count as porn?!’, let me explain. It’s not because if you look hard enough on Wiki you can find the occasional dick, it’s because Wikipedia involves community editing. And where there is this kind of collaboration, there is the potential for users to send each other content that falls under scope of the Act. And therefore Wiki – famously very wary of collecting data from its anonymous contributors – must now insist that the unpaid volunteers who edit the site hand over the kinds of personal info mentioned above. Wikipedia is challenging this in court, because they believe (I think correctly) that this will lead to greater censorship of the platform – wherever you require people to hand over ID, you reduce the opportunities for people from marginalised or persecuted groups to contribute. A spokesperson for Wikipedia said that insisting on age verification could leave users open to “data breaches, stalking, vexatious lawsuits or even imprisonment by authoritarian regimes”.

Wiki isn’t the only site concerned about censorship, though, and early implementation of age verification in the UK shows that it – like any attempt to regulate ‘porn’ – ends up catching a lot of other content too. Reddit users have already begun documenting subreddits that have been age-gated, including support forums for sexual assault survivors and help on how to quit smoking. On top of this, a tonne of LGBTQ+ content and sex education has already been caught in the net. Expect much much more of this to happen going forward.

This isn’t a question of just getting the government to write in exemptions, either. Although Wikipedia may earn an exemption through the court case (I hope it does), one of the core problems with the Online Safety Act in implementation is that the definitions are incredibly broad and the penalties are potentially extremely harsh. Websites need to assess whether they have a ‘significant number’ of UK users – what’s significant? 10% of total traffic? 10 people? 10,000 visitors per month? They also need to consider whether it’s likely to be accessed by children. What does that mean – ‘likely’? Sites which aren’t marketed to children or shared in any spaces where children are likely to be browsing… are they exempt? We don’t know. What we do know, however, is that sites which do not comply will be investigated by Ofcom, and potentially fined up to 10% of their annual revenue or £18 million – whichever is greater. The chilling effect of penalties like this, especially when combined with ‘guidance’ from the regulator that could generously be described as ‘vague’ means that any site with any content that could potentially be classed as ‘harmful to children’ would be taking a giant leap into the expensive unknown if they didn’t proactively comply. And compliance with ‘age verification’ is costly and time consuming: I personally can’t afford to do it, which is why I’ve just blanket blocked UK users from hearing the audio. Many other sites – both adult and non-adult – are coming to the same conclusion. Check out the Blocked page, from the Open Rights Group, which is tracking site closures and blocks as a result of the Act. Submit any sites you know of that are doing this too – let’s keep track of what we’re losing.

So there’s another harm: you’re not just losing access to this content unless you hand over private details, in many many cases (particularly with smaller sites and services) you’re losing access to it entirely. Even flashing your passport won’t get you to the content, because the site owner can’t afford to hire a bouncer to check your ID.

Transferring power from small porn sites to the big players

So, since July 25th most porn is now either age-gated or blocked to UK users entirely. You might think this is good, because isn’t that the point of the law? Ehhhh. Kind of. But also not really. The sites that child safety campaigners are usually thinking of when they make these arguments are the big, flash, video-based tube sites which offer porn for free. Those sites have the money and tech expertise to implement age verification without trouble, so they aren’t going anywhere any time soon. At the other end of the scale, though, smaller sites and hobby sites which don’t make a profit are more likely to struggle. As mentioned above, many of them are already shutting down or blocking UK users from some or all of the content, meaning those who are looking for erotic material will end up turning to the exact sites that anti-porn crusaders are so up in arms about.

As I say, this probably isn’t going to bother you that much if you think all porn is bad by default. But if you are an adult who recognises the value of legal erotic media (in whatever form), then hopefully you can see why this is a chilling situation. The UK government’s implementation of AV, without any exemption or concession for small sites, essentially means that those with the deepest pockets will get the most traffic. That means the large, ‘free’ porn tube sites – already an extremely dominant force in the adult industry, hoovering up a lot of the money and even shaping how we define ‘porn’ in the first place – will only become bigger and more powerful. Meanwhile those smaller sites trying to swim against the tide, offering a view of sexuality that is broader and more diverse than what you see on the front page of TubeFuck will struggle to get traction. As TechDirt put it this week:

“This is exactly what happens when you regulate the internet as if it’s all just Facebook and Google. The tech giants can absorb the compliance costs, but everyone else gets crushed.”

Blocking UK users means that small, independent sites are harder to find. They will lose traffic, and therefore revenue (for further marketing) as well as word of mouth. What’s more they will drop down in search algorithms because when users click through they won’t see the content they have searched for. Example: my own site has already seen a decrease in search rank since I had to block users from hearing the audio porn back in March. For understandable reasons, if someone in the UK searches ‘audio porn‘ and they get sent to a site where none of the audio plays, they’re more likely to click away and not come back. So Google downranks me. This rank isn’t something that can easily be regained, and it affects people outside the UK too – if sites are forced to block or restrict certain users, those sites may be recommended less frequently in algorithms, so you’re less likely to come across them too. You can help with this by sharing links that you like, or supporting your favourite porn creators financially, but ultimately you as a user don’t have much say in what stays and what goes from the internet. The sites which can afford to age verify will stay, and those which can’t will eventually go. Leaving only the big players – the ones anti-porn campaigners and child safety advocates use as their examples when they talk about the ‘harms’ of porn in the first place.

I guess at least kids won’t be able to see the sexy shit though, right? … Right?!

Training users to bypass filters and blocks

This part is going to be tricky, so bear with me. The UK regulator Ofcom has released guidance to sites on how to comply with the Online Safety Act, and they’ve expressly said that sites like mine should avoid encouraging people to use certain tools to bypass blocks. I’m interpreting that to mean that I also shouldn’t tell you how to do it. It would be absurd for me not to be allowed to acknowledge that it’s possible, though. I personally use a [TOOL I AM NOT ALLOWED TO MENTION] on a daily basis, because I have a block in place to stop UK users from listening to the audio on my site, so without one of these tools I wouldn’t be able to to bypass my own site block and therefore do my job. Lol.

What’s more, I’ve noticed an uptick in people playing the audio on my site since the Online Safety Act came into force. I’m not sure it’s statistically significant so I’m keeping a close eye on it, but that stat gels with news I’ve read elsewhere about these tools suddenly seeing a huge spike in both online searches and downloads from app stores. If more people are bypassing blocks, more people will presumably be bypassing my block too. Friends with whom I have chatted about this in the pub tend to move swiftly from ‘isn’t the law an ass?’ to swapping recommendations for these kinds of tools so I think it’s safe to say that the Online Safety Act itself has done far more to educate UK users about them than my own little site ever could.

I’m not encouraging you to use these, but I do need to mention them in order to say that I think any law which drives consumers to try and bypass content filters is generally a bad idea. Why are you making it so that the population of your country has to bypass blocks… in order to access perfectly legal material? Some content blocks are in place for a very good reason, for instance to prevent people being exposed to child sexual abuse material, terrorism material, and other types of content which are genuinely illegal. Last week Labour MP Peter Kyle told us that “If you want to overturn the Online Safety Act you are on the side of predators. It is as simple as that.” But it’s not as simple as that at all – perhaps the opposite. There are perfectly legal tools which will get you round the age verification requirements imposed by the UK government, but most people didn’t bother using them before because… why would they? Now that large chunks of the internet are functionally broken for UK users, more of them will use these workarounds, and many will now be in a space where even content which is blocked because it’s genuinely illegal (and harmful to adults, as well as children) is easier to access. For the same reason I am noticing an uptick in people listening to my (legal, ethical) audio porn since July 25th, I imagine sites which host genuinely illegal material will see the same.

On top of this, it’s worth noting that those who say ‘ah I’m OK, I can just use a [TOOL I AM NOT ALLOWED TO MENTION]!’ often forget that the knowledge – of what these are and how to use them – is not something everyone has. By nudging UK users towards these tools, the government has essentially created a two-tier internet. Some adults, those who are tech-savvy and who can afford it, can continue to browse without handing over their personal data or putting themselves at risk of the hacks and scams mentioned above. Others, who don’t have the knowledge, money or time to investigate these things, are exposed to significant privacy risks. Again, worth remembering: these are risks they’re being pushed to take to access perfectly legal material.

Breaking accessibility

This is a personal bugbear of mine, and I wish more people would take note of it: because of the way the law defines ‘porn’, text content is out of scope of the Online Safety Act while audio content is in. That means sites like mine which offer audio as an alternative to text (for blind readers, and anyone else who’d otherwise use a screenreader) have been forced to break important accessibility functions in order to comply with the law. Feels pretty crappy, right? When I discuss this issue in the pub with my friends, people are genuinely gobsmacked by the idea that it is OK for me to publish erotic stories when they’re written down, but not OK for me to post the accompanying audio. Even more absurd when you realise that AI text-to-speech generators exist, and can read my words aloud (but without a lot of the tone, some of which is often valuable in conveying things like consent and enthusiasm), but ‘paying other creators to record the work into a mic’ magically turns that content into something that is harmful to children.

Isn’t all this worth it if it protects children, though?

Ehhhh. Hmm. This is debatable. I’m going to have a crack. Firstly, it’s not entirely clear that this new law will protect children. Sure, it might prevent some children from stumbling across porn by accident, or being sent TubeFuck links by shitty kids in their class at school who want to show a blow job video for shock value in the group chat, but there are other factors at play that make this question a lot more complex.

  1. If parents and carers believe the internet has now been successfully child-proofed, are many of them going to relax how they monitor their child’s usage of the internet, and no longer feel the need to educate them about safe browsing habits or use the filtering tools that are readily available on their kids’ devices and connections?
  2. Can children bypass the age checks anyway? The answer to this is already ‘yes, and people have shown how this is done in some quite hilarious ways’, but I’m not allowed to go into detail about them here.
  3. Is the harm children might face from accidentally seeing some adult content greater than the harm to those children as they grow into adults who have been trained to upload their personal data to any website that asks for it?

This last point is the one I find most compelling, personally. The thing about kids is that they turn into adults, and I think it’s more important for us to gift the adults they will become with a free and open internet, where people have access to sex education content, LGBTQ+ content, and erotic material rather than persistent digital surveillance. I obviously don’t want kids to watch porn, it’s not for them, but I also don’t think that this law will do much to prevent that from happening – certainly not enough to outweigh the massive harms caused to all of us by implementing age verification as it stands.

So how do we stop kids from seeing porn, if not by age verification?

We actually already have measures to deal with this: back in 2011 the government worked with ISPs (internet service providers) to come up with a Code of Practice on implementing ‘parental controls’ for all new customers. In 2013 this was adopted by all the major players. So when you (an adult – because you have to be over 18 to do this) register for an internet connection, you are offered adult content filtering by default. You can tweak this, if you like, for example you can decide you’re happy for your family to access social media sites but not pornography. Or if you don’t anticipate any children using your connection, you can opt out of adult filters altogether. Research conducted in 2022, however, found that although 61% of parents were aware of these filters, only 27% actually used them. Again, sing it with me: lol.

The kind of filter that the UK government is attempting to apply to the whole internet – the one that means you have to hand over private data in order to look at perfectly legal content or join in discussions on subreddits or even… watch music videos on Spotify! – that technology already exists. You can apply it at the ISP level, and it is much much better and more comprehensive than anything the government could implement across the entire web. Everyone in the UK who signs up to a new internet connection is offered this kind of filter. There are additional controls available at a device level too – apps and services that limit a child’s internet use, even allowing the parent/guardian to implement granular detail such as blocking individual pages and sites. Internet filters already exist! Parents have had access to these tools for a long long time!

If we want to stop children from accessing porn I’d argue that we should use… them. Use the tools we already have, make sure that parents and guardians are aware of them, and provide education on how they can be used. If those adults don’t know how to do this, that’s a skill/education issue, and one that can and should be solved by giving them better information and guidance – not by blanket age-gating the entire internet, preventing every adult in the UK from accessing perfectly legal content on their own connections, many of which will never be accessed by children to begin with.

Kids exist in the world, and that world has many dangers. It’s important to protect them from those dangers where we can take reasonable and proportionate steps to do so. That’s why we fence off playgrounds, insist on DBS checks for adults working with children, make kids wear cycle helmets when they ride their bikes, etc. But ‘reasonable’ and ‘proportionate’ are key here. What we don’t do is insist that the entire world be made a space as safe as a kids’ playground to the detriment of adults who just want to live their lives. We don’t make pavements out of kid-safe squishy tarmac in case they fall over, insist on DBS checks for all adults in case they ever encounter a child in the wild, or tell drivers to hit the brakes and wait for a child to disappear over the horizon if they spot one nearby on a bike. We make kids’ spaces safe for them, so there are fenced off areas where children can play and parents can relax a little, but at the same time acknowledge that we can’t bubble-wrap the entire world, because adults need to move about that world as well.

This law breaks the internet, destroys people’s online privacy, kills independent websites, wrecks accessibility, trains users to routinely use tools that will bypass government blocks of genuinely illegal content, and offers less protection to children than the tools that are already freely available from every major ISP. That is the harm. And, as I said in the intro, I think it’s one that we should definitely be worried about.

This age verification thing won’t last long, will it?

One of the biggest and most frustrating questions I get asked in the pub about this is: when will the government repeal it? Or at least stop enforcing it?

For the last decade or so, every time the UK government has floated an idea to censor the internet, when I explain to friends and loved ones what the plans are, they all scoff and say ‘but that’s ludicrous! It’ll never happen’ or something along those lines. Now that it actually has happened – arguably in an even worse form than the Digital Economy Bill which was its last incarnation – people tell me ‘well this is ludicrous! It can’t last long!’ and I have to smash my head into the table and get my face all sticky with the cider I spilled while angrily gesticulating as I explained the points above.

At the time of writing, a petition to ‘repeal the online safety act’ has almost 500,000 signatures. The people who scoffed at this when it was only affecting small sites are now paying attention because now they are getting a taste of its impact on the wider internet – being compelled to take a selfie to access BlueSky DMs, or Reddit, or whichever big site it might be. And great, I’m super glad people are paying attention now! Unfortunately, repealing a law is a pretty complex process, and it isn’t one the government is especially interested in doing in this instance. So far its response to the petition has been a big fat ‘no’ with a side order of ‘shut up, perverts.’ More worryingly, the response reads to me as if any attempt to ‘take people’s concerns on board’ will come in the form of guiding Ofcom in how to enforce, but in a vague and generic way with no clarity whatsoever. For example:

“Ofcom will take a sensible approach to enforcement with smaller services that present low risk to UK users, only taking action where it is proportionate and appropriate, and will focus on cases where the risk and impact of harm is highest.”

What do they mean by ‘sensible’? What counts as a ‘smaller service’? What’s ‘low risk’? What is ‘proportionate and appropriate’ enforcement? How do I as a small site owner determine the ‘impact’ of harm? I’ve diligently done my ‘illegal content risk assessment’ and ‘children’s access risk assessment’, as mandated by the law, and I’ve also blocked UK users from hearing my audio. But I’d argue very strongly that the ‘risk and impact of harm’ of my audio is negligible given that it’s just readings of erotic stories which are perfectly legal in their text-based form. I’d actually argue that there is more harm in me turning off accessibility features than keeping them online. So am I small and insignificant and harmless enough to avoid enforcement? Maybe. I can’t bank on it though. And nor can my colleagues who run other sex blogs. Nor can the owner of a tiny community-run forum giving kink safety tips, or an LGBTQ+ Mastodon instance.

I think this point is an important one to make, because when people shrug off the impact of laws like this and say ‘ah well they’ll repeal it eventually anyway’, what they fail to take into account is that the law is having an effect on these sites right now. Immediately. Some are shutting down. Some are breaking features or blocking UK users from accessing them at all – and remember what I said above? Doing this will have an impact worldwide, not just in the UK, as these sites become harder to discover in things like search. Our government, which apparently cares so much about ‘growth’, is driving a tank through many small businesses – in the adult space and beyond – because it has not properly thought through the implementation of a law that has a very broad impact. On privacy, security, communication, speech and education… and also, yeah, on wanking.

Can’t believe I’ve got this far through a 5,000 word post without explicitly mentioning wanking, so let’s do this.

To be honest with you, even if the Online Safety Act somehow magically avoided all the other issues and only affected your ability to watch/read/listen to porn while you wank… you’re an adult and you should be allowed to do that! Ideally without the government demanding to see your papers before you proceed! You should be able to enjoy erotic content without putting your privacy and safety at risk. I know it sounds odd to end a piece like this by saying ‘I want the government to recognise your right to wank’ but so often this debate descends into talking about the peripherals, as if all our carefully-argued points will be moot if we acknowledge the (healthy, acceptable) desires of adults to see/read/listen to porn. We fall into the trap that the moral prudes have set by treating porn as if it’s inherently shameful or wrong. So then people don’t care much if pornographers are destroyed, the bigger issue is whether other sites might be taken down with them.

Some of us have to stand up for porn as a social good in and of itself, though, so I’m gonna be one of them. Some porn is shit, for sure, and some is exploitative and harmful (there is exploitation and harm in every single industry, porn is not unique here). But a lot of porn, erotica, and other sex-related content is a valuable contribution to society and culture. If you rip that up – or just make it so that the only people who can publish it are massive corporations which put profits over people – then you are doing significant harm. That harm cannot easily be undone if you repeal this law in a year’s time. The ethical pornographers who delete their sites today can’t just pick it all up where they left off in 2026 or after the next election. I find it depressing and chilling that the discourse on age verification (and other forms of adult censorship) so rarely acknowledges this. Burning the books today means you can’t read them tomorrow. Destroying independent, ethical adult websites today means the landscape changes significantly in future.

This is harm, too. Destroying work by people who are creating legal media that represents adult desires is incredibly harmful. It’s a significant loss to the adult you are now, and the adults your children will one day grow up to become.

 

20 Comments

  • LizzieB says:

    I’m not massively in the weeds of the law because everything is horrible in the world, but why is it “OK for me to publish erotic stories when they’re written down, but not OK for me to post the accompanying audio.”? Why is the audio the issue? Is it in case a child of non-reading age hits the play button? Because I kinda feel that if they can’t read, they won’t be able to follow a lovely erotic story when read to them, but I’m not a child developmental expert.

    I always feel laws like this are so badly thought-out and planned. If they wanted to stop the CSAM sent on Meta platforms, then you make a law that punishes Meta if police find that CSAM has been sent on Meta platforms. If they want to stop eating disorder content on TikTok, start fining TikTok for ED content. The issue is these big platforms not taking accountability for being a publisher and an enabler of this content, but for some reason they decided it’s better to attempt to blanket ban porn via age verification than actually change or enforce laws.

    • Girl on the net says:

      Goooood question re: audio. To be honest if a child who were too young to read had been given unsupervised access to the internet I reckon they’d have bigger problems than clicking ‘play’ on my audio (parental neglect for a start), but that’s not the reason. The reason is if anything even weirder than that. It’s a result of a problem that arises any time governments try to define ‘porn’ in a meaningful way. Basically, in drafting the law, they had to somehow define what counts as ‘pornography’ so they could say that XYZ sites are ‘pornography’ and need to age gate. But how do you define porn? There’s a broad definition of ‘content that is produced solely or principally for the purposes of sexual arousal’, but then there are exemptions to this written into the law, and text is one of them. So the government clearly doesn’t think that text alone is enough to cause material harm to children (though I’d argue that *some* text could, but I doubt any of mine ever would), but because audio is a different format that could be in scope of the law. There’s more information on this over at Neil’s amazing website on the Online Safety Act here. But yeah, trying to define porn while exempting sites which might just be giving text-based info leads to this bizarre clash – and it’s still not actually clear to me whether my content is in scope. When the act says that ‘audio’ can be classed as porn… do they really mean audio of me reading stories aloud? Or is the intention of the law to capture sites which publish audio files of people having sex/masturbating? It’s not clear. And Ofcom has not yet issued clarity on that.

      And yeah I agree – I think a lot of this law was drafted under the guise of ‘doing something’ about big platforms (not just Meta and TikTok but adult platforms like PornHub too) and in drafting it, ministers refused to listen to any of the technical experts who were telling them that the impact on smaller sites – and the net more broadly – would be ruinous. If it’s the big sites they want to focus on, they should say that, and exempt small sites from being forced to comply in ways they can’t afford to do anyway.

  • Clair says:

    Yes to all of this.

    And in particular, the prudishness that people show towards sex content drives me nuts. Every single one of us is here because our ancestors shagged.

  • Jaimie says:

    Great article which neatly encapsulates my thoughts and worries. I think you’re dead right about the law (enacted by the Conservatives, incidentally, but mindlessly implemented by Labour) being about being seen to be doing something. Surely those who drafted it and their technical advisers didn’t fail completely to foresee the ‘tool which you refer to’, immediately rendering the act impotent? I can’t believe that, which just adds weight to the argument that this was about appearances rather than a genuine belief that it would do what it said on the tin.
    I have some (quite a lot, actually) knowledge and experience of the legal system as it applies to England and Wales, and I am very familiar with a number of acts of parliament which are vague, deliberately, in how definitions are drafted. As you rightly say above, the definition of ‘pornography’ is especially problematic and particularly so for sites like ours. I (or you) can publish some prose which is the most unutterable filth imaginable (yum), but if it is not accompanied by a picture, you’re all good. Add a picture, though, and you’re OK as long as the picture itself is not pornographic in nature.
    Right.
    So, I like taking pictures of my arse. Maybe a bit of cleavage. Flash of stocking-top sort of thing. I use them to illustrate my sleazy prose. Are they porn, though? If they are, I’m up shit creek and waiting for a knock on the door from the OFSTED brown-shirts. It could be argued that they are pornographic, as, potentially, you could infer that I took those pictures for the purpose of sexual arousal. I want them to be suggestive, but you’ll find nary a nipple in sight, let alone anything even more graphic. Does that fall within the definition, then? Who the fuck knows, frankly. Common sense dictates that they do not, but if we were in the realms of common sense, you wouldn’t have just written this article.
    We are only likely to get real clarity when a body of case law – decided precedent – from the High Court and above, begins to emerge. That’s likely to take a long time, though: The big players, as you say, can shrug off the cost of AV and just get on with it. The ones likely to be caught and want to challenge – like you and me – are the ones not likely to be able to bankroll a judicial review, and would likely just draw stumps and shut down or geo-block the UK entirely. Result: the internet becomes less diverse, less vibrant, and even more in the grip of big-money players.
    What an absolute cluster-fuck. Jx

  • mark says:

    for what it’s worth here’s what Liana k has said (she used to be on T.V in Canada now does mental health videos on youtube) https://www.youtube.com/watch?v=TTxaF-oFwrI

    i have posted your blog to her on bluesky but don’t know if she’ll read it and posted it to her as you counter in this blog post some of the point’s she’s making.

    for what it’s worth i think the advice she’s giving is mainly being made for everyone trying to view content online more than for the producers of content.

  • John says:

    I’m going to read this properly over the weekend. But jeez, GOTN, you’ve clearly put a lot of time and thought into this piece. And for that i thank you. It is important…

  • SpaceCaptainSmith says:

    Well… yeah. Many people have been warning for years that this would be a disaster, and here we are, proving them right.
    I don’t think there’s much I can add to this post. There are so many examples of content which has been wrongly deemed suitable for adults only (from support groups for self-harm to posts about Gaza) but it’s not necessarily to list them all to reinforce the point.
    I’d instead like to share a different piece which is about the impact on sex workers specifically. This isn’t so much about age verification itself as other parts of the act, but the consequences are the same: the people providing legal services for adults are unfairly punished, while the ones causing actual harm to children escape unaffected.
    https://www.polyesterzine.com/features/sex-worker-online-safety-act

  • fuzzy says:

    I resent living in a world where graphic depictions of violence and gore are acceptable but graphic sex between consenting adults is verboten. I resent living in a world where meanness and greed and fascism and genocide are ok to tolerate but gods help us if we want to pay a sex educator or pornographer or erotica author or sex worker and can’t use a credit card because the fucking puritan credit card companies have “policies” which are the equivalent of what is going on here with the internet.

    Boooooooo.

    • Girl on the net says:

      “I resent living in a world where graphic depictions of violence and gore are acceptable but graphic sex between consenting adults is verboten” Honestly, this times a million. Write it in the fucking sky. I don’t understand it, and I find it really repulsive to be honest.

  • bob bobbly says:

    I read this elsewhere, and I am paraphrasing, but the the issue of trust could have been solved by the UK government providing an age verification service. Individuals sign up and give their data to this service. Then when you need to prove your age to a third party service, you request a time limited token or code from the government service that vouches for the age of that person but crucially is anonymous and does not contain any information about that person. The third party service can verify the token and allow access to the service. This could have been a model for any government wanting to enforce age restrictions. I appreciate that we would be trusting the UK Goverment, but so long a there was no way of tracing the token request and token verification back to the individual it has to be preferrable to trusting these dodgy or shadowy age verification services.

    • Girl on the net says:

      Yep that would be a better way of doing it and might marginally help with one of the many problems. But only marginally, and only one.

  • finagle says:

    This is just adding to the enforcement side of your discussion. All your points about the harm and ineffectiveness of the technology I definitely +1.
    From the IT security and privacy side the Online ‘Safety’ Act is unenforceable. As was the Digital Economy Act before it, which was in turn a response to previously failed legislative attempts, and lead to the OSA in an attempt to move the burden from the middleware where it was being bypassed to the end sites who can’t enforce it in any useful way. This was after the ISPs highlighted that they could not do it after being tasked with it way back before DEA. The ISPs log all http(s) traffic, but can’t do anything with https other than record the sites visited. The OSA is in many ways an attempt to block the holes in the DEA by people who have no idea about technology. There has been a surge in ‘certain kinds of tools’ since the DEA, and the OSA was obsolete even before it was passed because of the uptick in those tools. Like the ones we both use all the time.
    Whilst the government cannot outlaw those tools without breaking home working for every moderate size company upwards and ending up on the same human rights violation lists as China, they are relying on private services like Cloud flare to make them unusable by treating all their public endpoints as botnets.
    Meanwhile the use of Ofcom to enforce the OSA is at least a nod, if a private one, that the law is unenforceable. Ofcom have no background or skills in the internet content side. They have no competent staff, no crawlers, and even if they did it has so far proven impossible to accurately identify adult material. In fact the only tools available for identifying CSAM material rely on having known CSAM to directly compare to. Whilst AI may (in the far future) increase the hit rate, training it and filtering the output is too big a task for the budget assigned. What Ofcom will be doing for enforcement will basically be getting reports of sites handed in, and researching them manually.
    The first time they bring any kind of action the law is then going to be tested in court. Merely passing a law has no effect in the UK until it has been tested in court. Unless the site in question is the originator of the content, the defendant is going to point to Safe Harbor and platform exceptions in case law history. For those that don’t know, when the first audio recording devices appeared there were attempts to ban them for copyright infringement, whilst still allowing them to be used by those bringing the case. The courts decided the mechanism was legal, the intended use mattered on a case by case basis, and that has been tested and retested in law repeatedly. Meta et al live on that law, because they point to it every single time anyone attempts to make them responsible for how their platform is used (Cambridge Analytica). So the only sites that can be enforced against are the originators, like your good self. Even then they have latitude.
    Meanwhile the companies like Meta and Google, sit back, pushing out media on how diligent and safe they are whilst breaking other UK laws like GDPR every single second of every single day. They are now using age verification to enrich the data they hold on every user, increasing the value of their main assets, data about the users.
    Which brings us on to GDPR. The ICO are hell on wheels compared to Ofcom. Ofcom primarily exists so BT (historically, and other companies now) could claim to be a regulated monopoly whilst doing what they liked. The ICO exist to enforce privacy and security, and they are not for example staffed by directors of BT on sabbatical. But when you look at the UK record of enforcement on GDPR, which is clear and well written legislation, they are way behind for example Ireland. Have a look at the fines handed out. GDPR goes WAY beyond OSA in terms of fines but the maximum levied so far is I think 7 or 8 orders of magnitude lower than the law allows, even 5 years into enforcement. (GDPR had a 1 year education and tolerance period post May ’19). That’s with a dedicated, expert, informed, committed staff. Compare and contrast to Ofcom. Especially when Ofcom are being asked to judge what is significant or worth pursuing. It looks like a blatant nod to ‘yeah, not going to try to enforce this **** either’.
    So when it comes to respecting OSA, the government clearly don’t.
    Meanwhile it is being used to breach privacy regulations on one hand, increase share holder value, drive competition out of business and so on. One might almost think that members of government were employees of big tech companies (Mr Miliband?).
    Whilst the government, richly ignorant of how tech works, continue to defend the existence of the law, since it makes them look like they care, the real test of the law is probably never going to happen. It’s unlikely any case will ever see court. DEA was abandoned before it was live, because it was recognised to be unenforceable. OSA has managed to go live, but I very much doubt a test case will ever make it to court, because if it does, the law will not need repealing, the case will be thrown out effectively voiding it.

    • Girl on the net says:

      As someone below has already commented, there’s quite a lot in your comment that doesn’t reflect how the law in the UK actually works. I’d also argue that your point “It looks like a blatant nod to ‘yeah, not going to try to enforce this **** either’.” is demonstrably untrue – Ofcom has already begun enforcement on a number of websites, you can see some here: https://www.ofcom.org.uk/enforcement And here’s a specific example: https://www.ofcom.org.uk/online-safety/illegal-and-harmful-content/investigation-into-4chan-and-its-compliance-with-duties-to-protect-its-users-from-illegal-content

      I think your point about the law being tested in court is interesting because I would love to see this law tested in court (particularly under, for example, the right to privacy and whether the OSA is a proportionate means of achieving a legitimate aim), but the idea that laws don’t mean anything until they’re tested in court is damaging. Even if you discount every possible practical step that the government, law or regulatory bodies *could* take against a site owner, the chilling effect of a law like this (especially with the potential fines on the table) is dramatic and significant. It’s easy to say ‘well wait till it’s challenged in court’ but your average site owner can’t wait till it’s challenged in court because they could lose not just their business/hobby but potentially also their home, if the fines levied were high enough. Would this happen? is it likely? I hope not. But it’s possible. And all the while it’s possible, we will lose access to valuable content and the people making it may well stop altogether.

  • Jon says:

    Given how closely this coincides with the escalating attack on trans rights I can’t help but worry that some of the impacts are not accidents, but part of the plan all along.

    I am also deeply bothered by the fact that even with what seems to be a renewed commitment to sex education to address the problems associated with (certain kinds of) online pornography there are some apparent gaps which will undermine the protective potential. One cannot deny that the internet is full of horrors but hiding them from children is not the final solution. Relying on tech giants to do this (when many think tech giants contributed to the problems) strikes me as baffling.

    Finally, the irony that many of the adults who argue ‘think of the children’ have already succumbed to various online harms themselves. Maybe we shoulds rename the country The United Kingdom of Mumsnet.

    • Girl on the net says:

      “what seems to be a renewed commitment to sex education” – honestly, this isn’t really anything other than the government trying to shape *what* is taught (see their chilling insistence that younger kids shouldn’t be taught things that might encourage them to question their gender). There’s not a renewed commitment to sex ed, that would come with long term cash and support for those who are doing good work in this area. I think this is another shiny way for them to say they’re Doing Something but without putting any money behind it, or actually listening to experts about what the right thing to do is.

      “the irony that many of the adults who argue ‘think of the children’ have already succumbed to various online harms themselves” Yes. Absolutely this. It’s so depressing. Many people still genuinely seem convinced that even if this is a privacy nightmare and a problem in many other ways, kids will still be protected. They won’t though.

  • Olaf says:

    There is a whole reasoning problem behind “protecting the children” by hiding the real world from them. At some point they are not children any more and will have to face reality. But they have not been able to prepare for this. That is soooo much better than receiving an education….

  • John says:

    I can’t speak to Finagle’s technical knowledge, but the comments on enforceability are demonstrably, obviously wrong.

    “Merely passing a law has no effect in the UK until it has been tested in court.”

    Passing a law has immediate effects even if some people disregard it or breach it. The court cannot strike down primary legislation.

    • Girl on the net says:

      Yeah agreed, I’ll reply properly above. But even aside from the fact that passing laws *does* have practical effect (Ofcom is granted budget for enforcement, for example), it also has an immediate and chilling effect on content creators. I’ve been complying with the law since before July, because I couldn’t risk enforcement action. I know others who have closed their websites/projects in advance of it coming in to effect.

  • Bruce says:

    ” iT’s all about ‘control’; it’s not about doing-good, it’s about exerting POWER. That’s what Governments do. Maybe if every Member was voted out at the end of the First Term, some might get the message that it’s supposed to be about SERVING constituents and CONTROLLING constituents is Not Acceptable. I am not going to hold my breath.

  • Jon says:

    Regarding the government and sex education – yeah, that’s fair. I am pretty sure that had the previous government been in power much longer they would have rolled back the legislation which was trying to make sex education mandatory. The curricula I’ve studied recently have a lot of issues and are struggling to balance conflicting demands but there are some points which could be positive if the right people with the right motivations were involved. Unfortunately and frustratingly, that seems less and less likely to be the case given the current ideological cowardice and backtracking we are seeing.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.