Last week journalists had a ball with the news that We Vibe has agreed to pay out millions of dollars to users of its ‘smart’ sex toys, after a legal dispute surrounding the data that the company collected about toy use. Some users were understandably annoyed by this, and they launched a class action which cost We Vibe millions of dollars to settle. But it hasn’t put me off smart sex toys: not one tiny little bit. And I’m going to try and explain why.
Let’s take a look at some of the details of the We Vibe data story. Firstly, as demonstrated at Def Con 24, the fact that it’s potentially possible for someone else to control your vibrator. This sounds scary, though it is in fact true of pretty much any ‘smart’ system that it could be hacked and controlled remotely – lights, heating controls, smart meters, etc. The We Vibe toy is actually far less worrying than some other equipment, because in order for someone to ‘hack’ it they need to be really close (We Vibe says basically in the same room).
Then there’s data collection: what did the toy send back to the company?
- Temperature settings (i.e. for diagnostics) This sent info on the internal temperature of the toy, and in the original talk the presenters pointed out that internal temperature info could potentially tell someone whether the toy was in use at the time. i.e. if the toy is running hot, it’s probably being used.
- Usage (i.e. patterns/frequency and length of use of the toy) In the media reports I’ve seen it appears to be the thing people are most concerned about. That – according to some headlines – the company was ‘spying on your sex life.’
As I’ve said to many of my other sex-tech-nerdy pals over the last week, personally the idea that a company might have this data does not bother me. I use my Doxy around ten times per week, give or take: there. Have some data. I use it on a fairly high pulse setting. Have that data too. Click here to listen to me having one of my first gurning orgasms with it. It’s fine. I don’t care.
But – and this is a ‘but’ so large you could project it on the side of Big Ben – I have a very lax attitude to certain kinds of data and info about my life. Sure, I’m an anonymous blogger, and so I am hyper-paranoid about people taking my pictures or knowing my location at any given time. But I am an anonymous sex blogger, and so the information about how often I wank doesn’t seem particularly damaging to me. But I need to seriously recalibrate my comfort radar when it comes to sex tech and data, because it’s clearly not something that I can just shrug off – lots of people really are nervous about it.
How scary is ‘smart’ sex toy data?
Any sex toy that sends data over the internet could potentially be hacked. There: that’s the scary headline, and you’ll probably read it in a number of places over the next few years.
But there are two issues here. Firstly, whether your data is personally identifiable. You might not want someone knowing exactly how many times per week you masturbate, but would you be as concerned with a company collecting anonymised data? To me it seems like quite a different thing for a company to know that I am currently wanking versus knowing that an anonymous someone somewhere is currently having a wank.
The second issue is that data by itself is not good or evil – it is a tool that can be used for a number of things. Your data – wherever it is stored – is useful. To companies, and governments, for sure, but also to you.
I have a thermostat sitting in my living room which not only knows when I am inside the house and moving, it also tries to sneakily turn the heating on when my partner hasn’t even bothered to go and fetch a jumper. I hate it with a passion that runs deep. But it is also quite useful. It has pleased my inner tight-arse by lowering our heating bills, and so I give it a grudging respect, and I haven’t yet hurled it out of the window. I also have smart lightbulbs, which I hate equally, but use because the benefits outweigh my irritation. I have a couple of phones, which use apps, which occasionally request that I ‘find friends through my contacts.’ My life is a minefield of data – things which collect it and things which try to give it away, and while I’m reasonably savvy about this stuff, I am frequently surprised to learn about just how much of my data is being processed and the purposes it’s used for.
Data can be useful
Each month I get an email from Tesco with tailored vouchers giving me money off my shopping. They’re based on things I buy frequently, like fruit pastilles, cheese, more cheese, and cheese biscuits. Sometimes it reads like those food shame programmes where they lay out all the food you ate in a week to try and shock you into not eating cheeseburgers. But it is also helpful. If they sent me vouchers for 20p off my next purchase of kale, I’d chuck the vouchers straight in the bin. As it is, I use them and am quite pleased that my cheese habit is subsidised by Tesco’s data-collection and marketing efforts.
Sex is seen as much more intimate and personal than your shopping list. But is it more intimate than, say, your medical data?
A paper released last week examined the controversial data sharing of patient information from The Royal Free London NHS Trust to Google’s DeepMind. Now: this project is quite cool, in my opinion. The aim is to use some of the capabilities of Google’s AI to help diagnose problems and improve patient outcomes. Basically (and incredibly simplistically), if you have lots of health data and a means of making sense of it, it may be possible to identify health problems earlier on, recognise patterns in treatment or diagnosis, and use those to offer better care and save lives. According to the BBC write-up:
“more than 26 doctors and nurses at the Royal Free are now using Streams [the app developed to help medical staff interpret patient data] and that each day it alerts them to 11 patients at risk of AKI.”
Neat, right? But in doing something like this, health data from Royal Free has been transferred to Google servers. Last week’s paper raises questions about the ethics of data sharing – how it was done, and the lack of public consultation and consent from patients. Google and DeepMind have issued a statement that notes:
“every trust in the country uses IT systems to help clinicians access current and historic information about patients, under the same legal and regulatory regime.”
Essentially: what DeepMind and the Royal Free are doing is an interesting thing, which could have extremely positive results, but just because they’re doing a good thing doesn’t make them immune to questions around privacy and security.
When it comes to data sharing, we need to be really cautious with companies who would happily collect as much data from us as possible, because that data is used for marketing and is therefore very valuable to them. It’s often said that ‘if you’re getting a service for free, you’re the product’ – essentially that tools like Facebook work on a ‘free use’ model because they make their money from people buying your time/attention via ads rather than you buying the service itself. We make these trade-offs whenever we tick a ‘terms and conditions’ box – and often we are aware that data sharing can benefit us, and we agree to those terms in full knowledge of the exchange – whether we’re getting better health outcomes, tailored offers, or provision of a service that we would otherwise have to pay for.
Of course organisations shouldn’t collect, analyse and distribute our data on a whim: transparency and consent is vital here. But data opens up some really interesting learning possibilities too, and it should be possible to take advantage of these benefits in a way that respects user consent.
Sex toy data is useful too
I can see the value in data gathered by smart sex toys. Most obviously, having data on frequency of masturbation/sex might be useful for me in spreading my own propaganda: ‘Masturbation is normal and healthy and – hey! Here’s how often people do it!’ And on a more personal level I know I’m not the only one who’d be interested in stats and info about my own sex life: if there was a way to measure when and how often I had sex and masturbated, and I could tie this to other factors in my life (mood, diet, exercise, etc) it would be easier for me to spot where problems lay. If I’m taking pills that affect my libido, for instance, data on my sex life might make it easier for me to spot where problems arise, and share knowledge with my doctor that could back up my case.
I can also see the value in sex toy companies having anonymised information about settings – those who are using vibrators and other smart sex toys for the first time can have some tips and advice on things they might like to try, and companies get to improve their toys – honing what they do, improving on cool stuff, and even maybe dropping features/functions that aren’t used that often.
The key things here, though, are ‘anonymity’ and ‘consent.’ Is the data that’s stored (and therefore potentially hackable) anonymised so that no one can link it to you specifically? And do you, the customer, know exactly what is being collected and why?
Sex toy data and shame
Regardless of what I think about my own sex toy data, many people do see sex toys as shameful or embarrassing, and there’s a hell of a lot of societal stigma that means this isn’t likely to change any time soon.
Epiphora put it best:
“I know, sex is a bristly subject for people. It’s sensitive and loaded, and should be handled as such. I sometimes forget that, living in a self-created sex-positive utopia where relaying all the juicy details of my masturbatory life to the internet is the norm.”
How should we deal with sex toy data?
As I mentioned above, this issue is not going to go away. We Vibe toys are not the only ‘smart’ sex toys on the market, and it’s only a matter of time before the next sex toy data issue bubbles up. Tech lawyer Neil Brown, in an overview of interesting legal issues in sex tech, gave this rather neat example:
“Take the ‘HUM’, for example, which is described as an ‘artificially intelligent vibrator’. The incorporation of AI, or other learning technologies, into a device may mean that, through assessment of your reactions to particular stimuli, the software on the device knows more about your preferences than you do. If these data are communicated back to the device manufacturer or any other party, a very rich and hugely personal dataset is created.”
There are many many more examples of toys which do similarly clever things, and may raise similar privacy questions from consumers. In the short term, companies that produce smart sex toys should – and most likely will – assess their privacy policies to make sure that they aren’t collecting more data than they need, and ensure consumers know exactly what’s being collected and why. That much is obvious.
But in the longer term, I’d like to see a much broader conversation about sex toy data, and the ways in which this information could benefit us. In the wake of the We Vibe settlement, sex tech companies will probably keep their heads down, keen to avoid scary headlines of their own. But sex toy data is useful, and toy companies know it. It’s useful to them (for diagnostics, improving products, etc) and it’s also useful to consumers (in tackling sex stigma, offering advice on toy use, etc).
Data – like sex itself – is morally neutral. It is a tool. That’s why I won’t stop using smart sex toys any more than I’ll stop using Facebook or shopping at Tesco: for me the benefits far outweigh the drawbacks. You might feel differently. But the only way either of us can make an informed choice is if we’re given all the information: about the risks, the benefits, and everything in between.