Sex toy data: the good, the bad and We Vibe

Image by the brilliant Stuart F Taylor

Last week journalists had a ball with the news that We Vibe has agreed to pay out millions of dollars to users of its ‘smart’ sex toys, after a legal dispute surrounding the data that the company collected about toy use. Some users were understandably annoyed by this, and they launched a class action which cost We Vibe millions of dollars to settle. But it hasn’t put me off smart sex toys: not one tiny little bit. And I’m going to try and explain why.

Let’s take a look at some of the details of the We Vibe data story. Firstly, as demonstrated at Def Con 24, the fact that it’s potentially possible for someone else to control your vibrator. This sounds scary, though it is in fact true of pretty much any ‘smart’ system that it could be hacked and controlled remotely – lights, heating controls, smart meters, etc. The We Vibe toy is actually far less worrying than some other equipment, because in order for someone to ‘hack’ it they need to be really close (We Vibe says basically in the same room).

Then there’s data collection: what did the toy send back to the company?

  • Temperature settings (i.e. for diagnostics) This sent info on the internal temperature of the toy, and in the original talk the presenters pointed out that internal temperature info could potentially tell someone whether the toy was in use at the time. i.e. if the toy is running hot, it’s probably being used.
  • Usage (i.e. patterns/frequency and length of use of the toy) In the media reports I’ve seen it appears to be the thing people are most concerned about. That – according to some headlines – the company was ‘spying on your sex life.’

As I’ve said to many of my other sex-tech-nerdy pals over the last week, personally the idea that a company might have this data does not bother me. I use my Doxy around ten times per week, give or take: there. Have some data. I use it on a fairly high pulse setting. Have that data too. Click here to listen to me having one of my first gurning orgasms with it. It’s fine. I don’t care.

But – and this is a ‘but’ so large you could project it on the side of Big Ben – I have a very lax attitude to certain kinds of data and info about my life. Sure, I’m an anonymous blogger, and so I am hyper-paranoid about people taking my pictures or knowing my location at any given time. But I am an anonymous sex blogger, and so the information about how often I wank doesn’t seem particularly damaging to me. But I need to seriously recalibrate my comfort radar when it comes to sex tech and data, because it’s clearly not something that I can just shrug off – lots of people really are nervous about it.

How scary is ‘smart’ sex toy data?

Any sex toy that sends data over the internet could potentially be hacked. There: that’s the scary headline, and you’ll probably read it in a number of places over the next few years.

But there are two issues here. Firstly, whether your data is personally identifiable. You might not want someone knowing exactly how many times per week you masturbate, but would you be as concerned with a company collecting anonymised data? To me it seems like quite a different thing for a company to know that I am currently wanking versus knowing that an anonymous someone somewhere is currently having a wank.

The second issue is that data by itself is not good or evil – it is a tool that can be used for a number of things. Your data – wherever it is stored – is useful. To companies, and governments, for sure, but also to you.

I have a thermostat sitting in my living room which not only knows when I am inside the house and moving, it also tries to sneakily turn the heating on when my partner hasn’t even bothered to go and fetch a jumper. I hate it with a passion that runs deep. But it is also quite useful. It has pleased my inner tight-arse by lowering our heating bills, and so I give it a grudging respect, and I haven’t yet hurled it out of the window. I also have smart lightbulbs, which I hate equally, but use because the benefits outweigh my irritation. I have a couple of phones, which use apps, which occasionally request that I ‘find friends through my contacts.’ My life is a minefield of data – things which collect it and things which try to give it away, and while I’m reasonably savvy about this stuff, I am frequently surprised to learn about just how much of my data is being processed and the purposes it’s used for.

Data can be useful

Each month I get an email from Tesco with tailored vouchers giving me money off my shopping. They’re based on things I buy frequently, like fruit pastilles, cheese, more cheese, and cheese biscuits. Sometimes it reads like those food shame programmes where they lay out all the food you ate in a week to try and shock you into not eating cheeseburgers. But it is also helpful. If they sent me vouchers for 20p off my next purchase of kale, I’d chuck the vouchers straight in the bin. As it is, I use them and am quite pleased that my cheese habit is subsidised by Tesco’s data-collection and marketing efforts.

Sex is seen as much more intimate and personal than your shopping list. But is it more intimate than, say, your medical data?

A paper released last week examined the controversial data sharing of patient information from The Royal Free London NHS Trust to Google’s DeepMind. Now: this project is quite cool, in my opinion. The aim is to use some of the capabilities of Google’s AI to help diagnose problems and improve patient outcomes. Basically (and incredibly simplistically), if you have lots of health data and a means of making sense of it, it may be possible to identify health problems earlier on, recognise patterns in treatment or diagnosis, and use those to offer better care and save lives. According to the BBC write-up:

“more than 26 doctors and nurses at the Royal Free are now using Streams [the app developed to help medical staff interpret patient data] and that each day it alerts them to 11 patients at risk of AKI.”

Neat, right? But in doing something like this, health data from Royal Free has been transferred to Google servers. Last week’s paper raises questions about the ethics of data sharing – how it was done, and the lack of public consultation and consent from patients. Google and DeepMind have issued a statement that notes:

“every trust in the country uses IT systems to help clinicians access current and historic information about patients, under the same legal and regulatory regime.”

Essentially: what DeepMind and the Royal Free are doing is an interesting thing, which could have extremely positive results, but just because they’re doing a good thing doesn’t make them immune to questions around privacy and security.

When it comes to data sharing, we need to be really cautious with companies who would happily collect as much data from us as possible, because that data is used for marketing and is therefore very valuable to them. It’s often said that ‘if you’re getting a service for free, you’re the product’ – essentially that tools like Facebook work on a ‘free use’ model because they make their money from people buying your time/attention via ads rather than you buying the service itself. We make these trade-offs whenever we tick a ‘terms and conditions’ box – and often we are aware that data sharing can benefit us, and we agree to those terms in full knowledge of the exchange – whether we’re getting better health outcomes, tailored offers, or provision of a service that we would otherwise have to pay for.

Of course organisations shouldn’t collect, analyse and distribute our data on a whim: transparency and consent is vital here. But data opens up some really interesting learning possibilities too, and it should be possible to take advantage of these benefits in a way that respects user consent.

Sex toy data is useful too

I can see the value in data gathered by smart sex toys. Most obviously, having data on frequency of masturbation/sex might be useful for me in spreading my own propaganda: ‘Masturbation is normal and healthy and – hey! Here’s how often people do it!’ And on a more personal level I know I’m not the only one who’d be interested in stats and info about my own sex life: if there was a way to measure when and how often I had sex and masturbated, and I could tie this to other factors in my life (mood, diet, exercise, etc) it would be easier for me to spot where problems lay. If I’m taking pills that affect my libido, for instance, data on my sex life might make it easier for me to spot where problems arise, and share knowledge with my doctor that could back up my case.

I can also see the value in sex toy companies having anonymised information about settings – those who are using vibrators and other smart sex toys for the first time can have some tips and advice on things they might like to try, and companies get to improve their toys – honing what they do, improving on cool stuff, and even maybe dropping features/functions that aren’t used that often.

The key things here, though, are ‘anonymity’ and ‘consent.’ Is the data that’s stored (and therefore potentially hackable) anonymised so that no one can link it to you specifically? And do you, the customer, know exactly what is being collected and why?

Sex toy data and shame

Regardless of what I think about my own sex toy data, many people do see sex toys as shameful or embarrassing, and there’s a hell of a lot of societal stigma that means this isn’t likely to change any time soon.

Epiphora put it best:

“I know, sex is a bristly subject for people. It’s sensitive and loaded, and should be handled as such. I sometimes forget that, living in a self-created sex-positive utopia where relaying all the juicy details of my masturbatory life to the internet is the norm.”

Check out her full blog post here, as she sums up this issue really neatly. I agree with what she’s said surrounding transparency and consent, and I also echo what she has said about We Vibe as a company. I can easily forgive them for this, not least because they have updated their privacy policy and app, and ironically will probably be much better than some (though not all) sex tech companies when it comes to transparency around data.

How should we deal with sex toy data?

As I mentioned above, this issue is not going to go away. We Vibe toys are not the only ‘smart’ sex toys on the market, and it’s only a matter of time before the next sex toy data issue bubbles up. Tech lawyer Neil Brown, in an overview of interesting legal issues in sex tech, gave this rather neat example:

“Take the ‘HUM’, for example, which is described as an ‘artificially intelligent vibrator’. The incorporation of AI, or other learning technologies, into a device may mean that, through assessment of your reactions to particular stimuli, the software on the device knows more about your preferences than you do. If these data are communicated back to the device manufacturer or any other party, a very rich and hugely personal dataset is created.”

There are many many more examples of toys which do similarly clever things, and may raise similar privacy questions from consumers. In the short term, companies that produce smart sex toys should – and most likely will – assess their privacy policies to make sure that they aren’t collecting more data than they need, and ensure consumers know exactly what’s being collected and why. That much is obvious.

But in the longer term, I’d like to see a much broader conversation about sex toy data, and the ways in which this information could benefit us. In the wake of the We Vibe settlement, sex tech companies will probably keep their heads down, keen to avoid scary headlines of their own. But sex toy data is useful, and toy companies know it. It’s useful to them (for diagnostics, improving products, etc) and it’s also useful to consumers (in tackling sex stigma, offering advice on toy use, etc).

Data – like sex itself – is morally neutral. It is a tool. That’s why I won’t stop using smart sex toys any more than I’ll stop using Facebook or shopping at Tesco: for me the benefits far outweigh the drawbacks. You might feel differently. But the only way either of us can make an informed choice is if we’re given all the information: about the risks, the benefits, and everything in between.

6 Comments

  • SpaceCaptainSmith says:

    Good post*. It seems that (like pretty much everything else discussed here) informed consent is key.

    Personally speaking, I’m happy enough to let all kinds of companies track my personal data, but only once I’ve consented to it. I suspect Google has a more complete picture of my own sexuality than I do.

    *(though I may have lapsed in concentration slightly after the words ‘my inner tight-arse’…)

    • Girl on the net says:

      “I suspect Google has a more complete picture of my own sexuality than I do.” Excellent point! I used to work in marketing and every day was a bit of a revelation in terms of data – not just what sites like Google can collect but what they can do. I remember a long time ago there was a discussion about ‘like’ and ‘share’ buttons on pages, and the information they collect about users. There was a time when it was possible for the buttons to track you *just because you were on the page* – made me think of the porn sites that include ‘like’ and ‘share’, and people laughing at the idea that they would ever share porn on their facebook. The buttons could (potentially) track them anyway iirc. Identifiably. I am fuzzy on whether they actually did, but its safe to say Google probably knows a fair bit about all of us when it comes to wanking.

  • P says:

    So the question to be asked is why? Why are we as people so worried about sharing data!
    Does everybody wank? I’m sure not everybody does, but are the ones that don’t the the odd ones out? Stood on the beach looking at the sea, no one remembers the individual drops of water. In an ocean of happy wanker’s, no one is going to remember the individual ,,,,,,, are they???
    Hide in plane sight,,,, maybe that’s the only way in this age of data collection.

    • Neil says:

      > Why are we as people so worried about sharing data!

      I don’t mean this to come across as critically as it might — for which I apologise — but I suspect that a willingness or level of comfort in releasing this kind of information comes from being in a relatively privileged position.

      Perhaps some would feel less free to experiment with, or learn about, their sexuality, if they felt that every choice, feeling or reaction was monitored and linked to their name? Some people may be willing to share their information about precisely what they do, when, and for how long, with their parents and friends. Others might be more circumspect, and consider that, while they might not have anything to hide in their actions, they do not wish to broadcast them either: they are their own private business.

      Perhaps some are in a country, or family / situation, where it is illegal, or simply dangerous, to be anything other than heterosexual, or where masturbation is judged. Would they want their data shared? I am unsure whether those who are comfortable with sharing their data are really best placed to make decisions such as defining open / sharing default settings for products and services, just because they feel safe in doing so.

      For those who wish to share, and who are capable of understanding the ramifications of doing so, should it be prohibited? I’d struggle with that, but I’m still hesitant:
      – there is an onus on the companies collecting the data to be phenomenally transparent about what they are collecting, how they are storing it, and what they are using it for, to enable that decision-making; and
      – “big data”, and the ability to link together data sets (and identify users from allegedly anonymous or pseudonymous data sets) is considerable, and becoming cheaper and easier. Is the average person really able to appreciate the ramifications of their data sharing?

      I’m not sure it is as easy as saying “I don’t care, so the others are the odd ones out”. I suspect that there are many who would not feel comfortable posting on a website about sex, sex toys and sexuality — especially under their own name. That’s quite a privilege to feel happy and safe to do so.

      • Girl on the net says:

        Yep, no I agree and I don’t think you’re being too critical – I think you’re right that it’s much more than “I don’t care, so the others are the odd ones out” – maybe I wasn’t clear enough that I think *I* am likely the odd one out, gauging by the reaction to this story. That’s why I said I think I need to recalibrate my knee-jerk reactions to this – and I did also do a big section in the post on why I am likely very ‘odd one out’ because I write about sex toys and sexuality.

        “Perhaps some would feel less free to experiment with, or learn about, their sexuality, if they felt that every choice, feeling or reaction was monitored and linked to their name? ” Yep, I agree, though I think we need to be super-careful with this because the ‘linked to real name’ thing is so pivotal, and I think will make a huge difference in whether people are happy or not happy. But it’s also one of those details that is frequently fudged (or not mentioned) when these stories break in the media – whether people are actually identifiable or not.

        “Is the average person really able to appreciate the ramifications of their data sharing?” Good question, to which I’d say the answer is no, but then I would hope that over the next few years we’ll become more informed about it, and be able to make meaningful choices.

        • Neil says:

          I thought the way you positioned it in your piece was spot on: that, as a sex blogger, your perspective may not be that of the average person. (The “odd ones out” mention was in response to P’s comment, rather than your own, by the way!)

          Your point about “identifiability” is interesting. Even if the record does not include your name, could it still identify you? Legally – I know, not the most fun starting point – something doesn’t have to include a name to require protection as personal data. Looking perhaps at the other end of the spectrum, this piece comes to mind: https://www.schneier.com/blog/archives/2007/12/anonymity_and_t_2.html – from what was thought to be anonymous data to people’s name and identifies.

          Of course, there may be no-one with the motivation to do this, and your point about the actual risk, rather than media hyperbole, is well made.

Leave a Reply

Your email address will not be published. Required fields are marked *