Could you ever love a sex robot?

Awesome image by Stuart F Taylor

The question we usually ask about robots is this: ‘could a human ever love a sex robot?’

I don’t think that’s the right question. In fact, I think it’s the opposite of the right question.

What I’d ask instead is this: if a robot were programmed to care for you, speak to you, hold you, and do all the other things a lover would do… how could you not fall in love with them?

Robot love and empathy

A couple of weeks ago I was on a podcast alongside Dr Kathleen Richardson who runs the Campaign Against Sex Robots. Essentially she says that the development of machines we can have sex with is encouraging us to ‘switch off’ our empathy. Then, swiftly, she moves on from sex robots to sex work – a ludicrous comparison which is chillingly lacking in empathy itself.

When we spoke, she explained that robotic technology is very basic at the moment – which is true. Although some companies are doing really exciting things with sex tech, we’re a hell of a long way from artificial intelligence which can even reliably mimic human behaviour, let alone give us cause to examine whether robots can be conscious. However, she then drew a direct comparison between the kind of sex you’d have with one of these ‘toasters’ and sex you’d have with someone you’re paying.

There’s a separate discussion to be had about how shockingly cold that comparison is. It pushes the idea that sex workers are no more than receptacles, and is compounded by Dr Richardson repeatedly insisting that sex work is the only industry ‘where you’re allowed to enter another human body for your own pleasure,’ as if the dick is the only thing we take into account in any sexual interaction and indeed that all sexual interactions necessarily involve a dick somewhere. It objectifies sex workers and porn performers in a more extreme, direct, and callous way than I have ever seen elsewhere.

But I don’t want to go too deeply into her comparisons – you can read what she thinks in this Wired article if you like, and if you are a journalist or ethicist then I would love to see her being challenged more directly, as the Guardian podcast did. I would also like her to stop presenting her work as if it is ‘tech news’ – it isn’t tech news, it’s opinion about sex work. Although all credit to her, by labelling it a Campaign Against Sex Robots she’s certainly managed to achieve a much larger platform than she’d have got if she was up-front about what she’s actually campaigning against.

Anyway. That aside, what I want to talk about here is empathy. Because the thing that surprises me most in any discussion of robotics and sex is when people discuss empathy without noting that robotics and AI provides some of the most fascinating examples of human empathy in action.

Robots and empathy

I for one welcome our adorable automated overlords

Awesome image by Stuart F Taylor

PARO

Meet PARO: he’s a cuddly, cute pet seal. Or rather, he’s a robot designed to look like a cute seal, and respond in certain ways to human touch and interaction. ‘He’ is essentially no more than a toaster himself and yet when they meet PARO, people stroke him, play with him, chat to him, and interact in the same ways they would if ‘he’ were a living creature.

People are currently using PARO for all kinds of different things, including supporting patients with dementia and alzheimers, relieving loneliness and stress, etc. There are research papers available if you’re into that kind of thing, but regardless of the medical benefits of having a ‘toaster’ you can stroke, it’s abundantly clear that no one interacting with this robot has switched off their empathy.

The Mars Rover

Responses to this tweet:

…made me laugh. They include things like ‘poor lil fella’, ‘saddest birthday ever’ and numerous links to this utterly heartbreaking XKCD comic. Go and read that comic and tell me it didn’t pull even one of your heartstrings.

Roomba

If you have a Roomba – the little robotic one that bumbles round your house cleaning up after you – there’s an 80% chance that you have given it a name. Check out this interview with the CEO of iRobot:

In the beginning of Roomba, we all took turns answering the support line. Once, a woman called and explained that her robot had a defective motor. I said, “Send it back. We’ll send you a new one.” She said, “No, I’m not sending you Rosie.”

‘Inhumane’ tests on robots

Linked from that same piece is a fascinating exploration of bomb disposal robots:

At the Yuma Test Grounds in Arizona, the autonomous robot, 5 feet long and modeled on a stick-insect, strutted out for a live-fire test and worked beautifully, he says. Every time it found a mine, blew it up and lost a limb, it picked itself up and readjusted to move forward on its remaining legs, continuing to clear a path through the minefield.

Finally it was down to one leg. Still, it pulled itself forward. Tilden was ecstatic. The machine was working splendidly.

The human in command of the exercise, however — an Army colonel — blew a fuse.

The colonel ordered the test stopped.

Why? asked Tilden. What’s wrong?

The colonel just could not stand the pathos of watching the burned, scarred and crippled machine drag itself forward on its last leg.

This test, he charged, was inhumane.

Sex and empathy

I don’t need, surely, to point out that many humans love their sex dolls. Not just in the same way I love my Doxy Massager, but in ways that confer intelligence and consciousness – ways that empathise. Not everyone does, of course, and that’s unsurprising: for the same reason you’re unlikely to name your vacuum cleaner if it isn’t a Roomba, even responsive sex dolls aren’t particularly good at faking personality just yet. I’m confident that as technology gets better, and sex toys (or sex robots) get closer to mimicking human behaviour, we’ll see more people falling in love with their robots. Or their software, like in the film Her.

One of the things I find fascinating about the Dr Richardson’s opinion on sex robots is that it’s founded on the totally false claim that robots will encourage us to have sex without empathy, while completely ignoring the part that empathy plays in the robotic interactions we have at the moment.

Robots are already serving coffee, hoovering floors, minesweeping, organising our schedules, talking to us, and so on, and not only do we often treat them as we would our human colleagues, we seem utterly incapable of not doing so. In fact, the only real way to avoid falling into this empathy trap is to understand the inner workings of the robots: we’re less likely to attribute personality to something we know to be a ‘toaster’ and more likely to give it personality if all we see are the external effects. This gets to the heart of why I find Dr Richardson’s sex work comparison so terrifying: she objectifies porn performers and sex workers so utterly that she fails to allow them any subjectivity whatsoever. That is how we erase empathy: when we cease to see people and only see ‘things.’

Could you love a sex robot?

Awesome image by Stuart F Taylor

Awesome image by Stuart F Taylor

One thing is clear from what we already know of robots: the more closely they mimic human beings, the better we empathise with them. We simply can’t help it. In fact, some robots are so good at mimicking humans they even run into the same problems as humans: Cortana (Microsoft’s version of the automated assistant Siri) encounters frequent sexual harassment. This does not occur because humans don’t see her as a person, but because they do. That they are sexist twats is annoying, of course, but the actual interaction only happens because – no matter how basic a robot – if it does things which are ‘a bit human’ then far from switching off our empathy, we cannot hold it back.

Can a human love a robot? Yes, of course – we already do. We get emotionally attached to our Roombas, we mourn bomb-disposal robots, we watch films like Ex Machina or Her and weep when robots get sad, or rage at them when they betray those we thought they loved.

Robots will not encourage us to switch off our empathy. It’s a ridiculous notion, and like most wrong answers it’s so much less interesting than the truth: robots, far from making us colder, are teaching us new things about human warmth.

Could you love a robot? Of course. And I suspect in your lifetime you will – whether a sex robot, a household helper, or a therapeutic friend. Because you are human, and you cannot help but love.

27 Comments

  • callalillity says:

    That comic almost made me cry! Also, there is so much evidence for humans anthropomorphising all kinds inanimate objects and bonding with them, Kathleen’s argument is very silly indeed.

  • RichardP says:

    I think the issue of wether humans can love inanimate objects can be answered pretty handily by anyone who was ever given a stuffed toy as a child.
    As to insinuating that having sex with a robot would reduce empathy for real people, my attitude is that anyone who feels this doesn’t have enough faith in the human race. And that’s coming from a cynical sod who assumes most people are bastards.

  • Dave says:

    The real question is can a robot love you?

    • Girl on the net says:

      That is definitely an interesting question, but I think we’re a hell of a long way off it. Someone on Twitter pointed out that one of the even more fascinating near-future issues with AI is how you deal with e.g. giving robotic helpers to vulnerable people, given how strongly we form attachments to the robots. Is it unfair to essentially make people form attachments to things which will one day break and require replacement?

      • El Stevo says:

        That’s no different to pets. :(

        • Girl on the net says:

          But pets are conscious though – when we attribute consciousness to pets and therefore make decisions (i.e. to care for them at the expense of ourselves, etc etc) we are correct to assume that they’ll suffer if we don’t. But robots won’t suffer if we don’t, and so empathising with them can potentially be harmful to us if we’re making other sacrifices to do so. I’ll see if I can dig out the link later but I’m a bit swamped right now. It’s an interesting read and I’ve only skimmed it because I’m super busy.

          • Orathaic says:

            What is the difference?

            We can ascribe robots with conscioueness, and if they fill enough of the qualities of it, then the fact that they can be reprogammed doesn’t make their attributes any less real.

            Like assume you could turn off the ‘love/ affection’ a robot ‘feels’ – well you can do that with a human too (probably requires killing the human, but we may be getting better modifying brains) – so the ability to take the ‘feelings’ away doesn’t make then less real.

            What qualities do they need to ‘count’? A pet can die, a robot can be repaired… The issue i can see comes later when out robot slaves become so human that we don’t want to use them as cheap labour anymore…

          • Girl on the net says:

            When we attribute consciousness to pets and therefore make decisions (i.e. to care for them at the expense of ourselves, etc etc) we are correct to assume that they’ll suffer if we don’t. But robots won’t suffer if we don’t, and so empathising with them can potentially be harmful to us if we’re making other sacrifices to do so.

            Appreciate that’s not a great summary – I’m exhausted and super busy atm. But this paper is the one I was talking about (thanks to @c_halestorm on twitter for sending it my way). Check out section on Costs and benefits of mis-identification with AI http://www.cs.bath.ac.uk/~jjb/ftp/Bryson-Slaves-Book09.pdf

          • Orathaic says:

            And what is the robots can suffer?

            I tend to ignore the philosophical and simply look at the behavioural, so if the behaviour is indistinguishable from suffering, it is.

            And i think we can get to that point of technology – maybe not by programming robot to suffer ( which seems cruel) but by programming them to learn, and develop relationships (which may be more effective than a set / generic responce with siri/contra have at present; sure they learn from everyone who interacts with them, but they also generically respond to everyone as if we were one homogenous mass – we will see more and better personalisation if it sells better)

            If we program them to learn and want fulfilling social interactions / emotional relationships, then being abandoned will, i believe, for all intents and purposes, cause them to suffer.

          • Girl on the net says:

            Did you read any of the thing I pointed you to?

          • Orathaic says:

            Thanks for the link to Bryson.

            One of his axioms is “Robots can be Servants without being people”

            I believe this is idealist and leads to damaging conclusions.

            A) it is very hard to tell whether a robot has become a person or not (especially if we program them to learn)

            B) the robots (like siri/contra) we have are being programmed to be most effective

            C) having a personal assistant who is a person is more effective than one who is a servant.

            Therefore i think we are currently striving towards creating people. And it will become hard to tell whether/when we have succeeded.

          • Girl on the net says:

            OK. I think we might be talking a bit at cross-purposes then. I get what you’re saying, I just don’t think that right now that’s the main issue. It may well be eventually, but we’d need to have much more advanced AI before that stuff became a direct problem. I think that when it gets to a point where there’s genuinely a question of robotic consciousness, then your a/b/c is an issue. But while they are functional tools those questions seem nonsensical – we wouldn’t ask the same of a laptop, for instance, that it be designed more like a person so as to be more effective. But until we’re at that point, living as if we’re already there seems a waste.

          • Orathaic says:

            Sorry, i replied first, and then read it.

            *idealist, in terms of living in an ideal world.

          • Orathaic says:

            Yeah, i agree it isn’t much of a problem yet; have you seen the film, em, let me think, Robot and Frank?

            Or the french company which is making robots to be social companions around the house (not able to do any housework, but to keep people company) these are not the tools which laptops look like, when we’ve designing thig to fulfill social needs (and whether people have sex with them or not) the C) part becomes more true, than say for a roomba.

            Japanese people who currently love fictional characters, fctoonal pop stars who exist as holographic singers, and can buy life size stuffed toys are showing that there is demand for relationships* – though you can look at falling birth rates and lack of interest in human relationships as a problme in Japan, and not an indication of where we are all headed.

            Very interesting topic!

            *e-relationships? Virtual relationships? What would we call them?

  • Vida says:

    Have you seen those male sex dolls? Fuck yes, I could love a sex robot, and I would order one that looked like Jason Momoa. My very own Jason Momoa sex robot. The only issue would be where to keep him, as I have children, and I don’t think they’d accept him as their dad.

    For a sex robot story that’ll make you cry, check out Sommer Marsden’s contribution to Geek Love. V moving. Here’s a link, but they make you log in to look, as it’s got sex in it, which is a bit irritating, and will probably put you off doing so, which is sad, becuase it’s a beautiful anthology full of great stories and art. http://www.drivethrufiction.com/product/112892/Geek-Love%3A-An-Anthology-of-Full-Frontal-Nerdity

  • Paratethys says:

    Isaac Asimov.
    Earth Is Room Enough
    p104 – Satisfaction Guaranteed

    Either you’ve read this story already and loved it, or you have yet to read something that, going by this article alone, was basically invented for, specifically, just you and right now.

  • Azkyroth says:

    Yet another example: apparently soldiers are prone to grieving destroyed battlefield robots, even ones that look nothing like humans.

  • H.H. says:

    Thanks for this really interesting article. I tried to listen to the podcast, but it wouldn’t download. My ignorance, probably. I wonder, how did Dr. Richardson find you. Perhaps she enjoys porn (and your reviews of toys – read, sex robots) more than she’s willing to admit?

    • Girl on the net says:

      Ah, it was the Guardian podcast people who organised it. I didn’t know she’d be there because I was a last-minute guest. But yeah, she didn’t seek me out or anything – I imagine she’d be horrified by my blog!

  • SpaceCaptainSmith says:

    I recall reading somewhere about robots actually being used to *teach* empathy and interpersonal relations to autistic children. Which seems kind of ironic, but it’s true: some people find robots easier to empathise with than other humans.

    • Orathaic says:

      Really helpful for some autistic kids who find it difficult to read emotion on faves.

      The robots are able to perfectly replicate the same facial expression on demand which makes it easier to spot the patterns…

  • HH – you can download it through iTunes.

    GOTN – I think you were brilliant on the podcast and really showed up the intellectual quality of Richardson’s arguments. Strangely, for someone who is supposed to have studied her subject, her views on the lives of sex-workers shows a complete misunderstanding.

    I’ve only known one sex-worker (as far as I’m aware), she was a friend a long time ago and is finally marrying one of her clients – a real Pretty Woman story. I got her to do a guest post for me which is on this link: http://wp.me/p4bMUU-13N It shows that this can be a chosen profession and that some sex-workers really enjoy what they do.

    Okay, we all know there are those who are coerced or trafficked into the profession and that must be resisted at every opportunity, but no one has the right to tell us how we can or can’t use our bodies as long as it causes no harm to others.

    Now, robots. I love my bear. He’s getting old now and has had one or two repairs, but I love him. A different way from how I love my husband, but it is still love and I’d jump to his rescue if anyone were so much as to laugh at him. His shiny nose has performed the occasional service, but he is more just something to cuddle. So, could I love a machine? Possibly if I didn’t have Peter.

    Back to you as a radio voice – take every opportunity you can to participate in programmes – you have a great voice and are so clear-thinking and capable of making yourself clear. Really impressed.

  • This is an interesting one.

    If we were living in a world where humanoid robots walked and lived amongst us and were indistinguishable from flesh and blood humans (eg the Replicants in Blade Runner or the Cylons – particularly since I’m a bloke, the female Cylons in Battlestar Glactica), then would loving/having sex with them cause any issues? I suspect probably not. In fact, if such robots were indistinguishable from humans, would they not, themselves, be in some way human? Is the definition of a human being something that looks and acts in all ways like a human being?

    As to the current state of robotics, for me the answer is simply no. Not because I am against the idea, but simply because, for me at least, a sex robot is still a form of “sex toy”. Now, I may be missing out, but I’ve never used sex toys. Nothing against them and I know that those who use them derive great pleasure/satisfaction from them; but, just not my thing – each to their own and all that.

    For me, the main pleasure from sex comes from the pleasure I give my partner. I’m fairly sure a sex robot wouldn’t feel pleasure from anything I do. Any form of sexual gratification would be from the robot to me, which would make it an elaborate form of masturbation. I’m not knocking masturbation (unless knocking one out counts), I’m as much of a wanker as the next person, but while I’ve never used a fleshlight, I’m pretty sure I wouldn’t love one if I did and I suspect that, to me at least, a sex robot would just be some sort of sophisticated fleshlight.

    It would be interesting (well I think it would) to discover if there were some sort of gender difference towards the possible appreciation of such things. I may be wrong, and I realise I’m about to sound like a monumental twat, but I suspect (possibly) that women may enjoy the experience more so than men. I base that on nothing more scientific than the fact that women tend to use toys more than men do, I have no real basis for the supposition.

    So, in summary, give me a Rachael or Caprica 6 and I’m probably all over it, Give me a Wall-E or a Metal Mickey, and probably not so much…

    KW

  • Charlie says:

    The Lakeland Heated Airer has just slipped down my ‘want list’ in favour of the Roomba. Damn you. And just reading the account of the military robot here made me a bit weepy…

  • Andrew says:

    Fascinating article and one I am going to have to look more deeply into. This is the eternal question for us all, when does does a thing become capable of being alive? The simulation of emotions is not being alive, but where do you draw the line between simulating and the simulation being the real thing? A deep discussion that probably can’t be answered.

    I can’t find the podcast :( No links to it here and the Guardian website doesn’t help me at all.

  • Chris says:

    I have a hand made robot it is the greatest thing ever and it has everything. A really nice cock too.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.