Today seems to be “Ethics Questions That We Shouldn’t Have To Ask Day,” and Andrew Sullivan, over at the Daily Beast, phrases his entry this way:
“Is Sex With A Robot Adultery?”
Gee, I don’t know, Andrew: is sex with a toaster adultery? What has Sullivan asking such nonsense is a new book called Robot Ethics, which has some legitimate issues to explore, and then some other phony controversies included to get publicity and interviews. The field of robot ethics still includes little that hasn’t been thoroughly explored by Robert Heinlein, Isaac Asimov and on “Star Trek: The Next Generation,” but as a few of these dilemmas are likely to enter reality from science fiction in the foreseeable future, it is reasonable to dust off the issues again as long as we don’t get silly about it. Getting overly excited for the Boston Globe, however, Josh Rothman writes:
“Already, fascinating moral questions are emerging. If a robot malfunctions and harms someone, who is responsible — the robot’s owner, its manufacturer, or the robot itself? Under what circumstances can robots be put in positions of authority, with human beings required to obey them? Is it ethically wrong for robots to prey upon our emotional sensitivities — should they be required to remind us, explicitly or implicitly, that they are only machines? How safe do robots need to be before they’re deployed in society at large? Should cyborgs — human beings with robot parts — have a special legal status if their parts malfunction and hurt someone? If a police robot uses its sensors to perform a surveillance operation, does that constitute a search? (And can the robot decide if there is probable cause?) Some of these questions are speculative; others are uncomfortably concrete.”
Yes, and some are stupid.”The robot itself” is not going to be held “responsible” under any legal system in existence or likely to come into existence. Machines don’t have ethics, and laws don’t punish machines. Would it be ethically wrong for robots to deceive people? Yes, and those who built and programmed them, not the robots, would be accountable. Cyborgs are people, and our laws can handle that hypothetical now: if my artificial leg goes flying off and kills someone, then it is either my fault for not attaching it properly or the manufacturer for building it carelessly. Rothman seems to be thinking of a robot arm that runs amuck, like Dr. Strangelove’s. Whether a malfunctioning arm is a robot or just mechanical, the responsibility will be determined the same way. And no, a robot cannot decide if there is probable cause.
Rothman is especially intrigued by this selection from the book:
“Upmarket sex dolls were introduced to the Korean public at the Sexpo exposition in Seoul in August 2005, and were immediately seen as a possible antidote to Korea’s Special Law on Prostitution that had been placed on the statute books the previous year. Before long, hotels in Korea were hiring out “doll experience rooms” for around 25,000 won per hour ($25)…. This initiative quickly became so successful at plugging the gap created by the anti-prostituion law that, before long, establishments were opening up that were dedicated solely to the use of sex dolls… These hotels assumed, quite reasonably, that there was no question of them running foul of the law, since their dolls were not human. But the Korean police were not so sure. The news website Chosun.com… reported, in October 2006, that the police in Gyeonggi Province were “looking into whether these businesses violate the law . . . Since the sex acts are occurring with a doll and not a human being, it is unclear whether the Special Law on Prostitution applies.”
Well, the fact that the Korean police are confused does not an ethical dilemma make. Robot sex is not adultery or prostitution: it is elaborate masturbation. There is only one human being involved, and if this can be called adultery, so can a spouse’s devotion to a car, the golf course, or the demon rum. I will concede that if the day comes when robots are so lifelike that humans don’t know when they are dealing with them, as in “Blade Runner,” the question of whether a man is guilty of adultery when he thinks he is having sex with a woman that is really a robot that looks like Darryl Hannah might be worth an ethics quiz (Pssst…he’d be guilty of attempted adultery). Before that highly doubtful occurrence, however, there are too many important real ethical problems that need solving, to spend much time on this.