Fixing This Problem Requires Leaping Onto a Slippery Slope: Should We?

Nicholas Kristof has sounded the alarm on the growing problem of artificial intelligence deepfakes on line. I must admit, I was unaware of the extent of the phenomenon, which is atrocious. He writes in part,

[D]eepfake nude videos and photos …humiliate celebrities and unknown children alike. One recent study found that 98 percent of deepfake videos online were pornographic and that 99 percent of those targeted were women or girls…Companies make money by selling advertising and premium subscriptions for websites hosting fake sex videos of famous female actresses, singers, influencers, princesses and politicians. Google directs traffic to these graphic videos, and victims have little recourse.

Sometimes the victims are underage girls….While there have always been doctored images, artificial intelligence makes the process much easier. With just a single good image of a person’s face, it is now possible in just half an hour to make a 60-second sex video of that person. Those videos can then be posted on general pornographic websites for anyone to see, or on specialized sites for deepfakes.

The videos there are graphic and sometimes sadistic, depicting women tied up as they are raped or urinated on, for example. One site offers categories including “rape” (472 items), “crying” (655) and “degradation” (822)….In addition, there are the “nudify” or “undressing” websites and apps …“Undress on a click!” one urges. These overwhelmingly target women and girls; some are not even capable of generating a naked male. A British study of child sexual images produced by artificial intelligence reported that 99.6 percent were of girls, most commonly between 7 and 13 years old.

Yikes. These images don’t qualify as child porn, because the laws against that are based on the actual abuse of the children in the photos. With the deepfakes, no children have been physically harmed. Right now, there are no laws directed at what Kristof is describing. He also links to two websites on the topic started by young women victimized with altered photos and deepfaked videos of them being spread on line: My image My choice, and AI Heeelp!

It shouldn’t need to be said that this conduct is horribly unethical. The Ethics Alarms mantra is that when ethics fails, the law steps in and usually makes a mess: this situation is a typical looming trap. Kristof blames the lack of legal remedies to “a blasé attitude toward the humiliation of victims,” citing a survey that found that 74% of deepfake pornography users didn’t feel guilty about watching the videos. That’s because the consumers weren’t doing the victimizing, of course: most people have difficulty feeling accountable for harms that are one or more steps removed from them. Progressives, on the other hand, revel in that kind of expanded accountability; thus if you eat at Chic-Fil-A, you are really participating in persecuting gays.

Kristof wants Big Tech companies like Google to be held civilly liable for allowing links to sites selling and showcasing deepfake porn on their platforms. “We have a hard-fought consensus established today that unwanted kissing, groping and demeaning comments are unacceptable, so how is this other form of violation given a pass?” the columnist writes. “How can we care so little about protecting women and girls from online degradation?”

I care about it, but I also know the environment we are currently in regarding online content. Once Big Tech platforms are banned from circulating cruel manufactured videos and photos, laws allowing them to similarly ban cruel words, controversial opinions and “misinformation” will be much closer to fruition and acceptance. I don’t trust our lazy, ethics-challenged legislators to write narrowly constructed laws that would prevent deepfake porn and that would not necessarily haul in legitimate free expression too, like tuna nets capturing dolphins. I also know that many on the Left who want to criminalize “hate speech” (as they define it, of course) would be thrilled to see government control over online content be vastly expanded.

I don’t know what the solution to this problem is. I do agree that it needs addressing. Before generating sweeping laws in a “Think of the children!” panic, we should start with 1) making parents aware of the problem; 2) persuading our young not to be so eager to post photos of themselves online and on social media, and 3) do a better job making sure that members of our society understand and practice the Golden Rule.

No, I know it’s not enough; this is an area where ethics alone won’t do the job, and new laws are inevitable. We shouldn’t kid ourselves, however. This is a slippery slope to wider censorship.

16 thoughts on “Fixing This Problem Requires Leaping Onto a Slippery Slope: Should We?

  1. Creating a deep fake using the face of the child or adult does cause harm to to victim. Maybe Leticia James might start working to find a statute that would allow her to prosecute.

    • can you copywrite or trademark your own image which would give victims the right to sue creators and purveyors of these fakes.?

      • Crispin Glover made the first case law that an likeness can’t be used without permission, from him being replaced in later the Back to the Future sequels.

  2. Isaac Asminov postulated three laws of robots. The first of which states “The robot should may not injure a human being or by inaction cause a human being to come to harm.” This should be the basis of our approach to AI.

    • deacondan86

      Indeed. 
      Of course, the reason Asimov created the three laws was so that he could write a whole lot of short stories around the logical conflicts that resulted from their existence.

      • True, and he ended up with a slew of short stories and six or ten novels from the concept. His Three Laws of Robotics are pretty much universally recognized in science fiction fandom.

        • The problem I have with the Three Laws of Robotics is that they depend upon the robot understanding what is “human”, what is “harm”, and what is “another robot.” AI as it currently is really is no more than processed Big Data. The algorithms fundamentally have a reward metric and penalty metric that help them “learn” what is….whatever the programmer wants them to “learn.” There is no deeper understanding. In fact, there is no understanding whatsoever: it is really a bank of numerical weights stored on a hard drive, to which user inputs are applied, the weights analyzed, perhaps some pseudo-randomness applied, an an output generated that falls within the bounds of all those weights. No cognition. No cogitating. Just input, number-crunching, output.

          This means that a robot could harm any human that doesn’t meet the “human” metric. Perhaps it is an awkward position in bad light with other random pieces of background that throw the calculation outside the bounds of “human”. Perhaps it is a series of weights that just happened to accurately classify everyone human on earth as “human”, except for Elijah Bailey. It also means a robot could harm any human with an action that somehow fails to be classified as “harm”. Maybe the programmer spent so much time programming in all kinds of microagressive triggers as harm that he overlooked that backing a car over someone is actually harm. Or maybe the learning algorithm finds so much child porn online that its weights are adjusted so that raping a child ends up classified as “not harm.”

          Part of the question around the use of AI to generate pornographic images is that humans have been making pornography since practically the beginning. I’ve heard discussions that cave art might have been primitive nursery decorations, meant to sooth little babies who are terrified at the cruel, brutal world around them. But I’ve also read that cave art in some instances could have been primitive pornography. I’ve not followed up on that to find out if it is true, but look where we’ve gone from there. Humans invent sculptures, and they immediately sculpt people in the nude. They invent painting, and then there are nude paintings. They invent photography, and now there are naked pictures everywhere. They invent video, and now there are movies of naked people. They invent the internet…

          The proliferation of pornographic material has accelerated as technology has advanced. Part of this is because technology has made pornography easier to create and easier to acquire. When you had to draw your own pornography, it was scarcely worth the effort. When a skilled painter could paint pornography, you had to pay a hefty sum and probably find someone willing to model the painting. When it came to pictures, that was much easier, because taking a picture just takes a few minutes…er, seconds…er, happens practically instantaneously. But you had to have access to someone taking pictures, and that would be relatively rare. But then advance to Hugh Hefner and Playboy, and now professional grade photographs (and all those stories!) can be marketed to an audience who doesn’t have to be affluent at all to pick up a copy and hide it securely under a bedroom mattress. And then there came the Internet, which came right into your home, and you could look at all the porn you could desire without anyone outside your room knowing what you were doing.

          Except, the pornography has never been enough. Websites proliferate, and for all the free sites that are out there, there are also hundreds of subscription or membership websites that make a killing. Why are people paying when there is so much free pornography? Because it is never enough. Clicking through image after image, watching video after video, never really satisfies, and so people (men especially, but a growing number of women) keep searching for more, more more. The normal coitus fails to satisfy, so it needs to be exotic positions, or with exotic toys, or with BDSM, or choking, or violence, or increasingly disgusting acts, because every previous sexual act becomes boring. And so the pornography proliferates, but never fast enough to satisfy the demand.

          Now enter AI, which can number-crunch pornography at blinding speed. Here we have, perhaps still only in nascent form, the generation of pornography faster than every set of actors across the world could compete with. Sadly, it will still fail to satisfy. People will demand more and more and more customized pornography. I think they might even find themselves disappointed with AI, because AI porn will be derivative. However, I guess if it can generate any specific act between specific people, there will be regular interest in what it can deliver, just as the older forms of pornography continue to be gateway drugs into harder and harder hardcore porn.

          I think the ultimate upshot of the application of this AI deepfake of celebrities and even of the girl-next-door is going to be, eventually, disappointment. The AI generation will give you the face and the general body and a rendering of what the body underneath the clothes might look like, but it will eventually lead to frustration. Yes, that’s sexy, and yes, I got my rocks off a few times, but now I’m not sure I’m really seeing what that person looks like. And if it is all fake, why not look at the really attractive fakes?

          I don’t think there is a law that can be written to suppress this next generation of exploitation. The fundamental problem is that anyone can claim that the nudity, or exploitive activities, depicted isn’t really what that person looks like or does. You could cut out the picture of your love interest from the school year book and paste her face over a supermodel body and share that image, but it isn’t really what she looks like unclothed. You can spend a great deal of time photo-shopping some debauched action, but it still isn’t what that person actually did. There’s a great deal of outrage you can feel about having your actual likeness spliced into some depraved activity, but at the end of the day, the rest of the body involved really isn’t you. 

          Like posing nude for paintings or sculptures, like taking nude photos in an art studio, like taking nude selfies and other sexting techniques, this AI rendering is a new face of an old problem. Take care not to let your image be used. Don’t do things that can escape into the wild, which, sadly, includes letting people take your picture. Maybe that is where the law needs to step in — if someone takes a picture of someone else without their consent, they are liable to theft and potential defamation. That might escape the free speech issue, since we’re talking about an action taken against someone else, not someone’s right to express an opinion. However, that would be tricky, given how many parents take pictures of their kids…

  3. This is intentionally defaming a person by photographically lying about that person, how about another category of legal defamation that covers this kind of thing so the people creating these fake AI generated photos and videos can be sued?

    • It ties back to the same problem as many other AI problems, though. is the person who put in the prompt culpable? Everyone who views the image? The hosting site? The search engine? The programmers? The AI itself? Just as one ideally has a clear victim for there to be a crime, there also kind of needs to be a clear perpetrator.

      • As I wrote above, the one that created the false image/videoing is the one intentionally defaming others, the creator IS the clear perpetrator of the defamation. The fact that the software can be used for unethical or immoral purposes is not the fault of the programmers, it’s the fault of the user/abuser. The software is a tool just like lots of other tools that can be abused by unethical and immoral human beings. We don’t sue or ban hammers or other tools because they can be used by immoral people for illegal things.

  4. How to get a teacher you don’t like fired under of the “Naked Teacher Principle”. Create a deepfake of her. If the deepfake is done well enough, no school board will be able to tell the difference between the real and the fake.

    • I don’t think it’s difficult to create the connection between actual “onlyfans” style productions and the real person – there’s finances and everything.

      The challenge is in dissuading something that is easy to do AND catastrophic in nature. In cases like that – typically overly harsh punishments are acceptable. How about 50 years imprisonment for producers of defamatory deep fakes?

      • There is now also a “genre” of art that creates obviously fake nudes of celebrities – they are overly stylized. But without a doubt done without the consent of the subject.

        But, if entire and *generally* described “genres” of art are protected as “art” regardless of *particular* egregious and unethical content, we can’t go after that.

  5. You can always try to ban the distribution of pornography without the explicit consent of the people depicted. That has its own problems, but it is less likely to cause issues than the other suggestions. 

Leave a reply to Gregg Wiggins Cancel reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.