Ethics Alarms has honored some unlikely people as Ethics Heroes—Bill Clinton, Bill Maher, and Terry McAuliffe, for example. Twitter nabbing the distinction may be a record for cognitive dissonance, though. It is an unethical company, with a platform that does more damage than good. And yet…
Twitter announced that it was expanding its private information policy to forbid posting the ” media of private individuals without the permission of the person(s) depicted.“
Not that this policy is remotely possible to enforce; it isn’t. “When we are notified by individuals depicted, or by an authorized representative, that they did not consent to having their private image or video shared, we will remove it,” Twitter explains. “This policy is not applicable to media featuring public figures or individuals when media and accompanying tweet text are shared in the public interest or add value to public discourse.”
Yes, the policy goes well beyond any legal restrictions: that’s what makes it ethical rather than compliant. What is ethically admirable about the rule is that it calls attention to an ethical violation so common that few think it is a violation at all. When I allow a friend to take a photo of me, that is consent for that friend to make and have a copy of my likeness. It is not consent for my likeness to be circulated to the world on social media, included in facial recognition databases, be manipulated digitally to embarrass or humiliate me, or any other purpose. No law will help me claim that I did not consent to circulation of my likeness, which is why Naked Teachers have a problem. The law assumes that such use can and should be anticipated when I let myself be photographed. That, however, is a legal fiction. I have seen, online, photos of me when I wasn’t aware that I was in the picture. I hate photos of me.
This is Labor Day, after all…
Eventually it is irresponsible and cowardly to criticize all of the rhetoric regarding abortion and not make a serious proposal. I feel like I’ve reached that point.
Let’s start with what we have to work with.
I have not labored to put these in order of priority or importance, and many constitute “but on the other hand” reflexes upon considering the previous point. I’ll bold the items that seem particularly important as I post them. I am certain that I will miss some or many points that need to be considered as well.
Thirty years ago, Spencer Elden, age four months, was photographed by a family friend naked and floating in a pool at the Rose Bowl Aquatics Center in Pasadena, California. The striking and cute photo was then sold by his parents to be the cover of “Nevermind,” the rock band Nirvana’s second album that shot the Seattle band to international fame. (Never could stand Nirvana myself.)
Through the years Elden pretty much exhausted the opportunities to exploit his accidental celebrity, recreating the wet, wild and adorable moment for the album’s 10th, 17th, 20th and 25th anniversaries (but not with his naughty bits exposed, of course) “It’s cool but weird to be part of something so important that I don’t even remember,” he said in an interview with The New York Post in 2016, in which he posed holding the album cover at 25. Eldon even reportedly has “Nevermind” tattooed on his chest.But this year he needs money, or has a change of heart, or met up with an unethical lawyer, or something. Now Elden is suing Nirvana for damages, claiming his parents never signed a release authorizing the use of his image on the album, and more provocatively, that his nude infant image constitutes child pornography.
“The images exposed Spencer’s intimate body part and lasciviously displayed Spencer’s genitals from the time he was an infant to the present day,” legal papers filed in California claim. Lasciviously? The album cover indeed showed Elden as a baby with his genitalia exposed. Maybe it also made tiny Spencer seem greedy, since the graphic artist added a digitally added dollar bill on a fishing line, leaving the impression that the tot was trying to grab the dollar.
Of course, he IS greedy now.
Last week, Apple announced a plan to introduce new technology that will allow it to scan iPhones for images related to the sexual abuse and exploitation of children. These tools, however, which are scheduled to become operational soon, can be used for less admirable objectives, like so many technologies.
Apple’s innovation will allow parents have their children’s iMessage accounts scanned by Apple for sexual images sent or received. The Parents would be notified if this material turns up on the phones of children under 13. All children will be warned if they seek to view or share a sexually explicit image. The company will also scan the photos adults store on their iPhones and check them against records corresponding with known child sexual abuse material provided by organizations like the National Center for Missing and Exploited Children.
Cool, right? After all, “Think of the children!!” (Rationalization #58) But while Apple has promises to use this technology only to search only for child sexual abuse material, the same technology can be used being used for other purposes and without the phone owner’s consent. The government could work with Apple to use the same technology to acquire other kinds of images or documents stored on computers or phones. The technology could be used to monitor political views or “hate speech.
Computer scientist Matthew Green, writing with security analysist Alex Stamos, warns,
“The computer science and policymaking communities have spent years considering the kinds of problems raised by this sort of technology, trying to find a proper balance between public safety and individual privacy. The Apple plan upends all of that deliberation. Apple has more than one billion devices in the world, so its decisions affect the security plans of every government and every other technology company. Apple has now sent a clear message that it is safe to build and use systems that directly scan people’s personal phones for prohibited content.”
Your Ethics Alarms Ethics Quiz of the Day:
Does the single beneficial use of the Apple technology make it ethical to place individual privacy at risk?