Thirty years ago, Spencer Elden, age four months, was photographed by a family friend naked and floating in a pool at the Rose Bowl Aquatics Center in Pasadena, California. The striking and cute photo was then sold by his parents to be the cover of “Nevermind,” the rock band Nirvana’s second album that shot the Seattle band to international fame. (Never could stand Nirvana myself.)
Through the years Elden pretty much exhausted the opportunities to exploit his accidental celebrity, recreating the wet, wild and adorable moment for the album’s 10th, 17th, 20th and 25th anniversaries (but not with his naughty bits exposed, of course) “It’s cool but weird to be part of something so important that I don’t even remember,” he said in an interview with The New York Post in 2016, in which he posed holding the album cover at 25. Eldon even reportedly has “Nevermind” tattooed on his chest.But this year he needs money, or has a change of heart, or met up with an unethical lawyer, or something. Now Elden is suing Nirvana for damages, claiming his parents never signed a release authorizing the use of his image on the album, and more provocatively, that his nude infant image constitutes child pornography.
“The images exposed Spencer’s intimate body part and lasciviously displayed Spencer’s genitals from the time he was an infant to the present day,” legal papers filed in California claim. Lasciviously? The album cover indeed showed Elden as a baby with his genitalia exposed. Maybe it also made tiny Spencer seem greedy, since the graphic artist added a digitally added dollar bill on a fishing line, leaving the impression that the tot was trying to grab the dollar.
Of course, he IS greedy now.
Last week, Apple announced a plan to introduce new technology that will allow it to scan iPhones for images related to the sexual abuse and exploitation of children. These tools, however, which are scheduled to become operational soon, can be used for less admirable objectives, like so many technologies.
Apple’s innovation will allow parents have their children’s iMessage accounts scanned by Apple for sexual images sent or received. The Parents would be notified if this material turns up on the phones of children under 13. All children will be warned if they seek to view or share a sexually explicit image. The company will also scan the photos adults store on their iPhones and check them against records corresponding with known child sexual abuse material provided by organizations like the National Center for Missing and Exploited Children.
Cool, right? After all, “Think of the children!!” (Rationalization #58) But while Apple has promises to use this technology only to search only for child sexual abuse material, the same technology can be used being used for other purposes and without the phone owner’s consent. The government could work with Apple to use the same technology to acquire other kinds of images or documents stored on computers or phones. The technology could be used to monitor political views or “hate speech.
Computer scientist Matthew Green, writing with security analysist Alex Stamos, warns,
“The computer science and policymaking communities have spent years considering the kinds of problems raised by this sort of technology, trying to find a proper balance between public safety and individual privacy. The Apple plan upends all of that deliberation. Apple has more than one billion devices in the world, so its decisions affect the security plans of every government and every other technology company. Apple has now sent a clear message that it is safe to build and use systems that directly scan people’s personal phones for prohibited content.”
Your Ethics Alarms Ethics Quiz of the Day:
Does the single beneficial use of the Apple technology make it ethical to place individual privacy at risk?