Ethics Quiz: Your Swedish Post-Mortem Avatar

Swedish scientists believe artificial intelligence can be used to make “fully conscious copies” of dead people, so a Swedish funeral home is currently looking for volunteers who are willing let the scientists use their dead relatives in their experiments. The scientists want to build robot replicas, and to try to approximate their personalities and knowledge base in their artificial “brains.”

For those of you who are fans of the Netflix series “Black Mirror,” there was an episode closely on point in which  grieving woman bought an AI -installed mechanical clone of her dead boyfriend. (This did not work out too well.)

I was about to discard objections to such “progress” as based on ick rather than ethics, when I wondered about the issues we already discussed in the posts here about zombie actors in movies and advertising. Is it ethical for someone else to program a virtual clone of me after I’m dead that will be close enough in resemblance to blur what I did in my life with what Jack 2.0 does using an approximation of my abilities, memories and personality?

I think I’m forced to vote “Unethical” on this one as a matter of consistency. Heck, I’ve written that it’s unethical for movies and novels to intentionally misrepresent the character of historical figures to such an extent that future generations can’t extract the fiction from the fact. (Other examples are here and here.) Respect for an individual has to extend to their reputation and how they wanted to present themselves when they were alive. Absent express consent, individuals should not have to worry that greedy or needy relatives, loved ones, artists or entrepreneurs will allow something that looks like, sounds like and sort of thinks like them to show up and do tricks after the eulogy.

I am not quite so certain about this branch of the issue, however, and am willing to be convinced otherwise. After all, pseudo Jack could stay inside, and only be programmed to do a nude Macarena while wearing a bikini for my wife, while no one else would be the wiser. Or nauseous. And after all, I’m dead. Why should I care? Well, the fact is I do care. For me, this is a Golden Rule issue.

Your Ethics Alarms Ethics Quiz of the Day is this:

Will the Swedes who elect to allow scientists to try to perfect Dad-in-a-Box for nostalgia, amusement, companionship  and to take out the garbage be unethical, betraying their departed loved ones’ dignity?


30 thoughts on “Ethics Quiz: Your Swedish Post-Mortem Avatar

  1. Is it ethical for someone else to program a virtual clone of me after I’m dead that will be close enough in resemblance to blur what I did in my life with what Jack 2.0 does using an approximation of my abilities, memories and personality?

    You’re living in your own Duncan Idaho.

    Two cheeze-y pop culture references for the price of one!!

  2. Yes it is unethical.

    This falls into the same category of creating human clones. Using AI to try – and I repeat try to recreate a former human persona is undeniably unethical because:

    1. The representation is the perception of the creator not the actual person. No one understands the nuance of another’s thoughts and behaviors other than the person in question.
    2. If applied to public figures some could revise actual thoughts to reflect political beliefs and creeds that support a given narrative that would not reflect the actual character of the individual. Imagine Lincoln being programmed to reflect the progressive agenda.
    3. If it is unethical to reprogram a living person’s mind to obtain a desired political objective then the use of AI to recreate the persona to reflect the desired mindset is as well.
    4. If 3 is considered ethical then what stops the act of reprogramming non- deceased persons.

    AI may be useful in developing answers to complex scenarios but the operative word is artificial so applying it to a human persona is an improper use of technology just as it would be to use AI to determine guilt or innocence in a court of law. Just because an algorythm can create an estimate of an action based on estimates of probability of an act these algorythms cannot accurately estimate probability resulting from emotion.

  3. The interesting part of the question is what if Jack 2.0 does approximate you closely enough that it could be counted on to voice and enact your wishes? If it could, how would it be unethical to allow 2.0 to continue to do that? Would it be unethical not to allow you to continue (short of specific directions) if the technology was available and within your means? (After all, it’s typically unethical to not allow people to continue voicing and enacting their wishes when that power is in your hands.)

    Bonus question: what would that mean for human/robot rights?

  4. My answer is that most of it is “Ick” but that it’s also a dangerous slippery slope. Initially, I don’t think the Swedes who participate in this early experiment will be unethical. However, I don’t think it’s healthy for them to interact with such creations, if they truly are “life like”. I think the violation of the deceased person’s dignity is undetermined by simply being recreated. Should that person’s likeness be used to make viral revenge videos postmortem, well, that’s probably unethical.

    Consider where the future is going at the moment. If we can do this with a deceased person, we can probably do this with a living person. We can probably take a blank slate android and do a special order “Deep Fakes” to your favorite actor/actress and even give them the personality of one or a blend of their film characters.

    Once something is created (and eventually it will be) there’s no control on how it is used. Everything gets hacked and re-engineered so that the layman can have total freedom to recreate in his/her own vision.

    Perhaps the upside is that instead of Norman Bates killing people in the motel, he simply orders a deep fakes android of his mommy and produces snuff films of his mommy android for the dark web.

    Ugh. I hate to even sound like a person phobic of the future and technological advances, but if we can’t even ethically use current technology responsibly, I think we’ve got to take an extreme time-out from this kind of new-age scientific exploration until we grow up a bit and set some cultural guidelines.

  5. I agree it is unethical, but I’m coming to that conclusion from the other side. *If* it is close enough to pass for the dead person, *and* good enough to actually mimic the person, who’s to say that the AI programmed robot is not its own entity, separate from the original human. Barring p-zombies, you have created a new being with the sole objective of fulfilling your need to have the original one around. This is violation of the “do not use other peoples as means to an end” principle.

  6. An ‘approximation’ would, indeed, be unethical, because it is an approximation. It will present fiction as fact, which the libs and high school students are already doing. But, consider this: what if it becomes possible to record everything that is in your head. At that point, if it is downloaded into a simulacrum, it is no longer an approximation, but is responding as you would, based on your personality and knowledge-base. Still unethical? Frankly, I don’t know.

  7. Second try…even logged in to WordPress before posting, and still my comment would not post on the first try…

    I do not see a difference between technology-enhanced re-creations of formerly living persons and mere re-enactors. Both are dependent on someone – even many someones – other than the actual person, to create some virtuality (that’s probably not even a word), some notion, of the actual person. Fidelity could be high to non-existent. If confirmation bias is a struggle for you in relation to your perception of a particular person, then that person’s re-creation could either feed that bias, or it could impute serious cognitive dissonance.

    So it’s definitely an observer-beware set-up…so, who gets to decide what is harmful, thus unethical?

    I also am not sold on the idea that prior advance consent is mandatory, or, that re-creation is unethical if done without it.

    Relevant to that, is a commercial I have seen on TV recently, showing “[George] Washington crossing the Delaware [Turnpike, I believe].” It’s an absurdist depiction: Washington in his boat, guys pushing it across a road while traffic is held up; people honking their horns, Washington irritated, bellowing as one with road rage: “WE ALL HAVE PLACES TO GO! WE ALL HAVE PLACES TO GO!” and finally muttering bitterly, “BIIIG man with a BIIIG horn!” I hate the commercial, frankly. Even though I have seen it certainly more than a half-dozen times, it has annoyed me so much, I still cannot recall what the ad is about – have not paid attention to whatever (as far as I can discern) the advertiser is interested in my noticing. That, for me, is “really saying something,” because even though absurdism is my favorite kind of humor, the absurdism of that commmercial is absolutely useless for me, even off-putting. I feel quite assured that George Washington would not consent to a likeness of himself being portrayed like that commercial portrays him. But do I think the commercial is unethical because of how it portrays Washington? No. It’s just ineffectively absurd (to me).

    I sometimes wish there was a way to take a global time-out with technology exploitation. But that will never happen through voluntary human conduct. We humans are stuck with the way we are, ever were, and ever will be – in conflict with forces both within, and beyond, our control.

  8. For those who don’t me well, my full name is Adam Zanshin. Some of you know that last year my parents died suddenly in a horrific car crash. What I haven’t told anybody here, and also not to be found on my Facebook timeline is that there was a significant heritage to divide between me and my two sisters. To be honest, it was a painful process. Especially the farm; who would continue it?

    It got so bad thatat one moment we decided to ask Saul, a licensed mediator, to help out. In the third session the following discussion transpired.

    Adam: Dad always wanted me to have the farm.

    Lilith: Is that so? Let’s ask him.

    The door of the meeting room swings open and a man walks in.

    Eva [never been the brightest]: Dad?!? … How’s that … I mean … Who are you?

    Adam: Come on Eva that’s not dad. That’s a replica of dad.
    Seriously Lilith? Was this build by Replicas-R-Us?
    That thing can’t voice the will of dad! It wasn’t even build with any input from me!
    Did dad approve or even know about this?

    Adam looks for support from Saul: This is not good man. That’s not my father; this is so utterly completely soulless.
    By the way, didn’t you read the latest arrest from the Supreme Court? The case between Noah vs American Airlines? One can have replicas of deceased family members for comfort but the airline is not obliged to give them their own seat; you have to check them in, in a body bag preferable, and pay the normal fee for extra luggage.

    Saul: And this is relevant because … ?

    Adam: They also wrote that replicas don’t have any standing in legal cases.

    Saul: Ooops. Better call Saul.

    He grabs the phone and dials a number. After some whispered conversation the TV screen on the wall turns on and we see the head of another Saul on the screen.

    Screen Saul: Hi y’all. I just heard from my personal replica that the SC has decided that replicas don’t have any legal standing. That means that I will recuse myself from this case.
    Lilith? Are you there in the room? Listen honey, sorry it didn’t work out. Don’t call me. And I won’t call you. Just pay the bill I will send you.

    And as the camera zooms out, we see the real [sic] Saul on a Caribbean beach, a pina colada in his hand, flanked by two….

    A last bleep and the screen turns black.

  9. My gut response is that this is creepy as hell, and the ick factor is off the charts. No way would I ever authorize participation in such an endeavor.

    This world is for the living, not for avatars of the deceased. I recall a line from the movie Cocktail: “Bury the dead, they stink up the place.”

    As for the ethics of it, well, I feel confident my parents would be horrified speechless if I could speak to them again and proposed such a plan. So I’m going to say that the Golden Rule applies, even if it’s to the perception of my parents’ wishes and not the confirmed reality. Perhaps it is my bias and not their actual desires I perceive, but alas, I can only go with the corporeal, not the dearly departed.

  10. Quick take:

    1) Any “replica” that falls short of being a completely perfect copy of the original in terms of attitudes, personality, decision making, and behavior is unethical, because even a slight deviation from what the original would do/say/think is a deception that others will immediately assume is accurate.

    2) Any “replica” that *perfectly* mimics the original, flawlessly… not sure.

    Of course, won’t the ethics of this ultimately boil down to consent?

    If the original consents, then fine as long as the replica adheres to the restrictions placed on it by the original.

    If the original doesn’t consent, then it’s unethical out the door.

    That being said, of course, what statute of limitations? We always hijack the image of people long gone when depicting them in art/culture, and we’ll never get them *just right*.

    • Consider that a replica would never be subject to any consequences. They can’t ever mimic their real life counterparts because they aren’t weighing the risk / reward of their actions.

      Example 1: We moderate our opinions and speak gingerly because we don’t want to ruin any future job prospects. e.g. Social Media discussions, socially within our community or at work.

      Example 2: We might not be fully truthful if we think we can benefit, like, you tell your wife that she doesn’t appear to have gained new weight.

      A perfect replica would have to have fear of consequences…otherwise, it’s just a drunk version of you with no inhibitions.

      • That would be the ultimately test of how much integrity is a solid component of your personality…

        If your replica continues to “behave” because it’s the right thing to do, rather than behave because there might be consequences.

        • It’s not that simple, though. “The right thing to do” could have the same “integrity” if the situation duplicated (or at least approximated, depending on its technology) something the living person had actually experienced or knew of, but the consequences of even the identical situations – legal, physical, emotional, social, etc. – can and will change and reinvent themselves, usually for the worse.

  11. Let’s not confound A.I.’s with recorded personalities. A.I.’s are a whole different topic and open up a serious ethical can of worms. And an ‘approximation’ is clearly unethical, since the responses of the simulacrum will be determined by the programmer, not by the deceased involved. However, if I record who you re and put THAT into the simulacrum, unethical or no?

  12. I am surprised no one has brought Ray Bradury’s “Marionettes Inc.” up so far. (Except for maybe Zanshin’s comment above, in which case, slow clap!).

  13. Most of the things I would say have already been said here, so I’ll take this in a different direction.

    The desire to recreate a deceased friend or relative is based on the motivation of envy/dedication, which I define as the desire for a specific experience (imposing particular requirements and limitations on the experience you want).

    In a more general sense, as dragin-dragon points out, creating consciousness in such a way that we can choose its character traits raises questions about what sort of requirements and limitations we can ethically impose on it. The idea of making them imitate preexisting people is but one of many. If we are to tailor an artificial consciousness to any particular need, I posit that that could be ethical depending on the circumstances, but that it would be unethical, unwise, and ultimately impossible to attempt to prevent it from evolving as a person.

    People change (by definition, I’d argue), and a person who attempts to recreate a relative in a true AI may be disappointed that the AI changes as a person (even if that’s exactly what would have happened with the original person). Alternatively, they may very well change themselves and eventually grow tired of an AI that manages to stay the same.

    Taking it back to the general case, what of an AI which is not necessarily patterned after anyone in particular, but designed to be a near-ideal companion for a specific person? I see two ways this could go. If the person is mature and healthy, they don’t really need a “perfect” companion, and they are able to grow as a person and interact with other people who likewise grow and change, including the AI. No real problem there. If the person is not so well-adjusted, getting exactly what they want, regardless of whether it’s an imitation, an original work, a true AI, or an automaton, may trap them in their own addictive thoughts.

    Such addiction has existed ever since it became possible for humans to occasionally, if they’re lucky, get what they want without limits–even for a short time. Technology merely makes instant gratification that much easier. This is one of the four fundamental threats I’ve referred to in the past: the Apocalypse of Age, the threat from within ourselves. By becoming addicted to the eight basic desires, people may lose their values and stagnate or degenerate, imposing too many limitations or tearing them all down. That’s the threat that Tim LeVier and luckyesteeyoreman are referring to, and that’s something I’m trying to counteract by constructing a healthy meta-culture: a conceptual framework on which to base cultures. (It’s more or less finished, so the next step is presenting it compellingly.)

  14. What I don’t understand here why this is being applied the context of dead people, as if that’s some sort of requirement…?

    Certainly I should be able to, as a still-alive person, be able to not only consent but assist is the creation of a replicant/cylon/hologram/whatever copy of myself.

    Wouldn’t my assistance be crucial to the quality of the end-result?

    It’s this focus on making copies of the dead that raises my antennae to something smelling fishy here.


  15. Anyone who created an AI out of MY 78-year-old body unless it was for a scientific study of an obstreperous, curmudgeonly, grouchy old adjectival spoil-sport with slight cognitive impairment would deserve what they got. Heeheeheehohohohahunhhmmmm.

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.