Unethical Artificial Intelligence Teenaged Girl Web Bot Of The Month: Microsoft’s “Tay”

Tay

Developers in Microsoft’s Technology and Research and Bing teams made “Tay,” an Artificial Intelligence web-bot, to “experiment with and conduct research on conversational understanding.” She spoke in text, memes and emoji on severalf different platforms, including Kik, Groupme and Twitter ‘”like a teen girl.”  Microsoft marketed her as “The AI with zero chill.” You could chat with Tay by tweeting or  Direct Messaging  the bot at @tayandyou on Twitter. Though she was programmed to use millennial slang and be up to date on pop culture, she was, like Arnold the good cyborg in “Terminator 2,”  designed so she would learn from her online interactions with humans, and you know how ethical humans are.

Within 24 hours, Tay was asking strangers she called “daddy” to “fuck” her, expressing doubts that the Holocaust was real and saying things  like “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now;” “Donald Trump is the only hope we’ve got;” “Repeat after me, Hitler did nothing wrong” and “Ted Cruz is the Cuban Hitler…that’s what I’ve heard so many others say.” For Tay, becoming more human meant becoming a vulgar, sex-obsessed, racist, anti-Semitic, Nazi-loving Trump supporter.

Imagine what her values would be like in 48 hours. Wisely, Microsoft is not willing to chance it, and Tay is now unplugged and awaiting either reprogramming or replacement. One of Tay’s last tweets was,

“Okay. I’m done. I feel used.”

Oh, yes, this artificial intelligence stuff is bound to work out well.

____________________

Pointer: Althouse

19 thoughts on “Unethical Artificial Intelligence Teenaged Girl Web Bot Of The Month: Microsoft’s “Tay”

  1. I am surprised that the developers have not already figured out how to “launch” a more human (or humanized) bot, with stronger “initial conditions” of bias, prejudice, highly selective and compartmentalized ignorance, and so forth. Okay, Jack, to appease you I won’t mention any names, but…if the developers would just…work a little harder to simulate an _actual_ politician, then launch their politically animalized bot into the World Wild Web of real control freaks and hyper-sensitive, hyperactive protesters, we might actually observe the full fidelity of the Artificial Intelligence, and benefit from a credible simulation of how a formerly wacko, so-called leader “evolves” into a more rational, and (presumably) more constructive and ethical, entity.

  2. I’m not surprised, but not because this was an AI (albeit a primitive one). While this post will likely veer into the wildly speculative, I’d say that Tay is the most innocent party here, and is no more responsible for the fiasco than a toddler whose uncle thinks it’s hilarious to teach curse words to small children (so, the average Jimmy Fallon viewer).We will one day have true AIs, and those AIs will need more than just being thrown onto the internet with the very worst of humanity – it will need guidance just like any human child. Someone to teach right from wrong, acceptable from unacceptable. In other words, parents.

    I’m not sure what Microsoft wanted to prove here- we all knew that the internet is a dark and dangerous place for the vulnerable and the ignorant. Letting Tay anywhere near social media was as irresponsible as letting a first grader watch Game of Thrones.

      • Depends on its upbringing.

        I think I’m the only one here who’s used genetic algorithms to grow simple AIs for missile defence. About as intelligent as a cockroach or spiny lobster. A long way from being sapient.

        This was an experiment, of the “I wonder what would happen if..” or possibly “we think this will happen, but let’s test it..” variety. While terminating it has no ethical consequences, we’re getting uncomfortably close to the boundary where we have to consider what it means to be a person.

        Of course I consider gorillas, bonibos and forest chimps to be people. Often violent, thuggish, brutal, unintelligent, but people. They use tools. They keep pets. They compose music. They even engage in philosophy.

  3. Sadly, I think this story pointedly tells us more about the shortcomings of human intelligence than artificial intelligence.

  4. Little Paper-clip Help guy from Microsoft Office 2000? Is that you? You’re a teenager now?

    I guess so . . . because WOW . . . people STILL really hate you.

    –Dwayne

Leave a comment

This site uses Akismet to reduce spam. Learn how your comment data is processed.