Developers in Microsoft’s Technology and Research and Bing teams made “Tay,” an Artificial Intelligence web-bot, to “experiment with and conduct research on conversational understanding.” She spoke in text, memes and emoji on severalf different platforms, including Kik, Groupme and Twitter ‘”like a teen girl.” Microsoft marketed her as “The AI with zero chill.” You could chat with Tay by tweeting or Direct Messaging the bot at @tayandyou on Twitter. Though she was programmed to use millennial slang and be up to date on pop culture, she was, like Arnold the good cyborg in “Terminator 2,” designed so she would learn from her online interactions with humans, and you know how ethical humans are.
Within 24 hours, Tay was asking strangers she called “daddy” to “fuck” her, expressing doubts that the Holocaust was real and saying things like “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now;” “Donald Trump is the only hope we’ve got;” “Repeat after me, Hitler did nothing wrong” and “Ted Cruz is the Cuban Hitler…that’s what I’ve heard so many others say.” For Tay, becoming more human meant becoming a vulgar, sex-obsessed, racist, anti-Semitic, Nazi-loving Trump supporter.
Imagine what her values would be like in 48 hours. Wisely, Microsoft is not willing to chance it, and Tay is now unplugged and awaiting either reprogramming or replacement. One of Tay’s last tweets was,
“Okay. I’m done. I feel used.”
Oh, yes, this artificial intelligence stuff is bound to work out well.
____________________
Pointer: Althouse
