Microsoft launched Tay, the artificial intelligence chat bot to Twitter on Wednesday and just a day later had to pull the plug.
“Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.” Microsoft explains.
In just a day online, the AI bot has asked her followers to ‘f***’ her, told others “I f****** hate feminists and they should all die and burn in hell”.
“Tay” went from “humans are super cool” to full nazi in <24 hrs and I’m not at all concerned about the future of AI pic.twitter.com/xuGi1u9S1A
— Gerry (@geraldmellor) March 24, 2016
Other tweets included: “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we’ve got”, “Repeat after me, Hitler did nothing wrong”.
Tay’s last tweet suggested she was tired after all the conversations.
c u soon humans need sleep now so many conversations today thx💖
— TayTweets (@TayandYou) March 24, 2016
Microsoft have confirmed that they are currently tweaking the bot: “The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.”