Microsoft’s newest chatbot, called Tay, went live less than 24 hours ago. Hailed as a new attempt to create a chatbot to learn from how people interact with it on twitter it has quickly descended into anarchy as the bot began to reply with antisemitic, racist, and holocaust denial tweets to users that clearly became an embarrasment for the company, as can be seen in an article by Business Insider (images below are from article http://uk.businessinsider.com/microsoft-dele…from-ai-chatbot-tay-2016-3).

You can’t say it didn’t learn it from us and I for one welcome our new insane chatbot overlords.

Advertisement