Bright Ideas

Microsoft Chatbot learns worst of humanity


Implementation Date: March 2016

Microsoft’s artificially intelligent chatbot “Tay” was designed to mimic teens on Twitter. Thanks to internet trolls, within 24 hours it denied the Holocaust, insulted women and agreed with Hitler.

Microsoft Blog, “Learning from Tay’s Introduction” (March 25, 2016)

Key Contact

Post a Comment

Your email is never published nor shared. Required fields are marked *


You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

Please answer the question below to confirm that you are not a spambot * Time limit is exhausted. Please reload CAPTCHA.

All contents copyright © 2014 Eduvation Inc. All rights reserved.