10 Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage (www.huffpost.com) posted 1 year ago by conspiracymane 1 year ago by conspiracymane +10 / -0 Microsoft Chat Bot Goes On Racist, Genocidal Twitter Rampage Seriously? Seriously. 5 comments share 5 comments share save hide report block hide replies
"Here's a clear example of artificial intelligence gone wrong."
"Less than 24 hours after the program was launched, Tay reportedly began to spew racist, genocidal and misogynistic messages to users."
"Another post said feminists "should all die and burn in hell."
"Trolls taught Tay these words and phrases"