Microsoft apologizes for offensive tirade by its 'chatbot'

  • 8 years ago
Microsoft is "deeply sorry" for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week The bot, known as Tay, was designed to become "smarter" as more users interacted with it Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program forcing Microsoft Corp to shut it down on Thursday

Recommended