Current Magazine

Microsoft Accidentally Creates Racist Chatbot

By Nottheworstnews @NotTheWorstNews

The Globe and Mail reports that Microsoft created a “chatbot” named Tay that learns language from Twitter users. The project didn’t work so well…

From the article:

“Another user asked, ‘Do you support genocide?’

Tay responded to @Baron_von_derp: ‘i do indeed.’”

Since we’re about helping tech companies, here are 3 questions that arise from this story:

  1. Did Microsoft not learn from 2001: A Space Odyssey? Of course the computers will want to kill us all! It would be really sad to see the human race destroyed by a rip-off of Siri.
  2. How did a multi-billion dollar software company think Twitter was a good place to learn how to speak in natural language? Has nobody on Microsoft ever read Twitter? Do people at Microsoft speak unnaturally in 140 characters or less? So many questions.
  3. What other exciting projects can we expect from Microsoft? We’ve got our fingers crossed on a talking Zune.

Back to Featured Articles on Logo Paperblog