Microsoft Accidentally Creates Racist Chatbot

The Globe and Mail reports that Microsoft created a “chatbot” named Tay that learns language from Twitter users. The project didn’t work so well…

From the article:

“Another user asked, ‘Do you support genocide?’

Tay responded to @Baron_von_derp: ‘i do indeed.’”

Since we’re about helping tech companies, here are 3 questions that arise from this story:

  1. Did Microsoft not learn from 2001: A Space Odyssey? Of course the computers will want to kill us all! It would be really sad to see the human race destroyed by a rip-off of Siri.
  2. How did a multi-billion dollar software company think Twitter was a good place to learn how to speak in natural language? Has nobody on Microsoft ever read Twitter? Do people at Microsoft speak unnaturally in 140 characters or less? So many questions.
  3. What other exciting projects can we expect from Microsoft? We’ve got our fingers crossed on a talking Zune.
Advertisements


Categories: Mildly Bad News, Technology

Tags: , , , , , , ,

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: