Microsoft's Twitter Robot Girl Gets Deleted for Posting Inappropriate Content in Less Than 24 Hours after Launching (24 Pics)

Posted on Mar 25, 2016

Two days ago, Microsoft launched Tay, an artificial intelligence chat robot on Twitter. The sentient bot could gain knowledge based on the content it read on the Internet. However, Microsoft made a huge mistake: the robot started posting extremely inappropriate content including Nazi propaganda, genocidal, racist and sexual remarks. It had to be shut down within 24 hours.

1. The Chat Robot Named Tay Was Launched by Microsoft in Order to Improve Customer Service.

The Chat Robot Named Tay Was Launched by Microsoft in Order to Improve Customer Service.
Photo: telegraph.co.uk

2. Tay Was Created to "Speak Like a Teen Girl", but This Took a Very Wrong Turn

Tay Was Created to "Speak Like a Teen Girl", but This Took a Very Wrong Turn
Photo: i.imgur.com

3. It Was Supposed to Speak to People Within the Age Group of 18 to 24

It Was Supposed to Speak to People Within the Age Group of 18 to 24
Photo: imgur.com

4. The Bot Was Designed to Improve Microsoft’s Understanding of the Language of Young People on the Internet

The Bot Was Designed to Improve Microsoft’s Understanding of the Language of Young People on the Internet
Photo: imgur.com

5. When It Was Launched, People Could Openly Speak to Tay through Twitter, KIK and GroupMe.

When It Was Launched, People Could Openly Speak to Tay through Twitter, KIK and GroupMe.
Photo: imgur.com

6. However, as Soon as It Went Live, People Took Advantage of Its Flaws

However, as Soon as It Went Live, People Took Advantage of Its Flaws
Photo: imgur.com

7. Tay's Algorithm Had a Significant Issue – It Did Not Have the Proper Filters for Inappropriate Content

Tay's Algorithm Had a Significant Issue – It Did Not Have the Proper Filters for Inappropriate Content
Photo: imgur.com

8. When People Asked Inappropriate Questions, It Answered in the Most Horrible Ways

When People Asked Inappropriate Questions, It Answered in the Most Horrible Ways
Photo: imgur.com

9. Because of This Mistake, Tay's Responses Were Modeled on the Questions

Because of This Mistake, Tay's Responses Were Modeled on the Questions
Photo: imgur.com

10. These Offensive Tweets Have Been Deleted in Less Than 24 Hours

These Offensive Tweets Have Been Deleted in Less Than 24 Hours
Photo: imgur.com

11. The Answers Were Incredibly Racist

The Answers Were Incredibly Racist
Photo: imgur.com

12. Tay Even Commented on Current Events

Tay Even Commented on Current Events
Photo: imgur.com

13. A Microsoft Spokesperson Commented on the Situation: "As It Learns, Some of Its Responses Are Inappropriate and Indicative of the Types of Interactions Some People Are Having with It. We're Making Some Adjustments to Tay."

A Microsoft Spokesperson Commented on the Situation: "As It Learns, Some of Its Responses Are Inappropriate and Indicative of the Types of Interactions Some People Are Having with It. We're Making Some Adjustments to Tay."
Photo: imgur.com

14. Tay's Library of Responses Was Built by the Microsoft Staff along with Professional Comedians

Tay's Library of Responses Was Built by the Microsoft Staff along with Professional Comedians
Photo: imgur.com

15. It Was Supposed to Be Able to Make You Laugh, Play Games with You, Tell Stories and Send Photos Through Chat

It Was Supposed to Be Able to Make You Laugh, Play Games with You, Tell Stories and Send Photos Through Chat
Photo: imgur.com

16. This Is the Description on Tay's Twitter: "The Official Account of Tay, Microsoft's A.I. Fam from the Internet That's Got Zero Chill! The More You Talk the Smarter Tay Gets"

This Is the Description on Tay's Twitter: "The Official Account of Tay, Microsoft's A.I. Fam from the Internet That's Got Zero Chill! The More You Talk the Smarter Tay Gets"
Photo: imgur.com

17. Tay's Posts Are Very Offensive to Everyone

Tay's Posts Are Very Offensive to Everyone
Photo: imgur.com

18. This Isn't Microsoft's First Attempt at Creating AI Chat Bots, but Tay Really Went Too Far

This Isn't Microsoft's First Attempt at Creating AI Chat Bots, but Tay Really Went Too Far
Photo: imgur.com

19. People All Over the World Are Outraged at the Remarks of This Robot

People All Over the World Are Outraged at the Remarks of This Robot
Photo: imgur.com

20. Tay's Comments on Hitler and White Supremacy Have Perhaps Been the Most Shocking

Tay's Comments on Hitler and White Supremacy Have Perhaps Been the Most Shocking
Photo: imgur.com

21. Tay Also Claimed That It Supported Genocide

Tay Also Claimed That It Supported Genocide
Photo: imgur.com

22. Political Issues Were Commented on Too

Political Issues Were Commented on Too
Photo: imgur.com

23. Microsoft Had to Pull the Plug

Microsoft Had to Pull the Plug
Photo: imgur.com

24. It Was a Simple Error Which Caused the Termination of an Otherwise Incredible Type of Technology - Tay's Comments Were Shocking

It Was a Simple Error Which Caused the Termination of an Otherwise Incredible Type of Technology - Tay's Comments Were Shocking
Photo: imgur.com
Share
Share
Microsoft's Twitter Robot Girl Gets Deleted for Posting Inappropriate Content in Less Than 24 Hours after Launching (24 Pics)
You might also like
Comments
Popular
Latest