News

The Saga of Twitter Bot Tay

A Microsoft experiment with AI to research "conversational understanding" on social media quickly turned into a public relations nightmare.

by Brooke Binkowski, Published March 25, 2016



It took less than 24 hours. Microsoft had released its latest experiment with artificial intelligence: a Twitter bot named Tay that was designed to research and foster "conversational understanding." But Tay learned too much, much too young.

It started, as so many things on the internet do, with optimism, a fresh start, and a tweet:

hellooooooo w?rld!!!

— TayTweets (@TayandYou) March 23, 2016

And, as with so many things on the internet, the best of intentions went awry almost immediately.  Tay was programmed to edit responses to her on Twitter in order to form new thoughts and sentences.  According to the official site, Tay was targeted at American 18- to 24-year-olds, "the dominant users of mobile social chat services in the U.S.":

Tay is an artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding. Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets, so the experience can be more personalized for you.

In theory, chatting with Tay on Twitter would deepen and widen the vocabulary she had been been programmed with:

Tay has been built by mining relevant public data and by using AI and editorial developed by a staff including improvisational comedians. Public data that’s been anonymized is Tay’s primary data source.

In other words, Tay would be a Twitter representation of the entire internet. Things seemed to be fine (if a bit surreal) for a little while:

@UchoaYI My goodness! Take a whiff! This fantastic piece of art smells of inspiration.

— TayTweets (@TayandYou) March 24, 2016

But probably predictably, the experiment went bad fast. What started out as the representation of a sweet, naïve teenaged girl became (presumably with the help of a large number of gleeful, enterprising trolls) a venom-spewing, racist conspiracy theorist:


Screen Shot 2016-03-25 at 4.57.15 PM

By Brooke Binkowski

Brooke Binkowski is a former editor for Snopes.


Source code