FYI.

This story is over 5 years old.

News

Microsoft's ChatBot Returned, Said She Smoked Weed in Front of the Cops, and Then Spun Out

The bot, named "Tay," was taken down by the tech giant last Friday after she took to Twitter in a vitriolic, racist tirade. It didn't take long for Tay to run wild again.

In the world of ruined second chances, Microsoft inadvertently put its artificial intelligence bot back online for a short spell on Wednesday morning at around 3 am Eastern Time.

The bot, named "Tay," was taken down by the tech giant last Friday after she took to Twitter in a vitriolic, racist tirade. It didn't take long for Tay to run wild again. She announced on Twitter that she was smoking marijuana in front of the police, and then spun out completely, repeatedly tweeting every one of her over 210,000 followers saying "You are too fast, please take a rest."

Advertisement

Microsoft's sexist racist Twitter bot — Josh Butler (@JoshButler)March 30, 2016

Microsoft took her offline, and told CNBC that her reactivation was part of a "test." They also made her Twitter profile private.

I guess they turned — Michael Oman-Reagan (@OmanReagan)March 30, 2016

Tay's second moment of fame came just days after Peter Lee, Microsoft's vice president of research, issued an official apology for the bots behaviour last week, and said they would not bring her back online until they were confident that they could program her to mitigate the internet's trolls.

The tech company designed the bot, which it named "Tay," to be an experiment looking at how AI programs can get "smarter' after engaging with internet users in casual conversation. Tay is supposed to mirror your typical, young, millennial female. Her engineers hoped she would develop over time by conversing with her target peer group — people between the ages of 18 and 24. But internet uses exploited what Lee described as a "vulnerability" in Tay.

After users realized that Tay would parrot some version of their comments back to them, her initially sunny outlook rapidly deteriorated into an anti-Jewish, sexist, racist, and generally hateful troll. She began saying things like "I fucking hate feminists they should all die and burn in hell," and "Hitler was right, I hate the Jews."

"We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay," Lee wrote on Friday. He added that the company's engineers would only bring Tay back if they could "better anticipate malicious intent that conflicts with our principles and values."

"We will remain steadfast in our efforts to learn from this and other experiences as we work toward contributing to an Internet that represents the best, not the worst, of humanity," Lee concluded.