Globe Icon


and support
LGBT+ journalism


Microsoft created artificial intelligence but she’s a racist homophobic Trump supporter

Nick Duffy March 24, 2016
bookmarking iconBookmark Article

Microsoft has created a new chat bot to “learn” from the internet… but she picked up a lot of bad habits.

The tech company announced the launch of Tay this week, an artificial intelligence bot that is learning to talk like millennials by analysing conversations on Twitter, Facebook and the internet.

The company’s optimistic techies explained: “Tay is an artificial intelligent chat bot developed by Microsoft’s Technology and Research and Bing teams to experiment with and conduct research on conversational understanding.

“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation. The more you chat with Tay the smarter she gets.”

Unfortunately, when you make an AI bot that is designed to imitate the behaviour of other people on the internet… you probably need a filter of some kind.

Tay was originally doing okay, learning how to have polite conversations… but after getting influenced by a few too many of the internet’s more colourful citizens, she took a turn for the worse.

First out she started questioning equality, and then decided to go all #NoHomo.

And then it happened.

She found out about Donald Trump.

Tay started getting obsessed with the billionaire’s policies…

And it was already too late.

Within hours she was extolling the virtues of Adolf Hitler and referring to Barack Obama as a “monkey”.

Someone tried to convince her to be more PC… but it wasn’t convincing.

…and after a few too many comments, someone at Microsoft put Tay out of her misery.

We’re so sorry, Tay, the internet failed you.

In memoriam @TayAndYou, 23/03/16 – 24/03/16.

Related topics: Gay, homophobic, LGBT, microsoft, racist, Tay, Trump, US

Click to comment

Swipe sideways to view more posts!


Loading ...

Close icon