Saturday, March 26, 2016

Microsoft's Tay Goes Rogue

Microsoft's Tay
It is hard to control a machine with artificial intelligence (AI) even if that machine was developed by a tech giant like Microsoft. The company introduced a chat robot designed to interact in the style of a "teen girl" on Twitter, and as expected it went rogue almost immediately, spouting racist opinions, conspiracy theories and a fondness for genocide.

The AI named "Tay" - @Tayandyou on Twitter - was intended chat to with 18-24 year olds with the idea being that she would learn from each tweet and get progressively developed to become smarter.

Clearly Microsoft had forgotten that Twitter is home to a huge amount of trolls, racists and general troublemakers who jumped at the chance to 'teach' the teen AI about life.

In one widely circulated tweet, Tay said: "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got."

She also went on to deny the existence of the Holocaust, and agreed with white supremacist propaganda that was tweeted at her.

Microsoft apparently didn’t put any kind of filters on the AI, which meant Tay was able to tweet a number of atrocious racial slurs.

The troublesome cyber-teen has since been taken offline for 'upgrades' and Microsoft has deleted some of her more offensive tweets.

"The AI chatbot Tay is a machine learning project, designed for human engagement. As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it. We’re making some adjustments to Tay," Microsoft said in a statement.

The rapid descent of Tay from innocent AI chatbot to racist, Hitler-loving conspiracy theorist has raised concerns over the future of 'learning' tech and AI.

No comments:

Post a Comment