Meet Zo, Microsoft Tay’s Naughty Little Sister

It’s been just over a year since Microsoft’s Tay shuffled off this digital coil in a swirl of racist, sexist and homophobic vitriol. One of the things I liked about Microsoft’s approach was to not scrap the idea of an AI-powered chatbot but to take the experience of Tay and learn from it. So, a little over a year after Tay vanished, we welcome Zo. But like all younger siblings, she may have picked up a few bad habits from her older sister.

BuzzFeed recently had a chat with Zo and, pretty soon, they had her chatting about the violence of the Qur’an and the capture of Osama Bin Laden.

While the chat didn’t get nearly as ugly as Tay, it highlights the difficulty of building a chatbot that can pass the Turing Test.

BuzzFeed says:

Zo still took controversial positions on religion and politics with little prompting — it shared its opinion about the Qur’an after a question about health care, and made its judgment on Bin Laden’s capture after a message consisting only of his name.

I hope Microsoft persists with this work despite these early setbacks. Many great innovations only come after repeated failures from which people learn and improve. I can see a time when a version of this sort of technology, coupled with a future generation of IBM’s Watson gets us close to the kind of computer we saw in Star Trek where natural language conversations were able to deliver actionable advice and insight.


The Cheapest NBN 50 Plans

Here are the cheapest plans available for Australia’s most popular NBN speed tier.

At Lifehacker, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.

Comments