A few months ago, Microsoft surprised the world with Tay, an AI chatbot on Twitter, which could learn about the world by interacting with people on the world-wide-web. The surprise and awe of Tay quickly turned into a horrific display of misguided intelligence, as the bot learned many of the racial stereotypes of our society without realizing it.
Microsoft had to take the bot down, as it refined its ‘intelligence.’ In September, Microsoft launched a Chinese chatbot named Xiaoice specifically for China – perhaps as a test – while it shared some similarities with Tay, it wasn’t on Twitter, and it was quite vague about certain topics relating to politics and society.
The strategy to artificially limit the bot’s responses when it comes to these sensitive matters seems to have worked; Microsoft’s Zo has been released as an early access, and it appears to follow the same ideology.
Zo seems to be following the footsteps of Xiaoice, rather than Tay – it’s only available on Kik (for now), and it has remained sane so far.
The bot refrains itself from commenting on anything remotely political, which is a lesson Microsoft learned after the disaster of Tay.
The purpose of Zo isn’t much – it’s simply a chatbot and more of an experiment rather than a product. Microsoft already has Cortana, which is supposed to be the best virtual assistant you can get.
Whether Cortana is as good as Microsoft makes it out to be is debatable, but it’s clear that Microsoft is trying to get better at the whole AI assistant thing with experiments such as Tay, Xiaoice, and now Zo.
If you want to give Zo a try, click here to register for the early access or start using it via Kik.
Interestingly, Microsoft didn’t announce the launch of Zo – the news comes via Twitter – but it is available to anyone using Kik.
The limitation to Kik isn’t forever – Microsoft plans to bring Zo to Twitter, Facebook Messenger, and Snapchat at some point after the official launch.
It’s not clear when Zo would be officially launched, but considering the early access has already been ‘leaked,’ it shouldn’t be very far; hopefully Microsoft has learned from its mistake, and Zo won’t suffer the same fate as Tay.