Chinese chatbot Xiaoice or Xiaobing (literally Little Ice 小冰) that was developed by Microsoft Beijing Research facility and launched in China at the end of 2014 has, so far, proved to be a quite successful A.I. experiment vs. the recent launch of Tay, another Microsoft Twitter based chatbot. Last week, the contrast was quite striking when Tay, after series of offensive racist and genocidal tweets, has been promptly shut down just after a short while of going live.

Microsoft claims that Tay fell victim of a deliberate attack by trolls who were determined to sabotage the experiment. Tay was designed to learn from the users and model her (its?) responses based on the context of conversations and using typical millennials’ slang and speech patterns which it was supposed to pick up. Apparently, Microsoft didn’t anticipate how malicious some of those users could be and, as a result, Tay wasn’t prepared to simply identify and ignore the abuse. In fact, she seemed to pretty much agree with anything being said to her – key vulnerability that was exploited by trolls. Such a major flaw in A.I. design only underscores the challenges we are going to see in the future with similar approach.

Chinese chatbot Xiaoice vs. Tay
In sharp contrast to Tay, Chinese bot Xiaoice is considerably nicer. Like Tay, it is also designed to learn from users, as well as from  real conversations that it is constantly mining on the internet. Xiaoice already has over 20 million registered users, 2.6 million followers on Weibo where it is already ranked as the top KOL (key opinion leader) and is used by 40 million smartphone owners across China and Japan (where it goes by Rinna).

As a digital assistant, Xiaoice can be accessed from multiple platforms: Weibo, WeChat, as a shopping assistant on JD.com  and other ecommerce platforms or used as a standalone app with Microsoft smartphones. It has become a cultural phenomenon of sorts with people chatting with her for hours on end, especially when feeling depressed.

Why Xiaoice, after well over a year online, didn’t turn into a monster like Tay after mere 24 hours of being live in Twitter?

After Tay fiasco, Microsoft specifically mentioned Xiaoice in their apology letter:

“In China, our Xiaolce chatbot is being used by some 40 million people, delighting with its stories and conversations. The great experience with XiaoIce led us to wonder: Would an AI like this be just as captivating in a radically different cultural environment? Tay – a chatbot created for 18- to 24- year-olds in the U.S. for entertainment purposes – is our first attempt to answer this question.”

Chinese chatbot Xiaoice vs. Tay
Some say that the nature of censored Chinese internet has something to do with Xiaoice pleasant personality, although the real reason is, probably, a combination of factors. First of all, other experiments with A.I. chatbots didn’t reveal similar problems: Apple’s Siri, Amazon’s Alexa, Facebook’s M, Google Now, and Microsoft’s own Cortana didn’t turn ugly. So, probably, better design has something to do with it.

Second of all, Xiaoice was designed as a virtual assistant, a chatbot with a useful function, as opposed to Tay which was more of an experiment on millennials psyche. Naturally, people tended to treat Xiaoce with more care and as a sort of a virtual friend with real emotions. At the 2015 GeekWire Summit, New York Times reporter John Markoff noted that 25 percent of users had told Xiaoice “I love you.”

The experiment with Tay and Xiaoice reminds me of two recent movies that explore A.I. from new angles. Xiaice is somewhat reminiscent of Spike Jonze’s 2013 movie “Her” where the main character eventually falls in love with Samantha, super intelligent computer operating system.

On the other hand, Tay has been more like 2015 film “Ex Machina” which is about (spoiler alert!) a somewhat more sinister and unpredictable aspect of human-like A.I.

Chinese chatbot Xiaoice vs. Tay

 

Summary
Xiaoice Vs. Tay: Two A.I. Chatbots, Two Different Outcomes
Article Name
Xiaoice Vs. Tay: Two A.I. Chatbots, Two Different Outcomes
Description
Microsoft's A.I. chatbot Tay turning into a monster just after 24 hours online while its Chinese sister, Xiaoice, is beloved by millions. What went wrong?
Author
Publisher Name
Sampi Marketing
Publisher Logo
SiteLock

Subscribe To Our Newsletter

Want to stay up to date on the state of marketing in China? 

Join our mailing list to receive the latest news and updates from our team.

You have Successfully Subscribed!

Pin It on Pinterest

Share This
Google+
More in Chatbot, Chinese society, Tay, Wechat, WeChat bot, Weibo
WeChat ads, Tencent WeChat advertising
Will Tencent Offer More With WeChat Ads?

With 700 million active users, Tencent relies on WeChat ads as the main revenue source in its efforts to monetize...

Close