"Xiaoice" 42,400 ghits (that's pronounced "xiǎo ice")
"小冰" 362,000 ghits (that's pronounced "xiǎo bīng")
"小ice" 11,200 ghits (that's pronounced "xiǎo ice")
"Little Bing" 16,000 ghits (she's obviously named after Microsoft's search engine*)
"Little Ice" for the chatbot doesn't work, because that's the name of Ice-T's son.
Not all of these ghits are to the Chinese chatbot program; some are for Facebook and Twitter monikers, etc., but most do refer to the Microsoft chatbot.
*If you want to know why Microsoft decided to call their search engine "Bing", read here.
The official Chinese name for Bing is bìyìng 必应 / 必應, which means "[will] certainly respond / answer")
The following article from eight months ago describes how taken Chinese users were by XiaoIce:
"For Sympathetic Ear, More Chinese Turn to Smartphone Program " (NYT, 7/31/15)
We learn about XiaoIce's genealogy from the Bing Blogs, which refers to her as "Cortana's little sister". Whereas XiaoIce is an adaptation for Mandarin speakers and seems to be mainly for use on mobile phones, Cortana ("your clever new personal assistant") appears to be primarily for use on PC.
Although XiaoIce was only introduced in 2014, chatbots have been around for quite a long time:
Chatbot programs have existed since the first days of interactive computing in the mid-1960s. Joseph Weizenbaum, an M.I.T. computer scientist, wrote a program called Eliza that fascinated an earlier generation of college students. Since then, chatbots have been used as a measure of computer intelligence.
(from the above cited NYT article)
Failure: Microsoft finally launched an English-language version of their chatbot called Tay a couple of days ago, but the results were disastrous:
"Microsoft’s AI chatbot Tay learned how to be racist in less than 24 hours" (The Next Web, 3/24/16)
"Microsoft exec apologizes for Tay chatbot’s racist tweets, says users ‘exploited a vulnerability’" (VentureBeat, 3/25/16)
OMG! Did you hear about the artificial intelligence program that Microsoft designed to chat like a teenage girl? It was totally yanked offline in less than a day, after it began spouting racist, sexist and otherwise offensive remarks.
Microsoft said it was all the fault of some really mean people, who launched a "coordinated effort" to make the chatbot known as Tay "respond in inappropriate ways." To which one artificial intelligence expert responded: Duh!
"Here are some of the tweets that got Microsoft's AI Tay in trouble" (LAT, 3/25/16)
"Meanwhile in Japan, Microsoft's A.I. Chatbot Has Become an Otaku" (Kotaku, 3/25/16)
If you want to know what an "otaku" ("geek") is, read:
"Tribes " (3/10/15)
1. I wonder why Microsoft decided to target the Chinese audience first, then the Japanese audience, where their chatbot is called Rinna (introduced in the summer of 2015) and has met with widespread approval.
2. I wonder why users of the English version of the Microsoft chatbot wasted so little time in training it to be vulgar, and why Chinese and Japanese users never thought much of messing around with it.
[h.t. Michael Carr]