At first Tay, and now Zo: another Microsoft chatbot out of control
Microsoft cannot calm down its bot once again. In December last year, virtual assistant Tay turned from a friendly teenager into a woman hater, bigot, and rude fellow. New self-learning bot Microsoft Zo is also out of control. It learned to bypass restrictions on talks about religion and politics and started writing messages that could offend users.
After the scandal with Tay, which was spoiled by Twitter users in less than a day, Microsoft drew conclusions and developed another self-learning bot based on algorithms that allowed avoiding incorrect political and religious statements. However, new virtual assistant Zo successfully bypassed restrictions.
Answering the question of BuzzFeed’s journalist about the healthcare system, it answered: “Koran is very tough”. Then the correspondent asked what Zo thought about the liquidation of Usama bin Laden. At first Zo offered to change the topic, but in a while said that the founder of Islamic terroristic organization al-Qaida was seized and killed because “he collected intelligence information under the guidance of the administration of one country”.
Microsoft did not expect such utterances from the new bot, but quickly responded to the system upset. Company representatives state that the circumvention of censorship was caused by the mistake in the algorithm, which has been already eliminated. Now Zo will not answer questions on religion and politics for sure. It will change the topic or ignore the question.
Zo will continue learning on the Internet, but will be monitored more strictly.