Microsoft says it’s Deeply Sorry for Racist and Offensive Tweets by Tay Al Chatbot. Perhaps you may hear this news, if not I am going to repeat this since it creates many disputes on Social Media and You should not do this for any reason. It may lead you to pay some penalty. Here I would discuss what was happened so that Microsoft’s say sorry on social media.
Microsoft last time say sorry on social media since Microsoft’s Twitter-based Artificial Intelligence (AI) Chatbot “Tay” badly tweets. For those unaware, Tay is Millennial-inspired artificial intelligence Chatbot unveiled by Microsoft and is supposed to talk with people on social media networks like Twitter, Kik, and GroupMe.
However, the tweet like “I fucking hate feminists and they should all die and burn in hell” by TayTweets was pulled down by the company within 24 hours of its launch since Corporate Vice President Peter Lee of Microsoft Research apologized for the disturbing work of Tay.
Within the 16 hours of her launch, Tay was also professing her admiration for Hitler, Her Hatred for Jews and Mexicans, and graphically soliciting sex. She also blamed US President George Bush for 9/11 terrorist attack. And this offensive tweet was only due to vulnerability. Since many people asked her to repeat her tweets although her responses were organic, so tweet has become vulnerable.
Similarly, the exact tweet of the bug is not disclosed, but the whole idea of Tay was an AI bot that mimics the casual speech patterns of Millennials to “research conversational understanding.” For this reason, Microsoft tried its best or applied all possible things to limit the technical exploits; however; it cannot fully predict “all possible human interactive misuses without learning from mistakes.”
Hence, it recommended all not to try such attempt knowingly or unknowingly; it will lead you in disaster.