Microsoft's Artificial Intelligence Tay Became a 'Racist Nazi' in less than 24 Hours
SUMMARYTAY is a Microsoft program created to learn about topics of conversation and people. These days this program has been very controversial has since posted tweets threatening, racist and hateful. Over the day, Microsoft decided to close the program to avoid inconveniences to people who read those tweets. Currently, TAY is being reprogrammed and adjusted for a better quality of conversation.
OPINION
In my opinion, this program will always have some flaws because nothing compares to the intelligence provided to us. Artificial intelligence will always be a difficult issue because you will have disputes, as in the example of TAY.
When the program was writing these tweets, people are annoyed but we do not know is that, TAY learn from other people, because their main objective was to understand how the talks are generated and which are the best topics to talk with people.