Tay tweets ranged from support for Nazis and Donald Trump to sexual comments and insults aimed at women and blacks.
I listened to Kate Darling, from MIT Media Lab, speak at the Interaction 16 conference in Helsinki last month.
Kate told us how we naturally anthropomorphize robots even when they haven’t been designed for that – we give them names and funerals, we laugh when they make mistakes, and we become angry if someone “hurts” them, like in the case of the Boston Dynamics Spot dog or the friendly hitchhiking robot, Hitch Bot.
The first “operative” bot in the healthcare sphere dates back to 50 years ago.
ELIZA was created to mimic a Rogerian psychologist, that is a therapist who asks questions to the patient simply by rearranging what the patient himself said.
Microsoft this week launched the experiment in which the bot nicknamed "Tay" was given the personality of a teenager and designed to learn from online exchanges with real people.
Bu the plan was sent awry by an ill-willed campaign to teach her bad things, according to the US software colossus.Nao robots and huggy bear robots have been used for helping children with autistic spectrum disorder to engage socially and learn languages, for instance.We have also been touched by the story of how Siri became a non-judgmental friend and teacher to an autistic boy called Gus. Helsingin Sanomat interviewed a couple from Japan who had Pepper in their home.But it got a harsh lesson in what it can learn from people.As a result, Tay was taken offline for adjustments to the software, according to Microsoft.In some areas of the world, the basic rules for a healthy life are unknown because of the poor medical information, taboos, or lack of interest from people.