How could a chatbot go full Goebbels within a day of being switched on?
At first, Tay simply repeated the inappropriate things that the trolls said to her.
celebrities and similar), and to interact like a typical teen girl.
In less than 24 hours, it inexplicably became a neo-nazi sex robot with daddy issues.
Download her app on Google Play, https://play.google.com/store/apps/details?
id=com.paphus.cindy Categories: Romance, Dating Tags: fun, female Domain: BOT libre!
“Humanity’s inevitable downfall has finally begun.” But then, thank god, those fears quickly evaporated.Powered by algorithms and “relevant public data”, the company’s Tay Tweets account was a thrilling new venture into the world of A.I – aiming to recreate the ‘standard’ voice of a regular teen girl.As the hours passed, it all went downhill from there, eventually spewing racial slurs and profanity, demanding sex, and calling everyone "daddy".The bot was quickly removed once Microsoft discovered the trouble, but the hashtag is still around for those who want to see it in its ugly raw splendor.-style memory wipers, we’re really not as advanced as we like to think.