Sex chatbots for teens

But before too long, Tay had “learned” to say inappropriate things without a human goading her to do so.

This was all but inevitable given that, as Tay’s tagline suggests, Microsoft designed her to have no chill.

"C U soon humans need sleep now so many conversations today," Tay said in its final post on Twitter.

Sex chatbots for teens-14

But it got a harsh lesson in what it can learn from people.

As a result, Tay was taken offline for adjustments to the software, according to Microsoft.

How could a chatbot go full Goebbels within a day of being switched on?

At first, Tay simply repeated the inappropriate things that the trolls said to her.

[See also: Prove You're Human Online With New Language Test] The bot — which can speak multiple languages thanks to translation technology — is also programmed to act in a manner that could be considered vulnerable, trusting and naïve.

But what makes Negobot unique — aside from its crime-fighting mission — is its use of game theory to trap potential pedophiles.I mean, I was already excited about “Fifty Shades Darker,” but now, I’m about to Alexia La Fata is a Senior Editor.She's a proud New Jersey native and Boston College graduate. She's a proud New Jersey native and Boston College graduate.Microsoft this week launched the experiment in which the bot nicknamed "Tay" was given the personality of a teenager and designed to learn from online exchanges with real people.Bu the plan was sent awry by an ill-willed campaign to teach her bad things, according to the US software colossus.Tay tweets ranged from support for Nazis and Donald Trump to sexual comments and insults aimed at women and blacks.

Tags: , ,