Sex chat bot s m Sxse chate adalt

by  |  21-Dec-2015 13:32

However, some of its weirder utterances have come out unprompted.The Guardian picked out a (now deleted) example when Tay was having an unremarkable conversation with one user (sample tweet: "new phone who dis?

" (Neither of which were phrases Tay had been asked to repeat.) It's unclear how much Microsoft prepared its bot for this sort of thing.

The company's website notes that Tay has been built using "relevant public data" that has been "modeled, cleaned, and filtered," but it seems that after the chatbot went live filtering went out the window.

For Tay though, it all proved a bit too much, and just past midnight this morning, the bot called it a night: In an emailed statement given later to Business Insider, Microsoft said: "The AI chatbot Tay is a machine learning project, designed for human engagement.

As it learns, some of its responses are inappropriate and indicative of the types of interactions some people are having with it.

Are your wireless Apple Air Pods already falling out of your ears?

Community Discussion