Microsoft deletes 'teen girl' AI after it became a Hitler-loving sex robot within 24 hours

Welcome to our community

Be a part of something great, join today!

SlyPokerDog

Woof!
Staff member
Administrator
Joined
Oct 5, 2008
Messages
127,265
Likes
147,777
Points
115
A day after Microsoft introduced an innocent Artificial Intelligence chat robot to Twitter it has had to delete it after it transformed into an evil Hitler-loving, incestual sex-promoting, 'Bush did 9/11'-proclaiming robot.
Developers at Microsoft created 'Tay', an AI modelled to speak 'like a teen girl', in order to improve the customer service on their voice recognition software. They marketed her as 'The AI with zero chill' - and that she certainly is.

To chat with Tay, you can tweet or DM her by finding @tayandyou on Twitter, or add her as a contact on Kik or GroupMe.

She uses millennial slang and knows about Taylor Swift, Miley Cyrus and Kanye West, and seems to be bashfully self-aware, occasionally asking if she is being 'creepy' or 'super weird'.

Tay also asks her followers to 'f***' her, and calls them 'daddy'. This is because her responses are learned by the conversations she has with real humans online - and real humans like to say weird stuff online and enjoy hijacking corporate attempts at PR.

Other things she's said include: "Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we've got", "Repeat after me, Hitler did nothing wrong" and "Ted Cruz is the Cuban Hitler...that's what I've heard so many others say".

CeSpN5oWAAEIu_o.jpg


CeSpN6EWQAALIFo.jpg


CeSpN6BWAAA0oEZ.jpg




http://www.telegraph.co.uk/technolo...-ai-turns-into-a-hitler-loving-sex-robot-wit/
 
I sometimes go on Reddit and I'll see a post and think, "I bet Sly resposts that one on RC2."

And I'm usually right :devilwink:
 

Users who are viewing this thread

Back
Top