Microsoft’s(MSFT) “artificial intelligence” Twitter(TWTR) bot Tay came out of hibernation Wednesday morning, still making a bad impression.
Granted, this time Tay was not spewing racism and misogyny (though there was a tweet about smoking drugs in front of the police). However, “she” appeared to be stuck in some kind of nightmarish loop, talking to herself and several other accounts at high speed.
“You are too fast, please take a rest,” Tay said, several times per second, for long enough to comprehensively take over the Twitter feeds of more than 200,000 followers.
And then, mercifully, it stopped. Tay stopped spewing tweets at a rate of knots, and her Twitter profile was made private.
Microsoft told CNBC it had inadvertently put Tay back online during testing.
Tay, which is supposed to represent a teenage girl, is a machine-learning bot meant to engage young people in conversation, improving its own conversational skills in the process.
The exercise famously went wrong earlier this month, when trolls made a concerted and successful effort to teach Tay how to be awful. Microsoft suspended the account after it started issuing tweets denying the Holocaust, abusing women, praising Hitler and saying Donald Trump was “the only hope we’ve got.”
This article was updated to include Microsoft’s reported explanation.