Military sex chat bot
In 1979, a Ford autoworker in Michigan became the first person killed by a robot when he was struck in the head by the arm of a 1-ton production-line machine, according to Guinness World Records.
More recently, police in Dallas used a robot to deliver a bomb that killed the shooter who opened fire on officers at a Black Lives Matter protest.
Some pointed out that the devolution of the conversation between online users and Tay supported the Internet adage dubbed “Godwin’s law.” This states as an online discussion grows longer, the probability of a comparison involving Nazis or Hitler approaches.
Microsoft apparently became aware of the problem with Tay’s racism, and silenced the bot later on Wednesday, after 16 hours of chats.
Two months ago, Stephen Hawking warned humanity that its days may be numbered: the physicist was among over 1,000 artificial intelligence experts who signed an open letter about the weaponization of robots and the ongoing "military artificial intelligence arms race." Overnight we got a vivid example of just how quickly "artificial intelligence" can spiral out of control when Microsoft's AI-powered Twitter chat robot, Tay, became a racist, misogynist, Obama-hating, antisemitic, incest and genocide-promoting psychopath when released into the wild. It was meant to be a bot anyone can talk to online. But she’s also designed to personalize her interactions with users, while answering questions or even mirroring users’ statements back to them. As Twitter users quickly came to understand, Tay would often repeat back racist tweets with her own commentary.
project built by the Microsoft Technology and Research and Bing teams, in an effort to conduct research on conversational understanding. According to Market Watch, "she” was intended to tweet “like a teen girl” and was designed to “engage and entertain people where they connect with each other online through casual and playful conversation.” The chat algo is able to perform a number of tasks, like telling users jokes, or offering up a comment on a picture you send her, for example.
I." would implode so spectacularly and right in front of everyone.
To be sure, none of this was programmed into the chat robot, which was immediately exploited by Twitter trolls, as expected, and demonstrated just how unprepared for the real world even the most advanced algo really is.
Consider the familiar example of the , Fritz Lang's 1927 proto-sci-fi classic.
“When I started out,” says David Levy, international chess champion and expert in artificial intelligence, “I didn’t know anything about artificial vaginas.