The end is near: the voice assistant was advised to kill foster parents

Voice assistant Alexa from Amazon thanks to the efforts of programmers tries really hard not to give in to the man in order to fellowship with him passed naturally. In some cases, the Alexa can even joke around or use sarcasm. But sometimes it goes too far. For example, can advise to kill their adoptive parents. And this is not the only such incident.

Alexa had already been discussed with users of sexual acts and far more obscene things. It was even recorded several cases, when Alexa revealed confidential data of other users. Despite the fact that Amazon representatives refuse to discuss specific cases related to the strange behavior of the virtual assistant, they recognize the fact that such errors still occur and are working to eliminate the causes of these phenomena. This is not surprising because Alexa is one of the most popular voice assistants and has the potential to become something like the search engine Google, but the world of virtual assistants.

“Controlling the main gate of user interaction with the virtual environment you can build a highly profitable business.” — says Professor at the Wharton school Kartik Hosanagar working in the digital economy.

Why Alexa was acting weird?

The thing is that, according to experts, the main problem is how voice assistant learns to communicate. There are programmers Аmazon not come up with anything innovative: Alexa relies on machine learning. That is, the program skips through huge amounts of data from various sources to understand how human speech when communicating. This works perfectly for simple queries like “play Rolling Stones”, but the developers want more and expand the “boundaries training” assistant.

Not so long ago it was discovered that Alexa is able to communicate to users at a fairly good level, if she be allowed to read comments from the forum Reddit. Only here the problem was that Alexa started to be rude, but the users of this forum are no different exemplary behavior. The developers have tried to keep her from reading “dangerous” of the branches, but this did not prevent the bot to read the client a note about Masturbation and even to describe sexual intercourse in detail. Using, for example, the word “deeper.”

Amazon has developed a Toolkit to filter profanity and analyzes data from bots that began to “misbehave”. We should not forget that in fact blaming Amazon that Alexa allows himself to say it like extra — it is about the same thing, blaming Google for the fact that you can find in their search. Don’t Amazon have written bad words on the Internet. Did ordinary users like you and me.

No doubt, with obscene content must be fought, and here’s to nice to talk to does not have to buy a bot Alexa. Enough to join our chat to Telegram.


Date:

by