![]() If you're worried about accidentally triggering Alexa - maybe you often say words that sound similar to "Alexa" - you can change your Echo's wake word to either "Echo," "Amazon" or "computer." Just make sure these words aren't more likely to trigger your device than Alexa. Unless Amazon releases the actual audio logs, there's no way to know for sure what words and phrases Alexa misinterpreted. Still, there are a few things you can do to prevent your Alexa-enabled device from misunderstanding you. The answer isn't to stop using AI it's to improve it. Just like there've been and will likely be more accidents involving self-driving cars, there will probably be more screw ups AI becomes more weaved into our lives.Īccidents are natural and inevitable when we're talking AI (remember how Google Photos screwed up and categorized two black people as "gorillas"?). No AI is perfect, and I'm willing to bet this will happen again in the future, whether it's Alexa, Google Assistant, or another assistant. There's still a long ways to go before we get AI that won't f*ck up. Not to banish the devices from our homes, but to be more cognizant of the tradeoff between convenience and privacy at work: If anything, this incident is a wake-up call for everyone, both on the consumer and technology sides. And isn't that what we all demand from our digital assistants? Isn't that the biggest frustration to these things - when they hear, but fail to understand? But it still tried its best to understand. In this case, yes, it heard the audio and interpreted incorrectly. But at the same time, wasn't it also doing its job? We expect digital assistants to understand our voice even in the most challenging and difficult conditions and it's Alexa's duty to make sense of what it hears. I agree, Alexa sending out an unwanted audio message is bad. If we go by that, one screw-up out of 20 million equals only 0.000005 percent of all devices. Amazon doesn't ever share sales figures, but analysts peg it at around 20 million. How low? This is only one case out of " tens of millions" of Echo devices sold. It's hard to know exactly what words in the conversation led Alexa down this path, but it's safe to say the odds of this happening to other Echo owners are pretty low. It wasn't spying on them - it just misunderstood what it heard. Looking over Amazon's summary, it sounds like a series of very unlikely Alexa misinterpretations that would ever happen to most people. ![]() I own multiple Echos and it's impossible not to hear it if the volume is at even 50 percent (assuming I'm within earshot). But if it was set to a higher volume, it's a different story. If Alexa is set to a low volume, then it's more likely the couple could have not heard Alexa's multiple prompts. The volume at which Alexa was set to is also a factor. If it was an Echo in another distant room, though, my question is for Amazon: Does Alexa just pick a random person if it didn't hear a clear utterance? In which room was the Echo device that sent the audio message in? If it was in the same room, how do you miss or ignore Alexa when it asked "To whom?" and ", right?" - not to mention the telltale light that activates when the device is listening. I, personally, wouldn't want a friend calling me to tell me my Echo sent them a voice message that I didn't intentionally send.īut there's still some things we don't know about the event that would help clarify the incident. Okay, this isn't good, but I really don't think it's something to freak out about.ĭon't get me wrong, what happened to these people shouldn't have happened in the first place and shouldn't ever happen again. As unlikely as this string of events is, we are evaluating options to make this case even less likely."Īfter reviewing the logs and confirming the events to the couple, an Amazon engineer reportedly apologized "15 times in a matter of 30 minutes" and said "this is something we need to fix." Alexa then asked out loud, ", right?" Alexa then interpreted background conversation as "right". At which point, Alexa said out loud "To whom?" At which point, the background conversation was interpreted as a name in the customer’s contact list. ![]() "Echo woke up due to a word in background conversation sounding like "Alexa." Then, the subsequent conversation was heard as a "send message" request. The company provided Mashable with the following statement on what happened: To recap: A couple was talking about hardwood floors and the conversation somehow triggered one of their Amazon Echos, which then sent the conversation as an audio message to one of their friends listed on their contact list.Īnd Amazon confirmed the incident after engineers checked the device's log. Amazon is under fire for selling facial recognition tools to cops
0 Comments
Leave a Reply. |