How to keep your kid from ordering four pounds of cookies with Amazon’s Alexa

Because this is a thing we have to deal with now
Amazon Echo in kitchen
The scene of a future cybercrime, in which a child orders enough candy to fill the entire house. The headline reads: "Amazon won't turn over Alexa evidence in child candy case." Courtesy of Amazon

Share

Handing over purchasing power to a six-year-old is probably unwise. If I could talk to a magical voice in my house that would send me anything I asked for, I would have abused that power as a child. Heck, I probably still would. But that’s exactly what Amazon’s Alexa does, at least by default.

This week Brooke Neitzel, a 6-year-old living in Dallas, Texas, ordered a $160 dollhouse and 4 pounds of cookies through her family’s Echo Dot. Brooke loves dollhouses and sugar cookies, so naturally she chatted with Amazon’s virtual assistant, Alexa, about just that.

“Can you play dollhouse with me and get me a dollhouse?” she asked. Alexa was happy to oblige and a few days later they showed up on her doorstep. To Brooke is must have seemed pretty magical, but her parents weren’t so excited. It took them a while to figure out what had happened because they weren’t in the room when Brooke accidentally (or maybe deviously) ordered the items. It wasn’t until her mom checked the Amazon app’s history that she realized what went down.

It’s pretty easy to fool Alexa like this. It’s triggered by saying its name and the voice recognition isn’t perfect. And it will respond to any voice.

When NPR’s Weekend Edition did a story about Alexa, multiple listeners wrote in to say that the story triggered their Alexa to do things. Roy Hagar’s thermostat was reset to 70 degrees. Jeff Finan’s started playing an NPR News summary. Marc-Paul Lee’s acted up.

Those incidents were all triggered by saying Alexa’s name, which though inconvenient is at least the command that’s supposed to trigger it. But since Alexa is always listening, it can sometimes get confused and respond to something that’s not its name. The podcast Dear Hank and John, run by internet-famous Vlogbrothers duo Hank and John Green, had one listener write in to say that Alexa had sent a giant plush teddy bear in response to something John had said. This listener had asked Alexa to play the most recent episode of Dear Hank and John, and somehow interpreted a portion of the podcast (in which John discussed his refusal to use Mac keyboard and also Kenny Loggins albums) to be saying “do you like being here with your big bear?” That might not seem like a command to you, but apparently Alexa thinks that means “please order me a huge teddy bear.”

John went on to jokingly ask Alexa to order copies of his book The Fault in our Stars and to play the Alvin and the Chipmunks Christmas album. He got at least a handful of people’s Alexa’s to respond.

Post Unavailable
Post Unavailable
Post Unavailable

There is an easy way to fix this problem. Just set up a confirmation code in your Alexa app. From then on, Alexa will ask you for the code using your credit card to buy anything. You can also just turn off voice ordering altogether, but what’s the point of a voice-activated Amazon assistant if you can’t tell it to order things from Amazon?

Sure, a smart kid would just learn the confirmation code. But since Amazon keeps a record of every command it hears, parents can still keep tabs on anything their kids try to order. Your mom doesn’t really have eyes in the back of her head, but she does have a perpetually-listening AI assistant.