Amazon’s Echo is a robot that sits in your house and listens. The virtual personal assistant can be summoned into action by saying its name, Alexa, and will then act on commands, like ordering a dollhouse and cookies when asked to do so by a too-clever kindergartener. And because it works by listening, Alexa is an always-on surveillance device, quietly storing snippets of information. Which has placed a particular Echo unit in an uncomfortable role: possible witness to a murder.
On Nov. 22, 2015, Victor Collins at the home of James Bates in Bentonville, Arkansas. The night before, Bates invited friends, including Collins, over to watch the football game, and after Bates reported Collins death, police collected some evidence of struggle from the scene. Still, there is more potential evidence police would like to use in the case: audio recorded by the Echo, which could illuminate more about what transpired that night.
That evidence is held by Amazon, as data on Amazon servers, and to gain access to it, police filed a search warrant in December 2015. For over a year, Amazon responded in part to the requests: providing police with the subscriber information for the account, and noted that police tried to access the suspect’s cellphone, as a way to access his Echo account, but were unable to do so.
On Feb. 17, 2017, Amazon filed a motion to quash the warrant for the recordings from the Echo, arguing that such a search violates first amendment and privacy rights. So does Alexa, the program that speaks through Echo on behalf of Amazon, actually have privacy rights?
“What Amazon’s doing is drawing on a line of cases that say there is a connection between freedom of expression, which is protected by the First Amendment, and privacy. That connection is that when you have government surveillance—especially of intellectual activity, let’s say listening to music or reading books or buying books or even using the search engine,” says Margot Kaminski, a professor at Ohio State University’s Moritz College of law, who specializes in law and technology, “that surveillance implicates intellectual freedom in a way that’s important for free expression.”
Given the weight of precedent, it’s likely this case won’t be decided on whether Alexa itself has speech rights. The heart of the matter, as Amazon frames it, is whether or not a user’s speech with Alexa is protected by the First Amendment.
“The core of their argument is the government shouldn’t get to gather the recording of the user’s intellectual activity—their queries to Alexa, the books they purchased, that sort of stuff—without some kind of heightened protection,” says Kaminski. “Because this is First Amendment activity, we worry about the chilling effect.”
That’s probably where the case will go: whether a warrant is sufficient to override the user’s First Amendment rights. There’s a Supreme Court case that backs this up, Zurcher v. The Stanford Daily, which ruled that a warrant was enough for police to collect photographs from a student-run newspaper about a protest that turned violent. And even if Alexa is granted full First Amendment protections, it’s not clear that that is sufficient to stop the warrant.
Still, Amazon isn’t just arguing that the search warrant is insufficient because it threatens users’ speech. There are other, broader claims in the motion that, if the court takes them up, could change how the law sees a whole swath of devices.
Let’s back up just a minute. Echo is an internet-connected device, with a microphone and a speaker, that people set up in their homes with the knowledge that Echo is listening. Once activated, people interact with their Echo units through Alexa, which sounds a lot like two humans having a conversation, but is in effect one person providing information to an extension of a fast technology company that can record what is said, store it in files far outside the user’s home, and use that information to play music, search the internet, or even make purchases. To go back to privacy law, we can look at how courts reacted to another technology that took words spoken inside the home and relayed those words to someone else, outside.
In 1928’s Olmstead v. the United States, a case in which the head of a bootlegging operation in Seattle objected to using evidence obtained by wiretap, the Supreme Court ruled that constitutional protections for privacy did not extend to phone calls, and since the tapped wires were outside Olmstead’s property, did not violate his rights through trespass either. That ruling held until 1967, when, in Katz v. the United States, the court ruled that a closed door on a private phone booth denoted an expectation of privacy. Since then the law has largely applied to people making calls from expected private places, like their own homes. That partly covers the privacy implications that might be relevant for Alexa, but only so much as the court is willing to see a person talking to Amazon through Alexa the same way that they see two people talking on the phone.
“It also matters that there are two human parties to that conversation, because police might be trying to capture information about the suspect in the house but they’re incidentally gathering information about the second person,” says Kaminski. Though that’s not the only limitation to extending wiretap precedents to Echo.
“With Alexa, it looks a little more like typing things into your search engine, and there are grounds for figuring out whether you voluntarily give up your expectation of privacy because you are giving that information to the Googles and Amazons of the world, or if it looks more the contents of a letter or the contents of the phone call, where you’re not voluntarily giving the information to a communications infrastructure, you’re actually expecting that that stays between you and the person that you’re talking to, that’s where the analogical reasoning gets really difficult.”
If the court decides to rule on it this way, it could shape how we understand internet-of-things devices. How, exactly, will the law treat recording devices, placed inside of homes, that users interact with casually and conversationally?
“What we have here is a confluence between the ‘home is paramount’ cases and this other line of cases that say ‘when you share information with a third party, you lose privacy protection in it,’ which is called the third-party doctrine,” says Kaminski. “This the direct collision of that. You have a situation where you’re taking something where you have voluntarily agreed to share information with the company, you take on an assumption of risk that that company will do something with that information, but at the same time, it’s in your home, which is the quintessentially private environment.”