A new case in Arkansas raises tricky forensic questions
When police in Bentonville, Arkansas were faced with an unsolved murder last year, they looked to a nearby Amazon Echo. The owner of the house where the murder occurred, and chief suspect in the case, had purchased Amazon’s popular new personal assistant, and police looked to the device’s records for clues as to his guilt, in a case first reported by The Information. Even more ominous, police took the Echo itself into custody, arguing in a court filing that it contained audio files and other information relevant to the case.
That raised an uncomfortable question for Echo owners: could police pull incriminating data directly from the device? If so, it would give investigators an easy way to get data without Amazon’s permission, wildly shifting the balance in San Bernardino-style fights over police access to customer data. For local data, all police (or anyone else) would need is physical possession of the Echo unit, which would be much easier to get than a court order.
The good news for Echo owners is that there’s very little data on the device itself. The hardware includes only 4GB of storage, and that’s mostly taken up by device firmware. The ephemeral data in the device’s 250MB of RAM is more tempting for forensic experts, but the data is quickly wiped by restarting the device. Even then, the Echo has no data ports and storage can only be reached by physically removing the components or connecting directly to a pinout on the circuit board. The result is delicate enough that most analysts steer clear of the Echo unit entirely.
It’s a lesson Bentonville police seem to have taken to heart. Reached by The Verge, a representative said he wasn’t aware of any attempt to extract data from the Echo itself, declining to comment further.
However, much of the data implicated in the case should be available on the suspect’s phone. Through the associated Echo app, associated phones store text versions of nearly every request made to the device. The phone needs to be unlocked for police to access the data, which is still a challenge with iPhones, as San Bernardino demonstrated. But older Android phones, like the one already in custody in the Bentonville case, present much less difficulty. It’s not a perfect match, and the phone’s storage may be missing recent requests that haven’t had time to cache. Still, most of the data sought by investigators would likely be available using the right tools.
Many of those tools are already on the market. Magnet Forensics offers one of the few tools built specifically to pull data from Amazon’s smart assistant — but the tool is focused entirely on the app, rather than the Echo device. “The cloud’s almost always going to be your best source of information for IoT solutions,” said Jamie McQuaid, forensic consultant at Magnet Forensics. “After that, we found that the mobile phone apps themselves are the next best sources.”
The resulting data isn’t likely to solve any cases by itself, but it provides a surprisingly comprehensive picture of a person’s activities. Each request includes timestamps showing when the request was made, so it could be used to establish when a person was present in the room. Scheduled activities (“remind me to call Dad at 6:30,” for instance) will be accessible in a similar format. The Echo will also trigger accidentally on occasion, mishearing its name and recording a random snippet of conversation as a result. A text version of that conversation would subsequently be cached to the phone. “If I was screaming something in my house and Alexa heard its name, the phone would have a garbled version of that,” says Jonathan Rajewski, a digital forensics professor at Champlain College who studied the Echo last year. The result is certainly a long shot, but a viable one if police are feeling lucky. (Neither Rajewski nor Magnet have encountered any evidence of the device recording without being triggered by the wake word.)
A bigger problem for law enforcement might be the text itself. Audio files for a given request will only be stored on the phone if you’ve specifically cached them. As a result, police would be relying on Alexa’s voice-to-text interpretation skills in the majority of cases. If a defense attorney wanted to claim the transcription had made a mistake — maybe that “murder” transcription was actually “burger” — it might be enough to dismiss the transcripts as unreliable.
Not every home assistant stores that much data on the app. Rajewski also looked at the Google Home app, which stores almost no local data at all. The app functions largely as a gateway to the server, completely avoiding the evidence Echo stores on the app. You can verify this yourself by trying out the Google Home app in airplane mode: without a connection, it can’t display more than a loading screen. The result is that, in a similar case, the only source for Google Home queries would likely be Google itself.
In the end, Echo owners are faced with a familiar trade-off: a useful gadget in exchange for a lot more personal data, held by companies and accessible to law enforcement. As with cloud storage and webmail, the data lives largely on remote servers controlled by companies. In this case, Amazon provided only basic customer data, but anything held in the cloud will be susceptible to a warrant request. As a result, any request you have with Siri or Alexa will generally be treated like emails, Dropbox files or Google searches — hard for criminals to access, but available to any law enforcement officers with reasonable suspicions and an agreeable judge. That might not be the ideal result for privacy-minded Echo owners, but it’s the best they’re likely to get.