Amazon’s ‘Echo Look’ Could Snoop a Lot More Than Just Your Clothes

Posted by on May 18, 2017 in IT News | 0 comments

Amazon’s ‘Echo Look’ Could Snoop a Lot More Than Just Your Clothes

The new Amazon Echo Look seems like a logical enough extension of Alexa, the company’s AI-powered digital assistant. Previously, Alexa lived inside speakers. Now, it’s in a camera. That progression belies just how much more the Echo Look could know about you than other Alexa hardware does—especially if Amazon ever unleashes the full power of its machine learnings smarts.

Amazon envisions the Echo Look as a way to get fashion advice. Command it to take a photo of you, repeat across various clothing options, and get a recommendation of what you should wear. Along the way, Amazon will also suggest clothing you might want to buy. From Amazon. Including, one imagines, from one of Amazon’s in-house lines of clothing.

In other words, the Echo Look feels like the culmination of a process that started with the words CAMERA and COMMERCE scrawled on a dry-erase board. That alone shouldn’t surprise. Amazon exists, after all, to sell things. But the Look also marries Amazon’s powerful machine learning technology with a camera designed to take regular pictures of you and your surroundings—with no guarantees that in the future, it’ll stick to just your clothes.

Got the Look

As a consumer product, the $200 Echo Look surely makes sense for some people. Maybe a lot of people! Alexa makes for a great digital assistant, and we could all use a little sartorial wisdom now and then. It’s important, though, that those people know exactly what they’re getting.

“A lot of consumers see the convenience and don’t think about the long-term records that are being kept,” says Peter Swire, who specializes in privacy law at the Georgia Tech Scheller College of Business.

In the case of Look, those records include not just audio taken from your Alexa requests, but however many photographs taken in, presumably, your bedroom, as well as what you’re wearing in those photos. It compares different outfits for you as well, and knows which outfits you buy or don’t based on its recommendations. That’s already a ton of data—and before you get to the fact that it also sees your body and face, and whatever objects happen to be in the background. That matters in particular because Echo Look uses machine learning, technology that knows not only identify what an object is, but also what it’s like.

“In a simplistic way, we’re going from being able to give things names or nouns to being able to give them adjectives,” says Louis-Philippe Morency, a machine learning expert at Carnegie Mellon University. “We can describe not just that it’s a shoe, but it’s a red shoe.”

Or in a human’s case, not just a face, but one that’s smiling.

There are a few small comforts here. First, Amazon explicitly does not provide “interest-based ads”—the kind that your interactions with an Echo or Look might generate—to third parties. You can also opt out of Amazon itself serving up personalized ads here. You can also delete the photos the Look takes of you at any time. Last, Amazon confirmed to WIRED that Echo Look will only use its machine learning smarts to identify outfits, not expressions or your bed frame or the slightly torn KISS poster on your wall.

That’s right now, though. The future’s not as clear. And unless Amazon explicitly narrows its scope, the Look could someday know so much more.

Private Parts

The Echo Look suffers from two dovetailing issues: the overwhelming potential for invasive data collection, and Amazon’s lack of a clear policy on how it might prevent that.

On that first part? Machine learning is smart. Very, very smart, and very good at knowing what things are. You’ve seen this already if you use, say, Google Photos, in which you can type in “dog” to see every photo of a good boy you’ve ever uploaded to the cloud. You didn’t label that photo a dog. Machine learning did.

Think, then, of what a camera like Look could do should its outfits-only mandate fall by the wayside. It could notice that you are low on toilet paper. It could tell that you only buy blue bedsheets. It would know about KISS. And it could use all of that to try to sell you toilet paper, bedsheets, and a Gene Simmons bobblehead doll.

Then peer a few years into the future, as the technology matures. Think of a Look that evaluates not just your cardigan, but your mood, or even your body. If Target could figure out which of its customers were pregnant in 2012 based on their shopping history, what might Amazon be able to tell based on that plus changes to your posture?

Machine learning can’t do that yet in enough detail to alarm.

“We’re not yet at the point of being able to infer everything about someone, what they think or what what they are internally feeling,” says Morency. But as computer scientists work more closely with psychologists, linguists, and cognitive scientists, the ability to infer will continue to crystalize. “It’s not just a mapping between an image and a label, like an adjective and a noun. We’re really looking at a more complex human feeling and emotion.”

And despite those industry-wide limitations, Amazon does advertise its own image-recognition technology, which it calls Rekognition, as being able to “use facial analysis” to “easily track user sentiment.” Again, it doesn’t currently apply that to Echo Look. But that doesn’t mean it never will.

Fine Print

There does not appear to be a separate privacy policy for Echo Look. When asked for one, Amazon directed WIRED to an FAQ page that focuses more on the process of taking, storing, and deleting photos than it does on how it plans to use its machine learning capabilities now or in the future.

“Amazon doesn’t say anywhere in any kind of clear language what the risks are,” says Susan Liautaud, an ethicist at Stanford University. “Risks more generally about what it might accidentally pick up, and the privacy risk.”

Those kinds of explicit guarantees matter because, as Swire points out, “companies in the US are responsible for keeping their privacy promises. If the promises are not in the published policy, the likelihood of enforcement is much lower. What that means is consumers may not have any basis for legal complaint if the the scope of machine learning shifts over time.”

As machine learning advances, those questions will apply to more and more facets of our lives. They’re not limited to Amazon by any means. But they’re still worth asking every time we invite AI into our lives.

“If people don’t know what is recorded, or when it is recorded, and what information is recorded, then it brings more anxiety and more questions,” says Morency of machine learning generally. “The challenge for us moving forward is to be more transparent, to be able to explain better what data is recorded.”

For Echo Look buyers, the answer today is simple: your clothes. They should know also, though, that it’s just a software update away from potentially becoming much more complex.