Amazon Wants To Put A Camera In Your Bedroom To Watch You Dress
The Echo Look will mine your mirror selfies and judge your style. What's unclear is how else this data will be used
Amazon, the company that built an internet-connected microphone to eavesdrop on your living room, now wants to also put a camera in your bedroom.
The Echo Look is the latest creepy “smart home” device to feature the online retail juggernaut’s voice-activated digital assistant, Alexa. But Amazon designed it as much more than just a tool for hands-free mirror selfies — it’s meant to watch you get dressed so that it can use artificial intelligence to judge your outfits and offer fashion tips.
Using a feature called StyleCheck, the Echo Look will be able to send your selfies to the cloud so that it can judge your outfit and give advice on how to adjust your look, using a combination of style-conscious machine learning algorithms and human “fashion specialists.”
All those pictures will be stored on Amazon’s servers permanently unless the user manually deletes them, meaning the company will be free to mine data about your physical appearance, including your wardrobe, facial expressions, body type, and more.
“With this data, Amazon won’t be able to just sell you clothes or judge you. It could analyze if you’re depressed or pregnant and much else,” technology sociologist Zeynep Tufekci noted in a thread on Twitter.
That should be especially nerve-wracking to privacy-conscious consumers, since Amazon’s business model revolves around using data so that it can more effectively convince you to buy stuff. Intimate data about a person’s appearance can easily be used in ways that manipulate their negative body image and reinforce harmful and unrealistic beauty standards — especially in women, who have long been subjected to marketing specifically designed to prey on bodily insecurities.
When contacted by Vocativ, an Amazon spokesperson did not answer questions about what kinds of data the company will collect from user-uploaded photos, how that data will be used, and whether it will be sold or shared with third parties. The spokesperson refused to provide specific details on how Amazon’s machine learning algorithms work to judge a user’s appearance, saying only that they “take into account how the clothes fit, which colors look best, seasonality, how the outfits are styled and what’s on trend right now.”
The company also initially would not say how it plans to avoid the well-documented tendency of algorithms to inherit harmful cultural and racial biases. Last year, researchers used an algorithm to judge a beauty contest, and almost all of the contestants it judged most attractive were white.
UPDATE April 26, 2017, 3:30 PM: This article was updated to include statements from an Amazon spokesperson.
UPDATE April 27, 2017, 10:13 AM: This article was updated to clarify that StyleCheck is a feature of the Echo Look, not a separate app.