Fashion

Amazon’s Style-Coaching Camera Is Designed For Boring White People

The Echo Look doesn't just watch your body. It also gives you algorithmic style choices.

Fashion
Photo Illustration: Diana Quach
Apr 28, 2017 at 3:45 PM ET

There are plenty of reasons to be wary of the Echo Look, Amazon’s new camera-equipped, voice-activated “smart home” appliance. It sits in your bedroom, it’s always connected to the internet, and it captures intimate data about your physical body so that Amazon’s mysterious algorithms can judge your outfits and persuade you to buy more stuff.

But some fashion and artificial intelligence experts are particularly concerned about the Echo Look’s “StyleCheck” feature, which lets users send full-body selfies and videos to the cloud and receive near-instantaneous feedback and style recommendations. Unless you’re white and have a taste for basic, department store-style fashion, they say, it’ll probably be pretty useless.

The online retail giant claims its process rates outfits on a percentage scale using a combination of machine learning algorithms and human “fashion specialists,” which sounds like another role for the digital sweatshop workers of Amazon’s Mechanical Turk. But Amazon refuses to answer how these algorithms and their human counterparts actually work, despite how notoriously subjective style and fashion are.

Like all machine learning systems, the types of example data used to train StyleCheck’s algorithms will have an enormous impact on what kinds of styles and color combinations it thinks are “good,” and how it responds to the user with recommendations. But does that training data include a diversity of different people and styles, or does it simply stick to the fashion industry’s interpretation of what looks good?

When contacted by Vocativ, an Amazon representative would only say that the Echo Look’s ratings for pictures uploaded using StyleCheck “take into account how the clothes fit, which colors look best, seasonality, how the outfits are styled, and what’s on trend right now.”

“Customers are never judged against other customers, only themselves wearing different clothes,” the spokesperson told Vocativ. “In addition, our training data includes a wide variety of body shapes, skin tones, etc.”

But Caroline Sinders, a former street fashion photographer who researches AI and user-software interaction, highly doubts the Echo Look — or any machine learning system that measures style — would be useful beyond a specific class of mainstream, fashion-forward consumers. That’s because a likely source of the fashion algorithm’s understanding of “what’s on trend” is the larger fashion industry, she says — including Amazon’s massive inventory of products.

“We all have different models of dressing well, so what is the definition of ‘good’ here?” Sinders told Vocativ. “If you’re training models based off what is selling, you create a feedback loop of sameness where things look like what’s already out there.”

Using that data would also ignore that the fashion world is still predominantly white. This is a common, recurring problem with machine learning projects, because AI systems trained on datasets that underrepresent people of color and minorities have been repeatedly shown to produce harmful, biased results.

When researchers trained an AI system to judge a beauty contest in 2016, for example, the vast majority of the top 40 contestants it selected were white. The reason has to do with diversity of the training data: Try googling terms like “professional look,” to see how many stock photos show white people wearing business clothes.

Even if the algorithm’s training data is diverse, different cultures and groups can have vastly different senses of style that would be hard for an algorithm to pick up on without making insensitive or offensive recommendations.

“You still have the issue of trying to culturally tune the recommendations in a way that’s sensitive without relying on stereotypes,” Christo Wilson, an assistant professor at Northwestern University’s College of Computer and Information Science, told Vocativ. “Even assuming they have training data that is diverse and representative of many groups, machine learning tends to end up training for a kind of ‘middle’ — and in terms of cultural sensitivity, that’s just a stereotype. So this is gonna be super complicated for them.”

Similar problems will almost certainly arise for transgender and gender-nonconforming users. Amazon says users can select whether the Echo Look makes “male” or “female” recommendations during the device’s setup process. But what if the user is in the process of transitioning and needs different suggestions each day depending on whether they need to present as masculine or feminine? What if they don’t identify as male or female, or their sense of style simply doesn’t gel with the clothes in department stores and fashion magazines?

“What do you recommend to someone who isn’t white and doesn’t like white people clothing?” said Sinders. “I don’t think Amazon can do that.”