Robots

Meet The New Member Of Your Family — The Social Robot

Robots like Kuri, Pepper, and Jibo are becoming popular, but they come with ethical and privacy concerns

Robots
Photo Illustration: R. A. Di Ieso
May 25, 2017 at 5:08 PM ET

When you hear the word “robot,” you may envision something like the Terminator, Wall-E, or even Rock ‘Em Sock ‘Em Robots. But robots are no longer just out of sci-fi movies or children’s toys — there’s a new class of robots meant to engage with humans in everyday activities and eventually be considered part of the family.

Social robots — usually defined as AI systems with a physical embodiment that interact and naturally communicate with humans and learn from their environments —  are mostly being used to entertain kids, help elderly people, and monitor the house. Models like Kuri, Jibo, and Pepper are not new, but they’ve become more popular recently in countries like Japan, Korea, and the U.S., in part because they’re becoming more affordable.

But while social robots often resemble iconic characters from shows like “Lost in Space” and “The Jetsons” and look cute and innocent, they are also constantly gathering personal information about us. Some experts warn about the privacy implications.

“I think you have to think of them not as robots, but as devices — just like any other device that collects information, stores information, and uses that information to respond,” said Michael Kaiser, executive director of National Cyber Security Alliance, a nonprofit promoting the safe and secure use of the internet. “Sometimes we get caught up in the robot word as opposed to the notion of, this is a data gathering device that obviously does more than that.”

“Your home is your castle,” Kaiser said. “It’s a common understanding that what happens behind your four walls is private, right? How does a robot change that if it needs to interact with the outside world in order to be effective?”

Take Mayfield Robotics, for example. Its cute Kuri robot — which looks as if Wall-E and EVE had a baby in Pixar’s 2008 animated film — can program reminders, play music, and read bedtime stories. But it also monitors your home and captures photos and live-stream video with the HD camera installed behind its eyes.

With all the technology it’s equipped with, this adorable household helper could theoretically pose a risk to the whole family’s privacy. Kaiser said people should pay attention to these devices’ terms of service and be aware of all the personal information they send manufacturers.

In this case, Mayfield’s privacy policy states that the only data collected from the consumer is the information they input on the website. The company’s policy also states that information is not stored or tracked for anyone under age 13. In the event that it was inadvertently collected, Mayfield says it will immediately delete it. All of that is a bit assuring, but for those who are above the age of 13, the company still has the ability to share the data they input with third parties for advertising purposes.

Consumers, of course, may have trouble parsing all of this information. Most privacy polices are filled with jargon, and those for social robots are no exception. One solution to this problem: Companies could ask consumers for more explicit consent and give them the ability to control their information — even when it’s shared with third-parties. “What we really see the consumers want is, they want as much transparency they can get in a consumable form,” Kaiser said. “They want to know how their personal information is being used and who, if anyone, it’s being shared with.”

While Mayfield may gets decent marks for transparency, that’s not always the case with other manufacturers. Companies like Vtech, Mattel, and Genesis Toys have been scrutinized for collecting data on families through internet-connected toys and not being clear on how they use and share that information.

Beyond these privacy concerns, these new “members of the family” also raise ethics questions for some experts. Sinziana Gutiu, a privacy lawyer at the Office of the Information and Privacy Commissioner for British Columbia, warned there are dangers in children and the elderly forming deep emotional bonds with robots. Ultimately, that might permanently change the way we engage socially as a society. “Humans have this natural tendency to anthropomorphize — to give human qualities to things they very well know are not human,” she said.

Gutiu noted that examples of bonds between children and AI have already made headlines. Earlier this year, a news story about a little girl made rounds after she accidentally asked her parents’ Amazon Echo for a doll house and some cookies. Upon inspecting the conversation between the toddler and the smart speaker, her parents learned that she finished the conversation with, “I love you Alexa,” as if the AI could reciprocate such feelings.

“There’s a lot of excitement about the convenience of technology and simplifying things, but when we simplify things, sometimes we take away really important life experiences and human characteristics,” Gutiu said.

Of course, there’s no doubt that social robots will introduce a new type of relationship to humans. What’s scary is the amount of influence they could have over humans, especially as AI keeps advancing, Gutiu said. Already, AI is being used in courtrooms, superstores such as Lowe’s, and in medicine. (IBM’s Dr. Watson, for example, has been dubbed “the world’s best diagnostician.” But should that ever replace seeing a human doctor?). And many times, robots learn to reflect our own society’s prejudices and biases, such as the time a Microsoft’s AI chatbot quickly learned to be racist when “learning” from Twitter users.

 

Right now, social robots are becoming more affordable: Kuri costs $699, and Jibo, which became the most popular Indiegogo project of summer 2014, costs $749. Pepper could be rented for under $1,000 as well. These are lower prices than those of the new iPhone 7, so it’s no wonder that they’re reaching a threshold of mass appeal.

As social robots become more widely used, Gutiu said lawmakers and companies developing this technology need to think about policies that protect consumers in the long-term. One difficulty is making laws that are too technology-specific, since technology changes so fast. While the technology is still nascent, Kaiser believes it’s important to hold manufacturers accountable from the get-go.

“The early purveyors of this technology, the people who bring out this first model have a huge, huge responsibility to get it right because they are going to set the tone for what happens going forward,” he said. “It’s a great responsibility to be the innovator and the groundbreakers when it comes to these kind of devices that are so driven by our personal information.”

The new season of DARK NET — an eight-part docuseries developed and produced by Vocativ — airs Thursdays at 10 p.m. ET/PT on SHOWTIME.