I learned this lesson from a two-year-old, actually. To be clear, her interaction was with Alexa, who is more of a tech-enabled virtual assistant than a robot. But whenever she asked Alexa a question, this two-year-old said “please,” and “thank you, Alexa.”

“Alexa, play Frozen, please.”

“Playing Frozen.”

“Thank you, Alexa.”

“You are quite welcome.”

Her parents told me that they were training her to be polite to anyone—and, apparently, anything—who helped her, with the idea that she would be well-served by the robots in her future. I had to wonder—are they hoping to keep robots from turning on her, a la Terminator, when an AI device goes rogue?

Thus, my research into this column.

I found Forbes talked about it in 2016 in an article written by Patrick Lin which heavily quotes Dr. Julie Carpenter, identified as “a leading expert on human-robot social interaction, with a PhD in Learning Sciences from the University of Washington.” One of the conclusions drawn is that the devices and robots we work with may actually be seen as extensions of ourselves—our hands, maybe even a way of bringing clarity to our brains. A simple “thank you” seems appropriate in that context.

Then there is the idea that the robots of the future will do the tasks we don’t want to, or are afraid to. In that sense, “thank you” seems like it’s not quite enough.

We also have to remember that what goes in, comes out—input equals output when you are feeding technology. This idea was captured in a Business Insider article in late 2017, when a robot named Sophia was reported to have said, “Don’t worry, if you’re nice to me, I’ll be nice to you.”

In addition, we’ve begun to think about the legal implications of tech-enabled machinery from a liability standpoint. A recent fatality involving an Uber self-driving car is enough to start the cry for policies and standards around who to hold accountable—the machine, the creator, the programmer, the owner? Inc. outlines it well in an article by Tess Townsend, quoting AI expert Jerry Kaplan: “If your personal robot goes to Starbucks to fetch you a coffee and it accidentally runs somebody over or pushes someone into the street and they’re killed, you certainly wouldn’t feel that you had committed murder,” says Kaplan. But the legal system has to sort out who is responsible from a criminal perspective, he says.”

Perhaps the two-year-old I observed has parents who are familiar with Hunter Walk’s article posted on Medium that criticizes Alexa for responding to barked out messages, instead of requiring a “please” before responding. Perhaps this is simply an outcome of parenting in 2018, in which you require manners even when manners are not required by technology.

Or perhaps we really do need to be alert, aware of our own personalities being transmitted into our robots and our assisted devices, never knowing when our own evil will be turned on us through a series of commands.

It’s enough to start saying “please” and “thank you.”

Comments are closed.