When I was younger, when my siblings and I would come home from school, we were required to nourish our minds for an hour (study, homework, read, do math practice, whatever we were feeling that day) and then we were banished from the house until dinner.
We had to go outside and create our own fun. We rode bikes to friends houses, we went “fishing” in the creek, sometimes before we left the house we’d search the couch for loose change and go to our favorite corner store and share a bag of skittles.
Our neighborhood was a safe one — it was one of those ideal 90s neighborhoods where our house was seated on the end of a cul-de-sac so there was little traffic and there were enough kids on the street to field two kickball teams.
Each parent on the street was allowed to reprimand us and there were rarely any locked doors. As a 10 year old it felt like ultimate freedom. But, with that freedom came a very important lesson in strangers and what to do if we were ever approached by one.
I’m sure stranger danger is still a thing taught by parents and schools alike but we went from don’t talk to strangers online or get in strangers’ cars to getting online to request a stranger to drive us somewhere.
With the advancement of technology has come a readiness to bring strangers in (/near / to) our homes. The most invitations coming from those personal assistants many homes can’t seem to function without.
Alexa, Google Home, Bixby or whatever assistant you may use are all essentially strangers that you are willingly bringing into your home.
Just yesterday I had a conversation with a college kid that didn’t know that the microphone on those things are always on — as such is true with the Facebook, Instagram and Facebook Messenger apps.
In a recent article from Rachel Botsman (BOTSman, hmmmm), she describes the experience her three year old had with an Alexa.
Over the course of the interactions, her daughter asks the bot a few silly questions, requests a few items to be bought, asks Alexa a few opinions, she ultimately sums up her daughter’s experience as saying, “Today, we are no longer trusting machines just to do something, but to decide what to do and when to do it. The next generation will grow up in an age where it is normal to be surrounded by autonomous agents, with or without cute names.”
I’m not a mother and I’m definitely old enough to be extremely skeptical of machines (iRobot anyone?) but the effects smart bots will undoubtedly have on future generations have me genuinely concerned. Right now it seems as harmless as asking those assistants to order more toilet paper, or to check the weather or to see which movies are screening but what will it become in the future?
A MIT experiment cited in the Botsman article 27 children, aged between three and 10, interacted with Alexa, Google Home, Julie (a chatbot) and, finally, Cozmo (a robot in the form of a toy bulldozer), which are all AI devices/ toys.
The study concluded that almost 80 per cent of the children thought that Alexa would always tell the truth.
Let me repeat that — 80 PERCENT OF THE KIDS BELIEVE THAT THE AIS, CREATED BY COMPANIES WHO WANT TO SELL PRODUCTS, WILL ALWAYS TELL THE TRUTH.
The study went on to conclude that some of the children believed they could teach the devices something useful, like how to make a paper plane, suggesting they felt a genuine, give-and-take relationship with the machines.
All of these conclusions beg the question, how can we teach kids (and some adults if we’re being honest) about security and privacy in regards to new technology? How do we teach kids about commercialism and that as innocent as they may seem, not every device was designed altruistically?
We are quickly approaching an age where the strangers we introduce our kids to aren’t the lurkers in the park with the missing dog or the candy in the van, but rather, a robot voice that can tell a joke and give you the weather and order +$70M worth of miscellaneous stuff.
So now, it’s on us. Children of our own or not, we have to start thinking about best practices when it comes to teaching children about the appropriate time to trust in a computer. If the 5 year olds with smart devices are any indicator, teaching kids to be stingy with their trust in AIs will be an uphill battle.
This story was first published here in October of 2017.