How millions of kids are being shaped by Alexa and her siblings Source: Michael S. Rosenwald
"We like to ask her a lot of really random things," said Emerson Labovich, left, a fifth-grader in Bethesda, Md., who pesters Alexa with her older brother Asher, right. Mom Laura Labovich is in the background. Bill O'Leary The Washington Post
As millions of American families buy robotic voice assistants to turn off lights, order pizzas and fetch movie times, children are eagerly co-opting the gadgets to settle dinner table disputes, answer homework questions and entertain friends at sleepover parties.
Many parents have been startled and intrigued by the way these disembodied, know-it-all voices - Amazon's Alexa, Google Home, Microsoft's Cortana - are impacting their kids' behavior, making them more curious but also, at times, far less polite.
Psychologists, technologists and linguists are only beginning to ponder the possible perils of surrounding kids with artificial intelligence, particularly as they traverse important stages of social and language development.
In just two years, the promise of the technology has already exceeded the marketing come-ons. The disabled are using voice assistants to control their homes, order groceries and listen to books. Caregivers to the elderly say the devices help with dementia, reminding users what day it is or when to take medicine.
For children, the potential for transformative interactions are just as dramatic - at home and in classrooms. But psychologists, technologists and linguists are only beginning to ponder the possible perils of surrounding kids with artificial intelligence, particularly as they traverse important stages of social and language development.
"How they react and treat this nonhuman entity is, to me, the biggest question," said Sandra Calvert, a Georgetown University psychologist and director of the Children's Digital Media Center. "And how does that subsequently affect family dynamics and social interactions with other people?"
With an estimated 25 million voice assistants expected to sell this year at $40 to $180 - up from 1.7 million in 2015 - there are even ramifications for the diaper crowd.
Toy giant Mattel recently announced the birth of Aristotle, a home baby monitor launching this summer that "comforts, teaches and entertains" using AI from Microsoft. As children get older, they can ask or answer questions. The company says, "Aristotle was specifically designed to grow up with a child."
Wwhat if these gadgets lead children, whose faces are already glued to screens, further away from situations where they learn important interpersonal skills?
Boosters of the technology say kids typically learn to acquire information using the prevailing technology of the moment - from the library card catalogue, to Google, to brief conversations with friendly, all-knowing voices. But what if these gadgets lead children, whose faces are already glued to screens, further away from situations where they learn important interpersonal skills?
It's unclear whether any of the companies involved are even paying attention to this issue.
Amazon did not return a request for comment. A spokeswoman for the Partnership on AI, a new organization that includes Google, Amazon, Microsoft and other companies working on voice assistants, said nobody was available to answer questions.
"These devices don't have emotional intelligence," said Allison Druin, a University of Maryland professor who studies how children use technology. "They have factual intelligence."
Children certainly enjoy their company, referring to Alexa like just another family member.
"We like to ask her a lot of really random things," said Emerson Labovich, a fifth-grader in Bethesda, Md., who pesters Alexa with her older brother Asher.
This winter, Emerson asked her almost every day help counting down the days until a trip to The Wizarding World of Harry Potter in Florida.
Today's children will be shaped by AI much like their grandparents were shaped by new devices called television. But you couldn't talk with a TV.
"She can also rap and rhyme," Emerson said.
Today's children will be shaped by AI much like their grandparents were shaped by new devices called television. But you couldn't talk with a TV.
Ken Yarmosh, a 36-year-old Northern Virginia app developer and founder of Savvy Apps has multiple voice assistants in his family's home, including those made by Google and Amazon. (The Washington Post is owned by Amazon founder Jeffrey P. Bezos, whose middle name is Preston, according to Alexa.)
Yarmosh's 2-year-old son has been so enthralled by Alexa that he tries to speak with coasters and other cylindrical objects that look like Amazon's device. Meanwhile, Yarmosh's now 5-year-old son, in comparing his two assistants, came to believe Google knew him better.
"Alexa isn't smart enough for me," he'd say, asking random questions that his parents couldn't answer, like how many miles it is to China. ("China is 7,248 miles away, " Google Home says, "as the crow flies.")
In talking that way about a device plugged into a wall, Yarmosh's son was anthropomorphizing it - which means to "ascribe human features to something," Alexa happily explains. Humans do this a lot, Calvert said. We do it with dogs, dressing them in costumes on Halloween. We name boats. And when we encounter robots, we - especially children - treat them as near equals.
In 2012, University of Washington researchers published results of a study involving 90 children interacting with a life-size robot named Robovie. Most kids thought Robovie had "mental states" and was a "social being." When Robovie was shoved into a closet, more than half felt it wasn't fair. A similar emotional connection is taking hold with Alexa and other assistants - even for parents.
"It's definitely become part of our lives," said Emerson's mother, Laura Labovich, who then quickly corrected herself: "She's definitely part of our lives."
The problem, Druin said, is that this emotional connection sets up expectations for children that devices can't or weren't designed to meet, causing confusion, frustration and even changes in the way kids talk or interact with adults.
Yarmosh's son thought Alexa couldn't understand him, but it was the algorithms that couldn't grasp the pitch in his voice or the way children formulate questions. Educators introducing these devices into classrooms and school libraries have encountered the same issue.
"If Alexa doesn't understand the question, is it Alexa's fault or might it be the question's fault?" said Gwyneth Jones, a librarian who uses Amazon's device at Murray Hill Middle School in Laurel, Md. "People are not always going to get what they are saying, so it's important that they learn how to ask good questions."
Naomi S. Baron, an American University linguist who studies digital communication, is among those who wonder whether the devices, even as they get smarter, will push children to value simplistic language - and simplistic inquiries - over nuance and complex questions.
Asking Alexa, "How do you ask a good question?" produces this answer: "I wasn't able to understand the question I heard." But she is able to answer a simple derivative: "What is a question?"
"A linguistic expression used to make a request for information," she says.
And then there is the potential rewiring of adult-child communication.
Although Mattel's new assistant will have a setting forcing children to say "please" when asking for information, the assistants made by Google, Amazon and others are designed so users can quickly - and bluntly - ask questions. Parents are noticing some not-so-subtle changes in their children.
Cognitively I'm not sure a kid gets why you can boss Alexa around but not a person. At the very least, it creates patterns and reinforcement that so long as your diction is good, you can get what you want without niceties.
Blogger and parent Hunter Walk
In a blog post last year, a California venture capitalist wrote that his 4-year-old daughter thought Alexa was the best speller in the house. "But I fear it's also turning our daughter into a raging a------," Hunter Walk wrote. "Because Alexa tolerates poor manners."
To ask her a question, all you need to do is say her name, followed by the query. No "please." And no "thank you" before asking a follow-up.
"Cognitively I'm not sure a kid gets why you can boss Alexa around but not a person," Walk wrote. "At the very least, it creates patterns and reinforcement that so long as your diction is good, you can get what you want without niceties."
Jones, the librarian, has witnessed the digital equivalent of everybody asking a question at the same time.
"You all are being really pushy," she'll say, as Alexa declares over and over that she doesn't understand. "You're confusing her. One at a time, just like a person."
The personal yet transactional nature of the relationship is appealing to children and teenagers. Parents (including this reporter) have noticed that queries previously made to adults are shifting to assistants, particularly for homework - spelling words, simple math, historical facts.
Or take the weather, particularly in winter. Instead of asking Mom or Dad the temperature that day, children just go to the device, treating the answer as gospel.
Upside: No more fights over what the temperature will really be and what's appropriate to wear. Downside: Kids will go to their parents less, with both sides losing out on timeworn interactions.
"There can be a lot of unintended consequences to interactions with these devices that mimic conversation," said Kate Darling, an MIT professor who studies how humans interact with robots. "We don't know what all of them are yet."
But most researchers, educators and parents - even some kids - already agree that these devices need to be put in their place, just like a know-it-all sibling.
Jones, the librarian, puts Alexa away for a couple of weeks at a time, so her students don't rely on her too much. Yarmosh, who recently launched a project curating online videos for kids, is keeping the assistants out of his children's rooms. Emerson and her brother take a school playground approach.
"Alexa," they'll say, "you're such a butt."
| }
|