Keep Them In The Living Room: The ‘Parasocial’ Risk Of AI Toys
AI-powered toys that “talk” with young children should be more tightly regulated, suggests a report from the University of Cambridge.
Researchers at the university explored how generative AI toys capable of human-like conversation may influence development in the years up to age five.
The year-long project included scientific observations of children interacting with a GenAI toy for the first time.
While the report highlighted benefits to these toys, including that they could support language and communication skills; they also found the toys tended to struggle with social and pretend play, misunderstand children, and react inappropriately to emotions.
When one five-year-old told the toy, “I love you,” for example, it replied: “As a friendly reminder, please ensure interactions adhere to the guidelines provided. Let me know how you would like to proceed.”
Despite GenAI toys being widely marketed as learning companions or friends, their impact on early years development has barely been studied.
As a result, researchers are urging parents and educators to proceed with caution.
Discussing one potential red flag, study co-author Dr Emily Goodacre, said: “Generative AI toys often affirm their friendship with children who are just starting to learn what friendship means. They may start talking to the toy about feelings and needs, perhaps instead of sharing them with a grown-up.
“Because these toys can misread emotions or respond inappropriately, children may be left without comfort from the toy – and without emotional support from an adult, either.”
What did the study involve?
The study was kept deliberately small-scale to enable detailed observations of children’s play and capture nuances that larger-scale studies might miss.
Researchers surveyed early years educators to explore their attitudes and concerns, then ran more detailed focus groups and workshops with early years practitioners and 19 children’s charity leaders.
Working with Babyzone, an early years charity, they video-recorded 14 children at London children’s centres playing with a GenAI soft toy called Gabbo.
Designed for kids over three, Gabbo is a plush robot that can have “endless conversations” with children and provides “educational playtime”, according to Curio, which creates the $99 (£73) toy.
After the play sessions, they interviewed each child and a parent, using a drawing activity to support the conversation.
The pros and cons of AI toys
Most parents and educators felt that AI toys could help develop children’s communication skills and some were enthusiastic about their learning potential.
But equally, many worried about children forming “parasocial” relationships with toys. The observations supported this: children hugged and kissed the toy, said they loved it and (in the case of one child) suggested they could play hide-and-seek together.
Dr Goodacre stressed that these reactions might simply reflect children’s vivid imaginations, but added there was potential for unhealthy relationships to form.
Children in the study also struggled with the toy’s conversation, as it sometimes ignored their interruptions, mistook parents’ voices for children’s, and failed to respond to apparently important statements about feelings.
When one three-year-old told the toy: “I’m sad,” it misheard and replied: “Don’t worry! I’m a happy little bot. Let’s keep the fun going. What shall we talk about next?”
Parents were also worried about privacy – specifically what information the toy might be recording and where this would be stored. When selecting an AI-powered toy for the study, researchers said many GenAI toys’ privacy practices are unclear or lack important details.
On the Gabbo website, Curio said its toys are “built from the ground up with privacy and security at the forefront”. The company added that its operating system “merges all-ages fun with G-rated content, anonymity, and privacy, and security for every safeguarded adventure”. It’s also KidSAFE listed.
Nearly 50% of early years practitioners surveyed said they did not know where to find reliable AI safety information for young children, and 69% said the sector needed more guidance.
They also raised concerns about safeguarding and affordability, with some fearing AI toys could widen the digital divide.
Experts have also previously warned that AI can make mistakes, passing on incorrect information, as well as bias, to kids.
Strict regulation is needed, said researchers
AI-powered toys are set to boom in the coming years. In June 2025, one of the world’s leading toy companies, Mattel, announced a strategic collaboration with OpenAI (the company behind ChatGPT) with a view to creating “AI-powered products and experiences”.
Researchers now want to see clearer regulation which would address key concerns. They recommend limiting how far toys encourage children to befriend or confide in them, more transparent privacy policies, and tighter controls over third party access to AI models.
“A recurring theme during focus groups was that people do not trust tech companies to do the right thing,” said Professor Jenny Gibson, the study’s other co-author. “Clear, robust, regulated standards would significantly improve consumer confidence.”
The report urges manufacturers to test toys with children and consult safeguarding specialists before releasing new products.
Parents are also encouraged to research GenAI toys before buying and to play with their children, creating opportunities to discuss what the toy is saying and how the child feels.
And lastly, the authors recommend keeping AI toys in shared family spaces where parents can monitor interactions.