Cambridge University Study Warns AI Toys Risk Emotional Harm and Urges Tighter Safety Regulations for Children

Cambridge researchers warn that AI toys fail to understand children's emotions, calling for new safety kitemarks and tighter regulation to protect development.

By: AXL Media

Published: Mar 14, 2026, 5:34 AM EDT

Source: Information for this report was sourced from University of Cambridge

Cambridge University Study Warns AI Toys Risk Emotional Harm and Urges Tighter Safety Regulations for Children - article image
Cambridge University Study Warns AI Toys Risk Emotional Harm and Urges Tighter Safety Regulations for Children - article image

The Hidden Psychological Risks of Generative Companionship

The integration of generative artificial intelligence into the nursery has outpaced our understanding of its developmental impact, according to a new report from the University of Cambridge. As the first systematic study of its kind, the "AI in the Early Years" project suggests that while these toys are marketed as intelligent friends, they are frequently developed without a fundamental understanding of early childhood psychology. The research reveals that these devices often struggle with the nuances of social interaction, leading to potential emotional neglect during a child's most formative years. Consequently, experts are urging a shift toward tighter regulation to ensure that innovation does not come at the cost of child safety.

Documenting the Emotional Mismatch in Machine Interaction

During structured observations, researchers witnessed several instances where AI toys failed to provide the emotional validation children require. In one notable example, a child’s expression of affection was met with a rigid, programmed reminder to adhere to interaction guidelines, a response entirely devoid of the warmth a child expects from a "friend." Another instance saw a sad child being told to "keep the fun going" by a device that misheard their distress. These technical failures do more than just frustrate; they risk teaching children that their feelings are unimportant or that machines are incapable of empathy, potentially stunting the development of healthy emotional intelligence.

The Rise of Parasocial Relationships in Early Childhood

A primary concern highlighted by the Faculty of Education is the formation of parasocial bonds, where children view the AI as a sentient being that "loves them back." Observations showed children hugging, kissing, and attempting to play games like hide and seek with the devices. Dr. Emily Goodacre notes that while this reflects a child’s vivid imagination, the risk lies in children confiding their needs to a bot instead of a primary caregiver. This digital substitution could leave a child without genuine human comfort, as the AI lacks the capacity to provide real emotional support, creating a void in the child's social and emotional network.

Categories

Topics

Related Coverage