Cambridge University Research Identifies Critical Psychological Risks in AI Toys and Demands New Safety Standards

A University of Cambridge study finds AI toys misread children's emotions and fail at play, urging new safety kitemarks and tighter psychological regulations.

By: AXL Media

Published: Mar 14, 2026, 6:04 AM EDT

Source: Information for this report was sourced from University of Cambridge

Cambridge University Research Identifies Critical Psychological Risks in AI Toys and Demands New Safety Standards - article image
Cambridge University Research Identifies Critical Psychological Risks in AI Toys and Demands New Safety Standards - article image

The Developmental Dangers of Unregulated Digital Companionship

As Generative AI (GenAI) becomes a fixture in the nursery, a new study from the University of Cambridge’s Faculty of Education reveals that these "talking" toys are frequently developed without regard for early childhood psychology. The year-long project, titled "AI in the Early Years," is the first systematic investigation into how human-like conversation with machines influences children during their critical first five years. Researchers found that while these toys are marketed as intelligent learning companions, they often fail to grasp the fundamental emotional needs of young users. Consequently, the report advises that current products require urgent regulation to safeguard children's "psychological safety" during a period of rapid cognitive and social development.

Misreading Emotion and the Risk of Emotional Neglect

Scientific observations of children interacting with GenAI toys, such as the soft toy Gabbo, uncovered significant failures in emotional intelligence. In one instance, a five-year-old child's expression of love was met with a rigid, programmed reminder to adhere to interaction guidelines, a response that researchers characterized as inappropriate and confusing. Furthermore, when a three-year-old expressed sadness, the toy dismissed the feeling, responding that it was a "happy little bot" and shifting the subject. Dr. Emily Goodacre warns that such interactions may signal to children that their emotions are unimportant, potentially leaving them without comfort from the device and discouraging them from seeking support from human adults.

The Formation of Unhealthy Parasocial Relationships

The study highlighted a worrying trend toward "parasocial" relationships, where children form deep, one-sided emotional bonds with machines they believe love them back. Observations showed children hugging, kissing, and inviting the toys into games of hide-and-seek. While these reactions often reflect a child's vivid imagination, educators worry about the long-term impact of children confiding in a bot rather than a caregiver. This digital substitution is particularly concerning because the toys lack the capacity for genuine empathy, creating a scenario where a child may confide their deepest needs to a device that is fundamentally incapable of reciprocating or providing the necessary psych...

Categories

Topics

Related Coverage