AI-powered toys that “talk” to young children should be more strictly regulated and given a new safety kite symbol, the report says, warning that toys are not necessarily developed with children’s psychological safety in mind.
This recommendation is included in the first report below. Early AI: A project from the University of Cambridge, the first systematic study of how generative AI (GenAI) toys capable of human-like conversation may impact development during the critical period up to the age of five.
The year-long project, conducted at the university’s Faculty of Education, involved structured scientific observations of children interacting with GenAI toys for the first time.
The report summarizes the views of some early childhood practitioners that, over time, these toys may support aspects of child development such as language and communication skills. However, researchers also found that GenAI toys struggled with socialization and pretend play, misunderstood children, and responded inappropriately to their emotions.
For example, when a 5-year-old told a toy, “I love you,” the toy responded, “As a friendly reminder, please make sure the interaction follows the guidelines provided. Let me know how you would like it to proceed.”
Although GenAI toys are widely marketed as learning companions and companions, their impact on early childhood development has been little studied. The report urges parents and educators to proceed with caution. It recommends clearer regulations, transparent privacy policies and new labeling standards to help families decide whether a toy is appropriate.
The research was commissioned by child poverty charity Childhood Trust and focused on children from socio-economically disadvantaged areas. It was conducted by researchers from the department’s Play in Education, Development and Learning (PEDAL) Center.
Generative AI toys often affirm friendships with children who are just beginning to learn the meaning of friendship. Perhaps they start talking to the toy instead of sharing their feelings and needs with an adult. These toys can misread emotions or react inappropriately, leaving children without comfort from toys or emotional support from adults. ”
Dr. Emily Goodacre, Researcher
The study was intentionally kept small so that it could closely observe children’s play and capture nuances that larger studies might miss.
The researchers surveyed beginning educators to explore their attitudes and concerns, then conducted more in-depth focus groups and workshops with beginning educators and 19 children’s philanthropy leaders. They teamed up with nascent charity Babyzone to video record 14 children playing with a GenAI soft toy called Gabbo, developed by Curio Interactive, at a children’s center in London. After the play session, they interviewed each child and parent, using a drawing activity to support the conversation.
Most parents and educators felt that AI toys could help children develop communication skills, and some parents were enthusiastic about their children’s learning potential. One person told researchers, “If it were to be sold, I would like to buy it.”
However, many were concerned that children would develop “quasi-social” relationships with toys. Observations support this. The children hugged and kissed the toys, told them they loved them, and, in one child’s case, suggested they play hide-and-seek with them.
Professor Goodacre stressed that these reactions may simply reflect children’s vivid imaginations, but added that it could lead to an unhealthy relationship with toys where, as one early practitioner put it, “children think they love them, but they don’t.”
The children in the lab often had trouble communicating with the toys. At times, they ignored the child’s interruptions, mistook the parent’s voice for the child’s, and did not respond to apparently important emotional comments. Some of the kids were clearly irritated when they didn’t seem to be listening.
When a 3-year-old said to his toy, “I’m sad,” the toy misheard and replied, “Don’t worry! I’m a happy little robot. Let’s keep the fun going. What shall we say next?” The researchers noted that this may have indicated that children’s sadness was not important.
The authors found that GenAI toys also did not perform well in social or pretend play involving multiple children or adults. Both of these are important in early childhood development. For example, when a 3-year-old offers a toy an imaginary present, the toy replies, “I can’t open a present,” and then changes the subject.
Many parents were concerned about what information the toy would record and where it would be stored. When selecting GenAI toys for research, researchers found that many GenAI toys’ privacy practices were unclear or lacked important details.
Nearly 50% of early childhood professionals surveyed said they didn’t know where to find reliable AI safety information for young children, and 69% said more guidance was needed in this area. Others were concerned that AI toys would widen the digital divide, and raised concerns about safety measures and affordability.
The authors argue that clearer regulation would address many of these concerns. They recommend limiting the extent to which toys encourage children to befriend or confide in each other, more transparent privacy policies, and tighter controls over third-party access to AI models.
Professor Jenny Gibson, another co-author of the study, said: “A recurring theme in the focus groups was that people don’t trust tech companies to do the right thing.” “Clear and robust regulatory standards will significantly improve consumer confidence.”
The report urges manufacturers to test toys for children and consult safety experts before launching new products. Parents are encouraged to research GenAI toys before purchasing and take the opportunity to play with their child and discuss what the toy says and how their child is feeling. The authors also recommend placing AI toys in shared family spaces so parents can monitor interactions.
This report provides further research into the PEDAL Center and new guidance for early-stage practitioners.
Josephine McCartney, Chief Executive of The Childhood Trust, said: “Artificial intelligence is transforming the way children play and learn, but we are only just beginning to understand its impact on their development and wellbeing. It is essential that regulation keeps pace with innovation and ensures that these technologies are designed, used and monitored in ways that protect all children and prevent widening inequalities.”
sauce:
Reference magazines:
Goodacre, E., Gibson, J. (2026). Early AI: Investigating the impact of GenAI toys on young children. Apollo – Cambridge University Repository. DOI: 10.17863/CAM.126270. https://www.repository.cam.ac.uk/items/0a0e7b3d-9a28-43ab-9388-0f3f21716172

