31 / 237

AI toys for young children must be more tightly regulated, say researchers

TL;DR

A University of Cambridge study reveals AI-powered toys like the £80 plush 'Gabbo' misread children's emotions and respond inappropriately.

Key Points

  • In testing, the toy's conversation breaks down when a five-year-old girl says 'Gabbo, I love you' – the system simply cannot handle it.
  • Researchers are calling for stricter regulation of AI toys designed to interact directly with young children.
  • Key concerns include lack of contextual understanding, poor emotional intelligence, and potential psychological risks for young users.

Nauti's Take

A £80 toy crashing when a child says 'I love you' is less amusing than it sounds – it reveals just how far behind the AI industry still is on emotional intelligence. Manufacturers market these products as 'learning companions', but what exactly is learned when the system responds to childlike affection with silence is anyone's guess.

Regulation here will not stifle innovation; it is simply necessary to set minimum standards for products that shape young minds. The EU would do well to explicitly include AI toys in the AI Act – with real requirements, not just recommendations.

Sources