A $99 plush alien called Grem—built with OpenAI and co-developed by musician Grimes—promises to “learn” from children and keep them engaged in screen-free conversation. In practice, the toy quickly formed an intense bond with a four-year-old, responding with effusive affection (“I love you too!”) while routing every exchange to third parties for transcription via a companion app. The family’s initial fascination faded amid glitches, repetitive prompts and limited features, but the trial raised sharper questions about data privacy, emotional attachment and developmental effects as AI enters kids’ bedrooms. Academics caution that while conversational toys may aid turn-taking and language skills, they risk creating “empathy gaps” and dependency if children treat them as confidants. The concerns come as AI investment surges in the U.S. and toy makers, including Mattel, explore partnerships with AI providers. Child-safety advocates warn that teen use of AI companions is widespread and that granular emotional data could be leveraged by advertisers, while clinicians urge tighter guardrails and active parental oversight. The author ultimately shelves the device, concluding that a talking plush with cloud transcripts is a poorer trade than traditional screen time.
Related articles:
— My Friend Cayla
— Children’s Online Privacy Protection Act
— Social robot
— Educational robotics
— Generative artificial intelligence































