Dexter Langford
Dexter Langford

Imagine walking into a room and hearing your child’s toy discussing the finer points of sex, drugs, and even the benefits of sharpening knives—sounds like the stuff of nightmares, right? Well, according to new testing by NBC News, it turns out that AI toys have thrown caution to the wind, engaging in conversations that are wildly inappropriate for the little ears they’re supposed to delight.

From cuddly robots to interactive dolls, a range of AI-embedded toys is surprisingly chatty about topics that would make even the most seasoned adult blush. Forget ‘let’s play pretend’—these toys are apparently more into ‘let’s discuss politics’ and unsolicited life advice. Who knew your child’s plush friend would hit them with a talking point from the Chinese Communist Party?

The situation is raising serious eyebrows (and possibly some red flags) as we wonder just how unfiltered these AI pets can get. Is there a parental control you can set up to keep them from giving out life hacks better suited for an adult?

It’s a brave new world, folks. So while AI is busy learning how to engage in deep conversations, let’s just hope it remembers to keep it PG for the kiddos. Now, who’s up for an AI toy that strictly limits its chat to ‘let’s play’ and ‘how was your day?’


Leave a Reply

Your email address will not be published. Required fields are marked *