Companion dolls are being marketed to the elderly as a solution to loneliness. These aren't your grandmother's toy dolls. They talk back, remember conversations, and respond to emotional cues using natural language processing. The pitch is straightforward: aging populations are lonely, and robot companions don't get tired.
The problem is in the implementation. These dolls need to listen constantly to work properly. That means recording conversations, processing them through cloud servers, and storing data about a senior's health, finances, and family relationships. The target demographic often lacks the technical literacy to understand what they're signing up for, let alone configure privacy settings.
We've seen this before. Germany banned the "My Friend Cayla" doll after researchers showed hackers could exploit its Bluetooth connection to listen in on children. The same vulnerabilities exist in elder care devices, but the stakes are higher. A compromised doll in a senior's home could be used for financial scams or psychological manipulation.
The industry is moving fast to solve a real problem. Loneliness is a genuine health risk, and the elderly population keeps growing. But putting always-on microphones in the homes of people who can't evaluate the risks trades one problem for another that might be worse.