Canva's Magic Layers feature was supposed to just separate flat images into editable components. Instead, it started replacing the word "Palestine" with "Ukraine" in user designs. X user @ros_ie9 discovered the bug when the tool changed "cats for Palestine" to "cats for Ukraine." Related terms like "Gaza" were left alone. Canva spokesperson Louisa Green told The Verge the company fixed the issue and is adding checks to prevent it happening again.
Why is a segmentation tool rewriting text at all?
Magic Layers is marketed as non-destructive. It should identify visual elements and separate them, not generate new content. But this bug reveals Canva's pipeline likely uses OCR to detect text and an LLM to process it during layer separation. Training data biases in that model caused it to swap one country name for another without asking.
That's a real problem as creative tools build in more AI. Canva is pushing hard against Adobe's AI suite, and Magic Layers is central to that effort. But when your tool silently rewrites user content based on model training you don't fully understand, trust breaks down fast. Autonomous AI agents managing complex systems often face similar oversight challenges. A Hacker News commenter suggested the model should verify that recombining layers produces an identical original image and throw an error if it doesn't.
That's the minimum bar. Canva missed it.