Each disc flips with an electromagnetic pulse. 3,528 of them clicking in sequence sounds like rain on a window.
Nine Alfazeta panels in a 3x3 grid give you 84x42 individually addressable discs. The display runs at 25-60fps. A 24V Meanwell supply keeps the hardware running. An Nvidia Orin Nano handles computation.
The tech traces back to 1963. Kenneth A. Taylor filed a Ferranti patent for the original mechanism. Now someone's wired it up with modern machine learning.
Node.js controls the display through an npm library called flipdisc. PIXI, Three.js, Matter.js, and GSAP handle rendering and physics. Google's MediaPipe provides gesture recognition through an IMX708 camera. An Expo mobile app lets you control what's playing.
The next step is multi-modal AI. Voice, video, and images processed at the same time. The ML pipeline spawns Python scripts from Node.js and talks over ZeroMQ IPC. It's a practical bridge between Python's ML ecosystem and JavaScript rendering.
Flipdiscs remain stubbornly niche. Mostly found in transportation, being replaced by LEDs in many installations. Sourcing panels is hard and expensive, catered to businesses rather than consumers. But for interactive art like this computer vision project where you don't want screen glow, they're hard to beat.
The author open-sourced the flipdisc library for AlfaZeta and Hanover boards. Anyone wanting to experiment now has a real starting point.