Vadim Drobinin wanted venison. That goal sent him to a shooting range near Edinburgh, where he spent Wednesday evenings lying on a mat, shooting at targets, then walking downrange to score them with brass plug gauges. The shooting was fun. The plug-pushing ritual was not. So Drobinin, an iOS engineer by trade, built a computer vision app to do it automatically.

The technical challenge is tricky. Bullet holes are negative space. Object detectors are trained to find things that exist in an image, not gaps where material was removed. Apple's Vision framework failed immediately, tagging the target's center dot and random ring fragments as bullet holes. Drobinin's solution combines two approaches: OpenCV handles the geometric structure of the target (finding rings, measuring radii), while a fine-tuned YOLOv8 model exported to CoreML detects the actual holes. The geometry work draws from a 2012 paper by Rudzinski and Luckner at Warsaw University of Technology that achieved 99% detection on clean ISSF targets. But NSRA cards have printed score numbers on the rings, and .22 bullets leave ragged tears instead of the cookie-cutter holes that air rifles produce. Drobinin had to adapt.

Professional electronic scoring systems like Sius Ascor use acoustic sensors behind the target and cost thousands per lane. They're standard at Olympic competitions and offer sub-millimeter accuracy. But they demand permanent installation and maintenance. A phone app won't replace that infrastructure. What it can do is give practice ranges and amateur shooters a tool that doesn't require buying dedicated hardware. Some Hacker News commenters questioned whether automating a quick manual task makes sense, reflecting the [public's reluctance toward automation](/news/2026-04-23-the-people-do-not-yearn-for-automation). Fair point. But Drobinin's project shows how accessible computer vision has become. Fine-tune a model, export to CoreML, combine with classical CV techniques, and you've got a specialized tool running on the phone in your pocket.