Millions of UK iPhone users woke up to find their devices treating them like children. Apple's iOS 26.4 update automatically enables web content filtering and AI-powered "Communication Safety" tools that restrict access to websites and blur content in messaging apps. The only way out is proving you're an adult through Apple's verification system, which accepts credit cards, driver's licenses, or Apple accounts opened before 2008. Passports aren't accepted. According to Big Brother Watch, roughly one in three UK adults don't have a credit card, and one in five lack a driver's license.
The privacy group, led by director Silkie Carlo, is blunt: this isn't required by UK law. The Online Safety Act 2023 targets websites and online services, not operating systems. Apple has imposed similar demands only in South Korea and Singapore. Big Brother Watch wants the ID checks dropped entirely and the tools made optional.
Ofcom, the UK regulator enforcing the Online Safety Act, has been pushing for device-level age verification through closed technical working groups. Apple publicly threatened to pull FaceTime and iMessage from the UK in 2023 rather than weaken encryption under the Investigatory Powers Act. Now they're implementing OS-level content filtering. The deal: broader content moderation in exchange for keeping encryption standards intact.
The bigger concern is precedent. If operating systems become internet gatekeepers, device-level controls could spread worldwide. Several countries are already considering age verification for social media. And the exclusion problem is real. Older adults, low-income households, and disabled people are least likely to have acceptable ID. Big Brother Watch warns this could accelerate the push for a national digital ID system in the UK, something most of the public opposes. Some users are refusing to install the update. That means missing security patches. Apple risks making its users less safe while trying to protect children.