A smarter approach to age verification? Apple thinks so
Apple’s new age verification system shifts responsibility to device manufacturers—echoing elements of China’s approach.
Last year, before taking long service leave, I wrote an article for The Strategist about what Australia could learn from China’s approach to youth screen-time restrictions. My argument was simple: whatever social media platforms are doing to keep under-13s off their services isn’t working, and the responsibility for enforcing age limits should be distributed across the entire digital ecosystem—from app developers to app stores and device manufacturers.
Now, Apple has just announced a new approach to age verification that does exactly that. Instead of leaving the burden on social media companies, Apple is shifting it to the device itself, allowing developers to ask an iPhone for an age range—4+, 9+, 13+, 16+, or 18+—without collecting personal data. This aligns closely with what I argued: that device manufacturers and app stores are in the best position to enforce age restrictions, rather than expecting social media platforms to handle it alone.
This is a significant shift. As Benedict Evans points out in his latest newsletter, the current model is deeply flawed: platforms like Instagram and Snapchat need to determine if a user is both over 13 and under 18, but younger teens often don’t have any official ID to prove their age. Apple’s approach sidesteps this problem by verifying age at the device level and making parents responsible for setting up their child’s phone correctly.
It’s an elegant solution—at least for apps. The problem, as Evans also notes, is that Apple doesn’t allow adult content apps in the App Store, meaning that websites like OnlyFans won’t benefit unless Apple extends the system to Safari. That raises the question of whether Google will adopt a similar approach and, if so, whether they’ll integrate it into Chrome as well as Android devices.
When I wrote about this issue, I was being half cheeky by pointing to China as a role model, but also half serious. While China’s authoritarian social engineering approach isn’t something Australia should replicate, its structural enforcement mechanism—forcing collaboration between app stores, device makers, and app developers—offers a practical model. Apple’s new proposal proves that this isn’t just some abstract idea; it’s a practical, scalable solution.
Of course, the details still matter. Will Apple make this API mandatory for all apps targeting younger users? Will Google follow suit? Will governments mandate its use for social media platforms? These are the next big questions policymakers need to grapple with.
But one thing is clear: the idea that social media platforms alone should be responsible for keeping kids off their services is outdated. The responsibility needs to be distributed across the entire tech ecosystem. That’s what I argued in my original article, and Apple’s move is just the latest evidence that this is the right approach.
I’ll be watching closely to see how this plays out, but for now, it’s good to see that the debate is finally moving in the right direction.
Read my original piece here: Digital spinach: What Australia can learn from China’s youth screen-time restrictions