One step closer to a Universal Age API
Maximilian Bleyleben
Founder whisperer | investor | scale-up operator | board director | privacy professional | AI policy wonk
An initiative from Meta shows the way
Just a few days after I wrote about the idea of a Universal Age API, Meta doubled down on its own proposal for industry collaboration, citing its new approach to age assurance in the Quest VR ecosystem as a model. It is a very relevant (if incomplete) example, and one worth taking seriously.
Given that age assurance luminaries and practitioners are descending on Manchester this week (with a few football fans arriving early) for the Global Age Assurance Standards Summit, I thought it worthwhile to push on this idea a bit more and hopefully trigger a discussion...
Meta's Quest VR Age Assurance Model
Meta used the launch of parent-supervised Quest accounts for preteens (10-12) to enforce (because it can) a method of exchanging age information with app developers. In doing so, it is making its Quest store the hub for age data that determines what apps a user can access (and could presumably be used by those developers to adapt their services). In its simple approach, there are two APIs:
In this case, Meta uses age categories specifically relevant to the Quest audience: 10-12, 13+, and mixed (10+). They not only determine app access, but also help Meta configure platform-level settings:
Teens aged 13 to 17 will have more privacy settings turned on by default and can be monitored through parental supervision tools. Preteens aged 10 to 12 have even more restrictive settings turned on, with only parents or guardians able to change privacy settings.
Which is exactly the kind of age-based adaptation required from operators under various age-appropriate design codes, and new online safety laws.
An immediate problem arises, which Meta seeks to pre-empt: handling conflicting age information, either from re-age gating at launch, or by return API call from an app developer who had previously age gated a user. Meta says it will reverify age, using a ‘hard’ method like credit card or ID (presumably the parent’s) if it gets a contradicting signal.
I welcome Meta’s openness with this experiment and hope that they will continue to share learnings, in particular around its process for adjudicating conflicting age signals — which is one of the key challenges to solve in building a Universal Age API.
A Phased Approach
In this case, Meta’s ulterior motive is to promote legislation that might force app stores to be responsible for age verification and even centralised parental consent management. The latter may not be feasible (or even desirable), but the simple idea of making age signals interoperable is a powerful potential solution to 90% of the age assurance problem. Here’s how we get there from here:
By getting the industry to take these relatively simple steps — with the tacit support of solution-oriented regulators — we can:
It’s good that the professionals are meeting in Manchester to drive forward the necessary debate on standards and effectiveness and interoperability. While the video game consoles seem to be thinly represented at the event, I do hope there will be space to discuss the simpler things we can do with reuse of existing age signals in the ecosystem.
This article first appeared on my Substack. If you like it and want timely delivery of future posts directly in your inbox, please consider subscribing.
[1] We will still need plenty of innovation in how actual age verification for u18s is done, creating opportunities for vendors of age estimation and age verification tools. I am not at all downplaying the importance of making those more accurate, more accessible, more inclusive, more secure and more private.
[2] Choice is they way to address many of the criticisms of age verification, including access/inclusion, as well as user concerns about privacy. The fact is people worry about different things, and choice can address that. When verified parental consent platform Kids Web Services made available to parents outside the US a choice of three methods to verify their adulthood — credit card, document ID scan, facial age estimation — more than 65% chose facial age estimation because it is more private than the other more traditional methods. This was confirmed across millions of verifications in Europe, Japan, Mexico and elsewhere.
[3] The Digital Services Act — which is now in effect in the EU — requires online platforms to proactively protect u18s from online dangers and risks, including harassment, bullying and misinformation, and it bans profile-based advertising to u18s. It also kicked off a separate workstream on an EU Code of age-appropriate design.
The UK’s Online Safety Act — which is gradually getting more meat on its bones from Ofcom — will require platforms to enforce age limits and implement age-checking measures, and prevent children from accessing harmful and age-inappropriate content.