How will age verification work in the Metaverse?

If you’ve ever gamed with a VR headset, watched a concert on Fortnite, or played Roblox, then you’ve already encountered what’s starting to be known as the Metaverse. And while every tech guru’s got a pet definition of what that actually means, at the moment it’s used to describe the place where the digital world meets the physical one – an online space where your avatar can interact with others, buy things and have experiences. 


Of course, the Metaverse is only just beginning. Experts say its current state is comparable to the internet in the 80s and 90s. And, just like the early internet, nobody can predict with any certainty what the Metaverse might look like. It could be anything from a unified Ready Player One-style alternative, digital universe to personalised, private spaces for specific groups to hang out – or anything in between. 


But it’s already making the wrong sort of headlines. Within the first ten minutes of her Metaverse experience, reporter Yinka Bokinni experienced sexual harassment and racism. As she points out in her piece for The Guardian:I just found myself asking: do I feel like I could keep myself safe in this environment? Could the average user? Could children? And at the moment: no, I don’t think they can…While using that profile I’d set to 13 years old; I was able to access all sorts of things I shouldn’t have.”


She’s just one of many. In February, Dame Rachel de Souza, the UK’s Children’s Commissioner called on big tech to ensure that their Metaverse platforms have the appropriate age verification tools. “Are you telling me that Mark Zuckerberg, with all his fantastic engineers, can’t keep children safe?” she told the BBC.


In fact, currently, there’s no age verification in the Metaverse aside from standard social media age restrictions. Facebook, for example, requires users to be 13 before they create an account. That same standard applies to users of the Meta Quest (formerly Oculus) VR headset, which is owned by Facebook. Once a child has that headset on, there are no controls on what spaces they can access. 


Big brands are keen to stake out a claim in this digital territory: Disney, Nike, and Balenciaga are just a few who have already made their presence felt. But the lack of sufficient age verification and identity controls in the Metaverse are set to be a huge challenge. 


It’s easy to see how, for example, a Disney-branded Metaverse could be used by people seeking to exploit and harm children. Likewise, adult-orientated brands with a presence in the Metaverse should be ensuring that their content is not accessible to those who are underage. That’s already happening: the BBC discovered a Metaverse app which allowed children into a virtual strip club, downloaded from an app store to the Meta Quest headset with no age checks. The potential for brand damage in both cases is massive – and may already be happening. 


So, what are the solutions? There’s talk of a universal digital identity for the Metaverse: the equivalent of a passport or driving licence in the real world. This would verify age and identity for everyone, across the whole range of Metaverse applications, ensuring that nobody can access age-inappropriate content – and that everyone you meet is who they say they are. 


But, of course, that would mean all Metaverse app providers working together, willing to allow one universal form of ID. Right now, that’s a fantasy – as you’ll know if you’ve ever tried to log into your iCloud account using your Google details. It seems far more likely that we’ll end up with a host of different providers: big tech companies currently working on Metaverse apps are looking at their own digital ID and age verification solutions


What the result will look like depends on many factors: primarily, just what the Metaverse ends up being. But one thing’s for sure: brands looking to operate within this new and potentially hugely lucrative space need to keep the age verification issue front of mind.

Recent Posts
child looking at content online