Is this the biggest change in age verification?

Towards the end of 2021, Ofcom published a set of NEW “appropriate measures” that will have an enormous impact on Video Sharing Platforms (VSPs), such as TikTok, Snapchat, Twitch and OnlyFans. This significant regulatory change will raise awareness of age verification to both businesses and the general public. Therefore, a range of well-known platforms will require an age verification process for their users to ensure that platforms are protecting under-18s from viewing videos and adverts containing sensitive material.

The regulations are set out in Ofcom’s VSP Framework, which describes measures that operators of VSPs should take to protect their users from harmful material.

What is a VSP?

A VSP, or Video Sharing Platform, is an online video service that allows users to upload and share videos, having the principal purpose or essential functionality of “provision of videos to members of the public.”

VSPs are similar to on-demand video platforms, but there are some key differences. For example, the operator of an on-demand video platform may have complete control over what content appears on their platform. The operator of a VSP may not control what videos are on their platform, but they will have a degree of control over how those videos are presented. For example, they may have control over algorithms that determine what videos a visitor to the platform will encounter first.

Why do VSPs need regulation?

VSPs have transformed how we interact with each other and access entertainment, often in positive ways. However, research indicates that seven in ten users have experienced something potentially harmful while using them. Ofcom has been given new duties to address this.

Ofcom’s VSP framework designates two kinds of harmful material. Relevant Harmful Material includes material containing incitement to violence against certain groups, homophobia, racism, child sexual exploitation material, etc. Restricted Material includes material which has been, or would be, given an R18 certificate, or any other material that may impair the physical, mental or moral development of under 18s.

How can VSPs comply with Ofcom’s VSP framework?

Ofcom’s approach to regulating VSPs will differ from their approach to regulating TV or radio. Instead of monitoring individual pieces of content, Ofcom will monitor the systems that VSPs have in place to protect their users from harm. The actions required from VSPs will therefore depend on the nature of the service they offer. Ofcom will also consider the size and capabilities of VSPs when taking regulatory action.

Whilst Ofcom’s approach in general is not prescriptive, there are certain measures which Ofcom consider central to achieving main requirements. Ofcom expect these measures to be acted upon.

One “main requirement” is protecting under-18s from adult material. Implementing robust age verification is considered a central measure to meeting this requirement. Ofcom have stated:

“We believe that the age verification sector is mature and competitive enough to offer a range of solutions to adult VSPs… We are satisfied that adequate, easily integrated age verification options are available.”

Ofcom’s Enforcement Powers

Ofcom has the power to send enforcement notifications, requiring VSP providers to take specified actions. Additionally, Ofcom may impose financial penalties of up to £250,000 or 5% of qualifying revenue, whichever is greater. In the most serious cases of non-compliance, Ofcom also have the power to suspend or restrict a service.

How Can AgeChecked Help?

AgeChecked offers age verification solutions for online adult content including video and images. Our solutions keep customer data private while ensuring the onboarding journey is seamless. If you distribute age-restricted content online, it might be time to get in touch with our team today.

Click here or get in touch to find out more. We are always happy to help you remain compliant and can offer a tailored approach to suit your business.

Recent Posts