A New Role for the UK’s Watchdog Ofcom
Facebook, Twitter, Instagram and Snapchat have, until now, enjoyed the right to self-regulate according to self-defined community rules. A string of failures to uphold their own standards of care have, however, led to calls for independent regulation of the tech giants.
In the UK, the tragic death of Molly Russell, as the result of browsing graphic content on Instagram, proved a tipping point. The UK government published the Online Harms White Paper in 2019 setting out their intention to protect online users. One of the proposals was to appoint a regulator to ensure that social media companies recognise and act upon the ‘duty of care’ they owe to their users.
Why Choose Ofcom as the UK Regulator?
The Culture Secretary, Nicky Morgan, announced that Ofcom would be appointed to the role of regulator. Ofcom was chosen because of its track record as regular for ‘broadcasting, telecoms and other related industries’. They now have the task of enforcing a ‘duty of care’, and protecting users from “harmful and illegal terrorist and child-abuse content”. Dame Melanie Dawes, will migrate from the Ministry of Housing, Communities and Local Housing to become CEO of Ofcom in March 2020.
What do the Critics Say?
The government has been criticised by both campaigners for free speech, and MPs who fear that the regulator won’t be given the ‘muscle’ they need to enforce change. Despite calls to the government to criminalise any failure in the duty of care, final decisions won’t be made until the Spring. Conservative MP Julian Knight stated:
“The regulator must take a muscular approach and be able to enforce change through sanctions that bite. That means more than a hefty fine. It means having the clout to disrupt the activities of businesses that fail to comply and, ultimately, the threat of a prison sentence for breaking the law.”
There is, at present, no requirement for social media companies to apply age verification to their platforms.