The Government has today responded to the technology giants' collective failure to adequately moderate and remove terror and abuse content on their platforms with a new 'Online Harms' white paper. The document places the onus, to proactively seek out and remove child sexual exploitation, self-harming, abuse, revenge porn and terrorist propaganda content, squarely on the shoulders of the technology companies and the failure to do so could result in fines, blocks or even complete shut down. A new industry regulator has also been proposed, funded with a possible industry levy with senior managers held liable for breaches.

Jeremy Wright, the DCMS [Digital, Culture, Media and Sport] Secretary said the era of self-regulation for online companies was over as they'd failed to make any meaningful progress:

"Voluntary actions from industry to tackle online harms have not been applied consistently or gone far enough." He was supported by Home Secretary Sajid Javid who also highlighted the fact that the Government had been forced to step in because of this dereliction of duty:

"Despite our repeated calls to action, harmful and illegal content - including child abuse and terrorism - is still too readily available online. Put simply, the tech companies have not done enough to protect their users and stop this shocking content from appearing in the first place. Our new proposals will protect UK citizens and ensure tech firms will no longer be able to ignore their responsibilities."

He said the tech giants and social media companies had a moral duty "to protect the young people they profit from" a point the NSPCC has been making since 2017 with repeated calls for a legal duty of care to be placed on social networks: "Time's up for the social networks. They've failed to police themselves and our children have paid the price."

The White Paper proposals include:

  • An independent regulator to hold internet companies to account.
  • Funded by a tech industry levy.
  • A "code of best practice" that social networks and internet companies must follow.
  • As well as Facebook, Twitter and Google, the rules would apply to messaging services such as Snapchat and cloud storage services.
  • The regulator will have the power to fine companies and 'name and shame' those that break the rules.
  • Possible fines for individual company executives.
  • Making search engines remove links to offending websites.
  • Possible blocking of harmful websites or stopping them from being listed by search engines.

The likes of Facebook, Twitter and Google only have themselves to blame for such draconian measures, given the number of issues and their repeated inability to adequately respond which suggests a certain ambivalence. The most recent example being the horrendous mass shooting in New Zealand which was live-streamed on Facebook for 17 minutes and footage was only removed when local Police drew attention to it. However, copies had already been created and circulated on Youtube and Twitter with horrific footage still online up to 3 weeks later. The Ai filters Facebook uses to deal with such situations failed to flag the video and Zuckerberg has refused to implement a broadcast delay of live-streamed videos to prevent it.

The White Paper is long overdue, and is a step in the right direction, but it does raise some significant questions too:

  • Is there to be a new organisation to regulate the internet? Bearing in mind what a huge task this is.
  • What measures will it be able to take?
  • Is it to be entirely focused on giant social networks or smaller organisations as well, such as message boards?
  • And how is to going to regulate material that is not illegal but may still be considered harmful?
  • Who defines issues such as freedom of speech and censorship?

The next stage for the government is to consult on the proposals, so it will be some time before many of these measures are actually put in place. Finding a balance that makes the online world a safer place, without introducing censorship or an impossible regulatory task, is going to be the biggest challenge as TechUK, points out: "The government must be clear about how trade-offs are balanced between harm prevention and fundamental rights".