lawyermonthly 1100x100 oct2024eb sj lawyermonthly 800x90 dalyblack (1)

The Online Safety Bill - Does It Go Far Enough?

In this Article
Reading Time:
4
 minutes
Posted: 28th May 2021 by
Barkle & Leacock
Last updated 27th May 2021
Share this article

In May, the UK government released its long-awaited Online Safety Bill, which aims to provide a framework for identifying and removing harmful content from the internet. Greta Barkle and Guevara Leacock of the Data Protection team at BCL Solicitors analyse the draft Bill and offer their insights as to what it means for companies operating in the UK.

Eagerly awaited, the draft Online Safety Bill has finally been published, delivering on the government’s manifesto commitment to make the UK “the safest place in the world to be online”. The Bill has its genesis in the Online Harms White Paper, published over two years ago in response to widespread concern at the malign underbelly of the internet. But following passionate lobbying by stakeholders, is the result a Bill which has tried so hard to please all interested parties that it ends up satisfying no-one?

Elusive duty of care

The cornerstone of the Bill is a new ‘duty of care’ placed on service providers to protect individuals from ‘harm’. It will apply to providers both based in the UK and – nebulously – those having ‘links’ here. In the government’s sights is the gamut of illegal and legal online content, from child sexual exploitation material and terrorist activity to cyber-bullying and trolling.

The ‘duty of care’ will apply to search engines and providers of internet services which allow individuals to upload and share user-generated content. In practical terms, this net will catch social media giants such as Facebook as well as less high-profile platforms such as public discussion forums.

As regards illegal content, the duty will force all in-scope companies to take proportionate steps to reduce and manage the risk of harm to individuals using their platforms. High risk ‘category 1’ providers – the big tech titans with large user-bases or which offer wide content-sharing functionality – will have the additional burden of tackling content that, though lawful, is deemed harmful, such as the encouragement of self-harm and misinformation.

Adding a further level of complexity, the regulatory framework will apply to both public communication channels and services where users expect a greater degree of privacy, such as online instant messaging services and closed social media groups.

Quite how service providers will be expected to meet these onerous new obligations is not specified in the Bill and, instead, they must wait for full Codes of Practice to be issued.

Rabbits from the hat

Sensitive to public pressure, the government has built on early iterations of its proposals to include new measures addressing concerns raised during the consultation process over freedom of expression, democratic debate, and online scams.

The initial release of the Online Harms White Paper triggered a furore over the potential threat to freedom of speech, with campaigners fearing the proposals would have a chilling effect on public discourse as service providers self-censored rather than face swingeing regulatory penalties for breaches in relation to ill-defined harms. In response to such concerns, service providers will be expected to have regard for the importance of protecting users’ rights to freedom of expression when deciding on and implementing their safety policies and procedures.

Concern has been building for some time about the influence which the largest social media companies potentially wield over political debate and the electoral process. This was seen most starkly in the US during the recent presidential election, where some platforms may have felt like a political football in their own right. While there are only distant echoes of that here, the role which social media plays in UK democratic events has attracted attention and, in a nod to this the government has proposed a new duty on category 1 providers to protect “content of democratic importance”. In what might euphemistically be described as opaque, such content is defined as “content that is, or appears to be, specifically intended to contribute to democratic political debate in the United Kingdom…” Service providers affected might well be left scratching their heads about quite how they will be supposed to interpret and satisfy this obligation, and it is to be hoped that the eventual Codes of Practice will provide some much-needed clarity. Absent such guidance, the risk is that they will be pilloried by all sides.

The government has proposed a new duty on category 1 providers to protect “content of democratic importance”.

Following a vocal campaign from consumer groups, industry bodies and Parliamentarians, the government appears to have capitulated to pressure to include measures bringing online scams within the scope of the Bill. eCommerce fraud is estimated to be up 179% over the last decade, with romance scams alone resulting in UK losses of £60 million in 2019/20. All service providers will be required to take measures against these illegal online scourges. Commentators have noted, though, that frauds committed via online advertising, cloned website and emails will remain outside the Bill’s ambit, leaving many investors still vulnerable to the lure of sham investment frauds.

A fierce watchdog?

This ground-breaking regulatory regime will be enforced by a ‘beefed-up’ Office of Communications (‘Ofcom’) which will wield an arsenal of new powers including fines and, in the last resort, business disruption measures. Penalties of up to £18 million or 10% of annual global turnover (whichever is the greater) will be at the regulator’s disposal. Those calling for senior management liability will, however, be disappointed; the Bill will not impose criminal liability on named senior managers of in-scope services, though the Secretary of State has reserved the power to introduce such liability in the future.

Conclusion

It remains to be seen how the juxtaposition between online safety and freedom of expression and democracy will play out. Service providers and Ofcom alike will no doubt have their plates full trying to decipher just how to moderate lawful but harmful online content whilst also ensuring users’ freedom of expression and democracy is not adversely affected.

 

Greta Barkle, Associate

Guevara Leacock, Legal Assistant

BCL Solicitors LLP

Address: 51 Lincoln's Inn Fields, London WC2A 3LZ

Tel: +44 (0)20 7430 2277

Fax: +44 (0)20 7430 1101

Email: law@bcl.com

 

BCL Solicitors LLP is a market-leading London-based firm specialising in domestic and international corporate and financial crime, tax investigations and litigation, financial regulatory enforcement, corporate manslaughter and health and safety offences, disciplinary proceedings, serious and general crime, as well as the associated areas of anti-money laundering and anti-corruption compliance and risk management. BCL is consistently top-ranked by Chambers & Partners and The Legal 500, and recognised in Who’s Who Legal and in GIR’s list of the world’s leading investigations firms.

Greta Barkle is a New Zealand-qualified lawyer specialising in business crime, regulatory investigations, extradition and cybercrime. Prior to joining BCL Solicitors LLP, she worked on a range of complex disputes, including claims against the New Zealand Police and Government Communications Security Bureau in relation to dawn raids and unlawful surveillance of communications.

Guevara Leacock is a Legal Assistant in BCL’s Corporate and Financial Crime Team. He joined the firm in 2019 from another leading criminal defence practice where he also specialised in white-collar and business crime.

Sign up to our newsletter for the latest Government Updates
Subscribe to Lawyer Monthly Magazine Today to receive all of the latest news from the world of Law.

About Lawyer Monthly

Lawyer Monthly is a news website and monthly legal publication with content that is entirely defined by the significant legal news from around the world.