Feature

Will The Online Safety Bill Impact Your Rights?

By Hannah Shewan Stevens, Freelance Journalist 23 Sep 2021
Institutions
Credit: Chad Madden / Unsplash

Get involved

Share this article with a friend

Have a cuppa with EachOther

Tell us how EachOther has helped you understand human rights in the UK by completing our short survey.

Fill in the survey here

The internet is an untamed beast with sparse rules and regulations, so comprehensive moderation – whether desirable or not – seems an impossible task. Nonetheless, it is one with which the UK government is currently attempting to grapple. 

The introduction of a Bill to keep people safe online was announced in 2019 with the Online Harms White Paper, which set out aims to pressure social networks to take on more responsibility for harm caused on their sites. It focused on enforcing high standards on social networks, who would face legal consequences if they did not adhere to the rules.

The Bill would empower the government to take action against, for example, a social networking site that claimed to be child friendly but failed to remove harmful imagery from the online environment. 

Only the biggest social networks would have to abide by the regulations and Ofcom was suggested as the relevant regulator. Generally, the concept was welcomed as a positive development amid the chaos of internet communications, but critics are concerned that the current draft could infringe upon the right to freedom of expression, protected by Article 10 of the Human Rights Act

Overall, what is now the draft Online Safety Bill, which is due to be discussed in an oral evidence session in Parliament today, falls short of many of the expectations placed upon it.

What does the Bill propose to do?

According to the government’s website, the draft Bill “establishes a new regulatory framework to tackle harmful content online”, delivering on the administration’s manifesto commitment to “make the UK the safest place in the world to be online, while defending free expression”. 

The Bill outlines a new duty of care for online platforms toward their users, requiring them to take action against both illegal and legal harmful content. 

What happens if people disagree about it? Who makes the ultimate decision?

It gives Ofcom the authority to fine social media firms and tech companies who are deemed to have failed in their duty of care. Any platform that neglects this duty will be liable to be fined up to £18 million or 10% of their annual turnover. Ofcom will publish codes of practice for companies to follow

“One of the things that struck me reading through this Bill is that quite a lot of power to make amendments is given to the secretary of state (SOC),” said Simon Spence QC, barrister. “Most acts of parliament contain some sort of ability for a minister to make changes but one of the things I did note is that the SOC can amend it if it’s not thought to be complying with government policy. I don’t think I’ve ever seen something worded that way in an act of parliament previously. It would mean that if they didn’t think Ofcom were truly representing the government’s policy on online security, the SOC can effectively override them and change it.”

The Bill also tasks large platforms with a series of other vaguely outlined responsibilities, including demonstrating that they can simultaneously protect posts “of democratic importance” and manage others that are lawful but harmful. The scope of the Bill appears to encompass direct messages via social media and video calling platforms like Zoom. It defines “harmful” as content that risks “a significant adverse physical or psychological impact” on someone with “ordinary sensibilities”. 

News publishers’ own websites, including comment sections, are exempt. Section 14 sets out protections for journalism with an expedited appeal process for takedowns if the material was considered journalistic.

What are the Bill’s flaws?

One of the most significant criticisms of the bill is its vagueness. It does not clearly define what “harmful” content actually means or what “ordinary sensibilities” are.

“Who decides what’s harmful and what isn’t?”, asked Spence. “What happens if people disagree about it? Who makes the ultimate decision? When something is as grey and open to interpretation as that, I do think you’ve got a potential problem. In all statutes, the overriding thing is that they’re clear and unambiguous and I think there’s a lack of clarity and an ambiguity to this, which is not what you’d expect to see in an act of parliament.” 

The Bill’s definition of physical and psychological harm is also vague enough that it will be difficult to enforce with clarity. 

No one should be making a pre-judgement about whether journalistic content should be read by the public

“What is interesting is that the definition of physical and psychological harm specifically excludes harm caused by financial impact,” added Spence. “That does seem to me to be playing into the hands of commercial interest. If something is said or done publicly that causes significant financial harm to an individual and they, for example, have a nervous breakdown as a result, why is that any less serious than someone making a direct comment about them in public?”

Additionally, its reference to people of “ordinary sensibilities” could be troublesome for courts and service providers to interpret. 

“To what extent do you consider the child’s circumstances and identity?”, asked Spence. “If you’re the child of a wealthy family with no concerns, then people saying things about you may not have the same impact as they would on an equally robust child from a poor background. You can imagine any set of scenarios where the very specific comment communicated will affect child A but not child B, and both may be of ‘ordinary sensibilities’.” 

How could the Bill affect Freedom of Expression?

Although the Bill includes protective clauses for journalists, some fear it does not go far enough to protect them, or our freedom of expression. The Bill has also come under fire for essentially being unnecessary because harmful and illegal content, like child pornography and racist abuse, are already illegal under other legislation. 

“No one should be making a pre-judgement about whether journalistic content should be read by the public,” said Peter Wright, the editor emeritus of DMG Media, which publishes prominent UK newspapers. “It certainly shouldn’t be commercial organizations overseen by a state regulator.”

Tying heavy penalties to content regulation would impact freedom of expression, according to the UN special rapporteur on freedom of expression. The Committee to Protect Journalists (CPJ) revealed that Turkey uses similar legislation to suppress freedom of speech by preventing news from circulating online.  

The inclusion of private messages in the Bill may undermine digital privacy and anonymity protections. While the Bill does not ban end-to-end encryption – which prevents anyone but the sender and recipient from reading a private message – or anonymous internet use, the Bill is not clear on how companies that are unable to read direct messages are expected to meet its requirements for managing their contents. 

Who else has criticised it?

Multiple organisations have criticised the Bill for its potential to erode freedom of expression and suppress marginalised voices online. Critics argue that the proposed “duty of care” requirement is too vague and poorly defined, meaning it could be used to remove content by marginalised communities, such as sexual health information for LGBTQ+ people

A letter signed by Stephen Fry and other LGBTQ+ campaigners said: “This Bill, as it stands, would provide a legal basis for this censorship. Moreover, its vague wording makes it easy for hate groups to put pressure on Silicon Valley tech companies to remove LGBTQ+ content and would set a worrying international standard.”

Incentivising platforms to manage content under the threat of financial penalty could lead to the curtailment of freedom of expression, as providers decide what users are allowed to share.  

Conversely, the NSPCC argues that the Bill does not go far enough, saying it has “substantive weaknesses” in its current form. New figures obtained by the charity showed that the number of online child sexual offences has risen from 5,458 in 2016/17 to 9,736 in 2020/21. 

“​​The Government has a once-in-a-generation chance to deliver a robust but proportionate regulatory regime that can truly protect children from horrendous online harms,” said Sir Peter Wanless, chief executive of the NSPCC. “But, as it stands, there are substantive weaknesses in its plans which cannot be overlooked. The draft Bill fails to prevent inherently avoidable abuse or reflect the magnitude and complexity of online risks to children.”

Is the Bill salvageable? 

A dark photo shows a series of digital communication. Blue lights and trails represent digital communication

Credit: Fabio / Unsplash

With widespread criticism blocking its way, it’s likely the Bill will undergo significant changes before it is able to pass through both houses of parliament.  

“The regulatory powers given to the SOC should be subject to the scrutiny of parliament,” said Spence. “Because If parliament decided that the SOC was acting unlawfully, they would then be able to say ‘you can’t do that’. As I read it, although parliament has the power of scrutiny over Ofcom’s codes of practice, there doesn’t appear to be any scope for parliament to oversee any amendments made to the codes of practice made by the SOC, which is a potential area of concern.” 

Regardless of whether it’s enacted, the Bill itself appears too vague to have the teeth it needs to take on social network giants, and it lacks sufficient protections for vulnerable groups. 

“The overall aim of the Bill is one to be applauded, but I do think there is some devil in the detail,” continued Spence. “I hope that there’ll be some changes made to it, to clarify some aspects of it. One of the interesting things is that they say that codes of practices must be easy to understand and so forth but the very act they’re introducing is anything but.”

About The Author

Hannah Shewan Stevens Freelance Journalist

Hannah Shewan Stevens is an NCTJ-accredited freelance journalist, editor, speaker and press officer based in Birmingham. She acted as EachOther's Interim Editor from Summer 2021 to January 2022. Her areas of interest are broad-ranging but the topics she is most passionate about are disability, social justice, sex and relationships and human rights. Hannah believes in using her own voice and elevating others to create meaningful change in the world. She is also a sex columnist for The Unwritten and has recently completed her first accreditation in delivering Relationships and Sex Education.

Hannah Shewan Stevens is an NCTJ-accredited freelance journalist, editor, speaker and press officer based in Birmingham. She acted as EachOther's Interim Editor from Summer 2021 to January 2022. Her areas of interest are broad-ranging but the topics she is most passionate about are disability, social justice, sex and relationships and human rights. Hannah believes in using her own voice and elevating others to create meaningful change in the world. She is also a sex columnist for The Unwritten and has recently completed her first accreditation in delivering Relationships and Sex Education.