A deep dive into the Online Safety Bill
The Online Safety Bill (the ‘Bill’), which began life as a Green Paper in 2017, is drawing ever closer to reaching Royal Assent and is currently at the Report Stage in the House of Lords. This controversial and ambitious bill runs to 268 pages – hence the protracted elapse of time for intensive (yet necessary) scrutiny and discussion. There will undoubtedly be final amendments made to the Bill before it comes into force, so please note that some elements of the following round up may be subject to a degree of change in the near future.
1. How and why has this bill come about?
At present, the internet is barely regulated on a mass scale and therefore a wide variety of harm is caused to many users (particularly children, women and vulnerable adults) with alarming frequency. The Bill endeavours to cull this extensive suffering, whilst attempting to balance free speech that some critics suggest will inevitably be endangered by its enactment.
2. How will this be accomplished?
Unlike existing legislation which captures individual online users for specific offences committed, this Bill takes a reverse view (creating corporate liability) and places the onus on the platforms instead (primarily through imposing duties of care), so that companies will have to self-regulate their content and act accordingly.
This way, the Bill can sweep up a huge number of offences and methods of regulation under one umbrella, without having to create separate statutes for each new online offence.
Scope
Those in scope will fall into three primary categories:
- Regulated user-to-user services (i.e. providers of internet services which allow content to be directly uploaded and encountered by users of the service, such as social media platforms).
- Regulated search services (i.e. those which cross over into the definition above and those which also allow multiple searching of websites, such as search engines).
- Providers of internet services on which pornographic content (that is not user generated) is published or displayed.
Territorial jurisdiction
The territorial borders of the Bill are defined as those in scope which “have links with the UK” and are therefore regulated; the criteria for this is that the platform is capable of being used in the UK by individuals and which there are reasonable grounds to believe that there is a material risk of significant harm to individuals in the UK. This can capture any company based in the UK and one which (if based outside of the UK) still has a significant number of UK users of their platform.
Duties of care
Each entity that falls within scope of the Bill will fall into a different category relating to what type of platform they are and will have a range of respective duties of care (some overlapping) that are essentially geared towards two main objectives: the prevention of harm to users (with specific duties regarding children), balanced with protecting the right to freedom of speech.
- The duties of care relating to the prevention of harm is compartmentalised into:
- firstly, illegal content which must be removed (see below); and
- secondly, taking an active role to combat high risk (yet not illegal) content, such as (but not limited to): the promotion of suicide, eating disorders, cyberbullying and certain pornographic content. The platforms have a duty to actively restrict, monitor and/or remove this second type of content.
- The platforms also have proactive duties of care to uphold journalistic integrity and protect news publisher content, democratic importance, and freedom of expression and privacy.
Offences
- Illegal content (listed in the Schedules to the Bill) is an assortment of offences that are already covered by existing legislation, yet which are transferrable to online activity and thus relevant to the Bill. These include (but are not limited to): assisting suicide, threats to kill, public order offences, offences in relation to drugs and firearms, sexual offences, fraudulent advertising, financial crime offences, child sexual abuse, terrorism and assisting illegal immigration.
- The Bill also creates several new offences, such as: ‘communications offences’ (i.e. false communications offence, threatening communications offence, offences of sending or showing flashing images electronically and sending a photograph or film of genitals), ‘revenge porn’ offences and the promotion of self-harm.
Tools to fulfil duties of care and protect users
- In order to carry out their duties, the platforms in scope are expected to tighten access to their content to protect users from encountering certain content which may be harmful to them. The primary tool for this will be to implement extremely stringent age verification restrictions, so that the user (such as children) must prove their age before being granted access to content (such as on pornographic websites).
- The platforms must carry out regular risk assessments, have transparent and enforced terms of business and policies in relation to content and access, have a strong reporting procedure, as well as the duty to actively identify, mitigate and/or remove high risk content and illegal content.
- A function in the Bill named the ‘Triple shield’ is a threefold duty on platforms to protect users; it requires that those in scope: a) remove any content that has been banned by the platform’s own terms and conditions, b) remove illegal content and c) provide ‘user empowerment tools’ (such as filters) on their website, which allow the user active control regarding content they may wish to block.
Exempt content
There are a limited number of communications and content that are fall outside the reach of the Bill. These are: internal business communications (such as internal company messages), emails, mms and sms communications, live aural communications, limited functionality communications (such as posting a “like”, or an emoji) and news content that is posted on a recognised “news publisher” platform.
3. Enforcement and guidance
The Office of Communications (‘Ofcom’) has been appointed as the regulatory body for enforcement of the Bill. Ofcom is tasked with issuing comprehensive guidance to those in scope to actively help companies fulfil their duties under the Bill. This guidance is due to be issued following royal assent and it will drill down into the specific harms and offences that are to be monitored and addressed by the platforms. The companies must be able demonstrate to Ofcom that the guidance provided is being fully adhered to.
- Ofcom has been granted the following enforcement powers in their role as regulator:
- The issue of provisional notices and confirmation decisions to companies.
- Publication of enforcement action that Ofcom have taken.
- Business disruption measures, such as: blocking access to financing companies, advertisers and other service providers which work with the platforms, restricting revenue and UK access to their website and issuing court proceedings for non-compliance.
- Fines of up to £18 million or 10% of worldwide revenue (whichever the greater) for the most severe breaches of the Bill.
- Individual criminal liability can be imposed on senior management who do not comply with Ofcom’s requests.
- Should a coroner request it, Ofcom have the power to access the social media accounts of any deceased children whom the coroner has reason to believe may have suffered the effects of online harm and wishes to review this data (this amendment is undoubtedly derived from the tragic case of Molly Russell).
- Fees must also be paid to Ofcom by the companies in scope, as well as providing reports and findings of relevant incidents to the regulator.
4. What next and what companies should do
- It is imperative to track the progress of the Bill and make a note of important dates that will be relevant to companies who may be affected by the Bill.
- The biggest asset to any company falling within the scope of the Bill will be the guidance published by Ofcom. Companies should keep a watchful eye on Ofcom’s website and consider any changes that will need to be made to their business.
- The protections and polices that will surely be required by Ofcom will undoubtedly take some time to perfect, therefore it would be prudent to start drafting these measures in advance of the Bill coming into force, which will likely be before the end of 2023.
- The type of measures and tools that are predictably to be required in the guidance may include the following:
- Risk assessments
- Record keeping
- Transparency reports
- Updated terms of service
- User empowerment tools
- New policy guidance
- Age verification restrictions
- Security of access to certain content
- Re-evaluation of algorithms and content control.
5. Controversy
The Bill continues to be widely debated, with the main contenders being those who are pushing for the safety of children online (such as children’s charities) and those who are deeply concerned with the clear threat to the freedom of expression and all that that implies.
The main areas of contention that are under discussion have been summarised below:
- The Bill includes the ability to break end-to-end encrypted messages to (primarily) scan for suspected child abuse. This has been excoriated by businesses (such as WhatsApp) as an Orwellian violation of the right to privacy under Article 10 ECHR. At least 80 organisations have created a joint protest to attempt to remove this element of the Bill. If the Bill is passed with this section included, many companies have threatened to leave the UK market.
- The most widely used tool to prevent children from accessing unsavoury content will undoubtedly be the increased age verification measure. This does not appear to be too contentious at a glance, yet in reality, it will require users to upload a troubling amount of personal data, simply to prove their age. Companies such as Wikipedia are outraged at this measure that will be imposed on their users, since their ethos is built upon free and unfettered access of information to all users. Again, Wikipedia (and similar others) have also suggested that they will remove their platform from the UK, should they be required to adhere to this proposal.
- Another problem with the inevitable surge in personal (and financial) information that will have to be uploaded to pass age restrictions, is that the rate of fraud will also predictably rise with it. Opportunistic tech criminals who are savvy enough to intercept this information, will certainly be looking to cash in on the wave of data that they can exploit for financial gain.
- Critics of the Bill have argued that the penalties for non-compliance are Draconian and the subjectivity of “harmful content” will subconsciously encourage unnecessary platform-led censorship of content, as well as self-censorship by users of the platforms, who will not want to risk their content being removed, monitored or reported. Another issue with this is that platforms will unofficially take on the role of judge, jury and executioner of content within their remit, which (if the content is not actually illegal) should arguably be left to the public to post freely. Algorithms which are used to ferret out such content on a mass scale will be unable to detect nuance, sarcasm, humour, political satire and facetiousness; this could easily lead to over-moderation of any dubious content.
- The paradoxical attempt to control potentially “harmful” content whilst simultaneously protecting freedom of expression, has been widely disparaged amongst (particularly) journalists as being an unworkable and mutually exclusive ambition to balance Article 8 and Article 10 ECHR that threatens democracy. It has been suggested that there will be a consequential surge in litigation following enactment of the Bill, surrounding disputes over legitimacy of content.
- The Bill excludes news-focussed websites in an attempt to ameliorate this very problem; however, this may result in any platform self-proclaiming themselves to be “news publisher”, solely to place themselves out of the reach of Ofcom and hence will be able to disseminate any information that they wish. If this were to happen, it has been debated that the UK could become a hub of disinformation, with spurious propaganda sites controlling the narrative.
- Those attempting to fight what they consider to be the most concerning threats to free speech, are stuck between Scylla and Charybdis i.e. the lesser of two evils; the safety of children is clearly of the utmost importance, yet protecting the right to freedom of expression is arguably also similarly vital. How does one proclaim that freedom of expression is of more importance than children’s safety? Therefore critics must tiptoe around this very delicate point, whilst attempting to point out the Machiavellian elephant in the room.
Contact our regulatory law solicitors today
If you have any questions regarding any of the issues raised in this article, please do not hesitate to contact our specialist Regulatory Law team by using our online enquiry form or by calling 0330 191 5713.
Tags: Bethanie, Mantin, Online Safety Bill, Regulatory Law
How can we help?
If you have an enquiry or you would like to find out more about our services, why not contact us?