On 1 February 2023 the Online Safety Bill received its Second Reading in the House of Lords. The discussion that ensued around concerns with, and areas of change required to, the Bill provides an insight into what we can expect for the Bill as it moves through the House of Lords. Key themes included:

  • A universal priority for all: there was a universal consensus that the Bill should be finalised as soon as possible and this consensus was without political divide.
  • In terms of forthcoming focus in the House of Lords, among other areas we can expect discussion around:
    • Future proofing legislation: how best to ensure effective on-going parliamentary scrutiny of the Bill – a key element of the success of this non-static "living legislation" – as well as ensuring that powers granted to the Secretary of State do not undermine Ofcom's regulatory independence.
    • Stress testing the "triple shield": how best to address "legal but harmful" content online and whether the "triple shield" is currently sufficient to protect users from a range of serious harms that fall below the "illegal content" threshold, particularly vulnerable adults. This is likely to include thoughts around risk assessments and platform terms of service as well.
    • The Bill priority – child protection online: ways to further strengthen these protections in the Bill and how to identify priority content that is harmful to children.
    • Further senior manager liability: senior managers being made personally liable for a wider range of failures of compliance, for example where they have consented to or connived in ignoring enforceable requirements, risking serious harm to children.
    • Bolstering protections for women and girls: the most appropriate way to enhance these protections in the Bill, including addressing the harmful but lawful nature of misogynistic abuse online.

These themes will be discussed at length at the Committee Stage, the next stage of the legislative process in the House of Lords. Whilst a date for the Committee Stage has yet to be confirmed, we do not expect this to take place any earlier than April 2023. We also expect an extension to the time for considering the Bill until Autumn 2023 to coincide with the delayed King's speech, therefore extending the likely time frame for finalising the legislation.

In the meantime, for those in scope of both the UK Online Safety Bill and the EU Digital Services Act (which came into force in November 2022), it is also worth considering how any compliance programme can be carried out most efficiently – given the differences and potential overlap between the two regimes as well as the disparity in timing between them. Operationally, the most practical solution may require organisations to apply the higher of the two standards for consistency across both jurisdictions.

With the landmark ruling in the Molly Russell case clearly identifying that the negative effects of online content contributed to her death, and the fact the EU Digital Services Act has already come into force, some commentators have argued that the Bill is already too late.

Either way, we can expect the Bill to remain a legislative priority.

The end is more in sight: the Bill, a universal priority

Over the long and winding 4 years since its inception in the 2019 Online Harms White Paper and the "revolving door of four Prime Ministers, [five Culture Secretaries] and seven changes in Secretary of State" (Baroness Merron), it is fair to say that the Bill has been met with significant delays, weighty debate (particularly around the appropriate balance between regulation, innovation and freedom of speech) and much criticism along the way.

This was acknowledged by the Lords, alongside the fact that "there are still matters to discuss" (Baroness Kidron), a desire all around to "improve the Bill", and a "little work and a bit of polishing" still to do (Lord Stevenson). However, there was a universal consensus that the need for the legislation has not diminished and that the Lords should "get it on the statute book as soon as possible" (Baroness Merron), so that the "long-delayed implementation can start".

Unusually there was broadly thought to be no political divide to this consensus, and in his closing speech, Lord Parkinson (the recently appointed "Parliamentary Under Secretary of State of Department for Culture, Media and Sport") urged the House to "consider these matters in detail and without party politics intruding"...on the basis of "collaboration, co-operation and, on occasion, compromise." This was accompanied by an assumption from Lord Stevenson that the government would adopt the same approach.

Living legislation: challenges of ongoing scrutiny and future proofing

As with most legislation seeking to regulate the digital arena, the Bill aims to be technology-neutral to cater for the changing technological landscape, as well as new threats and challenges not yet envisaged. In light of this and the range of secondary legislation and secondary legislative powers anticipated under the Bill, we can expect discussions at the Committee Stage around how best to ensure effective on-going parliamentary scrutiny – a key element of the success of this non-static "living legislation" (Lord Inglewood), alongside the codes of practice to complement it.

Some Lords championed a Joint Committee (modelled on the Joint Committee on Human Rights) to scrutinise digital regulation across the board and reduce the risk of fragmented oversight via a range of committees. Others supported the creation of an independent online safety ombudsman, although the cost and practicalities of such a role were argued as being prohibitive.

Entwined with the appropriate oversight by parliament and elected government, is the extent of far-reaching powers granted to the Secretary of State under the Bill. The Lords raised concerns around the potential for these powers to undermine Ofcom's regulatory independence and, in turn, freedom of expression and the legitimacy of the regime. Particularly the Secretary of State's ability to propose directions relating to Ofcom's codes of practice.

In his opening speech, Lord Parkinson suggested the Lords intend to bring forward two amendments to the existing power: firstly replacing the "public policy" remit with a more defined list of reasons that a direction can be made; and secondly making it clear that this element of the power can only be used in exceptional circumstances.

We can expect related debate at the Committee Stage to enable strong safeguards in place to ensure that the use of this power is transparent and proportionate. Although given the broad nature of online harms, the Lords acknowledged that there may be areas that go beyond Ofcom's expertise and remit as a regulator, where a Secretary of State direction would be valuable, such as national security.

Is the "Triple Shield" sufficient? The true impact of removing "harm to adults" duty

One area tested at length in the Commons, centred around how best to address "legal but harmful" content online.

It is expected that the Lords will continue to home in on this area at the Committee Stage, particularly the true impact of the so-called "triple shield" and whether it is sufficient to actually protect adult users from a range of serious harms that fall below the criminal threshold (e.g. targeted online abuse, dangerous health misinformation etc), particularly vulnerable adults. Refer to our previous blog for further information on the government's recent related amendments to the Bill in the House of Commons

Whilst the toggle system filters under the "triple shield" have potential to allow adults to curate and control their experience online (including the ability to prevent adult users from seeing categories of harmful material), some of the Lords argued the harmful material will still exist and has the potential to influence others unless the government compel an "auto-on" default filter setting.

In his opening speech Lord Parkinson suggested the Lords will bring forward amendments to specify that Category 1 organisations (the largest platforms giving rise to the most risk) will be required to publish a summary of their risk assessments for both illegal content and material that is harmful to children. It is hoped that this would increase transparency about illegal and harmful content on in-scope services and, in turn, ensure that Ofcom can undertake its regulatory role effectively.

We can expect the Lords to explore other related themes such as: whether these risk assessments should be made public and subject to minimum standards, as well as whether they should also be conducted in respect of material that is harmful to adults as well (given the risk assessment itself would not negatively impact freedom of speech). There is, however, a danger that making any risk assessments publicly available could potentially flag platform weaknesses and, in turn, adversely enable work arounds to abuse vulnerabilities in the platform protections in place.

The "triple shield" elevates the importance of a platform's own terms of service; with providers needing to enforce their own terms around moderation of user-generated content and the potential of being subject to significant fines for not doing so. There is a fear that this change could have a perverse incentive, with the unintentional risk that providers strip back their terms of service more than they otherwise would to ease the compliance burden. Perhaps because of this issue, the Lords are expected to consider whether there is a need for minimum standards for platform terms or whether this issue could be addressed under Ofcom's subsequent codes of practice.

Another related area which has not received significant attention to date relates to end-to-end encryption. Whilst the Bill places duties on in-scope service providers to moderate illegal content and harmful content to children on their platforms, for service providers that offer end-to-end encryption arguably there is an incentive to remove or weaken the encryption they offer to comply with this duty. This is another topic likely to be discussed further in the Lords and at Second Reading some of the Lords re-iterated the need to protect encryption and ensure any such duty would not lead to "a requirement for continual surveillance".

Child protection priority: Further tightening, priority harmful content and age assurance

Following removal from the Bill of the statutory duties around harmful content for adults, there has been a real shift towards prioritising protections for children.

We can expect further discussions around strengthening these protections. These are likely to include amendments to name the Children's Commissioner for England as a statutory consultee for Ofcom's code of practice on children's online safety, ensuring that children and young people are adequately accounted for during implementation.

In the medium term, we also envisage further discussions around what a list of "material that is harmful to children" will look like. Whilst both "primary priority" content that is harmful to children and "priority" content that is harmful to children are due to be set out in secondary legislation (with examples provided in Ofcom guidance as well), we are likely to see pressure from the Lords for the government to indicate what these will comprise as early as possible. As well as further detail around age assurance requirements to prevent those under the age of 18 from accessing it.

Senior management: Further extension of liability to be discussed

As the Bill currently stands, individual senior managers can be held criminally liable and face a fine for failing to take all reasonable steps to prevent the platform from failing to comply with Ofcom's information notice. Senior managers can also face imprisonment, a fine or both for failing to take all reasonable steps to prevent the platform committing the offences of providing false information, encrypting information or destroying information in response to an information notice.

In his opening speech, Lord Parkinson confirmed the government have acknowledged the need for senior managers to be made personally liable for a wider range of failures of compliance. This was accompanied by a commitment from the Lords to carefully consider an amendment designed to capture instances where "senior managers have consented to or connived in ignoring enforceable requirements, risking serious harm to children". The amendment would aim to hold senior managers to account for their actions regarding the safety of children, "without jeopardising the UK's attractiveness as a place for technology companies to invest in and grow". It is thought that any such offence would be based on similar legislation recently passed in the Republic of Ireland, as well as looking carefully at relevant precedent in other sectors in the United Kingdom.

Strengthening so-called "protection of women and girls": New offences, adoption of code of practice and new statutory consultees

Perhaps the topic that appeared to feature most heavily in the media following the Second Reading, was that of misogynistic abuse and whether further protections were required for women and girls in the Bill.

Lord Parkinson suggested that women and girls are "disproportionately affected by online abuse". He acknowledged that services in scope of the Bill will need to seek out and proactively remove "priority" illegal content (i.e. terrorism content, child sexual exploitation and abuse related content and content that amounts to one of the priority offences listed in Schedule 7 to the Bill) and that some of the priority offences may disproportionately affect women and girls.

This is, however, thought to currently be insufficient to combat areas such as (the arguably subjective) online misogynistic abuse that, while harmful to women and girls, are legal. Lord Parkinson suggested the Lords will be strengthening existing protections, for example, by listing controlling or coercive behaviour as a priority offence. We can also expect the Victims' Commissioner and the domestic abuse commissioner to be named as statutory consultees for Ofcom's codes of practice. This will require Ofcom to consult both commissioners ahead of drafting and amending the codes of practice and ensure that victims are better protected, by making sure that women's and girls' voices are heard clearly in developing the legislation.

Baroness Morgan, in particular, has also called for a specific Ofcom code of practice for violence against women and girls online, likely to be based on a draft code that Carnegie UK has developed with a coalition of campaigning groups.

Next steps: Timing for the road ahead and compliance approach

The key themes mentioned at the Second Reading will be discussed at length at the Committee Stage in the House of Lords – this is the next milestone for the legislative package which involves a detailed line by line examination of the Bill. Whilst a date for the Committee Stage has yet to be confirmed, given the amount of other legislation passing through the Lords we do not expect this to take place any earlier than April 2023.

In addition, the Bill is due to lapse on 17 March 2023, when it reaches twelve months from its original first reading on 17 March 2022. If so, the legislation would fall away and the legislative process would need to start again, however, we expect an extension to the carry-over motion extending the time for considering the Bill until Autumn 2023 – this would coincide with the delayed end of the current 2022-23 parliamentary session and the delayed King's speech.

In the meantime, for those in scope of both the UK Online Safety Bill and the EU Digital Services Act (which came into force in November 2022), it is also worth considering how any compliance programme can be carried out most efficiently – given the differences and potential overlap between the two regimes as well as the disparity in timing between them. Operationally, the most practical solution may require organisations to apply the higher of the two standards for consistency across both jurisdictions.

With the landmark ruling in the Molly Russell case clearly identifying that the negative effects of online content contributed to her death, and the fact the EU Digital Services Act has already come into force, some commentators have argued that the Bill is already too late.

Either way, we can expect the Bill to remain a legislative priority as the year progresses.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.