The UK government recently published an interim update on its consultation on proposed legislation to tackle online harms. The legislation will impose a new duty of care on companies that facilitate the sharing of user-generated content to protect users from harmful online content and activity.

The update is low on policy detail but, using its own words, indicates the "direction of travel" in a number of key areas. A full report will be published in the spring.

The scope

The new rules will apply to businesses that provide services or use functionality on websites operated in the UK that facilitate the sharing of user-generated content or user interactions, through comments, forums or video sharing and the like. To be in scope for the proposed legislation, a business has to operate its own website (including mobile platforms) with the functionality to enable sharing of user-generated content or user interactions.

The duty

Online harms is the collective term given to an array of harmful content and activity. The focus is often on terrorism and child sexual abuse, but revenge porn, hate crime, harassment, promotion of self-harm, disinformation, trolling and the sale of illegal goods are all covered. We are told that the regulator will produce guidance detailing how each category of harm should best be addressed.

A clear division is made between illegal and legal content. Illegal content will need to be removed expeditiously and systems will need to be in place to minimise the risk of it appearing in the first instance. With regard to legal content, the UK government is concerned not to curb freedom of expression unduly. Accordingly, businesses will be required to state explicitly what content and behaviour they deem to be unacceptable on their sites and enforce their policies consistently and transparently.

The proposed legislation will provide that child users, in particular, are to be protected. Companies will be expected to use a range of tools including age-assurance and age-verification technologies to prevent children from accessing age-inappropriate content and to protect them from other harms.

In line with the approach in recent legislation combating bribery, tax evasion and money laundering, the focus is on ensuring that companies adopt appropriate and effective systems and processes.

Proportionality

The update is littered with reference to proportionality, with assurance given to small businesses that the legislation will not be overly burdensome on them. Risk will be the main differentiator, followed by size and resources. What this may mean in practice is a multitiered system with the potential for real uncertainty as to proper implementation. Stakeholders' engagement with the regulator from an early stage will be vital.

Sanctions

In addition to fines, the government proposed enforcement by way of senior management liability and business disruption measures. This caused widespread concern among social media companies. We are told that the government is considering the responses to this aspect of the proposal – an indication that sanctions may not be as draconian as originally proposed.

The regulator

The Office of Communications, known as Ofcom, has been named as the government's choice for regulator. As it already oversees TV and radio, as well as postal, phone and broadband services, there is certainly logic in the choice. This is, however, an entirely new ball game. Whether it will be able to cope effectively with this enormously complex and vast new brief is to be seen.

Interim codes of conduct

The government expects that companies should be taking action now to tackle harmful content or activity on their services. Voluntary interim codes of practice on tackling terrorist and child sexual exploitation and abuse content and activity will be produced in the coming months.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.