The Government's much anticipated response to the Online Harms White Paper consultation has finally been released but those seeking clarity at this stage may be left scratching their heads. It appears to represent no more than an indication of the Government's direction of travel under considerable media pressure to ensure that "something must be done" writes BCL partner, Julian Hayes and solicitor, Greta Barkle.

In April 2019, plans were announced to establish a broad new statutory duty of care for social media companies and platform providers to tackle widespread concerns about a host of online issues, from terrorist and child sexual abuse content to cyber bullying and trolling. After receiving over 2,400 consultation responses, some of which pilloried the proposals as a threat to free speech, legislative momentum appeared to wane. With the election of Boris Johnson in December though, the issue was resurrected in the first Queen's Speech of the new administration. Issuing its response to the consultation, the Government has reaffirmed its ambitious commitment to making the United Kingdom the "safest place in the world to be online."

The Government intends to realise its aim by creating a broad duty of care on those affected to take reasonable steps to keep users safe and prevent others coming to harm as a direct consequence of activity on their services. The duty will apply to companies facilitating the sharing of user-generated content or user interactions. This will include platforms allowing comments, chat forums or video sharing but will exclude B2B services where there is perceived to be a much lower risk of online harm.

The new duty is set to be supervised by a beefed-up Ofcom, although the suite of powers which it will wield remains unclear. The original enforcement proposals ranged from the more commonplace such as warnings, notices and fines, to the more draconian such as business disruption and ISP blocking. Most eye-catching amongst the proposals was the requirement to nominate a UK representative against whom enforcement action could 'bite' and the imposition of a senior-management liability regime. Some expressed concern that this would unduly penalise third-parties rather than those actually uploading the harmful content, and discussions with industry highlighted potentially negative investment consequences for the UK tech industry. Concern was also raised that excessive enforcement risked companies "over blocking" user-generated content to avoid penalties, risking a chilling effect on public discourse, particularly where vaguely defined harms such as "disinformation" are concerned. No firm conclusions about enforcement powers have yet been reached, leading to criticism that the proposals are being watered down, but the Government intends to say more about this in the spring.

The original White Paper took aim at twenty three broad online harms, some illegal, others not. The Government has now decided that companies in scope will be required to implement systems to minimise illegal uploads and remove them "expeditiously" where they do appear. Platform providers will continue to decide for themselves what type of legal content is acceptable on their services, leaving to them decisions about appropriateness which they may be reluctant to make and which would be better taken by others.

Freedom of speech issues also arose within the consultation over private messaging. Long a concern of law enforcement and security services whose work is complicated by encryption, it is well-known that criminals often make initial contact on public forums, before "going dark" – that is resorting to the seclusion of one to one or group messaging. Recognising the sensitivity of any intrusion on private communications, the Government has been consulting on the types of channels and forums which should fall under the duty of care – for which types of private messaging should platform providers be held responsible? Those advocating on behalf of children cited the risk of grooming and the sharing of child abuse imagery when arguing in favour of including encrypted services within the scope of the duty. The risk is, though, that once the principle of inviolable end-to-end encryption is breached, little prevents repressive political regimes across the world requiring access to private messaging for more sinister purposes. Overall, those responding to the Online Harms consultation opposed including private communication services in the regulator's purview and the Government has not yet expressed a formal view on this issue, although statements from the Home Secretary make clear her feelings on it.

As the Government admits, its response is just one step towards the development of a comprehensive package of measures to address a range of online harms: it is a policy process in motion and not a finished product. While the consultations with civil society and tech industry represent an intelligent attempt to build a consensus for the proposals, there will inevitably be difficulties when detailed proposals are brought forward later in the year, when freedom of speech concerns are likely to re-surface. For now, making the UK the world's safest place to be online must wait a little longer.

The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.