Online Safety Bill: Bishop of Chelmsford supports Bishop of Oxford’s amendments on safety and risk

On 25th May 2023, the House of Lords debated the Online Safety Bill in committee. The Bishop of Chelmsford spoke in support of amendments to the bill tabled by the Bishop of Oxford, Lord Clement Jones, and Lord Colville of Culross, which would introduce new duties to Ofcom to assess risk and monitor online safety:

My Lords, I shall speak in favour of Amendments 195, 239 and 263, tabled in the names of my right reverend friend the Bishop of Oxford, the noble Lord, Lord Clement-Jones, and the noble Viscount, Lord Colville of Culross, who I thank for his comments.

My right reverend friend the Bishop of Oxford regrets that he is unable to attend today’s debate. I know he would have liked to be here. My right reverend friend tells me that the Government’s Centre for Data Ethics and Innovation, of which he was a founding member, devoted considerable resource to horizon scanning in its early years, looking for the ways in which AI and tech would develop across the world. The centre’s analysis reflected a single common thread: new technologies are developing faster than we can track them and they bring with them the risk of significant harms.

This Bill has also changed over time. It now sets out two main duties: the illegal content duty and the children duty. These duties have been examined and debated for years, including by the joint scrutiny committee. They are refined and comprehensive. Risk assessments are required to be “suitable and sufficient”, which is traditional language from 20 years of risk-based regulation. It ensures that the duties are fit for purpose and proportionate. The duties must be kept up to date and in line with any service changes. Recent government amendments now helpfully require companies to report to Ofcom and publish summaries of their findings.

However, in respect of harms to adults, in November last year the Government suddenly took a different tack. They introduced two new groups of duties as part of a novel triple shield framework, supplementing the duty to remove illegal harms with a duty to comply with their own terms of service and a duty to provide user empowerment tools. These new duties are quite different in style to the illegal content and children duties. They have not benefited from the prior years of consultation.

As this Committee’s debates have frequently noted, there is no clear requirement on companies to assess in the round how effective their implementation of these new duties is or to keep track of their developments. The Government have changed this Bill’s system for protecting adults online late in the day, but the need for risk assessments, in whatever system the Bill is designed around, has been repeated again and again across Committee days. Even at the close of day eight on Tuesday, the noble Lords, Lord Allan of Hallam and Lord Clement-Jones, referred explicitly to the role of risk assessment in validating the Bill’s systems of press reforms. Surely this persistence across days and groups of debate reflects the systemically pivotal role of risk assessments in what is, after all, meant to be a systems and processes rather than a content-orientated Bill.

But it seems that many people on many sides of this Committee believe that an important gap in risk assessment for harms to adults has been introduced by these late changes to the Bill. My colleague the right reverend Prelate is keen that I thank Carnegie UK for its work across the Bill, including these amendments. It notes:

“Harms to adults which might trickle down to become harms to children are not assessed in the current Bill”.

The forward-looking parts of its regime need to be strengthened to ensure that Parliament and the Secretary of State review new ways in which harms manifesting as technology race along, and to ensure that they then have the right advice for deciding what to do about them. To improve that advice, Ofcom needs to risk assess the future and then to report its findings.

As the Committee can see, Amendment 195 is drawn very narrowly, out of respect for concerns about freedom of expression, even though the Government have still not explained how risk assessment poses any such threat. Ofcom would be able to request information from companies, using its information-gathering powers in Clause 91, to complete its future-proofing risk assessment. That is why, as Carnegie again notes,

“A risk assessment required of OFCOM for the purposes of future proofing alone could fill this gap”

in the Bill’s system,

“without even a theoretical threat to freedom of expression”.

Amendment 239 would require Ofcom to produce a forward-looking report, based on a risk assessment, to inform the Secretary of State’s review of the regime.

Amendment 263 would complete this systemic implementation of risk assessment by ensuring that future reviews of the regime by the Secretary of State include a broad assessment of the harms arising from regulated services, not just regulated content. This amendment would ensure ongoing consideration of risk management, including whether the regime needs expanding or contracting. I urge the Minister to support Amendments 195, 239 and 263.

Hansard


Extracts from the speeches that followed:

Lord Allan of Hallam (LD): Some of the issues in this group of amendments will range much more widely than simply the content we have before us in the Online Safety Bill. The right reverend Prelate the Bishop of Chelmsford is right to flag the question of a risk assessment. People are flagging to us known risks. Once we have a known risk, it is incumbent on us to challenge the Minister to see whether the Government are thinking about those risks, regardless of whether the answer is something in the Online Safety Bill or that there needs to be amendments to wider criminal law and other pieces of legislation to deal with it.

Some of these issues have been dealt with for a long time. If you go back and look at the Guardian for 9 May 2007, you will see the headline,

“Second Life in virtual child sex scandal”.

That case was reported in Germany about child role-playing in Second Life, which is very similar to the kind of scenarios described by various noble Lords in this debate. If Second Life was the dog that barked but did not bite, we are in quite a different scenario today, not least because of the dramatic expansion in broadband technology, for which we can thank the noble Baroness, Lady Harding, in her previous role. Pretty much everybody in this country now has incredible access, at huge scale, to high-speed broadband, which allows those kinds of real life, metaverse-type environments to be available to far more people than was possible with Second Life, which tended to be confined to a smaller group.

Lord Knight of Weymouth (Lab): Some months ago, I went to a Speaker’s Lecture given by Stuart Russell, who delivered the Reith Lectures around AI. He talked about the programming of an AI-powered vacuum cleaner that was asked to clear up as much dirt as possible. What then plays out is that the vacuum cleaner gets a bit of dirt up off the carpet and then spews it out and picks it up again, because that is the way of maximising the intent of the programming. It is very difficult to anticipate the behaviour of AI if you do not get the instructions exactly right. And that is the core of what we are worried about. Again, when I asked ChatGPT to give me some guidance on a speaking note to this question, it was quite helpful in also guiding me towards an embedded danger of bias and inequity. The AI is trained by data; we know a certain amount about the bias of data, but it is difficult to anticipate how that will play out as the AI feeds and generates its own data.

The equity issues that can then flow are something that we need to be confident that this legislation will be able to deal with. As the right reverend Prelate the Bishop of Chelmsford reminded us, when the legal but harmful elements of the Bill were taken out between draft stage and publication, we lost the assessment of future risk as being something that was in place before, which I think was an unintended consequence of taking those things out. It would be great to see those back, as Amendment 139 and Amendment 195 from the right reverend Prelate the Bishop of Oxford suggest. The reporting that the noble Baroness, Lady Finlay, is proposing in her amendments is important in giving us as Parliament a sense of how this is going. My noble friend Lord Stevenson tabled Amendment 286 to pay particular regard to the metaverse, and I support that.

Lord Parkinson of Whitley Bay (Con, DCMS): Clause 159 requires the Secretary of State to undertake a review into the operation of the regulatory framework between two and five years after the provisions come into effect. This review will consider any new emerging trends or technologies, such as AI, which could have the potential to compromise the efficacy of the Bill in achieving its objectives. I am happy to assure the noble Viscount, Lord Colville of Culross, and the right reverend Prelate the Bishop of Chelmsford that the review will cover all content and activity being regulated by the Bill, including legal content that is harmful to children and content covered by user-empowerment tools. The Secretary of State must consult Ofcom when she carries out this review.