<-- Back to proposed bills

Online Safety Bill - Sitting 12 (Afternoon)

16 June 2022

Proposing MP
Herne Bay and Sandwich
Type
Public Bill Committee

At a Glance

Issue Summary

The statement discusses amendments to clauses related to child sexual exploitation and abuse content identification technology. Kirsty Blackman discusses concerns and suggestions regarding clauses in the Online Safety Bill, focusing on reports to the National Crime Agency, proactive scanning, and future-proofing technology. Roger Gale is discussing amendments to the Online Safety Bill aimed at strengthening provisions related to the presence of harmful content and assessing risks posed by individuals producing, publishing, or disseminating illegal content. Roger Gale discusses the Online Safety Bill and debates amendments regarding child sexual exploitation and abuse content, focusing on whether Ofcom should consider 'presence' rather than 'prevalence' when enforcing action against harmful content. Roger Gale discusses clauses 107, 108, and 109 of the Online Safety Bill regarding Ofcom's guidance on identifying harmful content and its annual reporting obligations. The statement discusses the need for clearer provisions in the Online Safety Bill regarding financial penalties on organisations that contravene safety regulations, specifically how these fines could be used to support victims of online harm. Roger Gale discusses amendments aimed at improving data access requirements for online service providers. The statement discusses the need for civil society and researchers to access data from tech companies under the Online Safety Bill to ensure its effectiveness. Roger Gale discusses new clause 11 regarding supply chain risk assessment duties for content moderation. The speaker discusses the importance of protecting social media moderators from trauma and ensuring they are well-supported in their role. The MP is addressing concerns about the Online Safety Bill and its potential impact on subcontractors of social media companies, particularly those working in other countries. MP Alex Davies-Jones supports clauses 112 to 117 of the Online Safety Bill, which empower Ofcom to issue confirmation decisions and impose penalties on regulated services that breach enforceable requirements.

Action Requested

Alex Davies-Jones asks the Minister to clarify how technologies like hash matching and PhotoDNA can be utilized once the Online Safety Bill receives Royal Assent. She also requests clarification on the Secretary of State's role in setting minimum standards for accuracy in detecting terrorism and child sexual exploitation content.

Key Facts

  • Ofcom will have power to direct companies to use accredited technology to identify and remove child sexual exploitation and abuse (CSEA) content.
  • The Internet Watch Foundation provides “hashes” of previously identified CSEA material to prevent upload on platforms.
  • PhotoDNA, created in 2009 by Microsoft and Professor Hany Farid, enables detection even when images are digitally altered with a failure rate of one in 50 billion to 100 billion.
  • The NSPCC has raised concerns about proactive scanning.
  • Ofcom is allowed to take action against individual providers under clause 83.
  • A risk register or notice serving multiple companies with common functionalities could be considered.
  • Amendment 36 requires Ofcom to consider the presence of harmful content.
  • Amendment 37 adds a requirement for Ofcom’s risk assessment to include risks to adults and children from illegal content.
  • Amendments 39, 40, and 38 are similar to amendments 36 and 37 but apply to different clauses in the bill.
  • Amendments 35, 36, 39, and 40 aim to replace 'prevalence' with 'presence'.
  • Amendment 37 seeks to widen Ofcom's criteria when deciding on section 103 powers.
  • Clause 104(2)(f) already requires Ofcom to consider the risk of harm in the UK.
  • Clause 107 requires Ofcom to issue guidance on identifying harmful content.
  • Clause 108 mandates Ofcom's annual reporting.
  • Clause 109 ensures definitions of terrorism and child sexual exploitation content are consistent across the bill.
  • The Online Safety Bill includes provisions for Ofcom to fine organisations that fail to prevent harm online.
  • Victim-support organisations argue for independent advocacy and funding support from penalties levied on platforms violating safety rules.
  • Maria Miller proposes a hypothecation of financial penalties into a fund for victim support.
  • Amendment 52 would require Ofcom to produce a code of practice on access to data.
  • The amendments aim to improve understanding of the online environment through better data availability.
  • EU’s Digital Services Act requires platforms to share data with independent researchers immediately.
  • The amendments aim to accelerate data sharing provisions under the Online Safety Bill.
  • Clause 64 compels social media firms to become more transparent through transparency reports.
  • Clause 136 covers researchers' access to information, with Ofcom required to provide guidance.
  • New clause 11 sets out duties for assessing risks in a provider’s supply chain.
  • The clause includes a duty to assess risks to persons employed by contractors moderating content.
  • Contractors are often 'off book' from platforms, creating unaccountable supply chains.
  • Social media moderators experience trauma leading to PTSD.
  • Platforms provide little data on human moderation practices.
  • Meta promised but failed to submit figures on human content moderators.
  • New clause 11 seeks additional safety measures for employees moderating harmful content.
  • The amendment aims to impose standards on subcontractors employed by companies within the scope of the Online Safety Bill.
  • Subcontractors mentioned include those from Kenya working for Facebook.
  • Ofcom, as a telecommunications and communications regulator, lacks expertise in regulating employment conditions globally.
  • Clauses 112 to 117 set out processes around confirmation decisions by Ofcom.
  • Clause 113 outlines steps a person may be required to take for compliance or breach remediation.
  • Subsection (5) of clause 113 allows immediate action in cases involving an information duty.
  • Clauses cover powers when regulated providers fail risk assessments and children’s access assessments.
  • Labour welcomes the three-month timeframe for redoing assessments under clause 115.
  • Clause 116 establishes that Ofcom may require use of proactive technology on public content only.
  • Subsection (7) of clause 116 allows setting requirements for reviewing used technologies.
  • Clauses allow Ofcom to impose financial penalties as part of confirmation decisions under clause 117.
Assessment & feedback
Summary accuracy