<-- Back to proposed bills

Online Safety Bill - Sitting 5

07 June 2022

Proposing MP
Herne Bay and Sandwich
Type
Public Bill Committee

At a Glance

Issue Summary

The statement discusses Clause 1 of the Online Safety Bill, providing an overview of the Act and setting expectations for future debates. Roger Gale discusses the Online Safety Bill, focusing on key definitions and scheduling debates for several clauses and schedules. The MP discusses concerns about the Online Safety Bill, particularly regarding its broad definitions and lack of future-proofing for emerging technologies like AI and virtual reality. The MP discusses the need for the Online Safety Bill to be future-proofed and address issues in online gaming. Maria Miller raises concerns about end-to-end encryption being within the scope of the Online Safety Bill and its impact on child safety, as well as inquires whether the Bill covers software designed to sexually harass women through nudification. Roger Gale discusses the Online Safety Bill and its application to various online platforms and technologies. Roger Gale addresses technical issues with the live feed of the Online Safety Bill debate and announces a change in venue for the afternoon sitting. MPs discuss concerns about the Online Safety Bill's effectiveness and clarity, particularly regarding Scottish-specific offences in schedule 7 and the need for a standing committee to scrutinize implementation. Roger Gale discusses the procedural aspects of debating and amending clauses in the Online Safety Bill. The statement addresses the need for regulated online platforms to designate a senior manager as an illegal content safety controller responsible for ensuring compliance with risk assessments and content safety duties. The statement addresses the Online Safety Bill's provisions regarding senior management liability for child safety on online platforms. The statement addresses the enforcement provisions in the Online Safety Bill, specifically amendments 69 and 70 regarding senior management liability for non-compliance with information notices. The statement addresses amendments to the Online Safety Bill, focusing on duties of care for social platforms and search services. Roger Gale is discussing several amendments related to the Online Safety Bill to ensure regulated companies have responsibilities for illegal content risk assessments and cross-platform risks. Roger Gale discusses concerns regarding the Online Safety Bill and calls for increased transparency and accountability in risk assessments. The MP discusses amendments related to online safety measures, including directing users to other content, collaboration among companies, and the production of child sexual abuse material. The statement discusses Clause 8 of the Online Safety Bill, which outlines risk assessment duties for illegal content on user-to-user services. Roger Gale announces the adjournment of the Committee discussion on the Online Safety Bill.

Action Requested

Roger Gale acknowledges the introduction of the clause and calls on the shadow Minister to start proceedings. There are no specific actions proposed beyond continuing with the debate.

Key Facts

  • Clause 1 provides a high-level overview of the Online Safety Bill.
  • The Opposition welcomes the Bill in principle but expresses concerns about its drafting and approach.
  • Dan Carden apologises for missing evidence sessions due to contracting COVID.
  • Roger Gale is discussing the Online Safety Bill.
  • The focus includes key definitions in the bill.
  • Debates are scheduled for Clause 3, Schedules 1 and 2, and Clause 4.
  • The Government estimates 25,100 platforms could be caught under the new regime.
  • Approximately 180,000 platforms could potentially fall within scope of the Bill.
  • 12 million out of 18.4 million child sexual abuse reports made by Facebook in 2019 related to private channels.
  • The MP highlights concerns about private messaging and the use of AI to find illegal content.
  • The Bill needs to be future-proofed for emerging technologies like metaverse and virtual reality.
  • Concerns are raised about online gaming safety, with specific mention of apps like Fortnite and Roblox.
  • The Internet Watch Foundation underlined the importance of end-to-end encryption being in scope of the Online Safety Bill.
  • Nudification software takes photographs only of women and makes them appear completely naked, designed for sexual harassment.
  • Schedule 2 has been expanded to include commercial pornography.
  • The Online Safety Bill is tech-agnostic.
  • Snapchat did not exist when the White Paper was conceived.
  • App stores are not in scope of the Bill, but apps with user-to-user features or conveying pornography may be regulated by Ofcom.
  • The Government has a counter-disinformation unit focusing on Russia/Ukraine conflict.
  • The National Security Bill includes provisions criminalising foreign interference.
  • There are technical issues with the live feed of the Online Safety Bill debate.
  • The Committee will move to Committee Room 9 for the afternoon sitting.
  • Members need to take their papers and laptops with them due to the room not being locked between sittings.
  • Subsection 7 of the Online Safety Bill lacks Scotland-specific provisions.
  • Amendment 126 discusses variations in legislation for Scotland, Northern Ireland, and England/Wales.
  • Clause 6 introduces a duty of care for online service providers.
  • The Online Safety Bill is described as a complex piece of legislation.
  • There is an unusual amount of crossover between different clauses in this bill, making debates more challenging.
  • Chairman takes a relaxed view on stand part debates.
  • Amendment 69 requires providers of regulated user-to-user services to name an individual as the provider’s illegal content safety controller.
  • This individual is responsible for ensuring compliance with risk assessments under sections 8 and 9 of the Bill.
  • The amendment aims to embed online safety regulations at board level within companies.
  • Amendment 69 would require regulated companies to appoint an illegal content safety controller.
  • The amendment aims to make senior management accountable for harms caused through their platforms.
  • The Financial Conduct Authority uses personal accountability regimes to deter harmful behaviour in financial services.
  • The enforcement provisions include senior management criminal liability for falsifying or withholding information with a prison sentence of up to two years.
  • Financial penalties can reach 10% of global revenue which can amount to billions in certain cases.
  • There are 'unplugging' powers allowing Ofcom to disconnect companies from operating in the UK if compliance is not met.
  • Amendments aim to create duties for publishing and supplying illegal content risk assessments to Ofcom.
  • Labour supports the duty of care approach but seeks further transparency in implementation.
  • The government asserts its measures are ahead of similar legislation being developed by the European Union.
  • Amendment 14 seeks to ensure that regulated companies’ boards or senior staff have responsibility for illegal content risk assessments.
  • Amendment 25 requires the risk assessment to take into account the risk of the production of illegal content.
  • Amendment 19 incorporates a requirement to consider cross-platform risk.
  • Amendments aim to include duties to prevent the production of illegal content and collaborate with other companies to mitigate risks.
  • Labour welcomes moves to compel user-to-user services to provide illegal content risk assessments.
  • Amendment 10 aims to increase transparency around illegal content risk assessments.
  • Amendment 14 would require boards or senior managers to approve risk assessments related to adults.
  • Amendment 25 aims to hold regulated services accountable for online sexual exploitation of children.
  • The Philippines is a source country for livestreamed child sexual exploitation.
  • PhotoDNA cannot detect newly produced child sexual exploitation material.
  • Frida was groomed on Facebook and migrated to WhatsApp, leading to abuse at age 13.
  • Amendment 20 relates to being directed to other content, impacting services like Discord and Facebook.
  • Amendment 21 addresses collaboration among companies, highlighting a case where Facebook employed an outside company to harm TikTok's reputation.
  • Amendment 30 is related to the production of child sexual abuse content, with 75% self-generated by children or young people coerced into producing it.
  • Clause 8 sets out risk assessment duties for illegal content on user-to-user services.
  • Companies have three months to carry out initial risk assessments after schedule 3 is voted on.
  • There are concerns that Ofcom would be overwhelmed by receiving 25,000 risk assessments from all companies in scope.
  • Roger Gale (Herne Bay and Sandwich) (Conservative) announces the adjournment.
  • The Committee will reconvene this afternoon at Room 9.
Assessment & feedback
Summary accuracy