<-- Back to proposed bills

Online Safety Bill - Sitting 15

23 June 2022

Proposing MP
Herne Bay and Sandwich
Type
Public Bill Committee

At a Glance

Issue Summary

Roger Gale discusses the repeal of the video-sharing platform regime and amendments to existing legislation under the Online Safety Bill. Roger Gale is addressing the Online Safety Bill, specifically discussing clauses 173 to 176. The statement discusses amendments to the Online Safety Bill, specifically addressing powers for the Secretary of State to update fraudulent advertising offences and exemptions, ensuring the bill is adaptable to future changes. The MP discusses concerns about clauses 173, 174, 175, and 176 of the Online Safety Bill, which grant extensive powers to the Secretary of State to amend regulations and lists related to fraudulent advertising, exempt content, education and childcare services, and priority offences. The statement addresses the process for amending lists of priority offences and exemptions in the Online Safety Bill through statutory instruments (SIs) under the affirmative procedure, ensuring parliamentary oversight. Roger Gale discusses Clauses 181 to 188 and proposes Amendment 76 to clarify the definition of 'content' in the Online Safety Bill. Roger Gale responds to Alex Davies-Jones's speech on clauses of the Online Safety Bill. Kirsty Blackman discusses concerns about clause 181 and amendment 76 regarding the definition of 'user' and 'content' in the Online Safety Bill. MP Roger Gale discusses the Online Safety Bill and addresses the interpretation of terms within clause 189. Roger Gale discusses a new clause that amends section 2 of the Obscene Publications Act 1959 to provide a defence for OFCOM and its employees against publishing obscene articles when conducting online safety duties. Roger Gale discusses the commencement and transitional provisions of the Online Safety Bill, specifically proposing an amendment to bring Part 5 into effect three months after the Act is passed. The speaker discusses delays and failures in implementing age verification for online pornography under the Digital Economy Act 2017 and urges the Minister to address issues and work with Labour to improve part 5 of the Online Safety Bill. The statement discusses the implementation timeline for part 5 of the Online Safety Bill, specifically regarding protection against commercial pornography aimed at children. Roger Gale is discussing Government new clauses and schedules related to OFCOM's recovery of initial costs and payment into the Consolidated Fund. The statement discusses the need for a statutory user advocacy body representing children's interests under the Online Safety Bill. Barbara Keeley discusses the need for a statutory user advocate in the Online Safety Bill to protect children from online harms. Kirsty Blackman discusses the importance of establishing an advocacy body funded by the government to protect users online, particularly children. The MP discusses the importance of children's safety and victim support in the Online Safety Bill, emphasizing existing protections and mechanisms. The MP discusses concerns about creating additional statutory bodies with overlapping responsibilities and suggests that existing mechanisms such as super-complaints processes are sufficient for protecting children's interests online. The debate discusses the need for an additional user advocacy body to protect children's rights and safety online.

Action Requested

The statement describes legislative changes without proposing new actions, detailing the intent to repeal the VSP regime and amend sections of the Digital Economy Act and Protection of Children Act for clearer regulatory frameworks.

Key Facts

  • Clause 170 repeals the video-sharing platform (VSP) regime.
  • The Online Safety Bill applies to a wider range of online platforms than the VSP regime.
  • Clause 172 amends section 1B of the Protection of Children Act 1978 to create a defence for Ofcom staff against criminalisation while discharging their duties.
  • Roger Gale is discussing clauses 173 to 176 of the Online Safety Bill.
  • He suggests considering these clauses together for stand part purposes.
  • The clause allows for updating the list of fraudulent offences in section 36.
  • Clause 174 includes exemptions to keep the Bill targeted and proportionate.
  • Clause 175 updates the list of categories of education and childcare providers.
  • Clause 176 provides powers to amend schedules related to priority criminal, child sexual exploitation, and terrorism offences.
  • Clause 173 gives the Secretary of State power to amend fraud offences related to fraudulent advertising.
  • Clause 174 allows the Secretary of State to make regulations exempting certain content or services from the regulatory regime.
  • Clause 175 grants powers to amend descriptions of education and childcare in England, with criteria for amendments.
  • Clause 176 gives power to amend terrorism offences (schedule 5), child sexual exploitation and abuse content offences (schedule 6), and priority offences (schedule 7).
  • Clause 173 to 175 are ordered to stand part of the Bill.
  • Amendment 126 ensures consultation with Scottish Ministers or Northern Ireland's Department of Justice before making regulations that affect Scotland or Northern Ireland only.
  • Clauses discuss powers to amend Schedules 5, 6 and 7.
  • Clauses 181 to 188 are being discussed.
  • Amendment 76 aims to clarify the definition of 'content' in clause 189.
  • The amendment inserts 'but not limited to' after 'including' to broaden the scope of what is considered content.
  • Roger Gale advises that it would be courteous for the SNP spokesperson to speak to their amendment first.
  • The discussion is about amendments and clauses in the Online Safety Bill.
  • Clause 181 concerns the definition of 'user'.
  • Amendment 76 would amend the definition of 'content' in clause 189.
  • The amendment suggests adding 'but not limited to' before a list of content types.
  • Amendment 111 is not claimed as it has been tabled by a non-Committee member.
  • Labour seeks further amendments during Report stage if clarification is required.
  • Clause 49 refers to one-to-one live aural communications, while clause 189 uses 'oral' in content definition.
  • New clause amends section 2 of the Obscene Publications Act 1959.
  • It creates a defence for OFCOM, their employees, or those assisting in exercising online safety functions against publishing obscene articles.
  • The amendment ensures that the Bill applies to relevant parts of the UK.
  • Amendment 49 seeks to bring Part 5 of the Online Safety Bill into force three months after it is passed.
  • The amendment aims to provide protections for children from online harms as soon as possible.
  • A BBC 'Panorama' investigation highlighted that social media algorithms may promote violent content to vulnerable young people.
  • The Digital Economy Act 2017 received Royal Assent in April 2017.
  • Part 3 of the DEA was intended to be in force by Easter 2019 but faced delays due to administrative issues.
  • In October 2019, the Government announced it would not commence part 3 concerning age verification for online pornography.
  • Part 3 of the Digital Economy Act 2017 has been cancelled due to delays.
  • Ofcom will produce guidance on meeting the duties in part 5, but a three-month timeframe may not allow proper consultation.
  • The Online Safety Bill aims to provide comprehensive protection against user-to-user and commercial pornography aimed at children.
  • New clause 42 introduces new schedule 2 allowing OFCOM to charge fees to providers for recovery of initial costs.
  • The period over which set-up costs can be recovered is specified as between three and five years.
  • Additional fees charged under new schedule 2 must be paid into the Consolidated Fund.
  • Only 14% to 12 to 15-year-old children have ever reported content according to Ofcom’s evidence.
  • A survey of 2,000 children found that 50% had seen harmful content and 40% tried but failed to remove it about themselves in the past month.
  • Andy Burrows from NSPCC highlighted the importance of an advocacy body acting as an early warning system for identifying new areas of harm.
  • New clause would allow the Secretary of State to appoint an existing or new body as statutory user advocate.
  • The levy on regulated companies aligns with the 'polluter pays' principle in part 6.
  • Research shows that 88% of UK adults support a requirement for an independent body protecting children's interests online.
  • Almost 80% of young people aged 11 to 25 have never heard of the Bill.
  • Blackman has been online since she was eight years old.
  • Girlguiding conducts an annual survey on girls' experiences of the internet which shows overwhelmingly negative experiences but also provides solace during isolation.
  • Ofcom cannot rely solely on charities and third sector organisations without adequate funding.
  • The Online Safety Bill includes strong protections for children.
  • Clause 17 allows parents or responsible adults to raise content-reporting claims on behalf of children.
  • Clause 18 requires complaints procedures to be easy to access and use for children.
  • Clause 140 enables organisations like the NSPCC to raise super-complaints with Ofcom.
  • The Children’s Commissioner acts as a statutory advocate for children.
  • The Children's Commissioner has statutory duties set out in section 2 of the Children Act 2004.
  • The victims budget was approximately £300 million for the current financial year.
  • Clause 140 allows groups to bring super-complaints, covering more than just children's issues.
  • The Children's Commissioner in Scotland prioritizes protecting human rights over online safety.
  • Dame Rachel de Souza is doing a fantastic job advocating for children in both offline and digital spheres.
  • A YoungMinds survey showed that many young people have not heard of the Online Safety Bill.
  • A survivor of online grooming felt isolated and had no one on their side when seeking help.
  • The Children's Commissioner cited her own survey showing a large proportion of 2,000 children failed to get content about themselves removed.
Assessment & feedback
Summary accuracy