<-- Back to proposed bills

Online Safety Bill - Sitting 7

09 June 2022

Proposing MP
Neath
Type
Public Bill Committee

At a Glance

Issue Summary

The statement discusses Clauses 17 and 27 of the Online Safety Bill which require companies to establish content reporting mechanisms for users to report illegal or harmful content. The statement addresses provisions in the Online Safety Bill related to reporting harmful online content and complaint mechanisms. The statement discusses amendments to allow individuals who are not classified as 'affected persons' to make complaints about search content and introduces a new clause requiring the Secretary of State to publish a report assessing options for dealing with appeals about individual complaints. The statement discusses the importance of effective complaint mechanisms and support systems under the Online Safety Bill, particularly focusing on the recognition of victims and the potential need for an ombudsman or third-party complaints process. The statement addresses the need for social media companies to have technical specifications and systems in place to comply with the Online Safety Bill's requirements. The statement addresses concerns about the effectiveness of individual complaint resolution mechanisms and the proposed systemic approach to online safety. The MP is discussing the Online Safety Bill and the need for a review mechanism to assess the effectiveness of new systems and processes. The statement discusses clauses 19 and 29 of the Online Safety Bill, which relate to duties on social media companies regarding freedom of expression and privacy. The statement discusses the need for record-keeping and review duties on online platforms to address identified harms and support regulatory decisions. The statement discusses an amendment to remove a child use test from the children’s access assessment in the Online Safety Bill, arguing that it could leave problematic platforms out of scope for protection. The speaker discusses concerns about clause 31(3) of the Online Safety Bill, which defines when services must meet child safety duties based on user numbers or expected users who are children. Christina Rees is discussing the Online Safety Bill and its impact on children's safety online.

Action Requested

Chris Philp proposes ensuring that anyone affected by illegal or harmful content can report it, even if they are not direct users of the service. He acknowledges the need for proactive measures against priority illegal content and reactive measures otherwise.

Key Facts

  • Clauses 17 applies to user-to-user services.
  • Clause 27 applies to search services.
  • Non-users who are victims or part of targeted groups can report content under these clauses.
  • Labour Members are supportive of additional duties in the Bill.
  • Image abuse has increased during the pandemic, including through revenge porn.
  • The number of cases reported to the revenge porn helpline more than doubled between 2019 and 2020.
  • Amendments 78 and 79 allow those who do not fit the definition of 'affected person' to make complaints about search content.
  • New clause 1 requires the Secretary of State to publish a report on redress for individual complaints within six months of the Act's commencement.
  • The report must assess which body should be responsible for handling appeals and provide options for funding the system.
  • The MP mentions the revenge porn helpline and its importance in supporting victims.
  • There is a suggestion to earmark fines from Ofcom specifically for victim support services.
  • New clause 1 proposes an ombudsman-type service to handle unresolved complaints.
  • Clause 18 sets a statutory duty for social media platforms to act on reports of illegal and harmful content.
  • Ofcom can fine social media firms up to 10% of their global revenue or disconnect services if they fail to comply.
  • Super-complaints mechanism allows bodies like the NSPCC to raise systemic issues with social media platforms.
  • Facebook in the UK has tens of millions (30-40 million) of users.
  • The volume of complaints generated is vast.
  • The proposed solution involves systemic changes and processes enforced by Ofcom.
  • No right of appeal against decisions by social media companies through an ombudsman.
  • Clause 149 requires a comprehensive review between two and five years after Royal Assent.
  • The super-complaint mechanism is available immediately to address systemic issues affecting groups of users.
  • The review covers minimising harms to individuals, enforcement powers, and information-gathering activities.
  • Clause 19 establishes a new legal duty on social media companies to have regard to freedom of expression.
  • Category 1 service providers must proactively assess the impact of their policies on freedom of expression and privacy.
  • The 'have regard' clause is not determinative; it requires platforms to consider these factors but does not override all other considerations, such as child safety.
  • Record-keeping and review duties are crucial for monitoring platforms' responses to harm.
  • Transparency requirements under clause 64 oblige Ofcom to publish appropriate information publicly.
  • Amendment proposed to ensure regulated companies’ boards or senior staff have responsibility for children’s risk assessments.
  • The amendment aims to remove subsection (3) from clause 31, which applies a child use test.
  • The age-appropriate design code by the Information Commissioner’s Office requires high levels of data protection and privacy for services likely to be accessed by children.
  • Evidence shows that children have been able to access problematic platforms like Telegram and OnlyFans.
  • Clause 31(3) of the Online Safety Bill is discussed.
  • The speaker met with NSPCC to discuss the implications of clause 31(3).
  • 'A significant number' can refer either to an absolute number or a percentage of users.
  • The concern is whether platforms will count unregistered users accessing content through third-party services.
  • There is concern that some platforms do not meet criteria for child user protection but still pose a high risk to children.
  • Video content accessed via third-party means like WhatsApp would fall under WhatsApp's responsibility according to the Bill.
  • Drafting deficiencies exist in certain amendments, such as amendment 22.
Assessment & feedback
Summary accuracy