<-- Back to proposed bills

Online Safety Bill - Sitting 2

13 December 2022

Proposing MP
Wallasey
Type
Public Bill Committee

At a Glance

Issue Summary

Angela Eagle discusses the Online Safety Bill and its clauses concerning adults' risk assessment duties, specifically addressing concerns over the removal of clauses related to adult protections. The statement addresses the Online Safety Bill and its provisions for adult safety online, including companies' duties to remove illegal content and enforce their terms of service. The statement addresses the Online Safety Bill, specifically discussing the balance between freedom of expression and protecting against illegal content online. Angela Eagle discusses amendments related to the Online Safety Bill, specifically focusing on user empowerment tools and content categories. The speaker is addressing confusion about the Minister's arguments against an amendment not yet presented. Angela Eagle addresses concerns about the Online Safety Bill, particularly regarding the exclusion of harmful health content and climate change disinformation from the proposed list of regulated content. The discussion centres around amendments to the Online Safety Bill, focusing on user empowerment tools and content moderation policies. The statement discusses the inadequacies of the Online Safety Bill in addressing extremist and radicalizing content online. Angela Eagle discusses the dangers of online radicalization and the need for stronger regulation of social media platforms. Angela Eagle discusses amendments related to the Online Safety Bill, specifically focusing on tools for users to protect themselves from harmful content. The statement discusses amendments to the Online Safety Bill regarding user empowerment tools designed to protect individuals from harmful content online. MPs are debating amendments to the Online Safety Bill that would require platforms to hide certain types of legal but harmful content by default for adult users. The amendment aims to ensure that online safety features are accessible and understandable for adult users with learning disabilities. The statement discusses amendments and new clauses related to the Online Safety Bill, focusing on ensuring online platforms adhere to their terms of service regarding content removal and user suspension. The statement addresses new clauses in the Online Safety Bill aimed at enhancing transparency, accountability, and free speech protection for users of category 1 services. Angela Eagle discusses concerns about Government new clauses related to online safety and the responsibilities of large or risky online service providers. The statement discusses amendments to clarify the meaning of 'restricting access to content' in the Online Safety Bill. The statement discusses amendments to Clause 20 of the Online Safety Bill, which aim to enhance protections for freedom of expression and privacy. The statement addresses concerns about the Online Safety Bill, particularly clause 20, which aims to balance safety with freedom of expression and privacy online. The statement discusses the record-keeping and review duties for online platforms under the Online Safety Bill. The statement discusses the provisions in the Online Safety Bill related to record-keeping and review duties for companies, and the flexibility service providers have in complying with safety measures while considering users' freedom of expression and privacy. The statement discusses concerns over the Online Safety Bill's provisions for designating harmful content to children and adults. The statement addresses concerns regarding the frequency and transparency of annual reports required by the Online Safety Bill. The speaker discusses the need for biannual reports on online safety measures to ensure timely updates to legislation. The statement discusses the importance of transparency in the Online Safety Bill through annual reporting by companies and Ofcom's role in overseeing these reports. The statement discusses amendments to Schedule 8 of the Online Safety Bill to clarify definitions of relevant content, consumer content, and regulated user-generated content. MPs are discussing amendments related to transparency reports and definitions of content in the Online Safety Bill. The statement addresses the importance of including voice chat in the Online Safety Bill to protect children using online gaming platforms.

Action Requested

Eagle urges reconsideration of the removal of clauses 12 and 13 from the Bill to ensure that all work done up to this point is not undermined and that specific protections for adults are included in the legislation. She also highlights the importance of considering vulnerable young adults with learning disabilities or spectrum disorders who continue to be at risk beyond age 18.

Key Facts

  • The removal of clauses 12 and 13 from the Bill is considered a fatal error by Eagle.
  • These clauses are crucial for legislating forms of harm that are not illegal but are harmful nonetheless.
  • There has been a U-turn in policy with 15% of the Bill coming back to Committee, which is unprecedented.
  • Companies must remove illegal content from platforms.
  • Social media companies have a duty to enforce their terms of service.
  • A fine of £18 million or 10% of global turnover can be imposed for non-compliance.
  • Criminal liability is attached if companies do not put things right or share information with Ofcom.
  • The Minister asserts that the vast majority of platforms have set higher bars than what was originally proposed in the Bill.
  • Platforms can face intervention from Ofcom if they do not adhere to their terms of service.
  • The Online Safety Bill includes adult safety risk assessments covering racially and religiously motivated harassment.
  • Amendments 9 to 14 and 15 focus on enhancing user empowerment tools.
  • Amendment 15 defines categories of content relevant to the duty in subsection (2) including suicide, self-harm, eating disorders, abusive content targeting race, religion, sex, sexual orientation, disability, or gender reassignment.
  • Amendments aim to provide adult users with greater control over their online experience while protecting freedom of expression.
  • The Minister is currently debating an amendment that has not yet been moved.
  • The hon. Member for Pontypridd will move the amendment on behalf of the Opposition.
  • The government's prototype list includes online abuse, harassment, circulation of intimate images without consent, content promoting self-harm and eating disorders, legal suicide content, and harmful health content that is demonstrably false.
  • Amendment (a) to amendment 15 aims to reinsert harmful health content into the Bill.
  • Carnegie’s early written evidence highlighted the threat of climate change disinformation.
  • The Centre for Countering Digital Hate reported that sceptic content gained significantly more engagement than authoritative sources during COP26.
  • Labour Front-Bench amendments address climate change denial and misinformation about vaccinations.
  • Kirsty Blackman criticizes a toggle system where people must actively opt out of seeing harmful content.
  • Damian Collins supports the introduction of shields as an additional safety measure in the Bill.
  • The Online Safety Bill fails to tackle the issue of dangerous online content that sits just below the criminal threshold.
  • HM Prison and Probation Service report indicates the internet's role in radicalizing those convicted of extremist offences.
  • The speaker's personal involvement stems from her sister's murder by a far-right extremist.
  • John was radicalised online at age 15.
  • White nationalist groups target younger recruits through Call of Duty gaming tournaments.
  • ISIS uses social media to reach a large-scale global audience and select vulnerable individuals.
  • Social media platforms need unique regulation due to their distinct nature from real-life interaction.
  • Amendment 102 forces platforms to have safety tools 'on' by default.
  • Amendment 103 allows individuals to disapply certain safety features.
  • The amendments aim to protect users from priority harms such as abuse and hate speech.
  • Labour disagrees with placing the responsibility on users rather than platforms.
  • The amendments aim to ensure user empowerment tools are enabled by default and easily accessible.
  • The proposed changes seek to prevent a two-tier internet system where some users are unaware of harmful content visible to others.
  • Legal but harmful content can act as a gateway to radicalisation and extremism.
  • Amendments 102 and 103 suggest hiding such content by default for adult users.
  • The Government removed the adult legal but harmful duties from the Bill's initial proposal.
  • Amendment 101 proposes a new clause (6A) in section 14 of the Online Safety Bill.
  • Clause 14(4) mandates providers to make all enforcement and protection tools available to all adult users, including those with learning disabilities.
  • Ofcom is required to consult experts on vulnerable adults when producing guidance for user verification duties.
  • The amendments aim to reset the relationship between platforms and users regarding terms of service.
  • Ofcom will oversee companies' systems for discharging duties related to content removal or user suspension based on terms of service.
  • Risk assessments and codes of practice may be in place to ensure compliance with freedom of speech obligations.
  • New clause 4 requires providers to ensure terms of service are clear, accessible, and consistently enforced.
  • New clause 5 mandates Ofcom to publish guidance on compliance with new duties.
  • Video-sharing platforms may be temporarily exempted from the new terms of service duties.
  • Labour opposes the Government's current approach as it places too much responsibility on users rather than platforms.
  • The focus should be on how harmful content spreads online, not just taking down harmful content.
  • Government new clause 5 is supported due to its requirement for OFCOM guidance.
  • Amendments clarify restrictions on user access to content.
  • Down-ranking is not considered restricting access to content under the new amendments.
  • Users can complain about inappropriate down-ranking.
  • Amendments 28, 29, and 31 require providers to have particular regard for freedom of expression and privacy when implementing safety measures.
  • Amendments 36 and 37 apply these protections specifically to search services.
  • Ofcom can take enforcement action against non-compliant platforms.
  • Clause 20 aims to provide balancing provisions for online safety while considering freedom of expression and privacy.
  • Labour expresses concerns over the inconsistency in platforms' approaches towards achieving this balance.
  • Elon Musk's takeover of Twitter has led to changes such as disbanding the trust and safety council, which included experts working on tackling harassment and child sexual exploitation.
  • Labour does not seek to amend the clause but reiterates the significance of requiring in-scope services to publish their risk assessments.
  • Written ministerial statements on the Online Safety Bill previously indicated an amendment would require large platforms to publish summaries of their risk assessments for illegal content and harmful material to children.
  • Details about publishing will be scrutinized in the Lords but more clarity is requested from the Minister before then.
  • Clause 21 provides Ofcom with the power to exempt certain types of services from record-keeping and review duties.
  • Service providers can take alternative measures to comply with safety duties if they are innovative and appropriate for their business models and technological contexts.
  • Providers must have particular regard to freedom of expression and users' privacy when implementing safety measures.
  • Clause 56 deals with designating priority categories of harm for children and adults.
  • Regulations must consult Ofcom before being made.
  • Amendments 42 to 45 have removed definitions related to adult safety duties and legal but harmful content from the Bill.
  • Labour advocates for mandatory transparency reporting by providers.
  • Reports are currently required annually but should be biannual according to the speaker's view.
  • The Bill lacks clarity on how transparency reports will be made publicly available.
  • The speaker advocates for biannual reports as a bare minimum requirement.
  • There is concern about the lack of flexibility in annual reporting requirements, especially when major changes occur rapidly in digital platforms.
  • Researchers outside Ofcom are considered crucial for analyzing transparency reports and identifying issues.
  • Ofcom will publish a notice specifying required information for annual transparency reports.
  • Category 1, 2A, and 2B services must produce these reports annually.
  • The format, manner, and deadline for reporting are determined by Ofcom.
  • Amendments 72 to 75 are discussed.
  • These amendments clarify definitions within Schedule 8 of the Online Safety Bill.
  • The aim is to ensure consistency and support for Ofcom's requirements.
  • Amendment 72 defines relevant content for schedule 8.
  • Amendments 73 and 75 define consumer content and regulated user-generated content respectively.
  • One-to-one live aural communications are exempted from regulation due to concerns over telephony services, but this exemption has raised issues about voice chat in games and social media platforms.
  • Children primarily use voice chat for in-game communication.
  • The amendment aims to define 'consumer content' and 'regulated user-generated content' for Schedule 8.
Assessment & feedback
Summary accuracy