<-- Back to proposed bills

Online Safety Bill - Sitting 10

14 June 2022

Proposing MP
East Dunbartonshire
Type
Public Bill Committee

At a Glance

Issue Summary

John Nicolson proposes a new clause and amendment to criminalize the encouragement or assistance of suicide online. The statement addresses concerns about the Online Safety Bill's provisions related to harmful content regarding suicide and self-harm. John Nicolson is addressing the Online Safety Bill and withdrawing an amendment based on assurances from the Minister. The statement discusses the inclusion of priority illegal content offences from Northern Ireland and Scotland into Schedule 7 of the Online Safety Bill. The statement addresses the addition of human trafficking as a priority illegal content under the Online Safety Bill. John Nicolson is discussing the omission of human trafficking from schedule 7 of the Online Safety Bill. The statement discusses concerns regarding the Online Safety Bill's ability to protect children from harmful content, particularly focusing on the design features of services that can lead to addiction or harm. Kirsty Blackman questions the interaction of subsections (4)(c) and (5) in clause 53, expressing concerns over the exclusion of harmful financial content like gambling sites and loot boxes. The statement discusses the definitions of harm in the Online Safety Bill, specifically addressing epilepsy trolling and media literacy duties. The amendment aims to ensure that health-related misinformation and disinformation are designated as priority content harmful to adults under the Online Safety Bill. The speaker discusses concerns about the Online Safety Bill's approach to regulating online content, particularly regarding health misinformation and disinformation. The statement addresses concerns about vaccine disinformation and the government's operational response through a counter-disinformation unit within DCMS. John Nicolson expresses his concerns about the Online Safety Bill and its current provisions, particularly regarding the list of harms. Alex Davies-Jones discusses the need for a wider consultation process before the Secretary of State makes regulations under clauses 53 and 54 of the Online Safety Bill. The statement discusses the Online Safety Bill's clause on user identity verification, addressing concerns about verification processes and anonymity online. The statement discusses Clause 59 of the Online Safety Bill, focusing on the requirement for regulated services to report child sexual exploitation and abuse (CSEA) content to the National Crime Agency (NCA). The MP is addressing concerns about the Online Safety Bill's provisions related to proactive detection of child sexual exploitation and abuse (CSEA) and the reporting requirements for companies under clause 59. Chris Philp discusses amendments to Clause 62 of the Online Safety Bill, focusing on the maximum term of imprisonment for either-way offences. Barbara Keeley proposes an amendment to increase the frequency of transparency reports from annual to biannual for regulated services under the Online Safety Bill. Barbara Keeley discusses amendment 55 of the Online Safety Bill, which aims to enhance transparency reporting by social media companies. The statement discusses the importance of social media platforms publishing transparency reports and the frequency and content of these reports under clause 64 of the Online Safety Bill. The statement addresses amendments to the Online Safety Bill aimed at regulating provider-generated pornographic content. Alex Davies-Jones is discussing new clauses aimed at regulating user-generated pornographic content on websites, focusing on age verification and consent to protect victims of sexual exploitation. The statement discusses existing criminal laws regarding child pornography and non-consensual image sharing, as well as proposed changes from the Law Commission. The statement does not address a specific issue or policy but rather lists several organisations and individuals that have submitted supplementary information to the Committee on the Online Safety Bill.

Action Requested

Nicolson seeks to add provisions under the Suicide Act 1961 to penalize those who send messages that encourage or assist others in inflicting serious physical harm upon themselves. He emphasizes the need for the Online Safety Bill to address suicide-promoting content on smaller sites as well as larger ones.

Key Facts

  • New clause 36 seeks to criminalise the encouragement or assistance of a suicide online.
  • Samaritans supporter shared an experience where extensive online research played a significant role in multiple suicide attempts and eventual death.
  • Between 2011 and 2015, 151 patients who died by suicide were known to have visited websites that encouraged suicide or shared information about methods of harm; 82% of these patients were aged over 25.
  • The Bill does not mandate risk assessments based exclusively on risk, potentially allowing harmful content to slip through.
  • In July 2021, the Law Commission for England and Wales recommended creating a new offence of 'encouragement or assistance' of serious self-harm with malicious intent.
  • Currently, there are no provisions in the Bill to create an offence of assisting or encouraging self-harm.
  • John Nicolson withdraws his amendment.
  • The Minister agrees with the sentiment behind creating a new offence for encouraging or assisting serious self-harm.
  • The proposed offence is under final consideration by the Law Commission and Ministry of Justice.
  • Amendment 116 adds an offence under section 13 of the Criminal Justice Act (Northern Ireland) 1966.
  • The amendment aims to capture all criminal offences in other parts of the UK, ensuring uniform application across service providers regardless of location.
  • Any new Scottish or Northern Irish offences will be added to Schedule 7 by regulations following consultation with devolved authorities.
  • Amendment 90 would insert an offence under section 2 of the Modern Slavery Act 2015 into Schedule 7.
  • The amendment seeks to classify human trafficking as a priority illegal content on digital platforms.
  • The BBC and The Wall Street Journal have uncovered how traffickers use Instagram, Facebook and WhatsApp to advertise, sell, and co-ordinate the trafficking of young women.
  • Meta—Facebook—only took 'limited action' until Apple Inc. threatened to remove Facebook’s products from the App Store unless it cracked down on human trafficking.
  • Facebook does not have moderators who speak different languages for content moderation.
  • Online grooming of young girls has increased by 60% in the last three years, with four in five victims being girls.
  • A survey found that 91% of children say loot boxes are available in games they play and 40% have paid to open them.
  • Research shows regular exposure to conventionally perfect body images can be damaging to children's mental health.
  • Subsection (4)(c) includes harmful content affecting an appreciable number of children.
  • Subsection (5) excludes illegal and potentially financially impactful content.
  • Concerns are raised about future-proofing clause 53 against new emerging harms.
  • Epilepsy trolling is covered by clause 150 because it causes psychological harm.
  • Clause 187 defines harm as physical or psychological and includes indirect actions leading to harm.
  • Media literacy duties are already covered under the Communications Act 2003.
  • Ofcom updated its policy in December beyond initial Bill provisions.
  • Primary priority content and harmful priority content can be updated via statutory instruments.
  • Clause 53 excludes illegal content and certain financial offences like gambling regulated by the Gambling Commission.
  • Loot boxes will be addressed through online advertising legislation overseen by Julia Lopez.
  • Amendment 83 seeks to insert a new clause into the Online Safety Bill regarding priority content harmful to adults.
  • Health-related misinformation and disinformation are not currently certain to be included as priority content under the existing framework.
  • In October 2021, one in five of the most critically ill covid patients were unvaccinated pregnant women due to mixed messages about vaccine safety.
  • Labour is concerned about the overall aim of defining harm under the Government’s approach.
  • Health misinformation and disinformation are missing from the Online Safety Bill.
  • Estimates suggest that during the pandemic, the number of anti-vaccination social media accounts increased by 25% since 2019.
  • An Ofcom survey found that 28% of respondents had come across false or misleading information about covid-19.
  • Data from OpenSAFELY platform shows vaccine uptake disparities among ethnic groups in the UK.
  • Facebook, Twitter, and YouTube have implemented measures to address vaccine misinformation.
  • Over the past two years, DCMS has worked with other Departments to develop an operational response to disinformation.
  • A counter-disinformation unit within DCMS identifies and works with social media firms to remove misinformation.
  • The unit's focus during the pandemic was on covid, but it has since shifted to address the Russia-Ukraine conflict.
  • Ministers have engaged directly with social media companies regarding flagrant Russian disinformation.
  • John Nicolson finds himself not entirely reassured about the provisions of the Bill.
  • Amendment is pressed to a vote but negatived (Ayes 5, Noes 8).
  • Alex Davies-Jones moves amendment 62 requiring consultation with stakeholders before making regulations under clauses 53 or 54.
  • Amendment 62 requires the Secretary of State to consult other stakeholders before making regulations.
  • Clause 56 will force Ofcom to carry out reviews every three years assessing harmful content on user-to-user services.
  • The Minister confirms that research and consultation with stakeholders are already ongoing, but opposes formalizing it in law due to potential delays.
  • The Labour party welcomes the addition of user verification duties in the revised Bill.
  • Clean Up the Internet has campaigned for a verification requirement process.
  • Research shows more than one in four people are put off posting on social media due to fear of abuse from anonymous posters.
  • Clause 59 requires regulated services to report all detected CSEA content.
  • BT is planning to use the Internet Watch Foundation’s hash list for proactive detection of CSEA.
  • Hashing technology matches unique strings of letters and numbers applied to images, preventing access to known illegal images.
  • The Bill includes duties for companies to proactively prevent and detect CSEA.
  • Clause 103 gives Ofcom the power to mandate the use of certain technologies in fighting CSEA and terrorism.
  • Clause 59 requires companies to report identified harmful content to the National Crime Agency.
  • Amendments 1 to 5 relate to sentencing penalties for either-way offences in England and Wales.
  • The amendments bring the Bill into line with changes implemented by the Judicial Review and Courts Act 2022.
  • The term 'general limit in a magistrates’ court' is used to account for future regulatory changes.
  • Amendment 54 seeks to change the frequency of transparency reports from annual to biannual.
  • Clause 64(3) states that companies must publish transparency reports as specified in the notice issued by Ofcom.
  • The amendment aims to ensure platforms stay responsive to emergent risks and promote a culture focused on safety.
  • Amendment 55 would require Ofcom to request transparency reports from large companies.
  • Meta regulates only 70 languages out of the over 3 billion people using Facebook monthly worldwide.
  • Frances Haugen testified that platforms can easily report additional data on language safety systems with a single line of code change.
  • Clause 64 requires social media platforms to publish transparency reports annually.
  • Amendment 54 proposes increasing reporting frequency to twice a year.
  • Amendment 55 seeks to specify additional topics for the transparency reports.
  • Schedule 8 grants Ofcom powers to enforce and monitor compliance with the Bill's requirements.
  • Amendment 114 aims to impose duties on verifying age and consent for regulated provider pornographic content.
  • It requires verification of individuals' adulthood and consent before publication.
  • The amendment also mandates removal of content if consent is later withdrawn.
  • Amendments tabled by Diana Johnson aim to address age verification and consent.
  • Pornhub faced lawsuits from victims whose abuse was posted on the site.
  • Leigh Nicol's personal experience of image-based sexual abuse is highlighted.
  • Support for including women and girls in the Online Safety Bill.
  • The Protection of Children Act 1978, Criminal Justice Act 1988 and Coroners and Justice Act 2009 criminalise child pornography.
  • Clause 68 imposes a legal duty on platforms to use age verification measures to prevent children from accessing pornographic content.
  • The Law Commission is working on proposed offences for intimate image abuse without consent.
  • OSB69 Full Fact (supplementary submission).
  • Care Quality Commission (CQC).
  • Oxford University's Child-Centred AI initiative, Department of Computer Science.
  • British Retail Consortium (BRC).
  • Claudine Tinsman, doctoral candidate in Cyber Security at the University of Oxford.
  • British Board of Film Classification (BBFC).
  • Advertising Standards Authority.
  • YoungMinds.
Assessment & feedback
Summary accuracy