<-- Back to proposed bills
Online Safety Bill - Sitting 6
07 June 2022
Type
Public Bill Committee
At a Glance
Issue Summary
The statement discusses several amendments related to illegal content risk assessments and safety duties under the Online Safety Bill. The statement discusses the coverage of cross-platform risk and user-to-user service functionalities in relation to illegal content under the Online Safety Bill. The statement clarifies and discusses clauses related to online safety measures under the Online Safety Bill. Christina Rees discusses amendments to the Online Safety Bill aimed at enhancing regulatory measures against child abuse breadcrumbing and cross-platform grooming. The statement discusses amendments to the Online Safety Bill aimed at addressing cross-platform risks and ensuring that companies take proactive steps to prevent child sexual exploitation and abuse. MP Christina Rees discusses amendment 28 to the Online Safety Bill, which aims to strengthen protections for children online by requiring regulated services to assess their level of risk based on blocking and detecting child sexual exploitation and abuse (CSEA) content. Christina Rees is discussing amendments related to children's risk assessments, publication requirements, and consideration of illegal content production. The statement addresses the need for robust risk assessments and accountability within technology companies to ensure child safety on their platforms, as well as the importance of making these risk assessments public. The statement addresses the Online Safety Bill's provisions related to protecting children from illegal content and ensuring proper governance by platforms. The discussion centres around the effectiveness of fines and personal liability in ensuring online safety. The statement discusses amendments to ensure that Ofcom assesses risks of harm particularly affecting people with certain characteristics or membership of multiple groups, addressing intersectionality issues. The statement addresses amendments aimed at strengthening the Online Safety Bill to protect children from sexual and physical abuse via online services. The statement discusses the Online Safety Bill's provisions for protecting children from online harms, particularly those related to sexual exploitation and abuse. Christina Rees is discussing clause 12 of the Online Safety Bill, which concerns adults' risk assessment duties. Christina Rees is discussing amendments related to user empowerment duties and identity verification in the Online Safety Bill. The statement addresses the requirement for Ofcom to produce guidance on user identity verification for Category 1 services under the Online Safety Bill. The statement discusses amendments to the Online Safety Bill aimed at addressing online abuse and disinformation through user verification. The statement addresses amendments related to user identity verification and the protection of content important to democracy, specifically focusing on the safety of individuals involved in UK elections. The statement discusses amendments aimed at protecting democracy and democratic debate by addressing online abuse against politicians and their staff. The statement addresses issues related to online abuse faced by MPs and individuals involved in elections, emphasizing the need for protection and fair democratic engagement. Christina Rees is discussing Clause 16 stand part and proposing New Clause 7, which requires the Secretary of State to publish a report on the effectiveness of Clauses 15 and 16. MP Christina Rees addresses concerns about clauses 15 and 16 of the Online Safety Bill, which require platforms to handle journalistic and democratically important content. The statement discusses the interpretation and implementation of clauses related to content of democratic importance and journalistic importance in the Online Safety Bill. The statement lists organisations and individuals who have submitted evidence to the Online Safety Bill committee.
Action Requested
The speaker supports the amendments that aim to ensure regulated companies' boards or senior staff have responsibility for illegal content risk assessments, require the risk assessment to consider cross-platform risks, and include measures to prevent the production of illegal content. The speaker also highlights existing transparency provisions in clause 64 that mandate annual reports from larger companies to Ofcom.
Key Facts
- Amendments 10, 14, 25, 19, 20, 26, 18, 21, 30, and 31 are discussed.
- Clause 64 mandates annual transparency reports from larger companies to Ofcom.
- The transparency report covers the incidence of illegal content, harmful content for children and adults.
- The Bill includes a requirement for platforms to consider illegal content 'by means of the service'.
- Clause 8(5)(d) requires services to risk-assess functionalities that facilitate the presence or dissemination of illegal content.
- Ofcom will provide guidance on how companies can comply with these duties, including measures related to senior-level engagement.
- Amendments 18, 21, 30, and 31 address aspects of online safety measures.
- The Bill obliges service providers to take comprehensive measures against CSEA content.
- Ofcom can intervene if service providers do not engage in appropriate collaborative behaviour.
- The Bill does not capture the range of ways child abusers use social networks.
- In Q1 2021, there were 6 million interactions with accounts used for connecting abusers.
- Abusers exploit features like Facebook’s algorithmic friend suggestions to contact children before moving communication to other platforms.
- The amendments seek to ensure regulated companies' boards or senior staff have responsibility for illegal content risk assessments.
- Amendments proposed include requirements to consider cross-platform risk and content which could facilitate or aid the discovery or dissemination of child sexual exploitation and abuse (CSEA) content.
- Amendment 28 seeks to require regulated services to assess their level of risk based on blocking and detecting CSEA content.
- Clause 8 of the Bill requires risk assessments to cover priority illegal content, including CSEA offences listed in schedule 6.
- The government believes clause 59 already mandates reporting CSEA content detected by platforms to the National Crime Agency.
- Amendment 15 proposes a duty for the children's risk assessment to be approved by either the board or a named senior manager.
- Amendment 27 requires the children’s risk assessment to consider the production of illegal content.
- Amendments 16 and 13 propose similar duties in clause 25 as those proposed in clause 10.
- Clause 10 introduces a duty for user-to-user services likely to be accessed by children to conduct risk assessments.
- The Joint Committee recommended that the risk assessment should be approved at board level, which was rejected by the Government.
- In October 2021, 60 child protection organisations urged Meta to publish its risk assessments but were refused.
- The Philippines is identified as a hotspot for livestreamed child sexual abuse and the UK is the third-largest consumer of such abuse globally.
- The Bill includes an obligation for platforms to reduce risk of illegal content, including user-generated content.
- Search firms must prevent users from finding illegal content.
- Companies can be fined up to 10% of global revenue or disconnected if they fail to protect children.
- Governance arrangements are covered in clause 10(6)(h) and require Ofcom approval.
- Companies like Meta (Facebook) have a global turnover exceeding $71.5 billion.
- The Bill includes personal criminal liability for information provision, with up to two years' imprisonment.
- Fines can reach up to $7 billion per set of breaches.
- Clause 64 mandates annual transparency reports by Ofcom.
- Amendments 72, 73, 74, and 75 address intersectionality issues.
- Amendment 71 requires Ofcom to assess risks of harm affecting people with certain characteristics or membership in groups as part of its risk register.
- The amendments aim to ensure that the Bill considers increased vulnerability arising from being a member of multiple high-risk groups.
- Amendments aim to protect children from sexual and physical abuse through online services.
- Amendment 29 is proposed in clause 11, page 10, line 20.
- Amendment 33 is proposed in clause 26, page 26, line 18.
- UK law enforcement received 97,727 industry reports relating to online child abuse in 2021, a 29% increase from the previous year.
- Online grooming offences reached record levels in England and Wales in 2020-21.
- The number of sexual communications with a child offences in England and Wales increased by almost 70% in three years.
- Amendment 12 creates a duty to publish the adults' risk assessment.
- The amendment requires proactive supply of the risk assessment to Ofcom.
- Amendment 46 seeks to clarify that users should be able to determine whether other users are verified or not.
- Amendment 47 proposes adding a definition for 'Identity Verification' to the terms defined in the Online Safety Bill.
- OFCOM must produce guidance on user identity verification for providers of Category 1 services.
- The guidance must address effectiveness, privacy and security, accessibility, time-frames for disclosure to Law Enforcement, transparency, and user appeal mechanisms.
- Before producing the guidance, OFCOM must consult the Information Commissioner, Digital Markets Unit, technologically expert persons, and user representatives.
- Amendment 46 aims to empower people to use information about verification to make judgments about account reliability.
- Clause 14 of the Bill requires platforms to enable adult users to filter out non-verified accounts.
- The Minister contends that clause 14(6) already enables users to see who is verified when filtering non-verified users.
- Amendment 105 seeks to insert language into Clause 15 about ensuring the safety of people involved in UK elections.
- Amendment 106 proposes OFCOM issuing a code of practice for providers of Category 1 and 2(a) services regarding compliance with election safety duties.
- The amendments aim to strengthen protections for democratic processes through legal obligations on service providers.
- MPs receive over 7,000 abusive or hate-filled tweets per month.
- The University of Salford study from 2020 found high levels of abuse directed at MPs on Twitter.
- Two Members of Parliament have been murdered in the last six years due to online threats.
- MPs face significant levels of abuse on social media platforms.
- The Online Safety Bill includes duties for social media firms to remove illegal content and prevent it proactively.
- Clause 150 updates the Malicious Communications Act, creating a new harmful communications offence.
- Clause 16 stand part is discussed.
- New Clause 7 requires a report on the extent to which Category 1 services have fulfilled their duties under Clauses 15 and 16.
- The report must analyse the effectiveness against foreign state actors, extremist groups, and misinformation sources.
- The Secretary of State must lay the report before Parliament within one year of the Act being passed.
- Platforms are required to define 'journalistic' content, a task seen as unsuitable.
- Individuals could exploit the duties outlined by masquerading as journalists or claiming democratic importance.
- The Bill's definitions are broad and vague, potentially allowing hate speech under certain conditions.
- Far-right figures such as Tommy Robinson self-define as journalists to exploit loopholes.
- New clause 7 would require a report reviewing clauses 15 and 16 for effectiveness.
- Clauses 15 and 16 deal with content of democratic importance and journalistic importance respectively.
- Ofcom will issue codes of practice to specify how platforms should implement these provisions.
- The balancing exercise includes considerations around vile racism, antisemitism, or misogyny despite potential democratic context.
- OSB39 General Insurance.
- OSB40 Epilepsy Society.
- OSB41 Free Speech Union.
- OSB42 Graham Smith.
- OSB43 Center for Data Innovation.
- OSB44 Samaritans.
- OSB45 End Violence Against Women coalition, Glitch, Refuge, Carnegie UK, 5Rights, NSPCC and Professors Lorna Woods and Clare McGlynn (joint submission).
- OSB46 Sky.
- OSB47 Peter Wright, Editor Emeritus, DMG Media.
- OSB48 Graham Smith (further submission).
- OSB49 CARE (Christian Action Research and Education).
- OSB50 Age Verification Providers Association (supplementary submission).
- OSB51 Legal Advice Centre at Queen Mary, University of London and Mishcon de Reya LLP (joint submission).
- OSB52 Google UK (supplementary submission).
- OSB53 Refuge (supplementary submission).
- OSB54 Reset (supplementary submission).
- OSB55 Public Service Broadcasters (BBC, Channel 4, and Channel 5).
- OSB56 Which?.
- OSB57 Professor Corinne Fowler, School of Museum Studies, University of Leicester.
- OSB58 Independent Media Association.
- OSB59 Hacked Off Campaign.
- OSB60 Center for Countering Digital Hate.
▸
Assessment & feedback
Summary accuracy