<-- Back to proposed bills
Online Safety Bill - Sitting 3
15 December 2022
Type
Public Bill Committee
At a Glance
Issue Summary
The statement discusses technical amendments to ensure OFCOM can proactively identify and regulate emerging high-reach, high-influence companies before they reach Category 1 status. Paul Scully discusses amendments to the Online Safety Bill related to the designation of Category 1 services. Paul Scully discusses amendments to broaden criteria for designating companies as category 1 under the Online Safety Bill. The statement discusses an amendment to allow Ofcom to impose Category 1 duties on user-to-user services posing a high risk of harm, regardless of the number of users. Paul Scully discusses the removal of adult safety duties from the Online Safety Bill and introduces new transparency and free speech duties for category 1 services. The statement discusses amendments related to the Online Safety Bill, focusing on definitions and conditions for regulated services. Paul Scully discusses amendments related to the Online Safety Bill, focusing on enforcement powers for OFCOM and transparency in guidelines. Paul Scully discusses the need for a comprehensive and timely review of the Online Safety Bill's regulatory regime. The statement addresses amendments and clarifications within the Online Safety Bill, specifically regarding video-sharing platforms and content definitions. Paul Scully explains the purpose of Amendment 58 to Clause 203 and clarifies definitions in the Online Safety Bill regarding oral and aural communications. Paul Scully addresses the timeline for implementing the Online Safety Bill, emphasizing the need to proceed after Royal Assent. The statement addresses the introduction of new clause 1 in the Online Safety Bill, which requires OFCOM to produce and publish guidance for providers on harmful content to children and user empowerment duties. The statement addresses new clauses in the Online Safety Bill related to user access restrictions and provider responsibilities. The statement discusses new clauses related to online safety measures for Category 1 services. The statement introduces new clauses related to the Online Safety Bill, focusing on defining terms for regulated content, listing emerging Category 1 services, and child user empowerment duties. The statement outlines the duties of service providers under the Online Safety Bill to empower child users by including features that increase their control over harmful content. The statement discusses a proposed amendment to include child user empowerment duties in the Online Safety Bill. Paul Scully discusses the provisions within the Online Safety Bill aimed at protecting children from harmful online content. The statement addresses the need to strengthen the Online Safety Bill by introducing senior management liability for breaches related to children's safety online. MPs discuss the Online Safety Bill and its provisions for protecting children from harmful content online. Paul Scully is addressing concerns about a new clause that aims to create criminal offences for non-compliance with certain duties under the Online Safety Bill. Paul Scully discusses concerns about the Online Safety Bill and its potential impact on innovation and business attractiveness in the UK. Paul Scully is addressing the recommittal of the Online Safety Bill to Committee in a festive and appreciative tone.
Action Requested
The Government proposes amendments that require OFCOM to create a list of services approaching the Category 1 threshold to ensure prompt regulation. This addresses concerns about rapid growth in tech companies and potential delays in classification.
Key Facts
- Clause 82 is amended to include new conditions for user-to-user services.
- OFCOM will publish a public list of services that meet 75% of Category 1 users and at least one functionality of Category 1 services.
- The amendments aim to ensure OFCOM can designate companies as Category 1 swiftly in response to rapid growth.
- Amendments aim to change the approach to category 1 designation after removal of adult safety duties from the Bill.
- Category 1 services will be designated based on functionalities enabling easy and wide dissemination of user-generated content.
- The requirement for a number of users threshold remains unchanged.
- Amendments broaden criteria for selecting companies likely to fall into category 1.
- Secretary of State will make regulations but Ofcom will carry out the objective and evidence-based designation process.
- Process includes consultations at every stage and is subject to parliamentary scrutiny via statutory instruments.
- Amendment allows Ofcom to impose Category 1 duties regardless of user numbers.
- Kiwi Farms is cited as an example of a small platform that caused significant harm.
- Research found that more than half of mainstream pornography involving black women also involves violence.
- The adult safety duties have been removed from the Online Safety Bill.
- New transparency and free speech duties are being introduced for category 1 services.
- A watchlist amendment will allow Ofcom to monitor companies approaching the category 1 threshold.
- Amendment 93 defines 'regulated user-generated content' for the purposes of Schedule 11.
- Amendments 76 to 78 and consequential amendments (87, 88, 89, 90) relate to OFCOM’s advice on regulations under paragraph 1(1), (2) or (3).
- Amendment 50 in Clause 87 includes new duties related to terms of service.
- Ofcom will have the ability to direct companies to take specific steps for compliance, issue fines, and apply for business disruption measures.
- Criminal proceedings can be instituted against senior managers who fail to ensure company compliance with information notices.
- The criminal offence related to senior management responsibility will commence two months after Royal Assent.
- The Online Safety Bill requires a review two to five years after the regulatory regime comes into force.
- Other legislation like the Digital Economy Act 2017 is already outdated due to rapid technological changes.
- Consultation will be necessary to assess the effectiveness of the legislation.
- Amendment ensures services regulated under Part 4B of the Communications Act 2003 are not required to comply with new duties during transition.
- Paul Scully provides clarification on livestreaming platforms and their inclusion in the video-sharing service regime.
- The definition of 'content' is intended to be indicative rather than exhaustive.
- Amendment 58 removes a definition no longer required in Clause 203.
- The new definition of 'general limit in a magistrates court' is now included in the Interpretation Act 1978.
- Oral means speech only, while aural includes all sounds heard on voice calls, including music.
- The legislation requires some activity that can only be carried out after Royal Assent.
- Since November 2020, Ofcom has regulated harmful content through the video-sharing platform regulatory regime.
- In December 2020, interim codes of practice on terrorist content and sexual exploitation were published.
- Safety by design guidance was published in June 2021.
- New clause 1 places a duty on OFCOM to produce and publish guidance for providers of user-to-user regulated services.
- Guidance will include examples of content considered harmful to children or relevant to the user empowerment duty in clause 14.
- OFCOM must consult with appropriate persons before producing any guidance under this section.
- New Clause 3 imposes a duty on providers not to take down content, restrict user access, or suspend/ban users except in accordance with the terms of service.
- The clause allows exceptions for compliance with duties protecting individuals from illegal content and children from harmful content.
- New Clause 4 further clarifies duties regarding terms of service and includes definitions for criminal/civil liability and fraudulent advertisements.
- The new clauses aim to ensure providers' terms of service are clear and applied consistently.
- Providers must allow users and affected persons to easily report content or user actions that they consider relevant.
- OFCOM must publish guidance for Category 1 service providers to assist them in complying with the duties.
- The new clause requires OFCOM to assess regulated user-to-user services that likely meet the Category 1 threshold conditions.
- Services must have at least 75% of the number of UK users and one or more functionalities specified in the regulations.
- OFCOM must publish a list containing details about each service meeting these criteria.
- The statement sets out duties to empower child users in relation to Category 1 services.
- Providers are required to include features that reduce harmful content exposure or alert users about harmful content proportionate to risks identified by assessments.
- All features must be made available to all child users and clearly specified in terms of service.
- The proposed amendment is based on clause 14 of the Bill before amendments were made.
- Subsection (8) would enable child users to use features that allow them to approve only people they know and block others.
- Subsection (9) requires services to include features enabling children to filter out private messages from non-verified or adult users.
- The Bill includes provisions requiring service providers to conduct comprehensive risk assessments.
- Service providers must implement age-appropriate protections against harmful content as outlined in clause 10(6)(e).
- Providers are required to take measures, where proportionate, to meet child-safety duties under clause 11(4)(f).
- Senior managers will not be personally liable under current legislation unless they fail to comply with information requests or willingly mislead the regulator.
- A survey indicates that 82% of UK adults support appointing a senior manager as responsible for children's safety on social media sites.
- The concept of consent or connivance in other Acts, such as the Theft Act 1968 and the Health and Safety at Work etc. Act 1974, would be applied to determine criminal liability under new clause 9.
- The Online Safety Bill includes financial penalties of up to 10% of qualifying revenue up to £18 million.
- In 2021, 97% of child sexual abuse material showed female children, with the Internet Watch Foundation taking down a record-breaking 252,000 URLs featuring images of children being raped in seven out of ten cases involving children aged 11 to 13.
- Apple has paused an update that would have automatically scanned for child sex abuse material following privacy and security concerns.
- The new clause would establish criminal offences punishable through fines or imprisonment.
- It aims to hold companies accountable for user safety, especially younger users.
- Concerns are raised about potential unintended consequences such as excessive content removal.
- The Bill mandates regular risk assessments by companies and the implementation of systems to mitigate those risks.
- Companies could face fines up to 10% of their global turnover for non-compliance.
- Senior tech executives can already be held criminally liable for failing to ensure their company complies with Ofcom’s information requests.
- The Bill aims to protect users without stifling innovation or making businesses overly cautious.
- The recommittal process is described as unusual and involves discussions previously had.
- Colleagues from across the House have supported amendments, particularly those from the SNP.
- The room used for debates is noted for not being freezing.
▸
Assessment & feedback
Summary accuracy