<-- Back to proposed bills
Online Safety Bill - Sitting 16
28 June 2022
Type
Public Bill Committee
At a Glance
Issue Summary
Roger Gale discusses a new clause in the Online Safety Bill that would require regulated companies to disclose significant changes or breaches impacting safety duties to OFCOM. The statement discusses new clauses for the Online Safety Bill, specifically addressing duties related to distinguishing paid-for advertisements, verifying advertisements, reporting on content protection measures, and guidance on user identity verification. The statement discusses the need for Category 1 services to submit and publish risk assessments to ensure transparency and accountability. The statement discusses the Online Safety Bill and addresses concerns about transparency in risk assessments and media literacy functions. The debate concerns the liability of platforms for supply chain failures that amount to a breach of specified duties under the Online Safety Bill. Roger Gale discusses new clauses aimed at enhancing OFCOM's role in promoting media literacy for regulated user-to-user and search services. The statement addresses the importance of media literacy in combating misinformation and disinformation online. Roger Gale discusses the importance of media literacy in relation to the Online Safety Bill, supporting Ofcom's existing duties under the Communications Act 2003. The statement discusses concerns about algorithmic prompts in search functions that can lead to harmful and discriminatory content related to protected characteristics. Roger Gale is addressing the Online Safety Bill by discussing new clause 45 which proposes a report from the Secretary of State on steps taken to tackle disinformation. The statement discusses the need for Ofcom to be more proactive in identifying and responding to misinformation incidents during crises. The statement addresses the proposed Online Safety Bill and its provisions related to disinformation during serious information incidents. The MP discusses the need for increased transparency and research publication by social media companies under the Online Safety Bill. The statement discusses the powers granted to Ofcom under clause 85(1) of the Online Safety Bill. The statement discusses the need to address violence against women and girls (VAWG) in the Online Safety Bill. Roger Gale discusses the Online Safety Bill and its provisions to protect women and girls from online abuse. Roger Gale is discussing the proposed New Clause 24 of the Online Safety Bill which would allow users to bring civil proceedings against providers for breaches under Part 3 of the Act. The statement discusses a new clause requiring Ofcom to publish an annual report on the effectiveness of the Online Safety Bill. Roger Gale is addressing concerns about the Online Safety Bill and discussing new clause 26 which aims to require a report on synthetic media content harms. Roger Gale discusses the schedule for the Online Safety Bill and its expected reporting time.
Action Requested
The minister opposes the new clause, arguing that existing provisions in the Bill already mandate strong information disclosure and that the scale of entities potentially impacted by this clause is much larger than comparable sectors, posing risks of over-disclosure or misuse. He also notes that there are specific clauses like clause 85 which allow OFCOM to request necessary information from companies.
Key Facts
- The new clause proposes a duty for regulated user-to-user services to disclose significant changes and breaches impacting safety duties to OFCOM.
- Clause 85 of the Bill allows OFCOM to require information through an information notice, with penalties including fines up to 10% or service discontinuation.
- Up to 25,000 companies may be within scope of this new clause, significantly larger than the financial services sector regulated by the FCA.
- New Clause 5 proposes distinguishing paid-for advertisements from other content with clear labelling.
- New Clause 6 introduces an advertisement verification process requiring advertisers to demonstrate authorization by a UK regulatory body.
- New Clause 7 mandates the Secretary of State to publish a report reviewing Clauses 15 and 16 within one year after Act passage.
- New Clause 8 requires OFCOM to produce guidance on user identity verification, considering accessibility for vulnerable users and competition.
- Each new clause was read but negatived in committee votes.
- The Bill requires platforms to carry out a risk assessment under Part 3.
- Platforms must submit these assessments to Ofcom and publish them on their websites.
- Clause 13(2) already mandates summarising risk findings in terms of service.
- Clause 11 sets obligations for specifying child safety risks in terms of service.
- Clause 13 outlines duties related to adult users' risk assessments and summaries.
- Clause 64 gives Ofcom power to specify transparency requirements for platforms.
- New clause 13 seeks to impose liability on a provider where a company providing regulated services on its behalf does not comply with the duties in the Bill.
- Clause 180 of the Bill ensures legal certainty and clarity over which companies are subject to duties.
- The Committee voted against new clause 13, with Ayes 6 and Noes 9.
- New clauses require OFCOM to prepare a strategy within six months of the clause coming into force.
- The strategy must include steps to achieve specific objectives and an explanation of why these measures are effective.
- OFCOM must consult persons with experience in media literacy, the advisory committee on disinformation and misinformation, and other relevant individuals before publishing or revising the strategy.
- Research from Ofcom found that one-third of internet users are unaware of potential inaccurate or biased information.
- About 61% of social media users who claim confidence in judging online content's truthfulness actually lack the necessary skills.
- Clause 103, which proposed a new media duty for Ofcom, was scrapped from the final Online Safety Bill.
- New clauses 14, 15, and 16 would introduce a stronger media literacy duty on Ofcom with specific objectives.
- Ofcom has a statutory duty to promote media literacy under the Communications Act 2003.
- The Government published their own online media literacy strategy about a year ago with funding.
- Ofcom's approach to online media literacy was published at the end of last year.
- The issue involves derogatory terms related to protected characteristics appearing in search results.
- Search companies like Google address issues when highlighted but damage has already been done.
- Algorithmic prompts can cause significant harm and are the responsibility of internet companies.
- New clause excludes all algorithmic prompts for any term associated with protected characteristics under the Equality Act 2010.
- New clause 45 proposes a report from the Secretary of State on steps taken to tackle disinformation.
- The first report must be submitted within six months of the Act being passed.
- Subsequent reports are required at least once every three months thereafter.
- The current Bill focuses on regulating the day-to-day online environment.
- Clause 146 gives the Secretary of State powers in special circumstances but is too slow for real-time incidents.
- Ofcom could introduce a system allowing public reporting and convening response groups during crises.
- The Bill includes powers under clause 146 for the Secretary of State to direct Ofcom in matters of public health, safety or national security.
- Clause 135 mandates Ofcom's transparency report on harmful content prevalence and severity.
- A counter-disinformation unit (CDU) is already operational but full details cannot be disclosed due to security risks.
- New clause 19 aims to allow Ofcom to request research from regulated services for better regulation of platforms.
- New clause 19 would give Ofcom the power to request all research companies hold on a topic specified by OFCOM.
- Clause 85 allows Ofcom to require any information from companies for online safety functions, including research.
- Companies that lie or suppress information face criminal offences with fines up to 10% of global revenue and two years in prison.
- Chris Philk discusses the power of Ofcom under clause 85(1) of the Online Safety Bill.
- The clause allows Ofcom to request any relevant information from companies without specifying particular reports.
- Companies failing to comply could be prosecuted as per clause 92.
- A new clause on violence against women and girls is being read for the first time.
- GREVIO published a report on the digital dimension of violence against women and girls in October 2021.
- Ellesha, a survivor of image-based sexual abuse, was informed by police that they could not access Pornhub where videos were uploaded.
- The Law Commission's recommendations on intimate image abuse will be released on July 7.
- The Domestic Abuse Act 2021 introduced increased penalties for stalking and harassment.
- Automatic early release for violent and sex offenders from prison has been ended.
- Schedule 7 of the Bill includes priority offences such as intentional harassment, alarm or distress, harassment, and stalking.
- Clause 18(2) mandates that platforms must operate a complaints procedure that is easy to access and use by children and provides appropriate action.
- New Clause 24 would enable users to bring civil claims against providers for breaches of Part 3 duties.
- The Joint Committee recommended developing a bespoke route in courts for user redress.
- Government argues existing law already allows individuals to seek redress for negligence or breach of contract.
- New clause 25 would require Ofcom to publish an annual report on the operation of its regulatory functions under the Act.
- The report must include an assessment of the Act's continued effectiveness in reducing harm online.
- It will also detail the volume of content removed by category 1 services, the exercise of OFCOM powers, and reports received from regulated services.
- New clause 26 requires the Secretary of State to publish a report on harms caused by synthetic media content within six months of the Bill being passed.
- Synthetic media content includes deepfake technology which poses significant threats and challenges, especially in the entertainment industry.
- The Screen Actors Guild estimates that 96% of deepfakes are pornographic and depict women.
- The Online Safety Bill is hoped to report this afternoon.
- The sitting was adjourned until 2 PM.
- Mr. Gale thanked members for their courtesy and the staff for their support.
▸
Assessment & feedback
Summary accuracy