<-- Back to proposed bills

Online Safety Bill - Sitting 4

26 May 2022

Proposing MP
Herne Bay and Sandwich
Type
Public Bill Committee

At a Glance

Issue Summary

The witness discusses the Online Safety Bill and its approach to end-to-end encryption while balancing privacy concerns with child safety. The statement discusses the Online Safety Bill, focusing on regulatory cooperation between the ICO and Ofcom, the balance between privacy and online safety, and the need for flexibility in future-proofing the legislation. Roger Gale discusses the scheduling and proceedings of the Online Safety Bill Committee hearing involving witnesses from Kick It Out and Barnardo’s. The statement addresses the Online Safety Bill's provisions related to anonymity and verification of online identities, emphasizing the need for a balance between user privacy and protection against abuse. The discussion revolves around the Online Safety Bill and its provisions for protecting children online. Roger Gale, MP for Herne Bay and Sandwich, provides an introduction to the oral evidence session during the Online Safety Bill committee hearing. The statement discusses the enforcement powers in the Online Safety Bill and the adequacy of existing measures to hold companies and senior executives accountable. Poppy Wood discusses the challenges of regulating online harmful content, focusing on algorithmic promotion and transparency issues in risk assessments. The MP discusses concerns about the Online Safety Bill's clause on content of democratic importance and its potential impact on free speech and disinformation. Roger Gale discusses the Online Safety Bill, focusing on provisions related to handling inappropriate content, super-complaints mechanisms, and data access for researchers. Roger Gale is closing the session after hearing from Poppy Wood and bringing in new witnesses for further discussion. The debate focuses on the scope of the Online Safety Bill, particularly concerning the inclusion and regulation of comments sections on news publisher platforms. The statement discusses concerns about the Online Safety Bill's impact on journalistic content and recognised news publishers. Roger Gale introduces witnesses to discuss the Online Safety Bill and its impact on online scams and fraud. The statement addresses the weaknesses in the Online Safety Bill related to fraud and advertising. Roger Gale is questioning whether there should be a right to redress for victims of online scams and fraudulent advertising under the Online Safety Bill. The statement addresses concerns about the effectiveness of current measures against scam advertisements on social media platforms and the need for improved transparency and accountability from these companies. The statement discusses the need for the Online Safety Bill to address fraudulent advertising and ensure users can easily report such activities. The MP discusses concerns about the effectiveness of the Online Safety Bill's provisions on scam advertisements, focusing on thresholds and enforcement mechanisms. Roger Gale introduces Frances Haugen, a former Facebook employee, to give evidence in the context of the Online Safety Bill. Frances Haugen discusses the challenges platforms like Meta face in implementing the Online Safety Bill, emphasizing issues with AI limitations, human moderation requirements, data transparency, and child safety. Frances Haugen discusses concerns about the Online Safety Bill and its effectiveness in regulating social media platforms. Roger Gale thanks Frances Haugen for her testimony and informs the Committee about their next meeting date.

Action Requested

Stephen Almond from the Information Commissioner's Office emphasizes that the Bill maintains a balanced approach towards encryption, advocating for technological innovation to address risks without compromising privacy. He also highlights the importance of transparency in data protection and supports Ofcom's role in issuing technology notices.

Key Facts

  • The ICO welcomes the Online Safety Bill.
  • End-to-end encryption provides security but can be a safe harbour for malicious actors.
  • Ofcom will have powers to issue technology notices to regulated services if necessary.
  • Stephen Almond urges incentivising companies to develop technological innovations without compromising privacy.
  • The children’s code and Ofcom's video-sharing platform regime were introduced last year.
  • Regulatory co-operation between ICO and Ofcom ensures harmonised introduction of regimes.
  • Stephen Almond recommends retaining flexibility within the Online Safety Bill to address future harms.
  • Roger Gale is chairing the Committee hearing.
  • The witnesses include Sanjay Bhandari from Kick It Out and Lynn Perry from Barnardo’s.
  • Technical difficulties delayed Lynn Perry's testimony.
  • The Online Safety Bill aims to address issues like racial abuse after sporting events.
  • Barnardo’s, a children's charity, is concerned about the risk of exposure to harmful content for young people.
  • Lynn Perry mentions the need to protect children from online grooming and predatory behavior.
  • There are concerns about criminal exploitation targeting young people through digital platforms.
  • Lynn Perry works with more than 380,000 children and young people annually.
  • The ability to bring significant evidence of concern through super-complaints is welcomed by organizations representing children's rights.
  • Sanjay Bhandari highlights the need for improved transparency reporting from social media firms.
  • Eva Hartshorn-Sanders is the head of policy at the Centre for Countering Digital Hate.
  • Poppy Wood is the UK director of Reset.tech.
  • The session discusses concerns about tackling online abuse, particularly through direct messaging and end-to-end encrypted platforms.
  • There are significant enforcement powers in the Bill, including director liability and enforcement on companies.
  • Eva Hartshorn-Sanders refers to clause 149 for a review after two years.
  • A report by Centre for Countering Digital Hate found that 90% of anti-Muslim hate content reported was not acted upon.
  • Poppy Wood addresses algorithmic promotion of harmful content.
  • The risk assessment processes are criticized for not adequately mitigating business models that promote harmful content.
  • There is a need for transparency and independent audits of platforms' reporting data by Ofcom.
  • The content of democratic importance clause makes the Bill hard to implement.
  • Facebook took four years to identify and take down a Russian/Iranian-backed page promoting falsehoods in support of Scottish independence.
  • Misinformation on social media poses a real threat to elections and democracies, leading to death threats sent to election officials.
  • Instagram fails to act on 90% of inappropriate content flagged to it.
  • The Online Safety Bill includes provisions for super-complaints and proper complaints procedures (clause 18).
  • Clause 136 requires Ofcom to issue a report within two years on whether researchers should get access to data.
  • Roger Gale apologises for ending the session without further Q&A opportunities.
  • Owen Meredith from News Media Association gave testimony on the need for clarity in the drafting of the journalistic content exemption.
  • Matt Rogerson from Guardian Media Group expressed concerns about ambiguity in the definition of journalistic content and its potential impact on legal but harmful content categories.
  • Comments sections on news publisher platforms are subject to regulation by Independent Press Standards Organisation (IPSO).
  • There is a concern that the definition of journalistic content in the Bill may be too narrow, potentially excluding international issues.
  • Owen Meredith and Matt Rogerson both agree that the current definitions for recognised news publishers in the Bill set a high bar but need clarity.
  • The Online Safety Bill includes protections for journalistic content and recognised news publishers.
  • Clause 50 defines 'recognised news publisher'.
  • Ofcom will develop guidance on the definition of a newspaper against the criteria set out.
  • Tim Fassam, Rocio Concha, and Martin Lewis are providing testimony.
  • The Personal Investment Management & Financial Advice Association (PIMFA) is represented by Tim Fassam.
  • Which? is represented by Rocio Concha, director of policy and advocacy.
  • Martin Lewis from MoneySavingExpert will join the session later.
  • Search engines are not subject to the same duties as social media platforms.
  • Booster content is user-generated but should be treated like paid-for advertising.
  • Priority illegal content requires transparency reports but fraudulent advertising does not.
  • Roger Gale is questioning Government Back Benchers.
  • Alex Davies-Jones asks about a right to redress for victims of online scams.
  • Gale is seeking input on whether to include an ombudsman in the Online Safety Bill.
  • Martin Lewis had to sue Facebook for defamation to stop scam adverts.
  • £3 million was provided by Facebook to set up Citizens Advice Scam Action.
  • London Capital & Finance caused individuals to lose over £650 million, expected to reach £1 billion in costs.
  • Tech platforms are suggested to be part of the financial services compensation scheme architecture.
  • The current Bill does not provide an easy tool for users to report fraudulent advertising.
  • There is a suggestion to include complaints from individuals affected by accounts that imply association with legitimate firms but do not directly claim affiliation.
  • Ofcom needs additional resources to enforce the expanded scope of the Bill which now includes fraud and fraudulent advertising.
  • The Online Safety Bill includes provisions to mitigate scam advertisements but does not specify thresholds for what is acceptable.
  • Martin Lewis had to sue Facebook due to the prevalence of scam ads, highlighting the need for clearer enforcement mechanisms.
  • Ofcom can impose fines up to 10% of global revenue as a deterrent against non-compliance with the bill's provisions.
  • Roger Gale welcomes Frances Haugen.
  • Frances Haugen is a former Facebook employee.
  • The session is part of the examination related to the Online Safety Bill.
  • AI systems are limited in distinguishing between content of democratic importance and hate speech.
  • Meta did not disclose the number of human moderators hired, indicating a need for mandatory accountability laws.
  • Transparency around processes that keep children under age 13 off platforms is vital for child safety.
  • Facebook could be required to publish estimates on underage users' presence annually.
  • AI struggles to identify harmful content effectively; Facebook's own document showed it only took down 0.8% of violent content.
  • Current standards for what qualifies as journalistic publication are too broad.
  • Anonymity on social media can be exploited for disinformation campaigns.
  • The Committee will meet again on Tuesday 7 June at 9:25 am in Committee Room 14.
  • No specific actions or policy changes are proposed.
Assessment & feedback
Summary accuracy