<-- Back to proposed bills

Online Safety Bill - Sitting 8

09 June 2022

Proposing MP
Neath
Type
Public Bill Committee

At a Glance

Issue Summary

Christina Rees discusses amendments to ensure that fraudulent advertising in online services is targeted at users, distinguishing paid-for advertisements clearly from other content. Barbara Keeley discusses amendments to the Online Safety Bill aimed at preventing online fraud. Christina Rees discusses new clauses aimed at combating online fraud and misleading advertising on search engines. The discussion centres on amendments to the Online Safety Bill aimed at addressing fraudulent advertising and protecting vulnerable individuals online. The statement addresses amendments to the Online Safety Bill, specifically regarding fraudulent advertisements and their removal from Category 2A services. The statement discusses amendments to address fraudulent and misleading online advertisements for debt help services. The statement discusses amendments related to fraudulent advertising and the inclusion of additional offenses under consumer protection regulations. The statement addresses amendments to the Online Safety Bill aimed at ensuring protections for animal welfare and preventing animal abuse content online. The hon. Member is questioning how to distinguish between abusive animal images and those intended to raise awareness about animal welfare issues. The statement discusses the issue of animal cruelty being filmed and shared on social media platforms, highlighting the need for legislative action under the Online Safety Bill. MP Christina Rees discusses the Online Safety Bill and proposes a new clause regarding OFCOM's use of proactive technology to identify child sexual exploitation and abuse (CSEA) content in private messaging. Christina Rees discusses concerns about the Online Safety Bill's provisions for tackling online child abuse, particularly in private messaging. MP Christina Rees discusses the need for a specific code of practice to address violence against women and girls in the Online Safety Bill. The MP discusses the use of proactive technologies to identify illegal content without infringing on privacy rights. The statement discusses the Online Safety Bill, focusing on protecting children from child sexual exploitation and abuse (CSEA) through existing and proposed legislative measures. Christina Rees addresses the debate on clauses related to codes of practice under the Online Safety Bill, particularly focusing on the timeline for Ofcom's preparation and issuance of these codes. The statement discusses concerns about the Online Safety Bill's provisions regarding codes of practice, deadlines for their publication, and transparency in enforcement by Ofcom. The statement discusses the timeline and process for Ofcom to produce codes of practice under the Online Safety Bill. The statement discusses responses from various organizations regarding the Online Safety Bill.

Action Requested

The speaker proposes amendments and new clauses to require clear distinction of paid-for advertisements and the verification process for advertisements on Category 2A services. The amendments aim to protect users from misleading advertising by requiring providers to use systems that distinguish paid ads from other content, display them with specific labeling, and verify advertisers' authorization through UK regulatory bodies.

Key Facts

  • Amendment 23 proposes inserting 'that targets users' after 'service'.
  • New clause 5 introduces a duty for Category 2A service providers to clearly distinguish paid-for advertisements from other content.
  • New clause 6 mandates an advertisement verification process for relevant advertisements on Category 2A services, requiring advertisers to demonstrate authorization by UK regulatory bodies.
  • The Financial Conduct Authority estimates that fraud costs the UK up to £190 billion a year, with 86% committed online.
  • Reported incidents of scams and fraud have increased by 41% since before the pandemic.
  • Three in ten online scam victims felt depressed as a result of being scammed.
  • Paid-for advertisements can target specific geographical locations from anywhere in the world.
  • New clause 5 would ensure search engines provide clear messaging about paid-for advertisements.
  • New clause 6 proposes a duty on search engines to verify adverts before acceptance.
  • Companies must demonstrate authorization by a UK regulatory body for their ads to appear.
  • Viagogo, a secondary ticketing website, has been breaching consumer protection laws and selling non-existent tickets.
  • STAR (Society of Ticket Agents and Retailers) is responsible for regulating secondary ticketing practices.
  • The Bill aims to tackle fraudulent advertising that affects vulnerable individuals.
  • Clause 3(5)(b) of the Bill specifies that UK users form one of the target markets for the service, regardless of the number of users.
  • New clause 5 raises points about regulating online advertising which are considered outside the scope of this bill and will be addressed via other programs.
  • Amendment 45 seeks to align fraudulent advertising provisions for Category 2A services with those for Category 1 services.
  • Government amendments 91 to 94 are also being discussed.
  • Amendment 44 proposes adding further offences under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008.
  • StepChange reported 72 adverts last year for misleading and harmful practices related to debt advice.
  • 15% of people searching for StepChange and other debt advice charities online are routed away by deceptive adverts.
  • The consumer protection from unfair trading regulations of 2008, part 3, creates offences relating to misleading or aggressive practices.
  • Government amendments 91 to 94 make duties on search firms in clause 35 equivalent to those on user-to-user firms in clause 34.
  • Amendment 44 seeks to add offences under Part 3 of the Consumer Protection from Unfair Trading Regulations 2008 but is rejected due to regulatory overlap concerns.
  • Clause 36 is ordered to stand part of the Bill without amendment 44.
  • Amendments 63 and 64 propose adding safeguards for monitoring and detecting cruelty towards humans and animals in user-to-user services and search services.
  • Amendment 60 seeks to bring offences related to animals within the definition of illegal content under the Online Safety Bill.
  • Amendment 59 adds specific animal welfare offences from various acts to the list of priority offences in Schedule 7.
  • In 2020, there were nearly 500 reports of animal cruelty on social media, more than twice the figure reported for 2019.
  • The majority of these incidents appeared on Facebook.
  • The hon. Member raises concerns about children being exposed to abusive animal images online.
  • Examples given include distressing posts about the Yulin dogmeat festival and beagles used in laboratory experiments.
  • Eighty-one per cent of RSPCA frontline officers think more abuse is being caught on camera.
  • Nearly half think that more cases are appearing on social media.
  • One in five officers said hurting animals to become popular on social media is a main cause of cruelty.
  • Videos posted online include a magpie thrown across the road, a dog kicked by a woman, and cockerels forced to fight.
  • Footballer Kurt Zouma attacked his cat, leading to tougher penalties for filming animal abuse and posting it on social media.
  • Police discovered 182 videos with graphic animal cruelty during an investigation in Burnley.
  • The new clause requires OFCOM to consult organisations with expertise in tackling child sexual exploitation and abuse (CSEA).
  • OFCOM must produce a report within six months of the Act being passed.
  • The report will examine whether proactive technology should be used in private messaging to identify CSEA content.
  • The Online Safety Bill restricts Ofcom's ability to require proactive technology to identify or disrupt online child abuse in private messaging.
  • Research shows that 12 million of the 18.4 million child sexual abuse reports made by Facebook in 2019 related to content shared on private channels.
  • NSPCC data indicates that girls are victims in 83% of recorded offences involving online grooming.
  • Clause 37 requires Ofcom to produce codes of practice on illegal content and fraudulent advertising.
  • Women are disproportionately targeted by online abuse on social media platforms.
  • The End Violence Against Women Coalition, Glitch, Carnegie UK Trust, NSPCC, 5Rights, Professors Clare McGlynn and Lorna Woods drafted a violence against women and girls code of practice.
  • The technology exists to scan data without preventing encryption.
  • AI will identify illegal material and transfer it to the National Crime Agency (NCA).
  • Ofcom is required to justify the use of proactive technologies under new clause 20.
  • Ofcom is required to identify, assess and mitigate CSEA risks.
  • Clause 103(1) now uses 'necessary and proportionate' instead of 'persistent and prevalent'.
  • Accredited technologies can be updated by Ofcom at any time as needed.
  • Codes of practice must cover measures for compliance with relevant duties, including illegal content and safety concerns for children and adults.
  • The amendment proposed by Alex Davies-Jones would require Ofcom to prepare draft codes of practice within six months.
  • Chris Philp states that £88 million has been provided in last year's spending review for Ofcom to cover this and next financial year (2022-23 and 2023-24).
  • After these years, Ofcom will fund itself through raised fees.
  • The Bill gives Ofcom legal power to impose fees on regulated companies under clauses 70-76.
  • Labour supports clause 42 but urges Ofcom not to delay the process of publishing or amending codes of practice.
  • The audio-visual media services directive became UK law in September 2020 and came into force in November 2020, with formal guidance issued by October 2021.
  • Labour is concerned about subsection (6) of clause 43 regarding the Secretary of State's powers to prevent disclosure for national security reasons.
  • Clause 45 allows service providers to take alternative measures if they comply with recommended measures in codes of practice.
  • Violence against women and girls, including online abuse and harassment, is not mentioned in the Bill.
  • Ofcom has to designate priority categories of harmful content through secondary legislation.
  • Ofcom needs time for consultation exercises which are expected to last 12 months rather than six months.
  • The road map document, outlining timelines and plans, will be published before the summer recess.
  • Badger Trust submitted OSB61.
  • Lego submitted OSB62.
  • End Violence Against Women Coalition (EVAW) submitted OSB63.
  • Hacked Off Campaign submitted further material regarding clause 50 as OSB64.
  • Office of the City Remembrancer, on behalf of the City of London Corporation and City of London Police, submitted OSB65.
  • Juul Labs submitted OSB66.
  • Big Brother Watch, ARTICLE 19, Open Rights Group, Index on Censorship, and Global Partners Digital submitted OSB67.
  • News Media Association submitted supplementary material as OSB68.
Assessment & feedback
Summary accuracy