← Back to House of Commons Debates
Miscellaneous
24 February 2026
Lead MP
Layla Moran
Debate Type
General Debate
Tags
No tags
Other Contributors: 60
At a Glance
Layla Moran raised concerns about miscellaneous in the House of Commons. A government minister responded. Other MPs also contributed.
How the Debate Unfolded
MPs spoke in turn to share their views and ask questions. Here's what each person said:
Lead Contributor
Opened the debate
Initiated a debate on the need for urgent legislation to protect children from harmful online content. Emphasised the importance of age-appropriate regulation, cross-party consensus, and public consultation.
Jim Shannon
DUP
Strangford
Committed to Layla Moran's initiative for phone-free schools in Northern Ireland, citing Education Minister Paul Givan’s pilot scheme. Advocates for measures to prevent children from being harassed and exposed to inappropriate content.
Anna Dixon
Lab
Shipley
Agrees with the need to protect children but suggests a full public consultation is necessary, citing examples such as Australia's approach through groups like “36 Months”.
Gareth Snell
Lab Co-op
Stoke-on-Trent Central
Questioned the procedural validity of debating an unpublished Bill for Second Reading. Inquired about the substance of what is being debated, given that a detailed Bill has not yet been published.
Andrew Cooper
Lab
Mid Cheshire
Acknowledged the importance of protecting children online but expressed concern over the focus on age-gating rather than addressing algorithmic addiction and reinforcing prejudices. Questioned the efficacy of a 16-year-old age limit.
Lola McEvoy
Lab
Darlington
Expressed concerns about the practical implementation of an age-appropriateness register for content, including issues around self-published material and management costs.
Paul Holmes
Con
Hamble Valley
Argued that the Liberal Democrat motion lacks detailed proposals compared to a previous Conservative motion. Questioned the rationale behind seeking control of the Order Paper for future debate without presenting substantive legislation first.
Caroline Nokes
Con
Romsey and Southampton North
Asserted that Members will be asked to vote on an orderly motion, not detailed proposals. Emphasised the separation between voting on procedural motions and debating substantive Bills.
Chi Onwurah
Lab
Newcastle upon Tyne Central and West
Called for clarity in debate substance given that the motion focuses primarily on process rather than policy. Advocated for discussions about protections for children, not procedural details.
Ellie Chowns
Green
North Herefordshire
Chowns questions why the motion is being rushed without substantive proposals or consultation. She suggests that the government should emphasise hearing parents' voices rather than pushing through potentially flawed legislation.
Kirsty Blackman
SNP
Aberdeen North
Blackman expresses confusion about the substance of the proposed Bill, questioning whether it is based on proposals from the House of Lords or a new set of proposals. She calls for clarity regarding the content and necessity of such legislation.
Paul Holmes
Con
Hamble Valley
Holmes questions why the Liberal Democrats did not publish their drafted Bill before proposing it, highlighting the need for transparency and principled debate on policy proposals.
Bobby Dean
Lib Dem
Carshalton and Wallington
Dean addresses confusion in the Chamber about the clarity of Liberal Democrats' proposals. He suggests that while the official opposition may agree with substance, they are contorting themselves to find reasons not to support the motion.
Anna Dixon
Lab
Shipley
Dixon emphasises the importance of consultation in finding effective solutions and criticises rushing legislation when a longer process is necessary for ensuring long-term effectiveness. She supports using the consultation process rather than moving forward hastily.
Joy Morrissey
Con
Beaconsfield
Morrissey stresses the importance of thorough consultation outside Parliament before bringing forward legislation, arguing that this is the proper procedural way to debate and assess Bills. She calls for a long consultation process followed by joint work on the Bill.
Wera Hobhouse
Lib Dem
Bath
Hobhouse requests clarification from Caroline Nokes regarding procedural aspects of the motion, particularly concerning the scope and timing of the proposed legislation.
Kanishka Narayan
Lab
Vale of Glamorgan
Responding to the motion, Kanishka Narayan argued against the procedural aspect and substantive content proposed by the Liberal Democrats. He emphasised that the Government's approach involves a short, sharp consultation allowing for diverse voices to be heard, including those of children themselves. The Minister highlighted the Online Safety Act 2023 as a foundational step, complemented by further actions like criminalising non-consensual intimate images and setting legal duties on tech companies to remove such content within 48 hours. He urged the Liberal Democrats to engage with the consultation process.
Wera Hobhouse
Lib Dem
Bath
During an intervention, Wera Hobhouse emphasised the urgency of taking action based on the requests from constituents and raised concerns over perceived delays in implementing necessary measures for online safety.
Victoria Collins
Lib Dem
Harpenden and Berkhamsted
In an intervention, Victoria Collins highlighted the Liberal Democrats' efforts to push for concrete proposals on online safety over the past few years, aiming to work collaboratively towards solutions.
Caroline Voaden
Lib Dem
South Devon
During interventions, Caroline Voaden questioned the Minister about the timing of launching the consultation and requested a clearer timeline for future legislative actions post-consultation.
Anna Dixon
Lab
Shipley
In an intervention, Anna Dixon emphasised the importance of incorporating research findings, such as those from the Born in Bradford study, into decision-making to address known harms associated with social media use among children.
Natasha Irons
Lab
Croydon East
During an intervention, Natasha Irons highlighted recent actions taken by the Government and raised concerns about content on platforms like YouTube that may not fit neatly into existing definitions of social media, questioning if such platforms will be scrutinised in the consultation.
Ellie Chowns
Green
North Herefordshire
In an intervention, Ellie Chowns conveyed widespread parental and youth concerns about online harms from social media engagement. She urged for quicker action on launching the consultation and bringing forward legislation to address these issues.
Caroline Nokes
Con
Romsey and Southampton North
Defended her party's stance on the importance of consultation over immediate legislation, highlighting concerns about the Liberal Democrats' approach. She emphasised that while protecting children online is vital, rushing through legislation without proper scrutiny could lead to unintended consequences.
Julia Lopez
Con
Hornchurch and Upminster
Stressed the importance of protecting children from online harms but criticised the Liberal Democrats' approach as a distraction. She highlighted that an amendment in the Lords proposing no child under 16 should have access to harmful social media will return for debate, having gained cross-party support.
Natasha Irons
Lab
Croydon East
Emphasised that the consultation process is about seeking consensus with parents and children outside of Parliament. She advocated for a holistic approach to digital childhood rather than solely focusing on banning social media for under-16s.
Kirsty Blackman
SNP
Aberdeen North
Questioned the practicality and scope of proposed bans, particularly noting that a Lords amendment would not apply in Scotland. She highlighted inconsistencies in the Conservatives' previous actions regarding online safety legislation.
Gareth Snell
Lab Co-op
Stoke-on-Trent Central
Raised concerns about the impact of potential bans on young people's ability to connect with friends, especially where alternatives such as youth clubs are lacking. He questioned how enforcement would be managed and who would bear responsibility if parents cannot enforce the ban.
Sam Carling
Lab
North West Cambridgeshire
Highlighted the importance of listening to constituents' concerns about proposed solutions and engaging with the consultation process rather than criticising without offering alternatives.
Wera Hobhouse
Lib Dem
Bath
Challenged Conservative MPs on their stance, questioning why they cannot support the motion for a more immediate approach to legislating against online harms.
Caroline Nokes
Con
Romsey and Southampton North
Informs the House that the debate must conclude by 7 o'clock and mentions there are more than ten Members who wish to speak, indicating she will allow wind-ups from 6:40 pm.
Chi Onwurah
Lab
Newcastle upon Tyne Central and West
I am grateful to the Liberal Democrats for bringing forward this debate on protecting children from online harms, although I remain uncertain as to the measures they are proposing. This debate is happening up and down the country, in homes and at school gates—indeed, wherever people gather—so it is right that we debate it here. If the Conservatives had done something during their critical 14 years of power, our children would be better protected now, but they did not, so it falls to us to take action. I am going to speak about three things: online platforms, their history and approach; the work of my Select Committee, the Science, Innovation and Technology Committee, on algorithms; and the work of the Committee on digital childhood, all within the context of protecting children from online harms.
Chi Onwurah
Lab
Newcastle upon Tyne Central and West
The key online players range in age from pre-teen—TikTok was founded in 2016—to their late 20s, as Google was founded in 1998. In human terms, these platforms are just entering or leaving adolescence, and it shows. As hon. Members across the House may have heard me mention, I am an engineer—chartered, as it happens; thanks for asking—and my last job before entering this place was head of telecoms technology for Ofcom. I remember meeting people from a US platform around 2005 who stated they could not understand why discussions were still ongoing about Government involvement in their activities. Unfortunately for all of us, the Conservative-Lib Dem Government of 2010 and their successors shared the view that Government should not be a part of it, which is how we arrived in 2024—20 years later—without online harms regulation, while at the same time the use of social media has exploded.
Chi Onwurah
Lab
Newcastle upon Tyne Central and West
I support a consultation to improve our understanding of AI impacts. Parents are in a difficult position and should not have to be technology experts, but unfortunately there is so much pressure in the online world that they seem to need to be. The number of social media users has gone from practically zero to four fifths of the population. I have worked with the Molly Rose Foundation, a charity established by the Russell family after their daughter took her own life at 14 following exposure to self-harm content online; I have spoken to bereaved parents of children bullied to death online and the Internet Watch Foundation about horrific images of child exploitation. The fact that the Conservatives did nothing in all those years is, in my view, political negligence.
Chi Onwurah
Lab
Newcastle upon Tyne Central and West
Our Committee’s inquiry found that our online safety regime should be based on principles that remain sound in the face of technological development. Social media has many important contributions but also significant risks, which can evolve with technology. We identified five key principles for building public trust: public safety, free and safe expression, responsibility on the part of platforms, control, and transparency. The Committee recommended regulating the advertising-based business model so that amplification would not be incentivised in harmful ways.
Chi Onwurah
Lab
Newcastle upon Tyne Central and West
The Science, Innovation and Technology Committee will hold a session on social media age restrictions to feed into the Government’s consultation on measures to keep children safe online. We will hear from experts and representatives of those with direct experience of harms, seek evidence from Australia, and gauge the strength of evidence for and against an age-based ban on social media. Our larger inquiry on the neuroscience of digital childhood aims to find out how young people spending their formative years online affects their brains and what actions Governments should take.
Chi Onwurah
Lab
Newcastle upon Tyne Central and West
I call myself a tech evangelist, but I also know the cost of an AI engineer—over £100,000 a year. Tech companies will not put them to work on protecting children unless incentives are in place. The opposite of regulation is not no regulation, but bad regulation. More regulation is coming with new regulations from US states and Spain’s Prime Minister calling social media a 'failed state'.
Monica Harding
Lib Dem
Esher and Walton
I have four children born between 2004 and 2011. Facebook began in 2004; TikTok began in 2016. If this evolution was the industrial revolution, we would be around the spinning jenny stage with AI chatbots as the next destination. Those chatbots are terribly dangerous for our children and need to be regulated now within the Online Safety Act.
Matt Rodda
Lab
Reading Central
My hon. Friend, who chairs the Select Committee, is making an excellent speech with detailed exploration of these issues. Part of the challenge here is that we as parents are struggling to catch up with this revolution, which is gaining speed all the time. Perhaps my hon. Friend would highlight some of the challenges that parents face.
Bobby Dean
Lib Dem
Carshalton and Wallington
The greatest minds in the world are now working out the circuitry of our brains and driving content towards us so we look at screens for longer to sell more ads. Does she agree that one of those principles should be related not only to content but to the addictive nature of these platforms?
Natasha Irons
Lab
Croydon East
Channel 4 was regulated by Ofcom and responsible for the editorialisation of its content, beholden to certain standards. Does she agree that we should be holding media companies just as responsible for the content they put out on their platforms as any broadcaster?
Danny Chambers
Lib Dem
Winchester
Paid tribute to the staff at Leigh House in Winchester for caring for people with eating disorders. Stressed that AI chatbots, which are often integrated into social media, can provide mental health support but also pose significant risks, such as giving advice on how to lose weight or gain access to harmful drugs for individuals with eating disorders. Highlighted research indicating that children may not understand the nature of chatbots and could be receiving potentially dangerous medical advice from them. Recommended focusing on principles rather than specific regulations due to rapid development in AI technologies. Emphasised the importance of regulating AI chatbots by requiring regular reminders that they are not human or qualified to give medical advice, similar to the US's GUARD Act. Urged swift action to prevent harm caused by unregulated chatbot interactions.
Emily Darlington
Lab
Milton Keynes Central
This week is Eating Disorders Awareness Week, and we must remember the acceleration of online harms. We have heard horrific accounts of ChatGPT giving young people diets of 600 calories per day. The promotion of such content is now a category 1 offence. Protecting our children and young people online is extremely important. The Online Safety Act was an important step forward, but it has not been fully implemented by Ofcom, it is not proactive enough, and it is too dependent on what social media companies themselves tell Ofcom. I have done my own consultation with 500-plus 14 to 16-year-olds across my Milton Keynes Central constituency. Some 91% of them have a phone, and 80% have social media profiles. However, young people consider social media profiles to be YouTube and Roblox—two organisations not covered by the Australian model. Additionally, 74% of those 14 to 16-year-olds spend two to seven hours online a day. We know that from the science—just to be clear, that is not an opinion. A ban is a blunt tool that essentially raises the flag of surrender to social media platforms and declares that there is no way of making social media safe. That is essentially what the Conservatives did when the Online Safety Act 2023 was passed: they said, “We cannot go far enough, so we are going to roll back. It is about free speech.” No, it is not about free speech. Freedom of speech was written into law in this country and spread around the world, so we understand how to protect it and limit its harm. A ban would create a cliff edge at 16. All of a sudden it does not matter, and they go into a world that is not safe. We must fully and properly implement the Online Safety Act 2023. That must be done at speed. We need to make safe spaces for children online, ensure content is related to ratings we already understand as parents, consider in-app purchases for young children, change addictive platform algorithms, and talk to those behind iOS and Android.
Christopher Vince
Lab Co-op
Harlow
Mr Vince made a friendly intervention, raising concerns about the impact of social media on people with eating disorders during Eating Disorders Awareness Week.
Sam Carling
Lab
North West Cambridgeshire
Commends the hon. Member for Twickenham on bringing forward a debate about online harms. Describes social media as a 'wild west' due to its harmful content, grooming of children, addictive features, and fake AI-generated content. Raises concerns over the inefficacy of blanket bans aimed at young people and suggests a functionality-based approach instead. Advocates for better enforcement of age restrictions and clearer definitions of what constitutes social media.
Judith Cummins
Lab
Bradford South
Advocates for keeping contributions short to ensure all members can speak, advising hon. Members to limit their speeches to between five and six minutes.
Claire Young
Lib Dem
Thornbury and Yate
As a society, we are raising the first generation of children who spend less time outdoors than prisoners do. Ministry of Justice guidelines state that all prisoners in the UK should have a minimum of one hour in fresh air each day, yet research shows many children do not meet even this threshold due to screen usage. The Centre for Social Justice found up to 800,000 children under five are using social media. Adolescent social media use is three to five hours daily, growing by 50% over the past decade. Children encounter harmful content frequently and face declining attention spans, disrupted sleep, and reduced engagement with the physical world due to excessive screen time. It is crucial to regulate access to harmful social media for under-16s while maintaining access to helpful platforms such as Childline. A film-style age rating system could address these concerns.
Sojan Joseph
Lab
Ashford
The gradual increase in mental health conditions among young people is partly due to social media, although austerity and cuts to NHS services are also factors. Social media can cause depression, anxiety from cyber-bullying, and exposure to harmful content. Locking phones away at school improves academic performance and behaviour; some schools use sealed pouches or brick phones for students. Local studies show positive impacts on concentration and social interaction when smartphones are locked away during the day. The play 'Generation FOMO' highlights the impact of smartphones and social media, and its performances have been well-received by teachers, young people, and parents. I welcome the Government's announcement of a consultation to explore further measures for child safety online.
Liz Jarvis
Lib Dem
Eastleigh
Expresses concern about the lack of regulation and oversight by tech companies regarding children's online safety. She highlights issues such as harmful content, addictive algorithms, mental health impacts, and inadequate enforcement measures in schools. Emphasises the need for cross-Government approaches to mental health support, expanded public services response, and rigorous examination of age verification systems to protect privacy.
Gareth Snell
Lab Co-op
Stoke-on-Trent Central
I will constrain my comments to three themes: policy, the impact of social media on young people's access to information and news, and procedure. The debate has highlighted concerns about social media's influence on children, families, and society. As a parent, I understand the challenges in monitoring what my 15-year-old daughter does online. Instead of banning all social media, we need to help young people navigate misinformation and regulate content creators who spread hate and division. Disconnecting young people from social media is not a solution; it disservices them as they derive information from these platforms. We must consider the future impact on employment where technology use is essential. Lastly, I argue against the proposed rapid legislative process, suggesting that complex legislation requires thorough consultation.
Christopher Vince
Lab Co-op
Harlow
During discussions with young people at Mark Hall Academy in Harlow, they expressed concerns about a potential social media ban. They emphasise the importance of platforms like WhatsApp and highlight that not all social media is perceived as problematic by young people. It is crucial for these voices to be heard during the Government’s consultation process.
Scott Arthur
Lab
Edinburgh South West
I argue that proposing a rapid legislative process is impractical and anti-democratic. Constituents should have the opportunity to influence our thinking and voting on such matters through discussions both in the Chamber and with them directly.
Susan Murray
Lib Dem
Mid Dunbartonshire
Murray highlighted the dangers of social media platforms that track users' behaviour and promote extreme content for profit. She urged the Government to work with the Liberal Democrats to introduce age ratings for social media, arguing that this would help keep children safe online. Murray emphasised the importance of accountability in the online space and called for a clear framework to ensure companies are held responsible for their actions.
Kirsty Blackman
SNP
Aberdeen North
Blackman expressed frustration with the lack of understanding among MPs regarding children's access to social media. She criticised both the Government and her own party, the Liberal Democrats, for their inconsistent positions on the issue. She highlighted concerns about the amendment being proposed to the Children’s Wellbeing and Schools Bill and called for more consultation and expertise in addressing online dangers faced by children.
Caroline Voaden
Lib Dem
South Devon
Calls for urgency in legislative change to protect children online, citing lack of government action since the Secretary of State's announcement. Points out that children face significant risks from online harm, including grooming and exposure to violent content. Cites a 2025 survey by Internet Matters indicating two-thirds of children experience harm online, with one-fifth encountering violent content and over a quarter contacted by strangers. Advocates for age ratings similar to film classifications for social media platforms, banning harmful social media for under-16-year-olds. Emphasises the importance of balancing protection from online dangers with access to beneficial online spaces, such as support forums for bereaved children.
Mike Martin
Lib Dem
Tunbridge Wells
Intervenes to highlight local efforts in Tunbridge Wells where secondary schools are smartphone-free and primary schools are discussing similar measures with parents.
Victoria Collins
Lib Dem
Harpenden and Berkhamsted
Ms Collins expressed her shock at the procedural discussions surrounding online safety measures, noting that issues related to social media harms have been discussed for years. She highlighted specific instances of student concerns about mental health impacts due to social media use, such as self-harm and misogyny. Emphasising the need for action, she pointed out that awareness has grown over time, with experts like Dr Kaitlyn Regehr pointing out the reactive nature of current legislation like the Online Safety Act. She advocated for a proactive approach with stricter age restrictions on social media platforms, proposing an 18 or 16-year-old minimum age for social media use and film-style age ratings to hold tech companies accountable.
Ian Murray
Lab
Edinburgh South
The use of a procedural motion for this serious debate is unfortunate. The debate highlights the importance of proper consultation to allow consideration of various views, given the contradictory statistics about children's online experiences and risks. He emphasises that while most 12-17 year olds benefit from being online, there are significant issues with child sexual exploitation and abuse, including an 860% increase in obscene publication offences over a decade. The Online Safety Act requires services to assess the risks of child sexual abuse material, but he criticises those who want to dismantle it.
Gareth Snell
Lab
Stoke-on-Trent Central
He uses Snapchat only with his daughter and supports stricter measures to protect children online. He emphasised the importance of getting procedural motions right for such serious issues, highlighting that parents should not feel alone in making decisions about their children's online safety.
Kanishka Narayan
Con
Vale of Glamorgan
He mentioned that secondary legislation will be introduced within months to implement the outcomes of the consultation, ensuring that actions can be taken quickly through a primary legislative vehicle. He supports the Government's approach in tackling online safety issues effectively.
Chi Onwurah
Lab
Newcastle upon Tyne Central and West
She expressed disappointment at the absence of Reform Members in the debate, highlighting their lack of commitment to protecting children from obscene abuse. She emphasises the need for serious action to protect children online.
Government Response
Government Response
The Government are committed to engaging in a short, sharp consultation to hear diverse voices including children themselves. They have already taken steps like criminalising non-consensual intimate images and will introduce legislation by summer to remove harmful content within 48 hours of reporting.
▸
Assessment & feedback
Summary accuracy
About House of Commons Debates
House of Commons debates take place in the main chamber of the House of Commons. These debates cover a wide range of topics including government policy, legislation, and current affairs. MPs from all parties can participate, question ministers, and hold the government accountable for its decisions.