← All Articles & Guides
For ParentsPreventOnline Safety

Online Radicalisation: How Extremist Content Reaches Young People — and What Parents Can Do

Extremist groups have become sophisticated users of social media, gaming platforms and encrypted messaging apps. This parent-focused guide explains the mechanics of online radicalisation, the platforms most frequently exploited, and the practical steps families can take to protect young people.

✍️ By The Safeguard Hub Team 📅 April 2026 · Last reviewed April 2026 ⏱ 11 min read Part of The Safeguard Hub Articles Series
Online radicalisation — how extremist content reaches young people

Key finding: The Internet Watch Foundation and Home Office analysis found that online extremist content plays a role in a growing proportion of UK Prevent referrals. Ofcom (2024) found that 4% of 8–17 year olds have encountered "violent extremist content" online — rising to 7% of 16–17 year olds.

How Online Radicalisation Works

Online radicalisation rarely follows a single pathway. Instead, it involves a gradual process of exposure, engagement and ideological adoption. Researchers describe the following common patterns:

  • The "rabbit hole" effect: Recommendation algorithms on YouTube, TikTok and similar platforms can expose young people to progressively more extreme content after initial engagement with edge content. A search for historical conflict can lead step-by-step to extremist propaganda.
  • Grooming by recruiters: Just as with sexual exploitation, extremist recruiters actively seek out vulnerable young people in online spaces — gaming platforms, Discord servers, forums. They build a relationship before introducing ideology.
  • Echo chambers: Private Telegram channels, Discord servers and forums create closed ideological spaces where extreme views are normalised and challenge is absent.
  • Memes and "edgy" content: Extremist groups deliberately use humour, irony and meme culture to introduce radicalising ideas in a non-threatening format — particularly effective with young men seeking in-group identity.

Platforms Most Frequently Exploited

Higher Risk Platforms

  • Telegram: Encrypted channels with minimal moderation — a primary hub for far-right and Islamist extremist material in the UK
  • Discord: Server-based communities difficult to monitor — used to move contacts from mainstream platforms to more extreme spaces
  • 4chan / 8chan: Anonymous imageboards where extremist and incels content proliferates; radicalising for some young men
  • YouTube: Despite moderation, gateway content can lead via recommendation algorithm toward extremist material

Gaming and Other Platforms

  • Online gaming (voice chat): In-game voice channels are largely unmoderated — recruiters use them to build relationships with vulnerable players
  • TikTok/Instagram: Algorithmic content can expose users to radicalising "edgy" content, particularly related to incel ideology and far-right content
  • Rumble/Odysee: Alternative video platforms with lighter moderation, hosting content removed from YouTube

What Parents Should Watch For

  • Secretive online activity, particularly in encrypted apps or on platforms you don't recognise
  • New online "friends" who are significantly older or whose identity cannot be verified in real life
  • Increasingly extreme views expressed in conversation — including dehumanising language about groups
  • Interest in weapons, conflict or extremist groups — expressed in writing, artwork or online activity
  • Withdrawal from previous friends and interests in favour of exclusive online community
  • Sharing or laughing at memes that mock or dehumanise minority groups

What Parents Can Do

  1. Have the conversation early: Discuss extremism and terrorism in age-appropriate terms before exposure occurs. Young people who have a framework for critical thinking are more resistant to radicalising narratives.
  2. Know what platforms they're using: Ask, don't demand. Regular check-ins about online activity — who they talk to, what content they enjoy — are more effective than surveillance.
  3. Enable parental controls and content filters: Router-level filters (BT Parental Controls, Sky Broadband Shield), device-level controls (Apple Screen Time, Google Family Link), and platform safety settings all provide layers of protection.
  4. Teach critical thinking: Challenge conspiracy theories and "us vs them" narratives gently. Ask "where did you hear that?" and model evidence-based reasoning.
  5. Report online extremist content: The Home Office's ACT Early website (actearly.campaign.gov.uk) allows confidential reporting of concerns. Content can also be reported directly to Ofcom or platform operators.

Sources: Ofcom (2024). Children's media use and attitudes report 2024. ofcom.org.uk. | Home Office (2023). Prevent Duty Guidance for England and Wales. gov.uk. | Internet Watch Foundation (2024). IWF Annual Report 2024. iwf.org.uk. | CREST (2023). The Gateway: Experiences of online radicalisation. crestresearch.ac.uk. | Ofcom (2023). Online Safety Act 2023: Illegal content duties. ofcom.org.uk. | ACT Early, Home Office (2024). actearly.campaign.gov.uk.

Share this article: 𝕏 X / Twitter f Facebook in LinkedIn 📱 WhatsApp

Related Resources

Prevent & Radicalisation Hub →Parents' Corner →Online Grooming Hub → All Articles →