TL;DR
👉 95% of the internet isn't searchable via regular browsers.
👉 the US government invented the dark web - for all the right reasons
👉 The deep and dark web often hide content tied to child exploitation, grooming, fraud, extremist activity, and reputational risk.
👉 Standard background checks miss these hidden signals - more should be done to raise awareness of the dark web.
Where it all began…
Ironically, the dark web’s roots are in government research…
In the mid-1990s, the US Naval Research Laboratory - led by scientists like Paul Syverson, David Goldschlag, and Michael Reed - developed "onion routing" technology to protect military communications through layers of encryption – similar to the layers of an onion.
By 2002, this research evolved into The Onion Router (Tor) project, with the first version going live the same year. The US Navy released Tor as a free, public tool in 2004, and by 2006, it had transformed into a non-profit initiative supported by human rights organisations and the Electronic Frontier Foundation.
The original purpose was noble: providing secure communications for journalists, activists, and dissidents living under oppressive regimes.
However, Tor's powerful anonymity features became a double-edged sword. While protecting legitimate users seeking privacy, it also attracted those with criminal intent. The dark web now hosts not just drugs and fraud operations, but also child abuse rings, hate forums, and terrorist propaganda.
One study estimated that roughly 40% of dark web sites host illicit content – everything from illegal pornography to hitman services. Another analysis found 57% of dark web sites involved illicit material like child exploitation, weapons, or extremist communications.
This duality became starkly evident in 2011 with the emergence of 'Silk Road' – a dark web marketplace where Tor's anonymity combined with Bitcoin payments to create the internet's most notorious black market. Before the FBI shut it down in 2013, Silk Road had facilitated over a billion dollars in transactions for drugs, counterfeit documents, and other illegal items, demonstrating how rapidly legitimate technology can be repurposed for criminal enterprise.
…new versions of the Silk Road are still popping up all the time.
Further reading:
🔗 Policing the Dark Web (Wikipedia)
So what’s all this got to do with recruiting in schools?
Recruiting in education goes beyond qualifications or experience - it is fundamentally about granting trust.
With the dark web serving as a refuge for child sexual exploitation, grooming, sextortion gangs, and other illegal activities targeting children, granting trust without proper due diligence into this dark and elusive corner of the internet poses significant risks.
Consider these alarming statistics:
The UK's National Crime Agency and Home Office estimate tens of thousands of UK-based perpetrators are active online, with millions of dark web accounts globally involved in child sexual abuse.
Ofcom confirms that online grooming, abuse, and sharing of child sexual abuse material (CSAM) is rapidly growing.
The Internet Watch Foundation (IWF) has identified a significant and growing threat where AI technology is being exploited to produce child sexual abuse material (CSAM).
The findings from the IWF article "AI and the Production of Child Sexual Abuse Imagery" highlight a critical new dimension to this ongoing crisis. As technology evolves, so do the methods used by perpetrators, making our collective vigilance and proactive safeguarding measures more important than ever.
Further reading:
🔗 UK Home Office – Crime and Policing Bill: child sexual abuse material factsheet
🔗 IWF - AI and the Production of Child Sexual Abuse Imagery
Why ethical technological innovation must now step up
Most schools do everything that is asked of them: DBS checks, references, social media reviews, safer recruitment training. But the game has changed, and now some of the most serious reputational and safeguarding threats exist online - just not on the internet we can see.
This is the gaping blind spot.
The deep and dark web are where pseudonymous identities thrive. Where abusers and radicals organise. Where individuals who present well on paper - and in person - may actually be someone else entirely, hidden behind layers of anonymity.
The challenge?
Many school leaders, Designated Safeguarding Leads, HR teams, and regulators remain unaware that this risk layer exists - let alone how to access or assess it.
This must change.
We need:
👉 Greater awareness among senior leadership that effective "online vetting" extends far beyond a simple Google search.
👉 More innovation in ethical risk detection - tools that respect privacy, avoid overreach, detect bias, and focus specifically on role-relevant behaviours and harm signals.
👉 Wider availability of these emerging services across safeguarding-critical sectors: education, social care, healthcare, and charity leadership.
👉 Early involvement of regulators and government bodies to help co-champion the evolution of statutory guidance - rather than merely reacting after harm occurs.
Final Thought
Adapting to the evolving landscape of online risk must become a leadership priority. Let's work together to ensure our systems, regulations, standards, and statutory requirements keep pace with these critical changes.