TL;DR
👉 HR teams experience "search anxiety" - knowing they might miss critical information
👉 Surface web searches can be subject to unconscious bias and can miss potential risk signals - certainly those hidden on the deep/dark web
👉 KCSIE 2025 recommends online searches during candidate vetting but provides no structured framework
👉 Schools lack safe, affordable tools to conduct comprehensive online vetting
Statutory Guidance vs. Blind Spots
KCSIE 2025 advises:
“Schools and colleges should consider carrying out an online search as part of their due diligence on shortlisted candidates. This may help identify any incidents or issues that have happened and are publicly available online…” (KCSIE 2025, Part 3: Safer Recruitment)
That “online search” phrase is broad and very difficult to deliver against.
What it means in practice for schools is “search the surface web”. That’s the 5% of the internet that is indexed by search engines.
For very legitimate reasons, the deep and dark web (95% of the internet) have remained beyond the skill and budget of schools. This unregulated, unindexed, easy-to-hide digital blackhole is where some of the highest-risk behaviours often lurk: child sexual exploitation, radicalisation, hate speech, extremist associations, and plenty more.
The Anatomy of Search Anxiety
A typical online check during recruitment consists of entering a candidate's name into a search engine - most likely Google - maybe adding a location; skimming the first few pages, possibly clicking on a few links social media platforms: Facebook, Instragram, TikTok, LinkedIn. If nothing shows up, the box gets ticked.
School HR professionals generally lack training in open-source intelligence (OSINT) techniques.
Typing a name into a search engine (for most of us it will be Google) can return too much noise - irrelevant information with inconclusive associations or same-name red herrings - or nothing at all, creating a subjective evaluation process.
The mental burden mounts: "Have I found the right person? Have I missed something? Should I keep looking?”
This unstructured approach inevitably creates uncertainty - what we call "search anxiety" - particularly when the process feels like merely checking a compliance box without proper guidelines.
The anxiety intensifies with the knowledge that significant risks likely exist in online spaces beyond your search capabilities, leaving you with the uncomfortable awareness that your investigation is fundamentally incomplete from the start…
Why the Matthew Smith case should give us pause
In 2022, Matthew Smith became deputy head of pastoral care at Thomas’s Battersea - a school attended by Prince George and Princess Charlotte. Months after being hired and settling in, Smith was arrested for sharing child abuse material on the dark web as well as paying £65,000 for the sexual abuse of children in India via the internet, offering payments in exchange for indecent images and videos. He was in possession of over 120,000 indecent images. Smith was jailed for 12 years.
DBS checks didn’t flag him. References were likely positive. And while Smith’s offending was luckily uncovered by the National Crime Agency, it’s worth asking: “could any school HR process have spotted this?” Probably not - not if it relied solely on DBS Checks and a Google search.
Much of Smith’s activity likely took place in online spaces invisible to standard searches: encrypted chat rooms, forums, and the dark web. These are the places school HR teams cannot reach and is the gulf between what compliance guidance asks for and the reality of where harm signals often live.
More reading on this case:
🔗 Crown Prosecution Service - Sentence
🔗 BBC - East Dulwich: Primary school teacher jailed for child abuse
🔗 ITV - Paedophile teacher at Prince George's old London school jailed over child sex abuse
From Anxious Guess to Firm Framework
Search anxiety isn't a sign of incompetence - it's evidence that the current process is no longer fit for purpose.
Closing this gap requires action on several fronts:
👉 Senior leadership must recognise that effective online vetting goes well beyond basic Google and social media searches.
👉 We need innovation in ethical risk detection - tools that lawfully surface relevant red flags from deeper online layers while preserving privacy and eliminating bias.
👉 These services should be widely available across safeguarding-critical sectors: education, social care, sports organisations, healthcare, Charities, and NGOs.
👉 Regulators and government bodies must get involved earlier to help shape statutory guidance, rather than merely reacting after harm occurs with ambiguous guidance.
Wrap-up / POV
Search anxiety isn’t a sign your HR or safeguarding team is failing - it shows they've been tasked with a job without proper tools.
The surface web is only a fraction of the internet. The deep and dark web harbour some of the most serious risks - areas current school processes, skillsets and tools cannot safely or affordably access.
To truly protect children and communities, we must stop pretending a quick Google search suffices. This means reimagining guidance, embracing innovation, and building safeguarding processes that see further, sooner, and more fairly than our current capabilities allow.


.png)
.png)
.png)


