You Can’t Triage Terror with Tick-Boxes
- eross435
- Aug 26
- 3 min read
Make it as simple as possible—but not simpler
On 26 August 2025, the UK’s Safeguarding Minister admitted something many frontline workers already knew: the main tool used to decide which domestic abuse victims get urgent help—the DASH questionnaire—“doesn’t work.”
Since 2009, this 27-question yes/no checklist has been used by police, health services, and social workers to assess safety. But research shows it often fails to spot the most serious cases, wrongly classifying them as low or medium risk. That means victims are left without protection—and in some cases, families are left grieving.
Start with People, Not Paperwork
Real safety doesn’t live in tick-boxes—it lives in relationships.
Every couple has a unique dynamic, shaped by:
Triggers: break-ups, money worries, alcohol, jealousy
Beliefs: “I own you,” “No one will believe you,” “I can’t live without you”
Emotional control: how well someone can empathise, manage emotions, and reflect
Behaviour patterns: controlling/submissive, demanding/withdrawing, gaslighting/appeasing
These patterns tell us far more about safety than a checklist ever could. Spotting them early helps us act before the situation escalates.
Why DASH Doesn’t Work
DASH tries to flatten complex human behaviour into simple data. It misses key warning signs, ignores context, and sets services up to fail.
So why is it still used?
Cheap – quick to fill in, needs little training
Fast – looks efficient, but delays urgent help
Defensive – lets systems hide behind process: “The DASH was done. The score was medium. We followed protocol.”
Tick-box forms may comfort institutions—but they don’t protect survivors.
When Forms Become a Game
Survivors often adapt their answers to protect themselves:
Trying to guess what will get help—or avoid punishment
Tweaking answers to avoid escalation, or exaggerating to be believed
Holding back details out of fear—of eviction, losing children, or retaliation
Giving “expected” answers because they know what the system wants to hear
These aren’t lies. They’re survival tactics. But they make the data unreliable—and dangerous if treated as fact.
Suicide Risk Forms Have the Same Problem
This isn’t just about domestic abuse. Suicide risk forms fail in similar ways.
The NICE guideline NG225 is clear:
Do not use risk scales to predict suicide or repeat self-harm
Do not label people as “low,” “medium,” or “high” risk
Instead, professionals should carry out a full biopsychosocial assessment (looking at someone’s physical health, mental wellbeing, and social situation), build a shared understanding of risk, and work with the person—not just assign a number.
Why? Because research shows these scales don’t actually predict outcomes.
Einstein Had It Right
Einstein said: “Make everything as simple as possible—but not simpler.”
The current system oversimplifies. It strips away the very details that signal danger.
We’re not measuring customer satisfaction. We’re trying to understand whether someone might be harmed—or die. Using the same blunt tools for both is not just ineffective—it’s unethical.

A Better Way: Human-Centred, Dynamic, and Context-Rich
Here’s what a better approach could look like:
Start with the people
Map triggers, beliefs, emotional control, and relationship patterns.
Use thoughtful, open-ended questioning
Ask questions that help people explore their own experiences and beliefs. Follow the survivor’s story—not a checklist.
Ask the next right question
Use smart systems to guide interviewers through trusted questions—not random AI guesses.
Capture and analyse the full story
With consent, transcribe and use language tools to spot themes, emotions, behaviours, and escalation.
Explain risk—don’t rate it
Build a clear, shared understanding of risk, just like NG225 recommends.
Use AI carefully
Only for summarising, spotting contradictions, and prompting follow-ups. Never to make decisions or assign scores.
Trigger action—not just paperwork
Risk should lead to real help, not just a ticked box.
Guardrails That Matter
Consent and safety first
Use data ethically and only when needed
Keep human oversight
Audit for bias and performance
Make sure systems can link up and work together
The Cultural Shift We Need
This isn’t about replacing professionals with machines. It’s about giving them better tools to hear the full story, spot patterns, and act quickly.
Thoughtful questioning respects survivors
Mapping common behaviour patterns respects psychology
Forensic analysis respects truth
Controlled AI respects uncertainty
Dynamic systems respect life
DASH once gave agencies a shared language. But that language is now too basic, too rigid.
If we truly want to save lives, we need a bold shift:
“Ask the right question, at the right time. Get a truthful answer. Analyse it properly. Let human-guided systems trigger the right action. But don’t oversimplify.”
The truth—and the risk—lies in the nuance, the patterns, and the stories. Not in tick-boxes.
If you would like to discuss with our Head of Mental Health about how Akumen can support your organisation please use the contact us page of this website.
Comments