top of page

Rethinking Suicide Risk Assessment in the UK. A New Era in Suicide Prevention?

  • eross435
  • Apr 28
  • 3 min read

In the fight against one of the UK’s most pressing mental health crises, two very different paths are emerging—and they’re both leading us toward a common goal: saving lives.


Suicide remains one of the most complex and tragic public health challenges of our time. Each year, thousands of families are affected by a loss that is often preventable—but only with the right support, at the right time. Now, a fresh conversation is taking shape around how we assess the risk of suicide, and two voices share that dialogue.


NHS England, whose 2025 Staying Safe from Suicide guidelines embrace human connection as the core of care. The other is Akumen, our 2023 whitepaper explores how artificial intelligence plus human connection might transform risk detection and early intervention in suicide prevention. We share common values—but diverge in execution.

 

A Shared Mission: Compassion Over Checklists

In a show of alignment, both Akumen and NHS England agree on one crucial point: traditional suicide risk assessments are broken (read previous Akumen blogs which touch on this point).

For years, clinicians have relied on outdated tools—like tick-box questionnaires and static “low/medium/high” risk labels. But those tools, experts say, lack the depth needed to capture the dynamic, emotional reality of someone in distress.

“We’re not numbers,” says one mental health practitioner. “We’re stories. And stories shift.”

That belief forms the heart of both approaches. Akumen uses AI-powered narrative analysis to read between the lines of what people say, picking up on emotional patterns that might signal risk. Akumen recognise the power of AI to support in an unbiased 'always listening' manner which follows a strict rule based approach developed by humans. It is a tool which cannot free think but is a consistently reliable and an addition to any therapists toolkit, and begins with early access to services with ease. Meanwhile, NHS England is urging clinicians to focus less on scoring, and more on listening—deeply, consistently, and compassionately.

 

Two Paths Toward the Same Goal

Here’s where the paths diverge:

Aspect

Akumen’s Approach

NHS England’s Approach

Use of AI

Emotion analytics + machine learning + human

No AI; human relationships only

Method

Hybrid: Human teaches the machine, AI flags risk, human audits findings, human led therapeutic conversation commences

Fully human-led therapeutic conversation

Vision

National AI support tool for crisis teams

Strengthen support through cross-sector collaboration

Akumen’s system was developed with input from psychotherapists and trained AI ethics professionals. It’s not meant to replace humans—but to support them by identifying signals of distress that might otherwise go unnoticed, especially during emergency calls to 999 or 111. Akumen believe that to improve services the solution has to be scalable and accessible. Leveraging AI tools can enable the individual to begin capturing their information and story ahead of awaiting an appointment, which can take months. This could result in getting to a support/treatment plan much quicker than 1:1 traditional methods alone.

 

NHS England, however, remains firmly rooted in the human-to-human model. “There’s no replacement for the power of a therapeutic conversation,” their guidance insists.


Roadblocks and Realities

Neither model is perfect.


Akumen’s AI may face pushback from clinicians wary of technology interfering with care. It also depends on data literacy and trust in a system that—let’s face it—many may not fully understand yet.

On the other hand, NHS England’s approach risks falling short in today’s overburdened system. With staff shortages and rising demand, how feasible is a fully relational model without data-driven support?

“There’s only so much one person can pick up in a 30-minute consult,” says a frontline GP. “What if AI could help us listen better—not less?”

 

The Future Is (Probably) Hybrid

So why not both?


Mental health experts are increasingly calling for a collaborative approach—one where AI serves as an early-warning system, helping clinicians’ triage and tailor support without losing the human touch.

Imagine an NHS emergency operator, quietly supported by Akumen’s AI, spotting subtle signs of suicidal distress mid-call and escalating help in real-time. Imagine therapists using machine insights to gently explore topics that might otherwise go unspoken. The possibilities are lifesaving.

“We’re not talking about robots replacing therapists, we’re talking about technology that amplifies empathy.”


A New Era in Suicide Prevention?

As the UK navigates this pivotal moment in suicide prevention, the message from both camps is clear: the old tools aren’t enough. People in crisis deserve more—more nuance, more care, more innovation.


Now the question is: how will we bridge the best of both worlds?


Whether through the empathetic eyes of a clinician or the analytical lens of AI, the goal remains the same: to understand and protect those at risk—avoiding future preventable deaths.



Should technology have a role in suicide prevention?

  • Yes

  • No

  • Perhaps



📣 Have thoughts on AI and mental health? Join the conversation, email us at Eross@akumen.co.uk

 
 
 

Comments

Rated 0 out of 5 stars.
No ratings yet

Add a rating
bottom of page