Skip to main content

How we do discovery service assessments in DHSC

Posted by: , Posted on: - Categories: Capability, Service assessments, Services and products

Post-it note with the following question written, does this meet the service standard?Part of my role as a product manager at the Department of Health and Social Care (DHSC) is to help assure that ‘digital’ things being made in health and social care meet certain standards. Whether it’s a new website or an app, they all must try to meet the cross-government service standard (an updated service standard is coming soon).

Why does anything need to be assessed?

Assurance of digital products and services has been mandated by the Treasury/Cabinet Office for a number of years. This includes filling out spend controls and often progress is checked by a panel. The intention is to catch things that might be off-track or duplicating effort, and try to intervene so that the things government produces are valuable and safe for users.

I don’t like to see them as tests to pass – I would hope they feel more like a collection of people taking a second look at things and giving advice where needed.

Assessments in DHSC

I’ve only attended 2 assessments in DHSC so far – one that I was shadowing, the other where I was an assessor, and both at the end of the discovery phase. Colleagues, prior to me joining, hav been piloting discovery assessments – the hope being that if anything needs a little nudge in the right direction, then we can help do that earlier in the process rather than having to deal with it at a later assessment or in a future spend control. The assessments have been less formal than others I have seen previously at the Government Digital Service (GDS). It’s important they are not a burden to teams trying to deliver things.

The format

Our team is testing out the following format:

  • at the end of a discovery phase, the team being assessed sets up a time and place and invites 2 assessors, such as a colleague and myself
  • in advance we send out a list of questions that they are likely to face to make explicit the things we are interested in hearing about
  • on the day, we have a presentation of around 30 minutes from the team about what they have done and learned
  • then we have another hour to go through any questions we may have that weren’t answered during the presentation. They are grouped around:
    • understanding user needs
    • doing ongoing user research
    • the team
    • using agile methods
    • success measures and KPIs
  • after having a chat the assessors go off and discuss what they have heard (I wrote these down as hopes and fears for the product), and if needed, they create a list of recommendations for the team

The hope is that the team gets something valuable out of the experience other than having to just present to a pair of interested outsiders.

Questions asked

From my experience the sessions are much more a conversation than an interview with questions and answers. From my perspective I’m usually completely fascinated by the challenges they aim to solve and very empathetic to the challenges of trying to deliver things. I love hearing the team's story and all of their findings.

While some questions are to be expected, I’ve also heard some great questions from colleagues that were off-script. Some of my favourites have been:

  • "what new thing did you learn from doing the discovery that you did not know before?" – I like this because you want teams to challenge their assumptions and learn new things in discovery. This question is good to get to the root of it
  • "what is the team doing to upskill one another?" – I like this because in the health and social care world we currently bring in digital professionals with vast experience to help fill roles in teams. We should be getting every inch of their knowledge passed on to people who are new to delivering digital products/services
  • "how are you going to share your learnings/findings with others in health and social care?" – I like this because if we are paying for research, then we should be sharing it and letting others learn from it too

We are sympathetic and here to help

While we do our best to make the sessions not feel like a test, the reality is that some teams do go into these sessions with trepidation. Often the materials we share or the language we use is unintentionally a bit jargony. Lots of the teams, especially in discovery, have never done this before. They are new to delivering anything digital and are learning on the job. We will keep working on making these sessions as accommodating as possible and inclusive to all levels of experience and the people who come to present.

What to expect from us

Our team wants to give constructive recommendations and wants teams to deliver great things. Nobody is there to scold. We will always do our best to help teams link up with other teams we think could help in the health world and beyond. If the team does decide to pursue an alpha, then we are also there to help them get it approved. If we aren’t doing that, then we’d love the feedback too, so we can improve.

If you need an assessment or want to give us feedback or contact us for any other reason, please get in touch:

Sharing and comments

Share this page