Whilst ‘user needs’ is a term used in government more and more frequently, for some the concept remains shrouded in mystery; a way of concealing some kind of dark art perhaps.
But this feeling isn’t unique to government. When I worked at LinkedIn, I rolled out the European pilot for what is now known as the ‘voice of the customer’ programme. Although at a company-wide level, user research had always been core, it was new to the area of the business in which I worked. As a result, a fair amount of stakeholder engagement was needed prior to getting it off the ground and embedded into our way of working.
The LinkedIn programme was complex to kick off and commit to, but the premise, when stripped back, was actually pretty simple. Like the idea of user needs more generally, it went something like this:
- Talk to your users about how they’re currently solving the problem and understand what they need
- Make or adapt products or services to meet that need
- See the usage of your products or services increase
It’s with good reason then that ‘user first’ is the founding principle of the government’s digital by default service standard.
We recently invited Leisa Reichelt along to our ALB digital leaders meeting to find out more about their user first approach in practice.
Leisa explained that they use a four-step process to identify the needs users have for a particular service. Below is a brief summary:
Step 1: educated guessing
This is where everyone in the team notes down any user needs they can think of based on their assumptions and past experiences. Volume is key in this initial stage.
Step 2: apply data
This is where data is applied to see whether there’s evidence to support each of the guesses; user needs aren’t considered ‘real’ unless there’s proof.
Data comes in a variety of flavours; market research, secondary research, web analytics (i.e. if there’s content on your site that people are visiting or if related search terms are being used, it demonstrates that people have a need and are coming to you to try to fulfill it) call centre staff experiences and so on.
The service manual can provide you with a more comprehensive overview of user research techniques.
You may find that there are a number of needs left on the table with no supporting evidence. Whilst this should be seen as positive – the fewer needs you have, the much better the chance of supporting them - if there are unsubstantiated needs which are believed to be genuine, some lightweight research can be carried out.
Step 3: talk to real people
This is where you should get out and talk to the real users who will be accessing your service as data alone can’t always be trusted. Analytics, for example, could demonstrate high engagement with particular content or a service leading you to conclude that all is well. Talking to users though may reveal something entirely contradictory, i.e. that the service is very confusing, hence the time they’re spending having to decipher what they’re seeing.
User stories are the output of your research and should inform your development schedule. You can find out more about how these should be crafted in the service manual.
Step 4 – testing your design
This is where a chunk of development work has been delivered and you test it with real users. This allows you to confirm whether you’ve met their needs or whether some tweaking is still required.
User research isn’t about ticking boxes or jumping through hoops but about ensuring your site or service has the very best chance of success. And, as Leisa said, doing no user research just isn’t pragmatic – you should always do something.
Hopefully this post has pulled back the curtain a little on this 'dark art'. If you have further needs, please flag them and we’ll do our best to meet them.