Skip to main content

https://digitalhealth.blog.gov.uk/2019/06/14/phe-uk-national-screening-committee-alpha-re-assessment/

PHE UK National Screening Committee Alpha Re-assessment

Posted by: , Posted on: - Categories: Alpha, Assurance, Service assessments

Text saying "Service Assessment" with DHSC approved brand colours

From: NHSX

Assessment date: 6 June 2019

Stage: Alpha

Result: Met

Service provider: Public Health England

Service Description

The UK National Screening Committee advises ministers and the NHS in the 4 UK countries about all aspects of health screening and supports implementation of screening programmes.

The Committee has a list of 109 policy recommendations which are reviewed on a triennial basis. Their recommendations are based on evidence reviews which are commissioned and managed by the evidence team, and on the opinions of external stakeholders whose input is polled during public consultations. The service captures all this work by way of displaying the full list of policy recommendations and active consultations to the public.

Service users

The users of this service are:

  • Evidence review managers: commission and preside over evidence reviews
  • Policy stakeholders: access screening information, make statements on behalf of the perspective of their organisation
  • Interested members of the public: access screening information online

1. Understand user needs

Decision

The team met point 1 of the Standard.

What the team has done well

  • Usability testing in almost every sprint in alpha
  • In the alpha period after the first assessment the team used an agency to recruit users with accessibility and access needs
  • Tested admin interface with users at their place of work
  • Tested outside of England, in Wales and Scotland
  • Included staff users in almost every sprint
  • Tested with members of the public who weren’t familiar with the service

What the team needs to explore

  • It would be useful to understand the end to end journey for the user, what other services do they interact with to complete their wider goal? Is there anything the team learnt in their research that could be shared with these other services? Are there blockers to completing the user’s wider goal that the team could influence?
  • Explore how easy to understand and accessible the content written by policy team is likely to be. Perhaps test the content with users and have policy content writers observe? Also, are the plain English summaries produced by third parties accessible and easy to understand? If this hasn’t been tested it would be interesting to explore rather than assume.

2. Do ongoing user research

Decision

The team met point 2 of the Standard.

What the team has done well

  • The team have a thorough plan for ongoing user research and a dedicated user researcher in the team

What the team needs to explore

  • The team mentioned they plan to do unmoderated testing (by giving the prototype to users for feedback later). This is a great way to reach more people. However to get more useful results it would help to give users structure on what they are looking for so that you don’t end up with opinions or vague reviews. You also have to weigh this evidence against data you have collected through observation. For example, you may find that evidence collected in an unmoderated test is ambiguous or confusing. If this is the case you could hold shorter follow up interviews with users.
  • It appeared that the user researcher had carried out and analysed some sessions on their own. Sometimes this is unavoidable, but we would recommend at least one other person attend to take notes, that there is at least a remote viewing option and that as many as possible team members be involved with analysis

3. Have a multidisciplinary team

Decision

The team met point 3 of the Standard.

What the team has done well

  • The panel were pleased that content designers had been engaged several times throughout the extended alpha
  • The team explained that the content designers came from outside the team which helped ensure there was not a conflict of interest in the research and design process
  • The team are creating guidelines for the helpdesk for assisted digital users and have an alternate route of engagement in the shape of a postal form

What the team needs to explore

  • Based on recommendations from the last assessment where the content designer was doing some of the user research, the current user researcher chose to be less involved with content design reviews. We would recommend that the user research and content design roles are held by different people within the team and that they work closely together to set research goals, analyse research and agree on actions. The panel was pleased to hear that the team plan on bringing in a separate content designer as part of private Beta

7. Understand security and privacy issues

Decision

The team met point 7 of the Standard.

What the team has done well

  • The panel were pleased to hear that the team had plans to redirect the existing URL to reduce phishing potential
  • The team mentioned that they are working on a GDPR statement on how user data will be used
  • The team plan on using existing PHE login infrastructure

What the team needs to explore

  • The team needs to test the service with penetration testers and make sure all the content is sanitised

12. Make sure users succeed first time

Decision

The team met point 12 of the Standard.

What the team has done well

  • The panel were impressed that the team have timed task completion for admin users and have reduced this from about 60 minutes to about 5 minutes

What the team needs to explore

  • The team needs to test the service on mobile and tablet. Even though the current site isn’t accessed regularly on mobile, a lot of their users are repeat visitors and may have learned to avoid using their mobile. New users, the public in particular, will expect to be able to access content and complete tasks on their preferred device
  • Make changes based on the usability testing, for example the screen magnifying issue with the checkboxes. If this is a design system pattern the team could suggest a change to the design system by contacting the team

13. Make the user experience consistent with GOV.UK

Decision

The team met point 13 of the Standard.

What the team has done well

  • The team plan to run workshops with policy team who write the content so that they write in the GOV.UK style
  • They have engaged with other government teams who are delivering services for internal users. Keep doing this!

What the team needs to explore

  • Make a plan for ongoing engagement with policy team content writers to make sure they are still writing in the GOV.UK style
  • If there are health terms not covered by the GOV.UK style guide use the NHS content style guide to supplement it
  • PHE staff should be able to get access to cross-gov Slack and the Google groups. Contractors without a government or NHS email address could ask questions through PHE staff in their team. Or PHE should consider giving PHE email addresses to contract staff so that they can engage with the cross-gov community
  • It would improve the service to have a dedicated content designer in the team for beta, this doesn’t need to be full time if that isn’t necessary. This role could also lead on introducing content design methods to policy teams

14. Encourage everyone to use the digital service

Decision

The team met point 14 of the Standard.

What the team has done well

  • Users have the option of using email, letter or telephone to complete their task
  • Helpline is on every page and helpdesk can guide user the right content
  • The team had done analysis on number of email sign ups that came from outside of England and matched this to the number of registered charities outside of England
  • Product manager regularly reviews helpdesk reports

What the team needs to explore

  • It appeared that the team have engaged with the helpdesk, but not carried out any training with them. We’d recommend training them in the new website and have regular engagement going forward

16. Identify performance indicators

Decision

The team met point 16 of the Standard.

What the team has done well

  • The team have gathered some benchmark KPIs for comparison

What the team needs to explore

  • Add event tags to key behaviours and actions on the site so that you can interrogate this data in Google Analytics
  • Look at gathering data on task completion or click rate on old site and compare it to the same data on the new site. This would be useful when comparing changes in behaviour of mobile users or users from different traffic sources
  • It might be useful to gather qualitative data from internal users on the old site and compare it to the new site once it is in beta. This could be in the form of observing people use the site, or by asking for their feedback on specific tasks

17. Report performance data on the performance platform

Decision

The team met point 17 of the Standard.

What the team has done well

  • The service is registered with the performance platform team and product manager has been speaking with them

What the team needs to explore

  • Following a review of other PHE services on the platform we recommend that the team look to add the service to the site

Sharing and comments

Share this page