Skip to main content

https://digitalhealth.blog.gov.uk/2020/02/07/phe-evaluation-service-beta-assessment/

PHE Evaluation Service Beta Assessment

Posted by: , Posted on: - Categories: Assurance, Beta, Service assessments

Text saying "Service Assessment" with DHSC approved brand colours

From: NHSX

Assessment date: 9th January 2020

Stage: Beta

Result: Met

Service provider: PHE

Service description

The service aims to enable Public Health England and the wider health system to demonstrate the impact, cost-effectiveness and benefit of digital health products to public health. It is comprised of an online guide as well as workshops and a community.

Service users 

  1. Delivery teams that are making digital health products in PHE.
  2. Delivery teams that are making digital health products in the wider public sector.
  3. Delivery teams that are making digital health products in the private and tertiary sector.

1. Understand user needs 

Decision

The team has met point 1 of the Standard.

What the team has done well

  • Conducting user research with people using different parts of the service at different stages in the product development cycle. Evaluating health outcomes happens over years so they found a pragmatic solution to test the end to end service.
  • Refocusing, redeveloping their personas where evidence indicted.
  • Re-framing the users from specific roles to types of users based on user research.
  • Iterative testing of online guidance and content.

What the team needs to explore

  • The team presented evidence from user research that explored and tested hypotheses. These linked to the problem of how we might help PHE colleagues and external partners who evaluate health outcomes of digital interventions. This covers their 3 user groups identified above. They knew the product could be useful for other user groups but they were constrained by time and capacity to focus on the main three. The assessors agree with this judgement but further research to explore these different needs must be conducted if the service is intended for a wider user group.
  • The team needs to be mindful of their focus. They should ensure user research either covers this wider use or limit their focus to within the capacity of the current service. Significant further user research would need to be conducted to ensure that all contexts were understood. It is strongly recommended the team ensures there is clarity of the breadth of scope and the target audience.
  • Face to face assistance is available, as a proportion of the service is non digital. However further work is needed on routes into the service and the guides separate to the information provided online and via the email inbox.

2. Do ongoing user research

Decision

The team has met point 2 of the Standard.

What the team has done well

  • There was good evidence of how the team has tested and iterated their service based on findings.
  • Essentially using their own service to plan how they will evaluate their own service – running a workshop to develop their own logic model to use as a basis for their research plan

What the team needs to explore

  • Research with people with access needs. Digital accessibility centre (DAC) audit was conducted including usability testing but the service also needs to conduct user research with people with access needs who are not expert digital users. It is strongly recommended that the team uses their access to a wider set of users in public beta to access this group of users.
  • There is currently a vacancy for a user researcher within the team. This will need to be filled to ensure that user research continues to support public beta and will be essential should the service scope be wider (as discussed under point 1).

3. Have a multidisciplinary team

Decision

The team has  met point 3 of the Standard.

What the team has done well

  • The team has a co-located multidisciplinary team.
  • The team has all the roles expected for this stage of the service.
  • The team is made up of permanent PHE members.
  • The core team includes subject matter expertise from the wider organisation along with external advisors and demonstrates strong multidisciplinary working with a combination of Academic evaluation and public health experts working with Designers, User Researchers and Digital Technology professionals.
  • The wider team has senior level buy in and support with sponsorship at director level within the organisation. 

What the team needs to explore

  • The team should explore how user research and feedback will be captured and evaluated in live during their public beta.

4. Use agile methods

Decision

The team has met point 4 of the Standard.

What the team has done well

  • The team demonstrated the use of the scrum framework.
  • The team is co-located with a dedicated project space.
  • The team uses collaboration tools to support ways of working.
  • The team demonstrated holding retrospectives, identifying improvements and implementing these.
  • The team works in the open and held 26 open show and tells sessions, published multiple blogs and attended multiple conferences where work has been shared and feedback collected. 

What the team needs to explore

  • During public beta the team should consider how the agile and iterative ways of working demonstrated can be maintained when the service moves to live.

5. Iterate and improve frequently

Decision

The team has met point 5 of the Standard.

What the team has done well

  • The team demonstrated iterating the service in Alpha and Private Beta phases. An example is when the team pivoted from initial design patterns to revisions following user feedback to move away from step-by-step guides.

6. Evaluate tools and systems

Decision

The team has met point 6 of the Standard.

What the team has done well

  • The team explored a number of platforms including GOV.UK, internal PHE hosting, and bespoke hosting and publishing tools. The team tested with users and selected the appropriate tool and system based on user feedback and constraints of platforms. 

7. Understand security and privacy issues

Decision

The team has met point 7 of the Standard.

What the team has done well

  • The team considered the security aspects of the service and the privacy implications. As the service will be hosted on GOV.UK then the security elements are met.

What the team needs to explore

  • Should the team progress with community arrangements then they should consider the privacy aspects of these communities including how personal data may be stored and individuals contacted.

8. Make all new source code open

Decision

The team has met point 8 of the Standard.

What the team has done well

  • All code is open and available

9. Use open standards and common platforms

Decision

The team has met point 9 of the Standard.

What the team has done well

  • The team uses GOV.UK and as such are using an open and common platform.

What the team needs to explore

  • The team should work with and feedback to GDS and the GOV.UK team any constraints of the platform and the limitations which have been made to this service as a result. 

10. Test the end-to-end service

Decision

The team has met point 10 of the Standard.

What the team has done well

  • The team tested all elements of the service individually during private beta and tested end to end with four pilot teams. 

11. Make a plan for being offline

Decision

The team has met point 11 of the Standard.

What the team has done well

  • As this service is non transactional the impact of being offline is negligible. A plan to make the information available on request is in place. 

12. Make sure users succeed first time

Decision

The team has met point 12 of the Standard.

What the team has done well

  • Showed how they tested the service at different points in the end to end journey. They gave evidence why it is difficult to test the end to end service in the short time for service development because timescales for health outcomes are measured in years.

What the team needs to explore

  • More work should be conducted to understand the assisted digital journey. The assessors did not see evidence that the team could explain alternative paths in the service and demonstrate that they worked for users.
  • Need to test with users with access needs, separate to those testing the service via the Digital Accessibility Centre who are likely to be expert users.

13. Make the user experience consistent with GOV.UK design patterns

Decision

The team has met point 13 of the Standard.

What the team has done well

  • The service sits on GOV.UK. It is built using GOV.UK prototype and sits in Whitehall publisher so is consistent with GDS design patterns.

What the team needs to explore

  • The team should work with and feedback to GDS and the GOV.UK team any constraints of the platform and the limitations which have been made to this service as a result. 

14. Encourage everyone to use the digital service

Decision

The team has met point 14 of the Standard.

What the team has done well

  • Team has a communications plan in place.
  • This is both a digital and non-digital service and each part complements the other.

15. Collect performance data

Decision

The team has met point 15 of the Standard.

What the team did well

  • The team have a clear set of measures which will be used to evaluate the performance of the service and how data will be collected.
  • Measure are a good mixture of qualitative and quantitative, which will enable a rich set of data to lean from.
  • The team identified that the ultimate aim of the service is to increase the consideration given to evaluating services during the design of a service, to measure if this means services are more effective will be a long term ambition

16. Identify performance indicators

Decision

The team has met point 16 of the Standard.

What the team has done well

  • Team are clear what their performance indicators are and how to measure them.
  • Team will be evaluating their service using their own evaluation method.

17. Report performance data on the Performance Platform

Decision

The team has met point 17 of the Standard.

What the team has done well

  • Not applicable as this is a non-transactional service

18. Test with the minister

Decision

The team has met point 18 of the Standard.

What the team has done well

  • Team has tested up to Director level.

What the team needs to explore

  • Make sure you test with all senior stakeholders in beta.

Recommendations

  • During public beta the team needs to understand the assisted digital journey. The assessors did not see evidence that the team could explain alternative paths in the service and demonstrate they worked for users.
  • During public beta and before progressing to live the team should do specific user testing with users with access needs, separate to those testing the service via the digital accessibility centre who are likely to be expert users. The assessment team understood this was difficult during private beta due to small user numbers so this should be given more focus in public beta.
  • The team should work with GDS and NHS Digital to explore how this service could join up with the GDS Service Standards and the NHS Apps Library Standards. The assessment team felt this would be beneficial as end users could be reassured if health apps followed an assured evaluation process.
  • GDS and specifically the GOV.UK team should take on board feedback with regards to the challenges of publishing on GOV.UK.

Sharing and comments

Share this page