Skip to main content

https://digitalhealth.blog.gov.uk/2016/12/01/nice-medtech-beta-service-assessment/

NICE META tool - Beta service assessment

Posted by: , Posted on: - Categories: Service assessments

Department / Agency:

National Institute for Health and Care Excellence 

Date of Assessment:
26th September 2016

Assessment Stage:
Beta

Result of Assessment:
Not pass

Lead Assessor:
M.Harrington

Service Manager:
M.Hope

Digital Leader:
K.Farrington


Assessment Report

Overview 

https://digitalhealth.blog.gov.uk/wp-admin/tools.php
After consideration, the assessment panel have concluded that the MedTech Early Technical Assessment (META) service has not met the required standards expected of a Beta service and has not passed this assessment.

The service team have undertaken a significant amount of work to get the service into its current position and it was positive to see this being presented at the assessment. The panel were impressed with the technology decisions made and there was a clear understanding of the user needs on the facilitation side of the service, however there were also clear gaps in understanding the user needs of those expected to make submissions by the service and questions around cost and the charging model which need to be understood before moving to a public Beta.

User research is key to the success of the service. Launching something that hasn’t been tested and not expecting any significant changes is a risk and not in line with the GDS development phases. Beta is still a phase of learning and development and NICE must ensure that there is a team available to continue to develop and iterate until the service is Live. Similarly, good user research will give the team the evidence needed to focus on the important things and not be subject to last minute changes or ideas from stakeholders which are not based on user needs.

Though the service did not pass the assessment, there are positives to take and the suggestions in this report are within the capability of NICE to complete. The panel would like to thank the service team for the assessment and trialling the new approach with Dept of Health, the open way they approached it and how they input to the retrospective session.

The service is easy to use and meets user needs

Criteria: The service team should be able to confidently state the problem they're solving, who their users are, what needs those users have, how they researched these needs and how the service is meeting the needs of users.

Assessor comments: The amount of user research so far is not in line with what is expected of a Beta service. The service itself has been presented at conferences and other NICE events however no users have been observed trying to actually use the service. The assessment panel expect the service team to have tested the service before moving to Beta.

The user needs of the service itself are still unclear. Little has been done to understand the wider user journey between health organisations, academic institutions and technology companies. When does someone start using the service, how do they find it, where does it sit amongst other support etc.

Recommendations:

  • Undertake user research with future users of the service, ideally with an experienced user researcher
  • Document the user needs the service is meeting
  • Have a charging model in place for Beta
  • Understand the full user journey, including how this fits with existing NICE content and service, academic institutions etc
  • Have a plan in place for how user research will continue during Beta

 

The service can be quickly improved

Criteria: The service team have funding and people available for continued iteration of the service towards meeting the live standard. There is a clear understanding of what running a service requires and how to respond to feedback/issues/data. The code/service can be iterated quickly, inexpensively and regularly. There is a single owner responsible for the service.

Assessor comments: The implementation of the service means that the team in NICE can make changes to the forms, copy, questions etc and create new forms/services when needed. No releases or downtime are required for this. This flexibility will mean the team can remain responsive and can iterate the service. There is a plan in place to ensure that the handover to live service management (when appropriate)  is smooth however there will need to be clear roles for those expected to change/improve the service so that it continues to meet user needs through Beta and into Live.

Recommendations:

  • A team in place for Beta to continue to iterate and improve the service to meet user needs, including someone experienced in the discipline of user research.
  • A clear understanding of who is responsible for managing the forms/content/charging model once Live to ensure this continues to meet user needs
  • Clear understanding who will be responsible for monitoring analytics (Hotjar/GA) and other data available to improve the service
  • A transition plan to ensure that code is understood in the live service team (Some of this has already started)

 

The service is safe and secure

Criteria: The technology choices are sensible, the team have avoided lock-in, expensive contracts or solutions which cannot be regularly iterated. The service is safe, with data held at appropriate levels and has undergone necessary tests (e.g. a penetration test)

Assessor comments: The team had clear rationale for their technology choices and it was good to hear that these had helped speed the development of the service. Moving to IAAS has been managed exceptionally well and the usage of Docker/Rancher was particularly impressive. It was also positive to hear how use of some new technologies was being shared wider with colleagues in NICE during the firebreak period. The re-use of the authentication module for the service has sped up development, however few complaints / issues being raised does not mean the service is necessarily working well. The managing of accounts/profiles is also something that needs careful monitoring and NICE should understand the risks to the service from this.

Recommendations:

  • Have a risk log pertaining to the service to understand threats and vulnerability concerns
  • Undergo a penetration test and ensure security is baked into development https://www.owasp.org/ may be of particular interest
  • Have plans to take the system off line
  • Ensure knowledge about the system is somewhere that others can find and use i.e update README’s and increase test coverage across the system

Sharing and comments

Share this page