Skip to main content

https://digitalhealth.blog.gov.uk/2025/02/21/nhse-innovation-service-live-service/

NHSE Innovation Service Live service

Posted by: , Posted on: - Categories: Assurance, Live, Service assessments

From: DHSC
Assessment Date: 29/06/2023
Stage: Live
Result: Met
Service Provider: NHS England

Service description 

The NHS Innovation Service helps people with innovative ideas in healthcare (‘innovators’) to develop and spread their idea in the UK. It brings information and support that already exists into one central place, making it easier to find relevant, practical information.   

Innovators can also create a shared online record of their innovation and be connected with the appropriate organisation(s) for coordinated support. This reduces the need to fill out multiple forms and allows support organisations to collaborate using one centralised record. Eleven key organisations now provide support to innovators via the NHS Innovation Service, including NICE, MHRA, NHS Supply Chain and the AHSNs.  

Service users 

  • Small businesses  
  • Industry   
  • NHS Staff   
  • Academics researchers  
  • Patients and carers

Report contents

  1. Understand users and their needs
  2. Solve a whole problem for users
  3. Provide a joined-up experience across all channels
  4. Make the service simple to use
  5. Make sure everyone can use the service
  6. Have a multidisciplinary team
  7. Use agile ways of working
  8. Iterate and improve frequently
  9. Create a secure service which protects users’ privacy
  10. Define what success looks like and publish performance data
  11. Choose the right tools and technology
  12. Make new source code open
  13. Use and contribute to open standards, common components and patterns
  14. Operate a reliable service

1. Understand users and their needs 

Decision 

The service met point 1 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has actively involved user researchers throughout the Beta Phase to gather ongoing insights and feedback from users 
  • the team has conducted user research and utilising analytics   
  • the team has incorporated user feedback and insights into the design process 
  • the team has implemented collaboration tools like Mural and Slack to facilitate communication and knowledge sharing within the team, enhancing the efficiency of user research activities 
  • the team has conducted basic accessibility testing, released accessibility statement, and obtained an accessibility certificate, demonstrating a commitment to inclusivity    

What the team needs to explore 

Before their next assessment, the team needs to: 

  • ensure that the service maintains compliance with the latest accessibility standards 
  • ensure there are systems in place after launch to regularly gather user insights throughout the Live Phase 
  • continue to ensure inclusivity in user research by actively engaging participants with diverse backgrounds, disabilities, and accessibility needs 
  • develop a comprehensive outreach strategy to reach and engage new users, ensuring the service is accessible and well-suited to their needs 
  • develop a roadmap for ongoing user research initiatives, including ambitions such as scoping transactions effectively and creating or joining a service community 

2. Solve a whole problem for users 

Decision 

The service met point 2 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has continued to iterate on the end-to-end user journey throughout public beta 
  • the team has made changes based on feedback to address common pain points 
  • the team has a good understanding of how their users made use of the service in the wider context of innovation and partnership with the NHS   

What the team needs to explore 

Before their next assessment, the team needs to: 

  • explore the reasons for dissatisfaction with the service in greater detail and address where possible in future iterations 
  • ensure the service sets expectations at an appropriate level for engagement from both the support organisations and participants    

3. Provide a joined-up experience across all channels  

Decision 

The service met point 3 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has continued to do good work on ensuring users had a joined-up experience regardless of their organisation of origin 
  • the team has made changes in the innovator journey to allow for multiple collaborators     
  • the team has improved the accessors and needs assessors' flexibility through account switching   

What the team needs to explore 

Before their next assessment, the team needs to: 

  • ensure there is a level of coherency between user journeys within the service and direct contact with support organisations   

4. Make the service simple to use 

Decision 

The service met point 4 of the Standard 

What the team has done well 

The panel was impressed that: 

  • the team understands the design standards that need to be met and have been using the GDS service manual where the NHS service manual doesn’t have a relevant component or pattern for their user needs 
  • the team is using user research and performance data to iterate designs, such as the example the team gave on how performance data is used on excessively long completion rate to identify design issues and improve submission rates 
  • the team is testing some of the non-standard patterns that are used in the service 
  • the team is thinking about how elements outside of their direct control such as supporting organisations’ communication and response times can be improved, and is testing the use of triggered reminders and notifications to improve the user experience 
  • the team recognises the competing user needs that are presented by innovators, needs assessors and supporting organisations and is considering ways that this can be managed as more supporting organisations are onboarded who may request additional questions to be added to the innovation record   

What the team needs to explore 

Before their next assessment, the team needs to: 

  • refer to the recommendations provided in the beta report, in which the team were asked to “review where the service adheres or departs from the GOV.UK design community principles of ‘start with what exists’ 
  • refer to the recommendations of the previous beta report to ‘consider where the team can contribute evidence-based changes with the design community - particularly in the area of administrative interfaces where the service may be able to help fill an existing gap in research-led design’ 
  • for non-standard styles, components and patterns, ensure that the team can demonstrate how the team have started with what exists, and why the team have deviated from the standard based on evidenced research. The panel understands that there is some evidence to indicate that users are unclear at times where they are in the journey and what they need to do next. The panel notes that there are several non-standard navigational patterns that may not be as widely familiar or accessible to some user groups. For example, incomplete sections are indicated visually by a red cross. This element is typically used on NHS.uk as part of a Do and Dont’s list to indicate something a user is advised not to do and may cause confusion. The team is advised to test a standard task list pattern before deviating from this.  
  • continue to develop a framework and clear criteria to help them make and document consistent design decisions where the team need to prioritise competing user needs in the question set, for example by using a question protocol. The service design needs to meet the needs of supporting organisations by collecting enough information in the innovation record without having so many questions that it becomes a prohibitively high barrier that excludes potential innovators, especially those with access needs such as lower digital literacy or English as a second language. The risk here is that potential innovators may either opt to avoid the service entirely by going direct to a supporting organisation or be put off pursuing their innovation entirely. 
  • ensure that usage of the supporting guidance informs continuous service improvements, and that the guidance doesn’t replace the need to think about how to solve pain points at point of need through design and tech decisions  

5. Make sure everyone can use the service   

Decision 

The service met point 5 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has considered how the service needs to meet the needs of users with access needs and has included users with access needs in user research 
  • the team demonstrated a commitment to accessibility and acknowledged the challenges experienced in recruiting users with a diverse range of access needs  
  • the team recognised the accessibility issues presented by Hotjar and is moving away from use of this tool 
  • the team has carried out an accessibility audit and has fixed all issues to ensure the service meets WCAG 2.1 AA 
  • the team is providing some assisted digital support  

What the team needs to explore 

Before their next assessment, the team needs to:     

  • consider how users without JavaScript enabled will be able to use the service and think about how to monitor the impact of this technical decision on these users. Currently, users who do not have JavaScript cannot create an account or use the service 
  • continue to carry out user research with users with a range of access needs, beyond the user group that has signed up for research 
  • consider how potential innovators may be excluded from this and other pathways due to high barriers to entry and how the service can support these, for example by increasing the assisted digital proposition 
  • ensure that there is a plan for meeting WCAG 2.2 this year    

6. Have a multidisciplinary team 

Decision 

The service met point 6 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team includes a good mix of skills in the form of DDaT (digital, data and technology) professionals and subject matter experts from within the organisation 
  • thought has been given to the requirements of the service moving forward into live and the team plan has been adjusted accordingly   

What the team needs to explore 

Before their next assessment, the team needs to: 

  • make provisions to include performance analyst expertise to assist with refining the KPIs and data collection  

7. Use agile ways of working 

Decision 

The service met point 7 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has put in place a well-thought-out structure to enable various sub-teams to effectively work together     
  • the team has ensured there is a variety of agile ceremonies based on their needs
  • the team has frequent communication with the wider organisation and stakeholders  

8. Iterate and improve frequently 

Decision 

The service met point 8 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team clearly laid out how many parts of the user journey based on feedback during public beta have iterated 
  • the team has a well-defined process for capturing needs and developing subsequent features     
  • the team has spent time highlighting successes and identifying lessons learned       

9. Create a secure service which protects users’ privacy 

Decision 

The service met point 9 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team secured understanding and solutions implementation of the data requirement or capture 
  • the team has ensured the service provisions a Role Based Access Control (RBAC) mechanism for Innovators, Accessors, Needs Assessment, User Admin and Support teams 
  • the team has ITHC and Penetration Testing completed, and remediation activities are in-flight 
  • the team has made the service database to hold name, Email and Organisation information. No patient or health records are held in the database 
  • the team has designed and built the service with due security considerations for data, infrastructure and application components 
  • the team has made sure that infrastructure development is automated using Terraform. This also lends a secure deployment process 
  • the team's Service Architecture indicates that Azure WAF is implemented for OWASP protection. This protection against threat vectors includes DDoS attacks   

What the team needs to explore 

Before their next assessment, the team needs to: 

  • ensure that the infrastructure build which implements Terraform (IaaC), must be secured to ensure vulnerabilities are not introduced via the codes. For example, tools like TFLint implementation in the pipelines ensures code errors are identified during the build process prior to deployment 
  • ensure that infrastructure configurations re checked for vulnerabilities implementing tools for vulnerability scans for example Chekov in the automation pipelines. This ensures that infrastructure vulnerabilities are identified during the promotion process from build to test and deployment
  • ensure that Application Static codes are scanned for vulnerability implementing tools for this purpose for example SonarQube 
  • scan profiles used for GitLeaks should be implemented to ensure repository assets does not contain secrets and keys 
  • ensure that Key and Secret storage or management is implemented in Azure Key vault 
  • the Innovation Service captures PII to include name, email and organisation information, which contains commercially sensitive information. Although there is a RBAC provision, this should be extended to include a Data Loss Prevention (DLP) tool to obfuscate sensitive fields of data such as name, email and commercially sensitive information

10. Define what success looks like and publish performance data 

Decision 

The service met point 10 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team has created a clear dashboard capturing several key metrics for the service 
  • the team has worked to adapt the mandatory KPIs in the context of the service in a logical manner 

What the team needs to explore 

Before their next assessment, the team needs to: 

  • continue working on improving data capture where there is missing or inaccurate data 

11. Choose the right tools and technology 

Decision 

The service met point 11 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has implemented Open Standards and Technologies for example, Wagtail CMS, NodeJS, Terraform 
  • the team's choice of hosting platform choice is suitable as it is one of the 3 major providers. Hence there is reuse of existing capabilities and expertise within the team 
  • the team has ensured there are integrations with Gov Notify, API Management and Application Insights. This is a reuse pattern and leverage of existing capabilities 
  • the team has ensured there is a decoupling of application layer to provision asynchronous and rapid deployments 
  • the team has implemented a cloud first approach with the automation of the development process 
  • the team has built monitoring capabilities into solution by implementing Azure Monitor 
  • the team has ensured the service has analytical capability built into solution by the deployment of Google Analytics 

What the team needs to explore 

Before their next assessment, the team needs to: 

  • explore a containerised solution with Azure Instance Container for technical and cost efficiency in comparison to Azure App Service   

12. Make new source code open 

Decision 

The service met point 12 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has deployed Azure DevOps to include Azure Repos (Git), Azure Pipelines and Azure Artifacts. This ensures an automation of the CI/CD process 
  • the team's deployment approach also provisions an open-source code system for transparency and ease of deployment  

What the team needs to explore 

Before their next assessment, the team needs to: 

  • update the code base to include back-end updates. During the assessment process, the team informed the panel that there were recent updates to the Service back-end applications. The codebase of this update is yet to be integrated to the CI/CD pipeline and repository. The team should ensure that this update is integrated as soon as possible 
  • ensure that the Azure DevOps system is used actively for future updates and deployments. Deployments or updates should be live with new features integrated via a features branch 

13. Use and contribute to open standards, common components and patterns 

Decision 

The service met  point 13 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has open standards, common components and patterns implemented for service development. Hence the ability to reuse and integrate seamlessly with existing services, tools and technologies for example terraform, Gov Notify, Wagtail CMS, NodeJS, “4 Amigos” and “Keep it Simple” architecture 

What the team needs to explore 

Before their next assessment, the team needs to: 

  • further explore open standard tools, technologies and patterns that can be utilised to improve the Service build, test and deployment processes 
  • explore Open Standard technology and tools for Application and Infrastructure code base error and vulnerability scans as mentioned on section 9  

14. Operate a reliable service 

Decision 

The service met point 14 of the Standard

What the team has done well 

The panel was impressed that: 

  • the team has considered a service operating model which is currently live to include a service maintenance page   
  • the team has provided a disaster recovery model which has been tested by the team to include RTO and RPO metrics 
  • the team has a business continuity plan explored for four scenarios to include Azure SQL database, Azure Blob Storage, Azure PostgreSQL and Azure Active Directory B2C 
  • the team has showed that Azure SQL Database and Storage Blob backup details provided for local hardware failures, data corruption or deletion, datacentre outage and upgrade or maintenance errors. This should enable the team respond effectively to these issues, when they occur   

What the team needs to explore 

Before their next assessment, the team needs to: 

  • explore application resilience contingency plans. The service technical team has provided resilience and contingency plans for database and storage failures. However, it has not explored the provision of a High Availability (HA) architecture for the Service Applications as the current hosting of the service is in one Azure Zone only. In the instance of a zonal failure, the team would have to move the service to another zone, which is not currently setup. The team should explore the deployment of the service in a secondary zone and kept “warm” to mitigate delay of service due to zonal failures 
  • ensure that there is a governance structure and cadence process for the Disaster Recovery (DR). For example, partial DR test every six months and full DR test every twelve months. For each test instance, the RTO, RPO metrics and data loss review should be included in the DR report 

Sharing and comments

Share this page

Leave a comment

We only ask for your email address so we know you're a real person

By submitting a comment you understand it may be published on this public website. Please read our privacy notice to see how the GOV.UK blogging platform handles your information.