Skip to main content

https://digitalhealth.blog.gov.uk/2022/11/11/nhse-standards-directory-beta-assessment/

NHSE Standards Directory Beta Assessment

Posted by: , Posted on: - Categories: Assurance, Beta, Service assessments

Text saying "Service Assessment" with DHSC approved brand colours

From: DHSC
Assessment Date: 14/07/2022
Stage: beta
Result: met
Service Provider: NHS England with Marvell Consulting

Previous assessment reports 

Service description 

The service brings together different types of information standards needed to support interoperability across health and adult social care. It allows users such as healthcare providers and IT suppliers to find out what standards to use to collect, format and exchange data in real time between systems and across organisational boundaries. The directory makes it easy to browse content that is otherwise spread out across many sites. It helps users understand what standards apply to their use case and provides key information such as dependencies between standards and if it has been assured or endorsed by professional bodies.  

Service users 

Primary users are people who adopt standards

This group includes: 

  • suppliers building technology products 
  • health and social care providers 
  • regional bodies such as integrated care systems (ICS) 

Secondary user groups include: 

  • creators and owners of information standards such as the Professional Record Standards Body (PRSB) 
  • assurance boards, such as the Data Alliance Partnership Board (DAPB) 
  • procurement teams 

Report contents

  1. Understand users and their needs
  2. Solve a whole problem for users
  3. Provide a joined-up experience across all channels
  4. Make the service simple to use
  5. Make sure everyone can use the service
  6. Have a multidisciplinary team
  7. Use agile ways of working
  8. Iterate and improve frequently
  9. Create a secure service which protects users’ privacy
  10. Define what success looks like and publish performance data
  11. Choose the right tools and technology
  12. Make new source code open
  13. Use and contribute to open standards, common components and patterns
  14. Operate a reliable service

1. Understand users and their needs

Decision 

The service met with strong recommendations point 1 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team was able to show a nuanced understanding of the different organisation and user types needing to find standards that affect organisations in England  
  • the team iterated the services design with proxy users with differing access needs and gave good examples of improvements they made off this research  
  • the team was able to demonstrate common user scenarios based on research in private beta  
  • the user needs were well documented, categorised and linked to give the project team understanding of what needed to be prioritised  

What the team needs to explore 

Before their next assessment, the team needs to: 

  • test with a large proportion of users who are unfamiliar with service, to ensure their goals are met, this includes users looking for standards outside of England 
  • continue to find and test with genuine users with access needs and show where and how they looked for these users. This includes users with temporary disabilities  
  • ensure internal users with differing accessibility needs can use the CKAN (an open-source platform for sharing data) product to maintain the service in live 

2. Solve a whole problem for users

Decision 

The service met with strong recommendation point 2 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team continues to develop this service within the wider standards environment, helping users to not only find but also contribute to and keep in touch with emerging standards    
  • the team is making contact with other services in this area and joining up things for users  

What the team needs to explore 

Before their next assessment, the team needs to: 

  • prioritise testing with users who may be looking for other standards, such as accessibility standards or standards that relate to other UK jurisdictions, and how they will signpost those users to appropriate services  

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team continues to collaborate with other teams working in this space and has explored how they might join up different services  
  • the team has planned channels of communication, including search engine optimisation and signposting   
  • the team has considered how they will maintain and update content, including setting up a system to monitor changes and has produced guidance for the permanent team   
  • the team has spoken to standards owners and gives users the option to contact them for support

What the team needs to explore 

Before their next assessment, the team needs to: 

  • test user journeys, starting where users are such as with search or signposting    
  • review analytics to understand and improve user journeys, including referrals to and from other services   
  • explore the use of page dates to better understand how users understand whole page reviews versus interim updates, as well as version control in the standards   

4. Make the service simple to use

Decision 

The service met point 4 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team has iterated and simplified the service following user testing, including changing the service name and making it easier for users to deep link to the content they need   
  • the team has introduced a help and resources page and has cut back and simplified guidance     
  • the team has used the NHS design system and built on NHS or GOV.UK work in areas where patterns have not been published

What the team needs to explore 

Before their next assessment, the team needs to: 

  • prioritise testing with users who have accessibility needs or cognitive differences and iterate accordingly  
  • test with users unfamiliar with the standards and subject matter   
  • explore further how well users are succeeding first time, including when they follow mail links to other services

5. Make sure everyone can use the service

Decision 

The service met point 5 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team has carried out automated accessibility testing and has had an external audit against WCAG 2.1 AA, and has fixed a number of issues  
  • the team has tested the service with 8 proxy users with cognitive, motor, vision or hearing impairments

What the team needs to explore 

Before their next assessment, the team needs to: 

  • fix remaining issues identified in accessibility audit and reaudit the service  
  • test further with users with access needs or cognitive differences, for example, exploring internal staff networks  
  • check how accessible CKAN is and improve its accessibility  
  • use their position to influence standards owners to move from PDF to HTML 

6. Have a multidisciplinary team

Decision 

The service met point 6 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team planned for Public Beta includes key team members from Private Beta   
  • the team has successfully recruited an in-house product manager and metadata content manager with plans to support effective onboarding and future handover      
  • the team includes a variety of skills across the Digital Data and Technology (DDaT) job family with embedded support from policy and the NHS England Standards & Interoperability team 

What the team needs to explore 

Before their next assessment, the team needs to: 

  • support the effective onboarding of the in-house product manager and incoming metadata content manager continually 
  • establish clear plans for effective handover post public beta      
  • continually dedicate time to upskill and build capabilities of in-house team

7. Use agile ways of working

Decision 

The service met point 7 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team has used agile ways of working and frequently collaborated with the service owner   
  • the team has ensured effective governance was used with support from service owner, weekly risk meetings and use of steering group    
  • the team has ensured that user research was a team sport with team members observing research and helping support various research meetings such as research prioritisation and analysis   
  • the team has made certain that a variety of other services have been engaged with feedback shared

What the team needs to explore 

Before their next assessment, the team needs to: 

  • use agile ways of working continually to quickly iterate whilst collaborating with other service and in-house team members  

8. Iterate and improve frequently

Decision 

The service met point 8 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team has demonstrated how they have iterated and improved different areas of the service, including the home page, search, the roadmap and filters and labels  
  • the team has made sure iterations were driven by findings from user research 

What the team needs to explore 

Before their next assessment, the team needs to: 

  • prioritise the testing and iterating of the service to better meet the needs of users with accessibility needs. The includes internal and external users across the service  
  • use key performance indicators (KPIs), metrics and feedback from users to inform future iterations within Public Beta  
  • test and iterate new areas, for example around future standards  

9. Create a secure service which protects users’ privacy

Decision 

The service met point 9 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team have planned an external penetration test prior to public beta and will address issues before any release into public beta  
  • the team has implemented appropriate consent and cookie handling in place within the service for analytics etc.  
  • the team have looked at requirements and handling of information for a data protection impact assessment (DPIA) and committed to updating the need for a DPIA as the service evolves  
  • the team have implemented a WAF (firewall) as part of their private beta build, to mitigate against common threats and attacks and exploits, with basic protection for elements of their service      
  • the team have further developed monitoring and automation with AWS as well as StatusCake monitoring and user monitoring, and have some ideas about management and alerting within the team

What the team needs to explore 

Before their next assessment, the team needs to: 

  • address any findings from their penetration testing prior to release to public beta and show how they re-evaluate any vulnerabilities over time  
  • ensure that the team and product collateral evolve as the service develops in public beta, such as making sure they continue to consider DPIA  
  • think about how the transition for support from a security and patching perspective would work; especially how to ensure that a wider group than the team would handle urgent issues or notifications in future live operation   
  • consider what value using additional services to check the vulnerability status and security of their product, for example National Cyber Security Centre (NCSC) Web Check, will bring  
  • ensure their on-going arrangements for testing and support as they move towards a live service and handover 

10. Define what success looks like and publish performance data

Decision 

The service met point 10 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team made sure user satisfaction and completion rate are currently partially captured with additional measurements to be captured in public beta  
  • there was a clear problem statement articulating what the service is trying to solve that links to KPIs about the completion of various user journeys, and feedback on whether users found the documentation they needed  
  • the team have added or plan to add additional feedback and capture methods, such as HotJar  
  • the team have a good technical framework for capturing discreet service performance data to overlay with other feedback and analytics

What the team needs to explore 

Before their next assessment, the team needs to: 

  • clearly define success with analytics linked to desired service outcomes. Explore using specific ways to show a user got what they need such as a “was this page useful” question on the page  
  • identify relevant metrics to measure whether users are succeeding first time including when they follow links from the site to other services  
  • think about developing a better relationship with performance indicators outside the mandatory 4 KPIs  
  • ensure that search engine optimisation (SEO) behaviour and performance is considered as a key metric such as making sure people organically finding the service  
  • implement the planned improvements to analytics and logging as outlined in their plan, being prepared to change approach or prompts as more data is gathered and real-world users access the service  

11. Choose the right tools and technology

Decision 

The service met point 11 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team has evolved the use of their public cloud components from alpha, continuing to use Amazon Elastic Kubernetes Service (EKS) and managed databases to deliver a manageable solution  
  • the team have justifiably maintained a single region architecture to minimise cost, while clearly understanding recovery steps and limitations     
  • the team have continued to develop open source components to deliver their service and main product architecture  
  • the team have mostly automated deployment and release pipelines and plan to continue to manage their service in an automated way       
  • the team have used a combination of appropriate tools and technology to manage their service across the mixture of roles such as notifications in slack channels for all users to see 

What the team needs to explore 

Before their next assessment, the team needs to: 

  • continue to explore updating the helm chart and deployment scripts for the service to address identified issues allowing for redeployment with minimal intervention    
  • continue to consider how appropriate links are made for the service with other areas of digital guidance, to ensure best placement in search results and to offer the clearest journey to the service such as other NHS standards or service manual  
  • address potential technology limitations for the admin (CKAN) interfaces, to ensure they are accessible and brandable for future development of the service.  
  • consider exploring other sign-in methods or role based access control that could cooperate with other platforms in the future, where this is appropriate for support or delegated management of content items  
  • consider how to address the issues of some linked information ageing or changing; potentially linked to flagging 

12. Make new source code open

Decision 

The service met point 12 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team have continued to make source code open using GitHub      
  • the team have kept configuration and secrets closed to ensure security      
  • the team have fed back some enhancements to other repositories and products 

What the team needs to explore 

Before their next assessment, the team needs to: 

  • ensure that their public repository is up to date  
  • consider what will happen to maintenance of code and the administration of the repository as the service is transitioned. There should be a clear process and management process in any supplier-product team handovers 

13. Use and contribute to open standards, common components and patterns

Decision 

The service met point 13 of the Standard. 

What the team has done well 

The panel was impressed that: 

  • the team has used the NHS design system and has built on NHS and GOV work where they are using components and patterns that are not published  
  • the team has shared learnings with the NHS community, via GitHub and at design huddles and has linked up with other teams   
  • the team has continued to share and feedback to the CKAN community about the basic changes needed from this service

What the team needs to explore 

Before their next assessment, the team needs to: 

  • continue to share learnings and contribute to the NHS and GOV design communities  

14. Operate a reliable service

Decision 

The service met point 14 of the Standard. 

What the team has done well 

The panel was impressed that:    

  • the team has made rational decisions about having single region deployments for public beta, appropriate to the cost and recovery time of the service which could be changed as the services progresses through beta or in response to differing future needs    
  • the team have understood their recovery procedure and testing scenarios  
  • the team have undertaken an appropriate level of work looking at load testing and performance from baselines of other similar services across government  
  • the team have a combination of infrastructure performance monitoring and logging and basic service availability checks to alert the team to service failures or performance issues  
  • the technology components and scaling selected should cover a range of different scenarios that can be further refined as public beta progresses 

What the team needs to explore 

Before their next assessment, the team needs to: 

  • fully consider documenting the service availability, Recovery Time Objective (RTO) and Recovery Point Objective (RPO) metrics and targets that the service is designed to meet, so this is clearly understood and relates to the features and decisions taken in design and to support testing
  • consider implementing redirects to inform the public when the service is under maintenance or has planned downtime or similar, to avoid the service appearing to be unavailable        
  • consider how support for CKAN would be maintained and adopted if the service was fully transitioned to public servants with limited supplier involvement      
  • refine and develop and articulate further, testing scenarios beyond the basic load, health testing, and release checks already developed. The team should consider testing of functionality and critical path scenarios or process checks that cover a wider range of scenarios

Sharing and comments

Share this page