From: NHSX
Assessment date: 29 July 2021
Stage: Alpha
Result: Met
Service provider: NHS Digital
Service description
This Care Identity Management (CIM) service allows NHS and Care staff to be registered for a ‘Care Identity’ – a digital identity that can then be associated with health and care organisations they work for; assigned access controls that will enable appropriate access to clinical systems and patient information and assigned authentication tokens that allow them to perform multi-factor authentication to these clinical and patient record systems.
Service users
This service is for:
- Registration Authority Managers
- Registration Authority Agents
- Registration Authority Sponsor
- Registration Authority ID checkers
- Health and Care professionals
Report contents
- Understand users and their needs
- Solve a whole problem for users
- Provide a joined-up experience across all channels
- Make the service simple to use
- Make sure everyone can use the service
- Have a multidisciplinary team
- Use agile ways of working
- Iterate and improve frequently
- Create a secure service which protects users’ privacy
- Define what success looks like and publish performance data
- Choose the right tools and technology
- Make new source code open
- Use and contribute to open standards, common components and patterns
- Operate a reliable service
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- although for discovery and the majority of alpha there was not a dedicated user research role in the team, the whole team demonstrated a good understanding of service user needs. User input has been sought throughout using different methods (surveying, interviewing, observation) and via testing
- the wider team has been involved and invited to feed into user research and observe sessions
- there is now a user researcher role in the team that will carry through to beta. The initial activity demonstrated, such as empathy mapping and assessing gaps in generative versus evaluative evidence, will ensure user needs are considered and kept central to further development and prioritisation in the service
- the team demonstrated how they built on their understanding of their users, identifying key differences in different areas of the system, such as Clinical Commissioning Groups
- there was a good demonstration of understanding key pain points, such as providing unique identifiers, getting the right info, finding the right info and avoiding duplicate entries, processing large intakes, and not always being able to complete in one sitting
- the team also showed how research and understanding user needs influenced design iterations of the ‘create user’ journey and were confident they had reached a saturation point of feedback for this piece of the work
What the team needs to explore
Before their next assessment, the team needs to:
- continue to develop their understanding of the breadth and depth of different users relevant for this service user journey. When a key project driver is to replace legacy technology there can be a risk that technical considerations overshadow a focus on user needs and being open to a full range of solutions to user problems or opportunities. This is somewhat reflected in the user needs as written in the project brief, where the solution is partly referenced in the user need. For example, “to create and manage care identity profiles” or “to generate a report.” These could be considered as actions the user needs to take in the system as designed. What is the underlying thing they are trying to achieve in language they would use?
- continue to build on the knowledge of the range of users and their needs in private beta, including ensuring involvement of users with different kinds of accessibility needs. The team acknowledges gaps in understanding and the complexity of some users, such as sponsors, as well as difficulties engaging clinical users
- continue to involve service users with research and testing methods that ensure a strong focus on user behaviour as opposed to stated preference. For example, stated preference for single screen versus multi-screen data capture is less important than understanding workarounds users might be taking because they don’t have time to complete the form in a single session
- ensure key metrics refer back to user needs
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- a lot of thought had gone into how the CIM service fits into the ecosystem of different systems making up the wider user journeys
- a “heatmap” has been developed to highlight where there are gaps in the user journey which aren’t necessarily owned by this service team
- there was a very good understanding of the other service teams working on other components of the bigger picture and these were all connected through a deliberate communication structure
What the team needs to explore
Before their next assessment, the team needs to:
- continue to work in the open and remain connected to other service teams and areas of the organisation
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- there was consideration and a realistic plan for how users will be communicated to across channels to make them aware of the ongoing development of CIM (and CIS2)
- operations staff are playing a key part in user testing and are being effectively communicated with
- the team are sharing outputs and progress as regularly and as early as possible, with show and tells, all hands meetings, peer reviews and one-to-ones across the organisation
- engagement and data from service support help desks provides insight about issues and variations at a local level
What the team needs to explore
Before their next assessment, the team needs to:
- continue to engage across organisations, with frontline users and at a local level
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team has used NHS design patterns and followed best practice
- several rounds of user testing and iteration have been carried out to make the “create user” journey as simple as possible
- the team made significant improvements, addressing a number of pain points in the existing service, and directly responding to user needs, for example changes to the user journey for duplicate checking to incorporate it into a step by step flow
What the team needs to explore
Before their next assessment, the team needs to:
- continue to test with users and iterate the design as new features are added to ensure they are of a similarly high standard and that they don’t adversely affect any work already done. As more features are added, ensure complexity is not added so the service remains simple to use
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has so far followed accessibility best practice in the design of the service
- the team has really good awareness of the standards they need to meet
- the team has considered the digital literacy of their users and context in which they work and design the service accordingly
- they have developed a testing strategy to cover browser technology in use across the NHS
- guidance and support is provided to hospital IT staff who support the user base
What the team needs to explore
Before their next assessment, the team needs to:
- carry out user research with assistive technology users and users with accessibility needs, as already planned
- commission a full accessibility report, as already planned
- integrate accessibility testing into the development and build phase
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team is multidisciplinary covering all the roles required, including embedded subject matter expertise and experience - they have worked together to resolve issues and collectively agree on feature refinement
- the team has been covering user researcher responsibilities but have recently recruited a full time time senior user researcher to carry out a gap analysis, review research to date and lead research in private beta
- a programme board has recently been set up for the Senior Responsible Owner (SRO) to govern the delivery of this service
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the SRO is fully engaged with and is championing the project at a senior level
- make sure that any work done with contractors is on a sustainable basis
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team has embedded agile ways of working in the delivery of this project
- the team carries out fortnightly retros where they identify issues and collectively agree ways to revolve them
- the team includes stakeholders in sprint planning, allowing visibility about what the team is working on
What the team needs to explore
Before their next assessment, the team needs to:
- continue working in an agile way and demonstrating the benefits of agile to stakeholders
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team has quickly built prototypes to be able to test and learn fast
- the team has carried out 20 user interviews and testing sessions over 3 sprints and have used findings to continuously improve and iterate the prototypes
What the team needs to explore
Before their next assessment, the team needs to:
- continue to test the service with users - including users with accessibility needs - and iterate
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team has developed the service using existing standards for authentication
- by temporarily using an off the shelf product to develop the private beta they have simplified the storage of transient data
- penetration testing has been completed on the private beta service
- AWS logs are ingested by a central security operations centre
What the team needs to explore
Before their next assessment, the team needs to:
- address some of complexity in the current architecture - this has been acknowledged and the team presented their plans in the assessment
- move, as planned, to utilising existing Care Identity Service data stores to avoid re-visiting the challenges and risks associated with storing data in this service
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team has identified and prioritised user pain points and needs and are focused on testing how the service can resolve these
- the team has baselined the current process, enabling service improvements to be tracked over time - time to complete, number of interactions etc.
What the team needs to explore
Before their next assessment, the team needs to:
- put in place clear objectives for the service with measurable and quantifiable key results (consider using the OKR format)
- consider using web analytics to track service performance
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- for the technically complex parts of the service the team have used an off the shelf product, rather than develop this for private beta, with an intention to remove as the service develops over time
- separating the front end and consuming APIs that will eventually be open for others, the team have both created an open service for others to consume, as well as hiding the complexity of removing the temporary private beta architecture from end users
What the team needs to explore
Before their next assessment, the team needs to:
- ensure a robust plan is followed for decommissioning the temporary private beta architecture
- look to use a certified OIDC client for the integration to the OIDC provider, reducing the risk of custom development introducing risk/bugs
- investigate any opportunity to support progressive enhancement - it is noted that the service has a JavaScript constraint due to smartcard authentication, which is not something the team can control
12. Make new source code open
Decision
The service did not meet point 12 of the Standard.
What the team needs to explore
Before their next assessment, the team needs to:
- as part of the move from internal GitLab to GitHub make a case for using public repositories
- review which parts of the service can be made open source
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the service is being developed using the NHS design system
- the service is using existing standards for authentication and identity verification
What the team needs to explore
Before their next assessment, the team needs to:
- contribute new patterns developed for staff facing services to the NHS design system
- consider possible standards for the new external user management APIs e.g. System for Cross-domain Identity Management (SCIM)
- use, or develop, an open source OIDC client for the integration to the OIDC provider
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team already has experience in running national critical infrastructure, providing authentication services to the NHS
- development practices are in place to reduce risks to deployment - linking to code standards, test gates triggered in deployment pipeline etc.
- multiple deployments of the private beta service to provide high availability
- external dependencies are well understood and the team have developed the private beta to fail safe if there is a problem with downstream dependencies
What the team needs to explore
Before their next assessment, the team needs to:
- move to the proposed architecture for public beta - this addresses issues that were noted and understood by the team in the private beta architecture
- ensure the temporary service components are not left to linger as the new functionality is developed on the public beta architecture