From: NHSX
Assessment date: 4 November 2021
Reassessment date: 8 December 2021
Stage: Alpha
Result: conditional met when reassessed
Service provider: National Institute for Health Research (NIHR) with PA Consulting
Conditions of the met result:
- demonstrate involvement of users with different accessibility needs
- work with GDS and NHSX to reconcile the friction between your own brand and the design system
Service description
Digital Trial Engagement (DTE) is an opt-in service available to studies on the National Institute for Health Research (NIHR) Clinical Research Network (CRN) Portfolio and NIHR funded studies. Members of the public can apply to take part in studies and study teams can speed up recruitment from this pool of pre-screened volunteers.
Studies will use the service to manage communication between themselves and study participants. This will improve the experience of taking part in research for participants and bring the administration of study communication into one place for the study team. The impact of this improved communication will lead to a better participant experience, increased retention of participants and greater adherence to protocol.
Service users
- Potential research volunteer
- Enrolled study participant
- Researcher in the central study team
- Researcher from a study site
- DTE Service managers
Report contents
- Understand users and their needs
- Solve a whole problem for users
- Provide a joined-up experience across all channels
- Make the service simple to use
- Make sure everyone can use the service
- Have a multidisciplinary team
- Use agile ways of working
- Iterate and improve frequently
- Create a secure service which protects users’ privacy
- Define what success looks like and publish performance data
- Choose the right tools and technology
- Make new source code open
- Use and contribute to open standards, common components and patterns
- Operate a reliable service
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a dedicated user researcher who ensures the team understand and make use of insights via playbacks
- user pain points, needs, and assumptions based on discovery research for study team and research participant users were well summarised and described
- the team has conducted user testing with research participants in alpha and demonstrated how user feedback was used in iterating the prototype wireframes, including changes to design to ensure people don’t miss key content, breaking down data capture, and changes to the account set up and screener process to reduce friction experienced by people that could cause frustration and lead to drop off
- there are plans for ways of working between design and development cycles in beta and where research fits within this
What the team needs to explore
Before their next re-assessment, the team needs to:
- demonstrate research participant user needs have been met with the iterated prototypes as this round of testing was noted as still being collated and could not be fully assessed
- demonstrate research participant user needs for in-study communication have been met as this is referred to in the brief as within the scope of this assessment, but was not demonstrated as having been tested with users beyond feasibility investigation into use of gov.uk notify
- demonstrate involvement of study team and researcher users in the design and iteration of prototypes that meet their needs because whilst designs were shown in the assessment, user testing had not yet taken place due to challenges in recruitment and could therefore not be fully assessed
- demonstrate how the team has understood and iterated on users’ journeys with input from users because although the team have iterated on the prototype designs in alpha, it was noted that journeys were reviewed with product owner, tech input, and SMEs, but not with users until a later stage
- consider involving users with accessibility needs in user testing rounds to understand any service barriers for different users because whilst an accessibility audit is helpful, it should ideally not be relied upon in the absence of testing with users and although this service is thought of as in addition to existing offline processes, it is important that any new service considers digital exclusion so that users are not unnecessarily excluded from a channel or service of choice
- consider how best to overcome challenges in researcher user recruitment because whilst plans to form a test group that sees multiple versions of a prototype were mentioned, consideration should be given as to whether this might bias testing due to user’s growing familiarity with prior versions and project aims
Before their next assessment, the team needs to:
- consider the different types of users within the broadly defined categories and demonstrate an understanding of where their needs are considered within the service
- consider additional key user groups in the journey and process, such as GPs, and what they need of this service to ensure it meets project objectives
- consider ways to involve the team in directly observing user research sessions
- demonstrate how the approach solves the user problems identified in discovery, such as making recruitment faster or improving the research participant experience because whilst the team has shown some evidence of this in their iteration in alpha, it’s important to test for and demonstrate behaviour and goals rather than overly rely on preference or expectation feedback of screens or processes
Reassessment
Decision
The service met (pending demonstrating involvement of users with different accessibility needs) point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a dedicated user researcher who ensures the team understand and make use of insights via playbacks and is now involving the team directly as observers using a rota
- user pain points, needs, and assumptions based on discovery research for study team and research participant users were well summarised and described and prior workshops explored different potential solutions to the problems and opportunities identified
- the team has conducted user testing with research participants and researcher users in alpha and demonstrated how user feedback was used in iterating the prototype wireframes and in feature prioritisation and development
- the team have demonstrated an understanding of their key service journeys and how this fits into the wider landscape of ways in which people become involved in research
- the team have plans to begin user testing with users with different access needs and for a focus on this in beta
- there are plans for ways of working between design and development cycles in beta and where research and priorities fit within this
What the team needs to explore
Before their next assessment, the team needs to:
- consider the different types of users within the broadly defined categories and demonstrate an understanding of where their needs are considered within the service
- consider additional key user groups in the journey and process, such as GPs, and what they need of this service to ensure it meets project objectives
- demonstrate clear understanding of the user journeys for self-referral and for researchers that include elements of the journey prior to, or after, the usage of the service itself that could be critical to the successful outreach, awareness, and rollout of the service into live
- demonstrate how the service and prioritised features solve the user problems identified in discovery, such as making recruitment faster or improving the research participant experience because whilst the team has shown some evidence of this in their iteration in alpha, it’s important to test for and demonstrate behaviour and goals rather than overly rely on preference or expectation feedback of screens
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team had well defined user needs for the problem to be addressed considering both the upstream (participant engagement) alongside the downstream (researcher requirements) simultaneously
- the team showed an appreciation of the wider issues faced by the service in terms of ensuring it provides an integrated end to end solution for both of the key user types
What the team needs to explore
Before their next assessment, the team needs to:
- place the work in the context of the wider roadmap for the service given the full scope of the service
- ensure they are clear of what is out of scope for the MVP of this product
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team were aware of the existing non-digital service and were not looking to replace the service but provide a complementary service
- within the service the proposed solution would integrate with existing processes where appropriate and not force all communications down a digital route but ensure the service offered by researchers met the needs of participants
What the team needs to explore
Before their next re-assessment, the team needs to:
- determine how the existing recruitment channels and this new development will enable a wider range of participants
- ensure the design of the two parts of the service are aligned to show a service that would provide a joined up experience for both the researchers and the participants
- demonstrate through service mapping how the end to end service operates to meet the needs of all potential participants in research
Reassessment
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team had a clear view of how the existing recruitment methodologies could be used in conjunction with the new service leading to a wider range of participants. This should be seen as a key indicator of success for the project
- the team were aware of the existing non-digital service and were not looking to replace the service but provide a complementary service
- within the service the proposed solution would integrate with existing processes where appropriate and not force all communications down a digital route but ensure the service offered by researchers met the needs of participants
- the team had ensured that the service had a uniformity around design for both the participant and researcher portals giving the overall service a more joined up look and feel
What the team needs to explore
Before their next assessment, the team needs to:
- consider the wider issues of integration of some of the non-digital journeys into the service to enable participants to be added to the service through multiple points in the recruitment process, subject to the needs of the individuals
- look at the user needs of other recruitment channels to determine if they can be incorporated in to the wider service design, for example those individuals recruited directly through GPs who are digitally enabled
- develop a clear service map incorporating the touch points with the existing non digital service for recruitment for research
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the enrolment service used the GOV UK and NHS UK design patterns and was easy to follow after the iteration to one question per page
- the team demonstrated understanding of how the product fits into the wider context and system constraints
- the team showed how prototype designs were iterated based on user feedback to make them easier for research participants to use
What the team needs to explore
Before their next re-assessment, the team needs to:
- ensure the the user interface for the researchers is more inline with GOV UK standards, this should be a priority and user testing with this group is an issue that needs to be addressed
- demonstrate how the team has understood and iterated on users’ journeys because although the team have iterated on the prototype designs in alpha, they have not demonstrated how the solution designed was determined, what other service designs were considered to meet the identified user needs and how the service itself (not just the screens) has been iterated and developed as the team’s understanding has evolved throughout the project
- demonstrate designs for in-study communication, noted as in scope for this assessment, but not demonstrated
Before their next assessment, the team needs to:
- demonstrate understanding of existing and new user journeys, not just service maps, including offline elements
Reassessment
Decision
The service has provisionally met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team had understood the value that GOV UK and NHS UK design patterns could bring to their service. They are building their service with components that have been refined by the research and design community across government. By harnessing the power of these patterns, they are increasing the efficiency of government spending, so they do not have to duplicate iteration that has been done in the past
- front end coding expertise had been brought into the team, and their approach would be to make the code semantically the same as gov design patterns. This means that they can start with components that are already accessible, reducing rework and duplication
- the team demonstrated understanding of how the product fits into the wider context and system constraints
- in-service communication has been explored and iterated during the reassessment phase, and the team has built their understanding of the risks and issues that they will need to tackle in the next phase
- the team showed a textbook example of design pattern iteration. Beginning with a left-hand navigation pattern, they showed how their research with users had moved them away from a linear process, resulting in a switch to a tile pattern which would allow users to pick and choose
What the team needs to explore
Before their next assessment, the team needs to:
- demonstrate how they are meeting the intention of the government design principles point 9: “Be consistent, not uniform”. The existence of a separate design component system within NIHR is concerning for the following reasons:
- though paid for by public monies, it is not publicly available, so the government digital community cannot benefit from it, or contribute toward it
- it gives precedence to a fragmentation of government design standards which could lead to other services fragmenting further
- there has been no evidence supplied, to show what benefits a different design system brings
- work with GDS and NHSX to reconcile this point of friction between design systems, so that together we can create a fair and transparent process for assessment
- demonstrate that they are meeting the instruction of government design principles: point 9 “When we find patterns that work we should share them, and talk about why we use them.” The team should show that they have contributed their own research to the wider design community and NHS community. Expert and staff-facing systems are very relevant within the Government Digital design profession, and we know it’s hard to design complex systems for expert users. There is not nearly enough research logged on github in these areas, so while the team are ‘doing the hard work to make it simple’ they must share that work so others benefit from it.
- continue semantic alignment of front end code with established patterns
- continue looking at the in-study communication aspect of the service, to understand how users will be notified and kept in touch. This is a clear driver for the service to provide a standard way of communicating, particularly at the end of trial
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- consideration is being given to accessibility within the alpha via a noted external accessibility audit planned
- use of the standard patterns in the GOV UK design system has been used for the participants registration, ensuring the components used have been widely tested
What the team needs to explore
Before their next assessment, the team needs to:
- consider testing with users with accessibility needs for both the participant population and the researcher population
- show how this work fits into wider recruitment for trials and what actions they have taken to align the systems presently in place
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team showed clear evidence that a multidisciplinary team had been involved in the development of the service to date, with a wide range of individuals involved in providing evidence to the assessment panel
- there was also clear evidence of engagement across the team and with other teams within NIHR to ensure wider NIHR objectives are being addressed whilst the service is being developed
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the newly recruited staff are integrated into the team and particularly the content designer is able to address the issues with aligning content across the service
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team have adopted the use of agile ways of working, this was shown to be embedded in the team with all the team contributing having equal say in the development
- the agile methodology was not all about daily stand-ups, it was evident from the presentation that the team are operating together rather than just at the stand-ups with a well integrated team consisting of internal staff and contractors.
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the whole team is involved in the user research rather than rely on the feedback from the user researcher
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the changes to the service for the participant enrolment, these were clear to see from their initial versions and show the service is being developed iteratively based on user feedback
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the researcher “portal” is developed in a similar manner to the participant enrolment service, with on-going user research to enable this to be iterated to meet the stated user needs
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the service is being developed using encryption of data at rest and in transit using appropriate protocols and methodologies
- the service is using 2FA for the authentication
- the service is using appropriate key vault and certificate management as well monitoring tools for the security
- the team is working with Data Protection officer for Data protection impact assessment and planning for the Pen test
What the team needs to explore
Before their next assessment, the team needs to:
- ensure a DPIA is completed
- ensure that Security Risk Assessment, Threat and Vulnerability Tests and Pen tests are all completed
- publish the approved Cookie and Privacy policy
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the service owner clearly articulated the purpose of the service and showed a clear understanding of the expected benefits and the KPIs required to be measured to show success.
- the linkage of the service with wider organisational aims was captured within the KPI’s proposed in the documentation provided
What the team needs to explore
Before their next assessment, the team needs to:
- determine the targets that the service is aiming to achieve for the proposed KPI’s, this may be in the business case but could be detailed with the KPI information.
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- team is using right tools and technologies for the development and agile delivery
- the service is using appropriate tests and CI/CD tools for development and release management
- the service is engaged with the security architect and using the secure by design principles
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the service team has plans to publish the code in open with GitHub
- the service yean is using the open source tools for development
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the service is using APIs for interoperability
- the team is using common components like GOV.UK Notify, and NHS internal systems like CPMS and RTS
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the solution was being developed with strong governance, including around technical design, that will ensure the service is built in a reliable manner.
- the service is using CI/CD pipeline and release new features without any downtime
- the service will be using the wider NHS wide BAU services for the support
What the team needs to explore
Before their next assessment, the team needs to:
- develop a maintenance and outage page
- develop and publish the workflows for escalation and workaround when or if the service is down