Skip to main content

https://digitalhealth.blog.gov.uk/2022/01/21/nice-ai-regulation-service-alpha-assessment/

NICE AI Regulation Service alpha assessment

Posted by: , Posted on: - Categories: Alpha, Assurance, Service assessments

Text saying "Service Assessment" with DHSC approved brand colours

From: NHSX
Assessment date: 18 October 2021
Stage: Alpha
Result: Met
Service provider: NICE with BJSS

Service description

Data-driven technology solutions such as artificial intelligence (AI) are growing in the health and social care market at a significant rate. People developing these technologies (developers) and looking to use them (adopters) often struggle to know about and understand regulations on safety, efficacy, data use and protection. Developers also want to understand how to demonstrate real-world cost-effectiveness (known as health technology assessment; HTA), and adopters want guidance on how to assess and compare this across different solutions.  

The “AI Regulation Service”, developed by the multi-agency advisory service for AI in health and social care (MAAS Programme) will consolidate this information in one streamlined website and enable 1:1 guidance where required so that the journey to safe and value-adding implementation of these technologies is simplified for both developers and adopters. 

Service users

  • Developers of AI and data-driven technologies in the context of health and care. The user archetypes identified, which also can be combined to generate more nuanced descriptions of their needs:
    • Established companies with experience developing technology solutions in health and care already, including regulatory and HTA experience
    • Tech natives, with strong experience in AI and data-driven technologies, but limited experience of both health and social care as well as its relevant regulations and market
    • Healthcare natives, with strong experience in health and care but limited experience in AI and data-driven technologies. These are typically start-ups or spin-outs from academia or clinical context and tend to be clinician led.
  • Adopters of AI and data-driven technologies in the context of health and care
    • Commissioners focused on delivering value at a large population level
    • Secondary care focused on meeting needs of their patients 
    • Primary care and social care, looking to make local impact and improvements

Report contents

  1. Understand users and their needs
  2. Solve a whole problem for users
  3. Provide a joined-up experience across all channels
  4. Make the service simple to use
  5. Make sure everyone can use the service
  6. Have a multidisciplinary team
  7. Use agile ways of working
  8. Iterate and improve frequently
  9. Create a secure service which protects users’ privacy
  10. Define what success looks like and publish performance data
  11. Choose the right tools and technology
  12. Make new source code open
  13. Use and contribute to open standards, common components and patterns
  14. Operate a reliable service

1. Understand users and their needs

Decision

The service met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • the programme report, assessment presentation and show and tells showcase good quality research documentation for a discovery and alpha stage
  • user needs are drawn from multiple sources such as; user interviews and quotes, secondary research, stakeholder interviews and workshops
  • the approach to discovery including stakeholder and MAAS partner engagement workshops, alongside in-house expertise has enabled the team to map the end-to-end user journeys for different user groups
  • within the team, several colleagues were involved in user research and the user engagement process
  • findings are regularly shared with the wider stakeholders and team via show and tells
  • the team has processes in place for reaching into networks of relevant professional users 
  • the team calculated an approximate market size based on healthcare service engagement; this analysis could continue further by estimating the volumes of individuals with various skill types or doing these different types of AI-related roles 
  • appropriate and relevant personas and sub-personas have been created
  • the team have defined simple user flows identifying core ‘must’ legal requirements within the journey
  • pivoting in alpha - the team have discounted building their own platform with account logins, and so on, it was seen as unnecessary when another similar one (the NHS innovation service) already exists
  • there have been multiple rounds of concept testing and research with different user groups. The findings are leading to some further iteration of the service proposition

What the team needs to explore

Before their next assessment, the team needs to:

  • get a dedicated UR role within the team, to create a research strategy that supports ongoing discovery of users' needs and ensure adequate research documentation
  • structure research planning more clearly, for example: how are the team prioritising the requirements for design and research? What are the plans for the next few sprints? Can the team demonstrate research recommendations flowing through into service, and demonstrate evidence that KPIs have improved?
  • provide evidence of how the personas and sub-personas and user needs are being used to prioritise and drive the team’s work. Adopter sub-personas could be refined further considering archetypes around shared behaviours, goals or motivators and pain points or barriers - rather than by healthcare organisational type
  • conduct full end-to-end user testing across all the journey channels/touchpoints. For example from the user experiencing an initial problem, through finding the service, identifying content, navigating between channels, registering if required, completing transactions 
  • alongside task-based testing, the team could use a wider range of qualitative methods (like diary studies) to understand how users engage with this service and complete journeys that require returning over a longer period of time. For example to understand how the supporting information is gathered and processed, before it is being sent to the NHS Innovation transactional service forms
  • continue regular testing with specific user groups, and consider broadening the recruitment approach to help ensure a diverse range of feedback is received (for example, creating user panels, using b2b recruiters, participant incentivisation, online community engagement channels, popular development websites)
  • demonstrate intent to research and understand wider user groups than the current personas. This includes users who are not already providing services in the healthcare space as there is a risk of research bias towards the needs of users who already know how to navigate public sector regulations. This also includes international users who may intend to export solutions to the UK. This intent should be included in a research plan that targets user groups and the assumptions or hypothesis they plan to test
  • have a plan for regular and continuous tracking of user satisfaction and usability KPIs across various service touchpoints, including a plan for gathering metrics around demographics of users (for example, onsite-feedback, follow up surveys, analytics)

2. Solve a whole problem for users

Decision

The service met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • the team focussed on a solution that can be provided quickly. For example the content, guidance and signposting links to paid advice services 
  • the team identified pain points for developers and adopters navigating the AI regulation ecosystem. They discovered the current ecosystem favoured experienced developers who had greater understanding and dedicated resources for navigating AI regulations
  • the team aimed to reduce barriers to entry and reduce problems for native developers
  • the team has fairly clear service goals. Short term - ensure people know where to go to start and find support. Longer term to change behaviours for developers, ensuring they do not waste time using the wrong resources
  • the design team operates in a complex regulatory environment with many partner organisations and stakeholders. The AI regulations are also undergoing a current review so there is some uncertainty around what content specifically the team must provide to users in the future. The design approach and templates support rapid changes based on evolving policy requirements and user needs

What the team needs to explore

Before their next assessment, the team needs to:

  • over the short term, the team could consider if any of the issues can be solved at source - for example, by removing or updating outdated and legacy or duplicated content across the various gov.uk and regulatory and MAAS partner sites
  • clarify their role and scope in terms of supporting simplifying the wider complex set of regulations. For example how might they work with stakeholders, policy makers and regulators to share evidence that supports simplifying end-to-end journeys for users
  • through private-beta the team should conduct discovery research to understand the role and benefit of providing ‘expert advice... paid services’ links within this service. How do users feel about these paid services? What problems do the paid services solve? Can any of the knowledge be made more accessible to SMEs?
  • explore several options beyond content-only pages of content templates and guidance to meet the users needs. It would be good to see evidence of the team trying multiple UI patterns, for example, checkboxes or other triage processes or other toolkits/frameworks that might enable smaller AI development teams or SMEs to adopt best practises more quickly
  • the team captured quantitative usability testing - task completion and SUS as part of their moderated research approach. This will be important to monitor at a larger scale as the amount of content and features on the service grow
  • clarify how they are planning to measure success for their main service goals. For example, how do the team plan to measure behaviour change and adoption of the wider regulations?

3. Provide a joined-up experience across all channels

Decision

The service met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • the team found that users will mostly discover the AI regulations and guidance by  Googling; following this the team are already planning SEO work for the new site to ensure search engines identify relevant content. The team are also providing signposting between several AI sites that NHSX and MAAS manages 
  • the team mentioned having communication channels established for reaching non-digital users via newsletters (for example, if users are not aware of the new site) 
  • the team intends to join up with the NHS Innovation service to provide advice to users. This is mapped in the service blueprint and it will prevent the duplication of features that already exist such as account creation
  • the team is collaborating with linked services and have dedicated relationship managers for each linked service

What the team needs to explore

Before their next assessment, the team needs to:

  • document how this new service links and integrates with the NHS Innovation Service. Clarify the roles of both services and how the user experience flows across multiple channels
  • consider a wider comms strategy for beta, for example how do users know if a regulation changed after they initially visited the site, if they don’t have an account. Do users need any types of notifications?
  • the service blueprint should be expanded to begin before entering the service. For example the team should explore what other websites or services AI developers and adopters regularly use and whether there are opportunities to share content, increasing awareness and creating engagement beyond search and outbound mail campaigns, there is a risk that newsletters only reach the users groups who are generally ‘already engaged’ and follow changes in the regulations

4. Make the service simple to use

Decision

The service met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • the team articulated a clear vision of success for this service during alpha -  to provide clarity of regulation. Having established that, their approach will be to concentrate on getting the content right, and then funnel users to the NHS E&I Innovation service, where users can create an account 
  • the team demonstrated good awareness of wider NHS business drivers -  Mission 2: Joined-up services that solve whole problems and span multiple departments - By passing on the complexity of user account creation, the team hope to avoid duplication, and also concentrate resource on creating clear content
  • the panel were extremely impressed with the comprehensive and robust content design approach. The team documented activities and deliverables from high level strategy, through information architecture and a detailed taxonomy. They analysed a huge amount of existing content and condensed it down into a clear and detailed report
  • they involved subject-matter experts in co-creating a vision of a simple service, showcasing a wide range of stakeholder management skills including workshop planning and facilitation
  • the team demonstrated awareness of design system patterns, both NHS and gov.uk

What the team needs to explore

Before the next assessment, the team needs to:

  • continue developing their service in alignment with gov.uk and nhs design patterns 
  • continue the work they are doing to push the boundaries of established design patterns, so that they can support the user needs that they are uncovering. At the same time keep clearly documented evidence, and share findings with the research and design communities when appropriate
  • the expander component is used frequently - the team should continue assessing its relevance through beta
  • maintain and build on the relationships they established with peers in adjoining services, continuing to work across organisational boundaries to understand how to create a simple, streamlined service with minimal duplication
  • the iteration of ‘Must’ vs ‘Should’ to ‘Legal requirement’ vs ‘Best practice’ was well researched and documented. This is a piece of work that will have resonance with other government departments and the team should share this across the content profession in nhs and gov.uk during beta

5. Make sure everyone can use the service

Decision

The service met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have completed an initial round of testing with 5 users who had various access needs (via DAC)
  • the team have plans for a ‘service backdoor’ and email support channel for users who cannot use the transactional service
  • they have used a plain English content strategy, following gov.uk and nhs standards
  • they have made impressive progress in understanding how language causes barriers to understanding this specific subject matter, and developing a bespoke glossary to help new users navigate this complex area

What the team needs to explore

Before their next assessment, the team needs to:

  • continue their access needs research; many professional users in the NHS landscape (developers, adopters, etc.) have a wide variety of access needs. Also many NHS users work in challenging contexts and environments, potentially with slower connections or not using the latest hardware or software - keeping more detailed records of research participants and service user demographics would be recommended going into beta
  • continue using a plain English content strategy, following gov.uk and nhs standard, and incorporate a process for testing that language continues to be accessible
  • continue building the bespoke glossary and share across the content profession

6. Have a multidisciplinary team

Decision

The service met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • the core team had a blend of permanent and contractual team members with a variety of digital skills and expertise
  • contractual team members helped upskill permanent team members in digital skills and expertise. This alongside the Programme Recommendation report will help the transfer of skills and knowledge to future delivery teams
  • the Programme Recommendation Report details a proposed plan for the core team in private beta team
  • the core team frequently collaborated with the working group (MAAS partners) to access specialised expertise
  • the three designers involved had significant experience in running and delivering user research on projects 

What the team needs to explore

Before their next assessment, the team needs to:

  • recruit specialised User Researcher into core team. Currently this role was undertaken by multiple team members yet without dedicated User Researchers the team are at risk of expert bias 
  • continually work with a blended team of in-house talent and externally commissioned experts. This collaboration is crucial for service delivery
  • continue to upskill and recruit permanent staff for key roles such as Delivery Manager to gain better digital skills and expertise

7. Use agile ways of working

Decision

The service met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • the core team used recognised agile ways of working to inspect and adapt. For example using daily stand-ups, weekly retrospectives and sprint showcases
  • the core team used Kanban to visualise the work in progress
  • the core team used fortnightly sprints with sprint goals
  • there was frequent collaboration with MAAS Partners who attended workshops. To capture unbiased opinion occasionally the core team met with individual MAAS partners for feedback
  • team members were assigned as ‘relationship managers’ with the linked services
  • team members shadowed user research sessions

What the team needs to explore

Before their next assessment, the team needs to:

  • expand the audience for private beta showcases and show and tells to include representatives from the six services with links to the AI Regulation Service 
  • continually develop the digital skills & expertise of permanent team members especially in agile ways of working and user centred design
  • use private beta to test governance. The Programme Recommendation report proposes a new governance structure for post beta. It is recommended that the service team establishes the governance arrangements sooner and ensure they are consistent with the agile governance principles 

8. Iterate and improve frequently

Decision

The service met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • the team demonstrated iterations of information architecture and design that were informed by emergent learnings. These iterations were driven by some user research, stakeholder workshops, concept testing and accessibility sessions
  • content such as the guidance pieces were iterated. Especially impressed by findings regarding the language of ‘must / should’ which iterated to ‘Legal requirement / best practice’

What the team needs to explore

Before their next assessment, the team needs to:

  • complete end-to-end user testing to identify any insights to drive further iterations. This should include testing how user’s find and access the service and if desired how they use this service to access the NHS Innovation Service
  • continue to use sprints and sprint goals to efficiently iterate the service.
  • clearly document any learnings from feedback cycles, for example user testing and show and tells, to evidence the rationale for future improvements

9. Create a secure service which protects users’ privacy

Decision

The service met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had considered security implications for the service at alpha. In their planning, they had identified the need for security boundaries and the likely threat vectors that might affect their service when built
  • the team had proposed a security architecture that could accommodate encryption at rest and in transit
  • the team have made sensible recommendations for consideration during beta for technology and operational security (for example, automated AV, perimeter firewall)    
  • in proposing to re-use elements of the existing Innovation service, the team had considered the benefits of reusing Identity Management and secure logon and joining up logging, security alerting and configuration, which should provide more assurance of a secure and monitored service 
  • the team had given some consideration to the types and sensitivity of data that the service might handle   

What the team needs to explore

Before their next assessment, the team needs to:

  • in beta, the team should, as planned, ensure that they check their deployed services with an appropriate set of tools at varying stages of deployment, for example Web check and Azure vulnerability scanning in addition to code coverage and automated testing
  • complete a Data Protection Privacy Impact Assessment (if applicable) or amend documentation based around the existing Innovation service - they must check and begin the process of updating or completing a form early in any beta delivery
  • ensure that their new content service publishes appropriate cookie consent notices for analytics. If any further integration or registration for users were to be added the service would need a greater focus on retention and security of access for end user interactions
  • engage with the NHS security and platform team, to conduct vulnerability assessments, including the suggested penetration test.
  • consider the security aspects of the CMS content and administrative accounts in terms of reporting for reset, in addition to planned service desk support interactions

10. Define what success looks like and publish performance data

Decision

The service met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had identified the intended outcomes of this service and what success looks like in the short term and long term. Some of the KPIs such as ‘unique users staying longer for 30 seconds on guidance pages’ could help measure the short term goal to help users access key information in one place 
  • the team had obviously given some thought to the definition and use of KPIs, although they were bounded by only the mandatory 4 main areas 
  • the team had thought about how to capture some of the metrics to inform business use and how users might use the service    
  • the team were aware of different tools, methods and platforms that could support the capture of performance and user information, including between services and have made some sensible recommendations 

What the team needs to explore

Before their next assessment, the team needs to:

  • further refine the data points suggested so that a range of measures are developed beyond the current propositions; they should continue to develop the specific metrics as the service is built
  • clarify how they are planning to measure success for their longer term service goals. For example, consider how to measure developers accessing key information to make safer products.
  • think about how they can articulate the success of the service, especially in a “Cost Per Transaction” model, when not all interactions may have a clearly capturable end-point
  • think about how the evidence captured end-end across service journeys will be used and who will have visibility of metrics
  • map and be able to articulate feedback routes for key performance indicators to relevant parties, who may not only be within the service team
  • think about how performance data and service desk contacts can be fed back into the service alongside feedback, surveys or emails

11. Choose the right tools and technology

Decision

The service met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • the team had made a thorough investigation of possible technology options as part of their alpha phase, using well defined criteria to explain their recommendation   
  • the team have recommended an appropriate cloud based environment to host their solution(s), making use of native tooling and technology where appropriate (for example, MS PostgreSQL)
  • the proposed solution uses a mixture of open source software and reuse of common components for hosting their service, proposing to reuse existing technology approaches that are common in the NHS, such as Wagtail 
  • their recommendations have considered the eventual supportability and cost of a solution    
  • the team are proposing to deploy using IaaC management (Azure DevOps)

What the team needs to explore

Before their next assessment, the team needs to:

  • the team must make sure that they are able to utilise and share the existing cloud environment as proposed in alpha. They must firmly agree access to the environment and ways of working with other product teams or their approach or their beta build will not be successful and will require rethinking
  • consider further investigating the use of shared (Claims based) logon/AAD integration for Wagtail editor roles within their CMS service, so that user accounts and user account access have better visibility and security protection from cloud services
  • continue to refine and embed the proposed monitoring and analytics for their service deployment

12. Make new source code open

Decision

The service met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have proposed to commit changes to open source (for example, Git) repositories for any beta development    
  • the team have suggested open principles in their development and are aware of how they can make appropriate information available whilst retaining control of sensitive code and secrets   

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure that at beta, the intent outlined at alpha is implemented, with appropriate open repositories available, with relevant code being published     
  • ensure that any published or available repositories are adequately    

13. Use and contribute to open standards, common components and patterns

Decision

The service met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • the team is using common components from the NHS design kit     
  • the team have talked to and communicated with other teams and groups and suggested that they will provide feedback to others (for example, Wagtail community)
  • the team intend to share components and service elements with other product teams    

What the team needs to explore

Before their next assessment, the team needs to:

  • ensure ways of working are agreed with the other relevant product teams    
  • continue to work in a shared way across product teams and consider including and feeding back to the wider community (for example, reference to design patterns or content findings) as they refine their service

14. Operate a reliable service

Decision

The service met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • the team have thought about using native backup, regions and some measures to support a highly available service that could scale   
  • the team have a strategy for development to production releases
  • the team is planning on using IaaC and release pipelines that could support the release of new features with limited downtime and provide a DR approach   
  • the team have proposed some measures and support ideas that would allow monitoring of the service from a non-functional perspective

What the team needs to explore

Before their next assessment, the team needs to:  

  • refine and implement appropriate recovery, deployment and monitoring of service components as proposed in Alpha and start to think about documenting service recovery and support actions     
  • confirm with other product teams that they can utilise shared resources, support and the platform, to provide certainty about the reliability and performance of the innovation service component of the service
  • confirm and agree the proposed release cadence and ways of working with the Innovation Service team, to ensure appropriate releases

Sharing and comments

Share this page

2 comments

  1. Comment by Rob Lang posted on

    It's concerning that AI ethics isn't mentioned anywhere in this document.