From: DHSC
Assessment date: 3 March 2022
Stage: beta
Result: met
Service provider: NHS Digital
Previous assessment reports
No previous assessment reports have been published. All previous assessments have been against the old 18-point standard.
Service description
The Buying Catalogue is a digital marketplace which enables NHS clinical IT systems buyers to browse, discover, compare, and create orders for centrally assured clinical IT systems.
Currently, the service consists of two main components: a catalogue of solutions and an order form.
Users can use the catalogue of solutions to search and filter the solutions based on the procurement framework they’re available from or based on the Capabilities they map to, where Capabilities are the way solutions demonstrate how they help meet business needs, which helps users find the solutions that best meet their requirements.
As well as supporting short-listing, the catalogue of solutions contains detailed information about each individual solution with the right level of detail for informed decision making. To offer a clinical IT system to the market through the Buying Catalogue, a supplier must have already demonstrated to NHS Digital’s GP IT Futures programme that the system has met NHS Digital’s requirements specification for each type of system. This demonstration is assured through a centrally operated assessment process, the outcome of which is clearly presented to users as part of each system’s ‘listings page’ on the Buying Catalogue. Note that the Buying Catalogue itself is not used to drive, manage, or enable this demonstration process.
The order form component allows users to start a transaction by creating an order summary for a procurement, online. The tool has a triage section that provides information on the procurement route a user needs to take and how they will be funding their order. It has been designed as a step-by-step process to enable users to efficiently complete an order.
Contextual support is integrated into the order form allowing users to get just-in-time information and complete the next steps. After completing an order form, users can review instructions on the next steps needed to complete the contractual documentation and formal process of a procurement off-catalogue.
Service users
The primary users of this service are:
- Clinical Commissioning Group (CCG) Buyers: CCG Buyers are responsible for most clinical IT system procurements on behalf of general practice and primary care. These users will use the full extent of buying functionality on the Buying Catalogue, including functionality that requires a log in to complete and place orders.
Secondary users for the service are:
- NHS Commissioning Support Unit (CSU) Buyers – CSU Buyers are specialists that are responsible for procurement on behalf of CCG organisations and are usually involved in only the procurement stage of the process, rather than pre-procurement such as browsing or comparison stage, as well as the service management and delivery of the procured services. They only utilise call-off order functionality of the service.
- Contributors: NHS professionals such as GP Practice Managers that are engaged as contributors by Buyers. Contributors have a responsibility for local IT systems, and they therefore have a professional interest in understanding the range of systems that are available to procure through the Buying Catalogue. They provide their input and requirements to Buyers so that Buyers can procure the right IT solutions on their behalf. Note that these users will only use ‘open’ features of the Buying Catalogue, as in, features that do not require a login.
- Internal users: NHS Digital’s GP IT Futures programme users are responsible for administrative tasks, support and financial management of the Buying Catalogue service.
Report contents
- Understand users and their needs
- Solve a whole problem for users
- Provide a joined-up experience across all channels
- Make the service simple to use
- Make sure everyone can use the service
- Have a multidisciplinary team
- Use agile ways of working
- Iterate and improve frequently
- Create a secure service which protects users’ privacy
- Define what success looks like and publish performance data
- Choose the right tools and technology
- Make new source code open
- Use and contribute to open standards, common components and patterns
- Operate a reliable service
1. Understand users and their needs
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a strong understanding of all the end users, their different types, characteristics, current pain points, and needs
- the team has managed to source and recruit a strong user base to work with iteratively, despite the difficulty of finding users in the process of purchasing clinical information systems
- the team is using good and mixed research methods to iteratively gather user feedback, evaluate and prioritise the service delivery
- the team acknowledges the range and volume of user needs to address and are working hard to put in place a clear scope of the service
What the team needs to explore
Before their next assessment, the team needs to:
- be clear upfront and communicate to users and site visitors what exactly this service can and cannot do, and how the Buying Catalogue works. For example
- the actual review and preparation of answers require collaborations with multiple contributors involved in the buying process, which happens outside of the system for a single user
- the actual financial transaction does not happen in the system
- where is the catalogue of solutions coming from and how is it updated for a fair and open market?
- include in the research plan, how they intend to evaluate the end-to-end service from users’ perspective, especially the primary users. For example, if the buying process requires a nominated third-party specialist, how does the service meet the actual buyer’s need?
- provide evidence in the next assessment that the current approach of alphabetical browsing is the optimal experience
- consider different research methods to effectively test and measure users’ mental model of browsing the catalogue for a better experience
- clearly identify the most common user goals and needs when browsing the catalogue and make that option prominent
- put in place a regular review and triangulation of the different feedback, from feedback forms and Hotjar, for a comprehensive and iterative insight gathering and prioritisation. It is currently not clear how the different sources of insights are brought together systematically
- continue to identify and learn from similar existing services within GDS
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team has identified and clearly signposted significant dependencies. They had put into place practical activities to share findings and keep up to date with stakeholders in teams
- the team has used exploratory research to understand the pain points for users and identify their problems to solve
What the team needs to explore
Before their next assessment, the team needs to:
- continue exploration and elaboration of the boundaries of their service, where it cuts across other, related services
- show evidence of progress on a path to resolution, for these cross-boundary blockers and impediments
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team has investigated and mapped the online and offline processes, and set them within the wider context of procurement
- the team has carried out workshops and co-creation sessions, and results used to inform the service design
- the team is monitoring quantitative data via a range of tools, including Adobe Analytics and Hotjar. This was being used to make evidence-based changes to the service
What the team needs to explore
Before their next assessment, the team needs to:
- continue investigations, and expand mapping of the user experience across all channels
- validate end-to-end procurement process, beyond the research or order phase, such as planned digital tender or response journey, with similar service teams such as Crown Commercial Services
- monitor users as they progress through different levels of support, including 1st, 2nd and 3rd line, and iterate the support model
4. Make the service simple to use
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team has used common design patterns, and iterated them
- the team’s user research findings showed improvements in the key usability measures of efficiency, effectiveness and satisfaction, from the previous solution
What the team needs to explore
Before their next assessment, the team needs to:
- look at how the service adheres to content guidelines, particularly with the use of capitalisation
- look again at the choice of design patterns for the more complex pages, for example manage order, filter solution and detailed product templates. Are they following baseline established design patterns? Have the designs been iterated based on user research? Where designs differ from established design patterns, have these findings been shared with the wider gov.uk and NHS design and research communities, via GitHub, Slack or other means? At the next assessment, the team must be able to evidence departures from common patterns
- consider carrying out peer reviews of the service with content and design peers within the NHS digital services, to ensure alignment with established design patterns
- research the impact of using confluence in this service for users and share findings with wider programme. These findings can help identify improvements to make the service easy to use. The team should consider:
- how do users transition to and from the Buying Catalogue web and confluence?
- are there any blockers?
- how do they reconcile compatibility with the NHS Design system?
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has used accessibility personas developed by GOV.UK to help direct hypothesis about accessibility profiles
- the team has carried out both internal and external accessibility audits, including testing with assistive technologies
- the team has made sure there is an accessibility statement with up-to-date information, and backlog items already planned to address issues
What the team needs to explore
Before their next assessment, the team needs to:
- continue with regular internal accessibility audits
- show how outstanding and new accessibility points have been actioned as per plan
- continue to integrate accessibility testing into the design, development and test phases
- monitor, and if possible, mitigate confluence's accessibility issues as acknowledged in the accessibility statement
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team is multi-disciplinary and has considered recruitment for potential analytics skill gap
- the team moving into public beta will remain consistent
- the team is a blend of contractors and in house team members with project documentation held on NHS servers
- the team’s Programme Head keeps close to the progress made and the work in flight
What the team needs to explore
Before their next assessment, the team needs to:
- plan dedicated time for effective handover if delivery partner is offboarded
- plan recruitment for performance analyst
- focus on collaborating continually across the multiple dependency teams across the wider end to end service. The service team would benefit from dedicated time with dependency teams to share knowledge and findings for holistic service progress
- plan for how the service team will be after public beta
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team has maintained close links to dependencies with other teams
- the team iterates their ways of working with frequent opportunities to inspect and adapt
- the team uses robust means for prioritisation including 3 amigo sessions
- the team has direct access to wider programme Senior Leadership Team for escalating issues or blockers
- the team is using agile ways of workings to frequently iterate and deliver
What the team needs to explore
Before their next assessment, the team needs to:
- continue to inspect and adapt the team’s ways of working throughout public beta
- use show and tell sessions and dedicated collaboration time with dependency teams to collaborate and playback findings
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the service underwent multiple iterations driven by user research findings
- the team has established feedback loops via Hotjar and more to help drive future iterations
- the service team have identified areas for future improvement, such as the filter function design, and have planned future research and improvements in their roadmap
What the team needs to explore
Before their next assessment, the team needs to:
- establish hypotheses on what the service team expects to see during public beta and test against these hypotheses
- continue to use agile ways of working to promote frequent delivery and learning cadence
- continual to iterate from closed feedback loops to better meet user needs
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team has dramatically simplified the architecture, relying on public cloud concepts and reducing the team’s areas of concern
- the team keeps captured user inputted data to a minimum and ensure it does not contain sensitive information
- the team have an approach to good practice such as static code analysis in their build pipeline and secret management
What the team needs to explore
Before their next assessment, the team needs to:
- ensure authentication for Buyers and listing-maintainers is custom for this service. Assess the use of existing services like NHSmail for GP staff and NHS Digital’s AzureAD for internal staff
- address the two factor authentication (2FA) penetration test finding. it is important to note that this would be covered by the move to an existing authentication solution
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team intends to capture the four mandatory Key Point Indicators (KPIs)
- the team identified additional KPIs to measure progress against desired outcomes such as helping improve the availability of information for users to support decision making
- the team has considered what success looks like regarding delivery of a fully digitalised service. This was linked to the wider programme outcomes to help achieve a health market and support success of market
- the team is able to measure success within key parts of the service with analytics and log events
What the team needs to explore
Before their next assessment, the team needs to:
- establish visibility of KPIs for the service team and wider stakeholders from the programme. This transparency will allow broader team members identify relevant trends and patterns that could impact other aspects of the end to end service
- consider documenting the observable specific changes in users’ behaviours as a result of this digitised service. For example the changes in market prices due to the service transparency. This would be a useful case study to share in the open to teams across government
- explore metrics that provide feedback for users outside of the current programme user groups. For example metrics on the success of listings for sellers
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has built the service on open source technology
- the team has dramatically simplified the architecture, relying on public cloud concepts, reducing the team’s areas of concern, cost and risk of change, and simplifying transition from the supplier
What the team needs to explore
Before their next assessment, the team needs to:
- look to increase the sustainability of the service by calculating and tracking the carbon footprint
- monitor the current dependency on Confluence and explore alternatives that can meet the service standard
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team is hosting code on an open repo: https://github.com/nhs-digital-gp-it-futures/GPITBuyingCatalogue
- the team has removed complexity, as in their number of repos, and maintained public archives
What the team needs to explore
Before their next assessment, the team needs to:
- move to NHS Digital’s GitHub account: https://github.com/nhsdigital
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team is making use of services such as GOV.UK Notify
- the team has used design system components to move to HTML and reduce the dependency on PDFs for complex content
What the team needs to explore
Before their next assessment, the team needs to:
- contribute to the design system community their alternative use of components and new patterns as covered in point 4
- monitor the current dependency on Confluence and explore alternatives that can meet the service standard
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
- the team has ensured service is covered by extensive testing, which is integrated into build pipelines
- the team have a plan for, and have tested, service recovery including backup and cloud region failover
- the team captures web performance metrics, enabling them to make improvements to the user experience
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the service can sustain outages on external services such as GOV.UK Notify
- look at improvements to release code when ready, rather than waiting for bundled releases