From: DHSC
Assessment date: 1 September 2022
Reassessment Date: N/A (Reassessment through email)
Stage: beta
Result: met after reassessment
Service provider: DHSC
Service description
The service provides trust staff with the ability to digitally report building faults in a private finance initiative (PFI) hospital to the helpdesk, via a desktop, tablet or mobile. The aim of the service is to make it easier for staff to report faults, which in turn incentivises the PFI company to rectify them. The service also provides enough information to allow helpdesk operators to enter those faults into the FM helpdesk system so that they can be rectified in line with the PFI Contract.
Service users
This service is for:
- reporters - trust staff who want to report building faults
- operators and facility managers - the help desk operators and facility managers who enter the faults onto the FM helpdesk system
- administrators - the DHSC (and BSA) teams need to configure the tool for trusts
Report contents
- Understand users and their needs
- Solve a whole problem for users
- Provide a joined-up experience across all channels
- Make the service simple to use
- Make sure everyone can use the service
- Have a multidisciplinary team
- Use agile ways of working
- Iterate and improve frequently
- Create a secure service which protects users’ privacy
- Define what success looks like and publish performance data
- Choose the right tools and technology
- Make new source code open
- Use and contribute to open standards, common components and patterns
- Operate a reliable service
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has made sure that the reporting site was easy to access for users, requiring no log in or downloading of an app
- the team has had over 200 faults reported during private beta by 9 trusts
- the team had regular contact with reporter and FM manager users during private beta, with a few different methods to provide feedback such as calls with the product owner, surveys and a feedback link in product
What the team needs to explore
Before their reassessment, the team needs to:
- produce a plan, created by the user researcher with involvement from the team, to show how in public beta they will
- carry out regular rounds of usability testing with more users and user types, and use this evidence to make iterations to the content, design and user flow
- recruit users with accessibility needs and meet in gaps in their current sample
- use a variety of appropriate user research methods to observe users using the service and gather actionable feedback on design, content and user flow
Before their next assessment, the team needs to:
- carry out multiple rounds of usability testing and iteration with FM operators. There was no usability testing done with this user group
- carry out many more rounds of usability testing with the reporter user group. We recommend one round of usability testing and iteration per sprint in beta. Only one round of 10 users and one iteration based on this evidence was carried out in private beta
- carry out more structured usability and user research to observe faults in design and content. This research should be led by the user researcher or service designer, with the rest of the team observing whenever possible
- the feedback process in private beta relied on informal and self-report data such as calls with product manager and surveys
- consider methods such as content testing techniques for example highlighter testing, contextual inquiry and observational studies in the workplace
- do more contextual and in-person observation of users and reduce the reliance on self-reports of behaviour
- be more focused with their sampling of users for research. The team needs to consider the characteristics of the trust and users who have been involved in private beta, and make sure to carry out research with any types of user or trust that have not been covered
- carry out more usability testing with users with accessibility needs. If this is not possible from the current user base, they can instead speak to people with similar roles with access needs. We recommend contacting disability networks within hospital trusts to do this, and looking outside of their private beta trust contacts
Reassessment
The team sent through the user research plan for public beta for the panel to review, as recommended in the beta assessment report. The panel reviewed this and thought it was largely positive. There were some follow up questions which, through various absences, did not get resolved before the key assessor became unavailable.
As a result, the team has been given permission to progress with their public beta and the decision remains incomplete.
2. Solve a whole problem for users
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team has ensured that there was a clear articulation of benefits, namely that the service will increase quality of facilities by allowing better reporting and monitoring of faults, and the activities in private beta had given the team confidence that those benefits could be met
- the team has used private beta intelligently to explore the relationship between the public and private partnerships, and create a strategy to mitigate tensions and provide benefits to both parties
- the teams’ approach has been pivoted to a simple, email-based service, which provided practical integration into 3rd party systems, using existing people and processes
- the team has used private beta to test a simple, minimum product, which has given them confidence that the service can scale up successfully
- The team has a good understanding of how fault resolution is monitored and reported, and how this process could be used to improve hospital environments directly
- the team has communicated outwards and are exploring whether this could help other government organisations with similar facilities management contracts and seeking a better way to report faults
What the team needs to explore
Before their next assessment, the team needs to:
- define the user needs, goals and pain points of the Facilities Management provider staff clearly, and represent them as a core user type
- continue to build relationship with FM providers, and seek out smaller companies to ensure the service can integrate with their processes as well as it does with those of the multinationals
- continue communicating with other organisations, so findings from this service might increase savings or quality across government
3. Provide a joined-up experience across all channels
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team has looked wider than just the reporting interface, mapping the crossover points between systems, people and processes, and how they come together across multiple organisations to find, log, and fix a fault
- the team has made observations in some local environments to understand how users would become aware of the service
- the team has clearly articulated simplified journeys for key user groups in the context of the digital and phone system workflows
- the team has placed focus on ensuring this is a discrete, modular service which can easily operate alongside the existing fault reporting phone lines and other systems PFIs use, whilst neither relying on nor being constrained by them
What the team needs to explore
Before their next assessment, the team needs to:
- provide clear evidence from usability testing the live beta service, not the prototype, of all parts of the service across all user types, both the frontstage and backstage, of how the systems and processes map across the whole service
- use public beta to test the best approaches for entry points to the service. The team have identified that all environments are different, however they have not yet explored whether there are any common approaches or content or channels. These might be used as templates or recommendations of best practice to individual trusts
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team has pivoted since their alpha assessment to create a service that fits with existing Facilities Management processes and third-party systems. They have moved from an API-based, tech-driven approach, into an email-based human-oriented service that is simple, intuitive and comprehensible to all user types. The work that has been done on the service design during private beta is a great exemplar of this service standard
- the team has designed the service to work with multiple device types, giving a user choice and allowing flexibility to their circumstances
- the team has employed a simple reporting interface with no need for log-ins or accounts
- the team has designed the service to be maintainable across multiple organisations, they have created a simple maintenance interface and put thought into which parts are configurable, who will configure it, and how that process will scale
- the team has made the addition of photo uploads to the service, which provides valuable insight to back office staff
What the team needs to explore
Before their re-assessment, the team needs to:
- arrange a peer review by the BSA interaction design profession, to identify instances where the existing design departs from the correct usage of established design patterns. The design assessors and observers on the panel identified multiple areas of divergence from patterns
- standardise the language used in the service, where patterns exist. For example, https://design-system.service.gov.uk/components/error-summary/ suggests the heading ‘There is a problem’ rather than ‘Please resolve the following’
- provide evidence of recommendations from both content and interaction design professions
- present a plan for addressing content and interaction design recommendations
Reassessment
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team has clearly collaborated with designers in BSA, and benefited from the help available to improve understanding of, and adherence to design patterns
- The team has a clear plan to apply those design changes into the service
What the team needs to explore
Before their next assessment, the team needs to:
- show how content and design changes have been incorporated into the service and researched with users
- show how they have approached content design and interaction design as 2 different specialisations
- show how they have planned and implemented research-led content design and interaction design which adheres to standards
- engage with other members of the user centred design community across government, and demonstrate any learnings or contributions they have made in the community
- continue testing the usability, and iterating the design of the service frequently across all user types, including Facilities Management and administrators
- continue questioning and testing their approach, to see where future technology changes or supplier strategy changes might change the solution in future
- baseline efficiency, effectiveness and satisfaction of the existing service with both end users and backstage users, to measure the usability of the new service against the old
5. Make sure everyone can use the service
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has made sure that the service is available those who lack digital skills or internet access, as it uses existing infrastructure which is uses both phone and email
- the team has incorporated automated accessibility testing tools into their delivery
What the team needs to explore
Before their next assessment, the team needs to:
- show that they have considered the order of questions on the page, and researched with users. Currently, the page which asks for a Trust email is at the end of the service, and if a user does not have a valid email, they have spent time documenting issues that cannot be reported using the service
- have the service assessed for accessibility needs that cannot be found on automated tests, for example with a third party accessibility testing service
- use descriptive error messages which state the problem and solution. For example, ‘Server error 413:Request too long’ does not tell the user what the problem trigger was, or what to do about it
- test the service with people who have accessibility needs, examples may include users who have large screens, keyboard access, dyslexia, or a temporary disability or accessibility need
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- the team has appointed a Product Owner from Policy to help define project needs in line with overarching strategic objectives and steer the service through the entire agile lifecycle
- the team includes two Technical Leads in the team structure. One specifically for this project and another from the NHSBSA team who are providing the technical infrastructure for this service, and will own the system management going forward
- the team has distinguished between the roles of User Researcher and Service Designer, with separate people taking on these responsibilities
- the team’s members felt valued and have enjoyed being part of the project
What the team needs to explore
Before their next assessment, the team needs to:
- ensure the contributions of all members of the MDT are given equal weight. In particular, it’s important that User Researchers are supported, encouraged and empowered to champion the voices of current and future service users
- include a dedicated User Interaction or Content Design role within the team, who should iterate the service in line with design standards and patterns where possible (if research suggest other components would be preferable, then these should be proposed via https://design-system.service.gov.uk/community/propose-a-component-or-pattern/)
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team’s ways of working have been informed by agile ceremonies and UCD methodologies, which had been embedded into each sprint
- the team’s plan for transitioning into Public Beta adheres to, and is underpinned by, these principles
- the team uses weekly sprint reviews alongside retrospectives in response to feedback from the previous assessment on the need to increase reflective practices
What the team needs to explore
Before their next assessment, the team needs to:
- consider varying the people hosting agile ceremonies, for example, not having the Product Manager run every stand-up, to promote and embed diversity of thought and practices
8. Iterate and improve frequently
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team has made significant efforts to shorten feedback loops and enable rapid assumption and idea testing
- the team has developed multiple versions of all three elements of the service (1.0, 1.1 and 1.2) in response to the research
- the team has put into place a backlog for the move into public beta
What the team needs to explore
Before their next assessment, the team needs to:
- develop new ways of testing and iterating the content and interaction design of the service
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team has identified the risks of having a service without authentication, and implemented levels of protection to prevent abuse
- The team has created a separation of security concerns, utilising serverless patterns, within the system that allows only components accessible by authenticated users to access submitted data
- the team’s use of a well-tested common component, BSA’s file upload, provided mitigation for one of the riskiest parts of the end user service
- the team’s deployment pipeline includes dependency and code quality scans
What the team needs to explore
Before their next assessment, the team needs to:
- be aware and ready to respond to issues of sensitive information being uploaded, improving the mechanism for reporting and redacting incidents, or focus on preventing such incidents
- investigate bot detection further, such as honey pots, to prevent abuse to the service
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team has established a solid set of KPIs which overcome some of the difficulties of continuous evaluation caused by the long-term nature of PFI contracts
- the team’s chosen definitions of success cascade high level strategic objectives into the specific intentions of this service
- the team has assigned appropriate metrics to each KPI, and how and where these are shared has been planned
What the team needs to explore
Before their next assessment, the team needs to:
- consider including the workload impact on DHSC and BSA in managing and delivering this service in Public Beta, as a metric of success. Some questions to ask are, how many requests are coming in and to which teams? How long do requests take to answer? Are there commonalities in requests that could enable streamlining?
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has made use of a mature technology stack from NHSBSA
- the team has avoided expensive integration to proprietary interfaces, changing the Alpha plan to integrate with facilities provider’s API, of which they would have had no future control over
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team is working with an open repo https://gitlab.com/nhsbsa/dhsc/dhsc-hospital-fault-reporting
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team has been able to rapidly develop the service due to the use of the NHSBSA tech-stack and support from the BSA engineering team
- the team has reduced duplication of effort through the use of both internal shared components, such as the BSA file upload service, and external such as Notify
What the team needs to explore
Before their next assessment, the team needs to:
- have their prototypes reviewed by the NHSBSA interaction design profession and the NHSBSA content design profession
- have evidence of recommendations from both professions
- have evidence of the changes on their backlog, to be addressed before their next assessment
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team has retained the telephone service, and by designing their service to operate alongside but not conditional on the telephone one, there is no single point of failure. Should their service stop working for any reason, people can use the telephone service and vice versa
- the team has ensured that there is a maintenance support plan in place with NHSBSA
- the team has plans in place for potential support issues, showing a forward-thinking approach to future issues
What the team needs to explore
Before their next assessment, the team needs to:
- implement scalability of support when the service grows, specifically around supporting the facilities managers when additional sites are brought online
Leave a comment