
From: DHSC
Assessment date: 09/02/2024
Reassessment date: 06/06/2024
Stage: Alpha reassessment
Result: Met at reassessment with conditions
Service provider: DHSC
Service description
The Skills Record provides a standardised and permanent way to record the training and skills of members of the adult social care workforce that can be shared with current or new employers.
Service users
This service is for…
- Care workers – people who are paid to support someone receiving care
- Employers – people who employ care workers and run care provisions
- Training providers – Organisations who create, deliver and assess care worker training
Report contents
- Understand users and their needs
- Solve a whole problem for users
- Provide a joined-up experience across all channels
- Make the service simple to use
- Make sure everyone can use the service
- Have a multidisciplinary team
- Use agile ways of working
- Iterate and improve frequently
- Create a secure service which protects users’ privacy
- Define what success looks like and publish performance data
- Choose the right tools and technology
- Make new source code open
- Use and contribute to open standards, common components and patterns
- Operate a reliable service
1. Understand users and their needs
Decision
The service did not meet point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has done user research in person, in a care facility
- the team understands similar services within the sector, including clear reasons for developing a different service as duplicating or combining with the Digital Staff Passport wouldn’t have been appropriate
- the team ensures involvement from the whole team in the user research process
What the team needs to explore
Before their next assessment, the team needs to:
- articulate how this service addresses the wider, possibly strategic, problem of ‘improving the experience of working in adult social care’
- create a research plan that explores the administrator role and their needs, plus the usability of their interactions with the service in detail
- conduct in-depth user research with awarding bodies, to understand their needs in detail
- understand the impact that a service that makes the ‘employer.... the gateway to care worker users’ and supports ‘monitoring of sector skills’ may have on care workers
- explore how a care worker can share ‘untrusted’ or ‘unverified’ qualifications that aren’t currently in the Skills Record, alongside ‘verified’ skills that are
- explore with employers how ‘trusted’ an ‘unverified’ qualification is, if it is in the Skills Record, or outside of it as it is currently
- explore assisted digital support routes through user research, that include where all user types do not have internet access and a smartphone
- assess whether user needs uncovered during Alpha warranted a change in the design. It may be that it did, and the team iterated as a result, however this was not evidenced to the panel
Reassessment:
Decision
The service met point 1 of the Standard.
What the team has done well
The panel was impressed that:
- the team has a deeper knowledge of their primary users, particularly care worker and how the types of care they are providing may affect their needs
- the team has a deeper knowledge of their users’ unhappy paths and were able to articulate them
- the team has explored and articulated how a care worker would own their own record, and researched how approving entries made by an employer would work
- the team increased their efforts to research with users with access needs and English as a Second Language
- the team has understood and can articulate the value of the ‘We Care’ brand for care workers primary users of their service
What the team needs to explore
Before their next assessment, the team needs to:
- explore and articulate care receivers as tertiary users of their service in more depth
- plan for user research where a care receiver, or someone on their behalf is employing a care worker as a personal assistant. Particularly the offline routes to providing qualifications. These employers are most likely to be adversely affected by a poor offline service
- research the effectiveness of their support routes, including support guides. Research, design, and implement additional support routes if needed
- explore partnerships with community organisations that support users with digital access, to recruit users with low or no digital skills
2. Solve a whole problem for users
Decision
The service did not meet point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team understands some adjacent services and how this service may need to integrate with them
- the team has considered user flows and identified 3 main user groups
What the team needs to explore
Before their next assessment, the team needs to:
- identify any areas of prejudice and injustice that adversely impacts users. such as suitable candidates not being able to provide a ‘Skills Record’ through this service at interview. Seek to mitigate this through the design of the service
- give an insight into how the testing of prototypes went and what was learned from this, including the subsequent iterations and unhappy paths that must be considered to mitigate fraud, abuse and harm to its users or the public. For example, explore what needs to be in place to stop a carer or an employer from using a skills record for employment should they cause harm to a patient in their care and be dismissed from the profession. The service must be considered as solving a whole problem and mitigating further ones. At this point solving has only been explored within the interaction design of the application happy path
- include plans to create and recruit the role of administrator or super user as part of service design, as the team mentions this will be paid employees in a role that does not yet exist
- explore the entire user journey start to end and how this service supports users in happy paths and unhappy paths. For example, when a carer needs something added or an amendment on their record. A plan on how research and design would work on improving this for the 3 main user groups would show awareness of this. For example, 56% of the workforce doesn’t have social care qualifications recorded. Is recording the problem, or is lack of training the problem? If it’s lack of training, then the service isn’t addressing the problem.
- map out how the assisted digital route would work for all 3 user groups. It is not appropriate to assume all users will have access to online applications and though it is a digital service these users must be considered or there is a risk of excluding user groups. Include research barriers to accessing the service, as in, not just perceived value of the service for care workers. Are there barriers in terms of context of use, through devices and internet access that are required, through still needing access to the service and a printer for the assisted digital route? We need more participants of a level 6 and lower on the inclusion scale
- validate their assumption that all user groups should be using the same service. It may be that a bespoke view of the same underlying data and tailored features are more appropriate given the user groups’ different tasks, goals and motivations
Reassessment:
Decision
The service met point 2 of the Standard.
What the team has done well
The panel was impressed that:
- the team worked hard to ensure that its primary users, care workers, were empowered by the service and able to manage their own skills record including adding their own training
- the team had built a relationship with the quality assurance team to again validate the skills record and build trust for users by providing high quality validated training
- articulated the wider ‘We Care’ ecosystem that the service is part of. This includes how multiple services would address the core problem of the experience of working in social care.
What the team needs to explore
Before their next assessment, the team needs to:
- explore whether this service as part of the wider adult social care ecosystem addresses the core problem of improving the experience of working in care. This may be done from a more strategic perspective, by working with other service teams in the ‘We Care’ programme
- widen the scope of research with users of the service, to ensure that the problem has been solved for all users. For example, research has been heavily focused on users that work with the elderly. Adult social care covers care for people with disabilities, neurodiversity, mental health issues and those with substance abuse problems. Assessors are concerned that research does not reflect those users who care for these groups covered in adult social care. The needs of these care workers may be different because of the context they work in. Therefore, this service does not protect those receiving this type of care from harm, if those roles, needs, and unhappy paths are not explored.
3. Provide a joined-up experience across all channels
Decision
The service did not meet point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team has completed some initial workshops with care home staff to understand the user needs of those working in a care home
- the team has included some key partners, resources and activities for future scope in a business model canvas
- the team has measured risk assumptions but how they would be measured against actual research findings would be beneficial to see
What the team needs to explore
Before their next assessment, the team needs to:
- explore a service blueprint to identify the entire process of the service, its delivery and each stage performed by different roles or props that the service will touch. The team should consider how service design can support their investigation into how far and wide the service will go. For example, interactions and service steps and how and when it can touch all service users and beyond. Further user journeys of those affected by the service would also support this, consider Health Care Professionals, all types of care staff, agencies such as DWP, Care Quality Commission and beyond. Go as far as possible to show they have been identified and how this will inform research, where appropriate, in future
- clarify and show what the mentioned assisted digital routes are. A walkthrough of them, including plans to research them would be beneficial to assessors to understand how all users currently manage their work now and how this service might improve their current systems. Examples of how an employer manages training and qualifications without an online service at present
Reassessment
Decision
The service met point 3 of the Standard.
What the team has done well
The panel was impressed that:
- the team’s designers really considered and produced designs based on how the service could be trusted and therefore validated with a “mark of trust” from care providers thus validating primary users belief in the service
- there were offline routes for providing and receiving a Skills Record
What the team needs to explore
Before their next assessment, the team needs to:
- widen their scope further again with tertiary users, including with those in receipt of care and Health Care Professionals
- For example, assessors asked that Health Care Professionals be identified for their role in a user's journey in the previous assessment. The team did explain in this assessment that an HCP would not necessarily look at a skills record for GPDR purposes, but do in fact have a part in validating some training of care workers where appropriate. The purpose of this service is to make it easier for a care worker to maintain and share their Skills Record. So, all channels for all users they interact with to do this must be understood. This will ensure a care workers experience is joined up across all channels. Including a joined up experience in support of happy paths or to rule out any potential issues and unhappy paths for all users
- plan for user research to understand additional signposting routes and their effectiveness (see also point 5)
4. Make the service simple to use
Decision
The service did not meet point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team considered the use of GDS toolkits, patterns and components that supports familiarity for users in government and healthcare services for the online journey
- the team plan to test with accessibility users moving into beta
- The teams research and design sub teams understand each other and work well together as a team
What the team needs to explore
Before their next assessment, the team needs to:
- explore how the ‘We Care’ service and the tools within it, such as the Skills Record are named and communicated to all users of the service, thus ensuring it can be found, accessed and used
- consider whether one application “fits all” is the way to solve every user's problem. Could it be that a different application for each user group may solve every user need? Have research to validate that either way
- explore how using the carer records for proof of qualifications for any current or potential employer empowers or instils trust with the carer in their own skills record. To allow only the employer to control the carers record could leave them open to unfair treatment should the record be used against them, or qualifications withheld or not updated. It's confusing to see what benefit this is to the carer at this point. Could the carers application be used by them to manage their progression and career? It was not clear at this point
- explore how the service could further benefit employers and training providers. It was not clear what benefit or value the application would have to them. For example, what design could be implemented to benefit training providers and employers? Explore the systems they use to manage staff and courses already and take learnings from how they could support this service. Could the service support managing their records, link up, have signposting and incident reporting?
Reassessment:
Decision
The service met point 4 of the Standard.
What the team has done well
The panel was impressed that:
- the team have thought a lot about and used design patterns and components from Government and NHS toolkits to make the service recognisable to its users.
- the team have clearly shown how their learning in research has informed parts of the design to ensure the users of the service can understand it
What the team needs to explore
Before their next assessment, the team needs to:
- appoint a content designer. This will support the team in making sure the service is simple to use. An issue with content not being clearly understood by users was identified and discussed by the team as part of research and design so this must be in place before the team enters private beta
5. Make sure everyone can use the service
Decision
The service did not meet point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team has carried out research with some user groups and understand some main pain points associated with the online service. Though a plan outlining how this will further develop through public beta and beyond would be beneficial to see
- the team has identified other services under the ASC wider data set for future scope and collaboration
- the team has considered how they might engage with users with both regional and national comms support
What the team needs to explore
Before their next assessment, the team needs to:
- the team needs to ensure that they are highlighting all that they have been working on and achieved since discovery. Questioning uncovered a lot of work that the panel would otherwise have been unaware of, such as artefacts supporting exploration of service and user needs. For example, user journeys and research plans with wider user groups
- provide more examples of how the research carried out has influenced the design. At times it felt that the research with ‘diverse’ individuals (/’non-typical’ users) was yet to be considered and wouldn’t influence future design, which is a concern as it implies the service is expected to work for all and will move into private beta when its unlikely to solve a whole problem for all users at this stage
- show plans to solve the problem as a service and not just within interaction design. As in point 3 service design needs to be fully explored, understand, and articulate the difference between user experience design, interaction design and service design
- explore fully how users will encounter this service. This has not been acknowledged through the service name and that must be looked at in line with GDS “Naming your service” GDS service manual - Naming your service with research. It's not appropriate to say we care more about the name than the users. The service name is the main identifier for a user in how it will support its targeted user need. More needs to be considered in how the service will be sign posted by Gov.uk, NHS and other organisations within the healthcare space
- content design needs to be brought into the user centred design space in accordance with GDS and DfE guidance to meet the service standard
- as in point 2 and 4. The team need to investigate how and assisted digital route would work for all 3 user groups. To assume all users will have access to online applications deliberately excludes those with lack of digital skills or internet access from using the service. Assisted digital routes should be in place to cover such gaps to support those user groups and meet any service standard.
Reassessment:
Decision
The service met point 5 of the Standard.
What the team has done well
The panel was impressed that:
- the team increased the volume of user research with participants with access needs and low digital skills, given the difficulties with participant recruitment
- the team has ensured that every research participant has contributed to research of the naming of the service. This is great practice, and we encourage this to continue beyond private beta and to have a content design contribution as soon as possible
- the team have not mandated the digital route so those users without access to a digital application can still use a paper route
- the team have conducted exploratory design desk research with other government departments, to see how users may be supported in this context
What the team needs to explore
Before their next assessment, the team needs to:
- explore signposting in much more detail. Though the team had clearly thought of an established way to sign post from one government “working in adult social care’ website the reach must go further in order to enter private beta. Care providers, employment agencies within NHS and government and private should also be considered before the move into private beta
- be confident that support routes will work in context before putting them in place. The team have shown plans on how to support users of the digital service through a back-office function and user guides. However, neither has been designed and researched for this specific context. Data from attitudinal research such as concept testing, and analytics on existing similar routes is a potential way this could be done. These support routes must be in place in order to support the teams private beta pilot, and to address any issues with the support routes before the service goes public.
6. Have a multidisciplinary team
Decision
The service met point 6 of the Standard.
What the team has done well
The panel was impressed that:
- aside from content design, the team had the right mix of professions to deliver their alpha
What the team needs to explore
Before their next assessment, the team needs to:
- onboard a content designer into the team to iterate the content, ensuring it is appropriate for a wide range of users including those with English as a second language
7. Use agile ways of working
Decision
The service met point 7 of the Standard.
What the team has done well
The panel was impressed that:
- the team created a team charter to underscore their commitment to agile ways of working and respecting each other’s work
- the team plans and manages their work through sprint planning and stand ups
- the team does open show and tells at the end of every sprint to share their work with stakeholders and others
What the team needs to explore
Before their next assessment, the team needs to:
- stress-test their escalation procedures. Although they have decision making escalation routes these have never been used. They should revisit their escalation routes to ensure they can resolve any issues in a timely and definitive way
- ensure their processes and procedures for working as a single team made up of people from three different organisations are robust and equitable
8. Iterate and improve frequently
Decision
The service did not meet point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team has produced the figma prototype design based on wireframes produced from feedback and insights gathered from initial workshops
What the team needs to explore
Before their next assessment, the team needs to:
- evidence how they have clearly iterated and continually improved their design in response to user needs. A design log or design history recording each iteration and how design influenced a change could be considered to support this along with highlighting the same in their visual user flows. It was difficult to see how research influenced the user flows, or the demo shown to the panel
- conduct more usability testing to inform the application design iteration. The sample size and user groups do not form a large enough basis for identifying how this service will provide value for its users. If the team follow recommendations in point 1, they will be in a better position to conduct research and show frequent iteration. This will give a clear value in what the service is trying to achieve
- consider a feedback loop of research and design iterations in private beta and have a plan on how that will work
Reassessment:
Decision
The service met point 8 of the Standard.
What the team has done well
The panel was impressed that:
- the team have a substantial number of iterative designs informed by user research and recorded in a design log
- the close working relationship between researchers and designers was apparent, which no doubt supported the service team in meeting this standard
- the designer clearly knew the user needs and articulated that perfectly while discussing their iterative process
What the team needs to explore
Before their next assessment, the team needs to:
- showcase iterations solely based on research with users who may need further support. There is an impressive amount of design iterations and research with groups such as ESL and accessibility users. The team were able to answer the panels questions, though this could have been addressed up front.
9. Create a secure service which protects users’ privacy
Decision
The service met point 9 of the Standard.
What the team has done well
The panel was impressed that:
- the team has enhanced data integrity and user trust by implementing W3 Verifiable Credential Standard
- the team has robustly secured the service perimeter with cloudflare's WAF and DDoS protection
- the team has secure, scalable authentication mechanisms with OAuth2 client credentials with signed JWTs,
- the team has effectively reduced potential attack surfaces, using rotating AWS access keys and multi-staged Docker images
- the team ensured reliable identity verification by adopting GOV.UK One as an external IDP
- The team has enhanced network security through architectural separation into public and private networks with a NAT gateway
What the team needs to explore
Before their next assessment, the team needs to:
- conduct PEN Testing to rigorously evaluate the system's defences against potential vulnerabilities
- develop strategies to mitigate against lost notification scenarios, ensuring reliable message delivery
- implement a verifiable onboarding process for employer admins and training accessors to strengthen access control and verification processes
- enhance data privacy measures during transfer and at rest, exploring technologies beyond HTTPS and JWT tokens for added security layers
10. Define what success looks like and publish performance data
Decision
The service met point 10 of the Standard.
What the team has done well
The panel was impressed that:
- the team has done a detailed exploration of success measures, aside from the mandatory ones, and has proposed methodologies for their measurement
What the team needs to explore
Before their next assessment, the team needs to:
- estimate type and level of digital support required has not been through user research. Appropriate steps for example using DfE’s ‘how many people’ tool is a good alternative, but the team haven’t mentioned a way to consolidate or back up this estimate with user research and actual data from their service
- provide a comprehensive assessment of operational resilience, define Recovery Point Objective (RPO) and Recovery Time Objective (RTO) for the project and measure the service against it
11. Choose the right tools and technology
Decision
The service met point 11 of the Standard.
What the team has done well
The panel was impressed that:
- the team has demonstrated an effective blend of reliability and scalability through adopting AWS technologies like ECS and RDS
- the team has utilised Doppler for secrets management streamlines secure configuration across environments
- the team has illustrated a commitment to secure and standardized user authentication through integration with GOV.UK One Login and the proactive consideration for the UK Digital Identity and Attributes Trust Framework
What the team needs to explore
Before their next assessment, the team needs to:
- enhance email communications by exploring GOV.NOTIFY further for customized email branding, enriching user interaction
- decide on a cloud provider should be expedited to minimize potential rework and streamline development efforts
- evaluate the impact of integrating multiple third-party solutions on system complexity and maintenance overhead
- consider exploring more cloud-native solutions that could offer similar benefits with potentially lower operational complexity
- investigate additional tools and technologies that might provide enhanced performance or security benefits without significantly increasing the learning curve for new team members
12. Make new source code open
Decision
The service met point 12 of the Standard.
What the team has done well
The panel was impressed that:
- the team recognises the importance of open-source code for transparency and community collaboration
What the team needs to explore
Before their next assessment, the team needs to:
- make a detailed plan for managing community contributions, ensuring the security of the open-source code, and establishing a clear governance model for community-driven development
- provide all the associated design and support documents so that the code can be reused
- develop a clear, actionable plan for open sourcing the code, including timelines and guidelines for contributions
- establish processes to ensure that open-sourced code remains maintainable, secure, and up to date
13. Use and contribute to open standards, common components and patterns
Decision
The service met point 13 of the Standard.
What the team has done well
The panel was impressed that:
- the team has engagement with existing open standards, including exploring the possibility of reusing the Digital Staff Passport, showcases a proactive approach to leveraging and contributing to shared digital solutions within the government ecosystem
What the team needs to explore
Before their next assessment, the team needs to:
- contribute actively back to the open-source community and open standards to foster a collaborative ecosystem
- expand the use of common components and design patterns to further align with industry best practices and enhance service interoperability
14. Operate a reliable service
Decision
The service met point 14 of the Standard.
What the team has done well
The panel was impressed that:
- the team has implemented performance testing and planning for benchmarking services with realistic data volumes are strong steps toward ensuring service reliability
- the team’s architectural design, including the use of AWS services and Cloudflare for resilience and scalability, underpins a reliable infrastructure
What the team needs to explore
Before their next assessment, the team needs to:
- develop and proof a comprehensive disaster recovery plan that includes detailed strategies for data backup, system redundancy, and quick recovery in case of service disruption
- enhance real-time monitoring and alerting mechanisms to promptly identify and address performance issues or security incidents
- understand the urgency of access for users and design and test an appropriate planned and unplanned downtime notification
Leave a comment