Skip to main content

https://digitalhealth.blog.gov.uk/2019/04/15/nice-comment-collection-private-beta-assessment/

NICE Comment Collection Private Beta Assessment

Posted by: , Posted on: - Categories: Assurance, Beta, Service assessments

Text saying "Service Assessment" with DHSC approved brand colours

The report from the private beta assessment for NICE’s comment collection service.

 

Assessment by: Department of Health and Social Care

Assessment date: 3 April 2019

Stage: Private beta

Result: Met

Service provider: NICE

Service description

‘Comment collection’ provides a way for NICE stakeholders (people who the guidance effects) to comment directly on the documents produced when developing guidance. It also provides a way for teams within NICE to request these comments, set up the consultations, and collate the comments for response in a way that avoids repetitive manual handling.

Service users

Primary user groups are NICE guidance producers, coordinators and external commenters (organisations and the public).

1. Understand user needs

Decision

The team met point 1 of the Standard.

What the team has done well

The panel was impressed that:

  • The team understand the needs of internal and external users of the service.
  • They are capturing insights from users in a variety of ways, including usability testing, survey responses, hotjar recordings and calls from the enquiry handling team.

What the team needs to explore

Before their next assessment, the team needs to:

  • Carry out usability testing more regularly and split this into rounds. For example, test with a group of participants, analyse and iterate (Round 1). Then test the iterated journey with a group of different participants (Round 2).
  • Conduct observations and test usability with internal users who receive and respond to comments (NICE staff) and those who modify the guidance based on comments (Advisory Committee Members).
  • There needs to be a robust way of capturing and responding to internal feedback, making sure issues are logged, analysed and prioritised in the UX and design work.
  • Make sure that public and external users are not deprioritised against internal users. Particularly those who may find it hard to influence NICE guidance because of lack of easy access or digital ability.
  • Consider paying incentives and / or using a recruitment agency to recruit user groups that meet your criteria and aren’t already familiar with the concept of NICE guidance. This has the added benefit of saving the user researcher’s time spent on recruitment, will encourage participation and make sure they get the right people.

2. Do ongoing user research

Decision

The team met point 2 of the Standard.

What the team has done well

The panel was impressed that:

  • The user researcher acknowledged the need to do more focused usability testing going forward and they are looking at better approaches to analysis.
  • The whole team have been involved in user research, observing and taking notes.
  • The team plan to test more things they are aware need fixing, such as navigating from the overview page to the main content.
  • There is another accessibility audit coming up.

What the team needs to explore

Before their next assessment, the team needs to:

  • Conduct more accessibility testing with representative users who have hearing, motor and cognitive impairments such as dyslexia.
  • Work out how to capture survey feedback from external users earlier on in the journey. The survey has a small number of responses and is only available once the user has completed a comment. This could mean it is only capturing responses from those who have been successful.
  • In usability testing, provide each user group with relevant tasks/scenarios. For example, for a first-time user it would be good to explore how they arrive at the correct place, as opposed to starting all users from the overview page. What would they type in and where would this take them to? Different scenarios will help test this.
  • Do more contextual user research with internal users and observe them using the service.

3. Have a multidisciplinary team

Decision

The team met point 3 of the Standard.

What the team has done well

The panel was impressed that:

  • There is one service manager who leads multiple teams working on different parts of a service and/or code base.
  • Team were able to respond to ad-hoc requests with mock-ups and quick testing or by working closely with users, demonstrating to senior people they were happy with the service.
  • The team are co-located and pair up across disciplines, for example developers with user researchers. It’s good that developers observe user research sessions.
  • There appeared to be a representative mix of genders in the team.

What the team needs to explore

Before their next assessment, the team needs to:

  • Review whether it’s a good idea for content to be covered by a user researcher. Are there other content designers in NICE who could support?

4. Use agile methods

Decision

The team met point 4 of the Standard.

What the team has done well

The panel was impressed that:

  • The team work well together pairing up to approach problems across User Research, UX, Design, Testing and Development.
  • User research revealed findings that the team didn’t expect, which challenged their initial assumptions. One example was the use of icons, they thought these would help users but during usability testing participants didn’t understand what they meant. The team have responded by replacing icons in future iterations.
  • The team have regular show and tells, sharing their work with the rest of the team.

What the team needs to explore

Before their next assessment, the team needs to:

  • Consider ways of capturing issues from different sources in a centralised place. This will provide a holistic view of the service and help to prioritise what should be worked on when. For example, the user researcher has kept a thorough log from usability testing. When issues arise from survey responses and the inquiry handling calls etc. the team could combine these into one place to see the impact and frequency of issues across multiple sources.

5. Iterate and improve frequently

Decision

The team met point 5 of the Standard.

What the team has done well

The panel was impressed that:

  • The service manager was able to explain the lifecycle from user research through to design and development.
  • The team are internal staff and work closely together in the same space.
  • The prototype has undergone various iterations throughout beta, responding to needs from user research.

What the team needs to explore

Before their next assessment, the team needs to:

  • Make sure user research is happening in some capacity during every sprint.
  • Once something has been tested, analyse, iterate and test again. Try to avoid iterating during testing, this is what the next round can be used for.
  • Do mobile testing on different devices to understand the look and feel, page layout and usability. From the script, it looks like mobile was tested via a computer screen. This means the usability on a mobile device (for example, selecting text and leaving comments) may not have been accurately represented. If the participant has an iPhone or iPad, they can record their own screen or using Lookback, Reflector or Quicktime player the screen can be projected to a computer whilst they interact with the mobile device itself.

6. Evaluate tools and systems

Decision

The team met point 6 of the Standard.

What the team has done well

The panel was impressed that:

  • There was a detailed justification for each technology choice.
  • Monitoring was in place.
  • Sensible and mature technology has been chosen.

What the team needs to explore

Before their next assessment, the team needs to:

  • Ensure they have explored how they plan to upgrade components when new releases are available.

7. Understand security and privacy issues

Decision

The team met point 7 of the Standard.

What the team has done well

The panel was impressed that:

  • There was a strong emphasis on privacy and user security.

What the team needs to explore

Before their next assessment, the team needs to:

  • Consider whether personal information needs to be in the (internal) XLSX download.
  • Look at whether automated tools could be used to reduce spam and remove sensitive personal details.

8. Make all new source code open

Decision

The team met point 8 of the Standard.

What the team has done well

The panel was impressed that:

  • All the code was available on GitHub.
  • A suitable license had been chosen.
  • Security of API keys was well managed

What the team needs to explore

Before their next assessment, the team needs to:

  • Understand how they can open up other internal components that they used.

9. Use open standards and common platforms

Decision

The team met point 9 of the Standard.

What the team has done well

The panel was impressed that:

  • Site is in HTML5 and works across all modern browsers

What the team needs to explore

Before their next assessment, the team needs to:

  • Consider the reliance on closed formats like DOCX and XLSX. Even though they are for internal use only, it still restricts the technology choices available to you.
  • Ensure that all PDFs comply with PDF/A - or republish them as HTML5 as set out in GDS guidance.

10. Test the end-to-end service

Decision

The team met point 10 of the Standard.

What the team has done well

The panel was impressed that:

  • There are a good range of pre-prod environments.
  • The service was well tested across multiple devices - including automated testing.

11. Make a plan for being offline

Decision

The team met point 11 of the Standard.

What the team has done well

The panel was impressed that:

  • The team had a sensible backup strategy.
  • There are good business processes for dealing with unexpected extensions.
  • There is an external facing helpdesk able to directly raise issues in JIRA.

What the team needs to explore

Before their next assessment, the team needs to:

  • Test their backups - and perform a recovery exercise.
  • Consider whether teams other than ops need to get alerts.

12. Make sure users succeed first time

Decision

The team met point 12 of the Standard.

What the team has done well

The panel was impressed that:

  • In usability testing, the team found that there was little difference between the first time users and experienced users in being able to complete tasks.
  • There is assisted digital support in place via the enquiry handling team who are trained and experienced.
  • The team recognise the need for more complex features to be tested, such as the need for larger tables and different types of consultation.

What the team needs to explore

Before their next assessment, the team needs to:

  • Ensure the HTML editor is accessibility tested.
  • Test on different screen sizes and devices themselves (see point 5).
  • Test the guidance teams uploading different types of document.
  • Test all edge cases in the journey, so when the user submits an error, uploads something of the wrong format, tries to login with the wrong account etc. to make sure the experience works in all scenarios.

13. Make the user experience consistent with GOV.UK

Decision

The team met point 13 of the Standard.

What the team has done well

The panel was impressed that:

  • NICE has a design system that is based on existing GDS components.
  • There is a team who own the design system, but the service teams are responsible for designing and testing new components.

14. Encourage everyone to use the digital service

Decision

The team met point 14 of the Standard.

What the team has done well

The panel was impressed that:

  • The team are working with internal users in private beta to make sure the service works for their needs and they can use it.
  • Sponsors from internal users’ teams come to showcases and report back.

What the team needs to explore

Before their next assessment, the team needs to:

  • Explore how external users (the public) with low digital access and ability can use the service.

15. Collect performance data

Decision

The team met point 15 of the Standard.

What the team has done well

The panel was impressed that:

  • The team has goals set up for full user journey and can track drop off at different stages.

What the team needs to explore

Before their next assessment, the team needs to:

  • Track user journeys across all parts of NICE domain (team have this on their roadmap), so that an exit to another NICE site is not treated as an ‘exit’.
  • Track how people use the service over multiple sessions.
  • Make sure there is someone in the team who can access and report on analytics regularly. This can be the responsibility of someone already in the team.

16. Identify performance indicators

Decision

The team met point 16 of the Standard.

What the team has done well

The panel was impressed that:

  • Team have used creative methods, such as a time-motion study, to benchmark current performance.
  • There is a sponsorship group who have agreed KPIs.

What the team needs to explore

Before their next assessment, the team needs to:

  • Look at whether a cost/benefit analysis on potential new features will help them prioritise the roadmap.

17. Report performance data on the Performance Platform

Decision

The team met point 17 of the Standard.

18. Test with the minister

Decision

The team met point 18 of the Standard.

What the team has done well

The panel was impressed that:

  • The team will be demoing the service to their board on multiple occasions.

Conditions before the next assessment

  • Do further rounds of usability testing with external users and iterate the service.
  • Test service with internal and external users who have access needs.
  • Test on different devices and screen sizes.

 

Sharing and comments

Share this page