The Comment Collections service provides users engaging with NICE with the means to comment and offer opinions on NICE content.
The service users broadly extend to 4 categories which can be further sub-divided. NICE guidance producers, members of the public, health and social care professionals & academics and representatives from industry.
The service extends and makes more consistent the experience for NICE users commenting on guidance and simplifies the process of the NICE guidance teams collecting aggregating and responding to comments raised by users.
Currently 4 ways exist for users to respond to consultations. Some teams will also offer more than one way for users to respond within same consultation making it difficult to track responds, aggregate the and respond to them.
Department / Agency:
National Institute for Health and Care Excellence(NICE)
Date of Assessment:
11 October 2017
Assessment Stage:
Alpha
Result of Assessment:
Pass
Lead Assessor:
J.Morley
Service Manager:
M.Hope
Digital Leader:
I.O'Neill
Assessment Report
Outcome of service assessment
After considering all the evidence submitted by the team during the Alpha assessment, the panel are happy to conclude that the NICE comment collection tool is ready to pass into private beta on the proviso that the following conditions are met:
- The team produce a plan for publishing the finalised MVP code (once fully developed) openly within the first 6 weeks of Beta
- The team award the contract for the accessibility audit
- The team contact GDS to receive guidance about acceptable KPIs once performance and cost analysis is completed by NICE analysts,
User Needs and Usability
The team have clearly put a considerable amount of time and effort into developing an understanding of their users. This began in Discovery, during which, face-to-face interviews and experience mapping exercises were conducted with internal users and 5 external organisations to develop a set of 6 personas. These personas include 3 internal to NICE personas and 3 external to NICE to ensure that the needs of the full range of service users identified in workflow mapping exercises have been considered throughout Alpha. These personas have also been used to identify the key user needs including:
- The ability to consolidate comments
- Choice of a range of response formats
- Clear publication of consultation dates
- Share comment internally
- Edit comments before submitting
All of these user needs have been explored during Alpha where the focus on user research has shifted to Guerrilla testing, which has primarily been conducted with internal stakeholders either in person or remotely using Zoom.
These user research sessions have been interactive with users being set a series of 4 tasks to complete on the prototype site. This gave the team a chance to observe whether users were able to complete these tasks without guidance and subsequently make changes to remove any identified blockers for example changing the icons when it was highlighted that these were causing confusion.
The team were able to clearly articulate a number of changes that have been made following this user research including: adding a comment review function, creating a central HUB page and navigation panel and adding the ability to tag and filter comments.
The overall response to these changes has been positive, however, the team is aware that significant further improvements will be required during private Beta. For example, a number of external users identified a need to be able to share comments within their organisation for moderation purposes before submitting the final response. This has currently been added as a function by allowing users to export comments as an excel file but the team are looking to introduce a more seamless fix within Beta.
The panel were reassured that the majority of user needs were being heard and acted upon with only one functionality request, the ability to share comments more broadly (e.g. outside of an individual organisation) and ‘like’ comments made by others, not having been met yet. This is something that the team will explore during Beta to see if it needs to be included in the MVP or whether it is a secondary need that can be met as part of continuous improvement work.
Team
During Alpha the team consisted of:
- Service Delivery Manager
- Business Analyst
- UX Designer
- Information Architect
- UX Developer
- Developer
- Tester
Alpha has been very much a team effort with work being divided based on a triaging system that identifies the most appropriate person to solve a problem in a way that is in keeping with the collective understanding of the desired outcomes, rather than relying on work being divided based on specific disciplines.
By taking this approach the Service Delivery Manager has been able to ensure that no one person has had to take on too much responsibility by providing specific functions additional capacity from within the team. This is most evident from the user research work which has been led by the Business Analyst but involved all members of the team at some point.
This collective approach does not, however, mean that there is a lack of clear governance in the team, as although all the team are involved in stakeholder meetings to promote a collective understanding of the direction of travel, the delivery manager has final say when necessary and acts as the conduit for the sponsors board where any strategically significant decisions are made - such as the decision to hire a dedicated user research for Beta.
Continuous Improvement
The team work in an agile way to ensure that there is enough flexibility to respond to changing project needs.
The project is set-up to run in two week sprints with the goals and subsequent outlines documented on the project blog for stakeholder communication purposes. Daily stand-ups are held with the developers to discuss what has been through testing, what is on the kanban board, what needs to be added today and what functions require further user research. There is also an overarching theme for each week to ensure that there is a deliverable for the weekly stakeholder meetings and fortnightly showcases.
To ensure that the team can respond to adhoc changes to plan, the team is co-located and also use a range of remote tools, such as Miggle, Jira and Slack, to ensure that the lines of communication are always open.
This way of working will continue into private Beta where a release plan chunked into 2 week sprints has been developed. With the help of the patient involvement programme user testing will happen within each of the two week sprints, initially with internal users and then a targeted group of external users. The user testing will be used to test the prototype itself from a UX and UI perspective and also to test concepts using sketches and wireframes.
Throughout Alpha these mixed methods have allowed the team to take a fast-turnaround approach to prototyping and have already produced 200 different iterations of the prototype in response to the outcomes of user research. The panel are confident that the team have the capability and capacity to continue to do this throughout Alpha, especially as the software is custom built which provides a greater degree of autonomy and flexibility.
Technology
The panel were pleased to observe how much thought had been given to the technology stack to ensure a consistent user experience across NICE services, maintain up-time, ensure user needs are met and minimise disruption should the service fail.
The team are currently using two test environments: test and Alpha. Alpha is the primary test environment and is composed of a web server and a Raven database server hosted on AWS using an EC2 instance. The only difference between this set-up and the intended live environment is the scale, with the intention being to increase the number of servers for live.
The prototype used in Alpha is not mobile friendly as user research showed that those responding to consultations primarily use desktop so mobile browsers were deprioritised. However, all NICE sites support all browsers and are mobile/tablet friendly and the plan is to design the Beta site to support mobile-first users.
To further ensure the team are prepared for Beta and go live, the service is already managed as if it were live. Statuscake is used to monitor the status of the app 24/7 and there is a support team available should anything break. Should the service go down completely, there is a cascade plan in place to alert necessary stakeholders and an online status page will be displayed on the site. The ops support team will be used to help overcome the underlying technical issues.
Data Security
The panel were reassured that the team had carefully considered potential threats to the service and had taken appropriate actions to mitigate these. For example, users will log in with their NICE account which is a single sign on solution offered across all NICE services with clear eligibility criteria. In addition the comments tool will use roles and associated permissions to further protect access.
The panel did raise the potential threat for malicious users to add excessive number of comments to consultations. The team felt that the threat of this was limited as NICE accounts are closely moderated and the service will not be widely accessible to the public. However, it was recognised that limited authentication takes place when NICE accounts are created and so the team agreed that the potential for CAPTCHA to be used as a means of protecting against a spam attack could be investigated during Beta. The panel highly recommend that the team follow through on this action.
Code
The service makes use of an existing publications tool, which is an in-house developed doc converter that converts word to html. This is a tool that was carried over from another service and can be reused elsewhere. The additional comments functionality has now been brought in-house with NICE owning the IP after being initially inspired by open-source software hypothesis. The backend in both instances is written in C# with an API frontend written in react and stored on AWS, ensuring a high degree of portability and limited vendor lock-in.
Although inspired by open source software, the code is currently not published openly but the team did express they would be happy to make it so. The panel recommends that the team takes steps during Beta to better understand how the code could be used by other government departments or bodies, and how best to go about doing this.
Open Standards and Common Platforms
The panel were impressed by the extensive amount of research the team had conducted to ensure existing solutions were used where possible in the application.
The majority of the service builds on existing NICE systems, pulling in data via APIs. The only new functionality added during Alpha was the ability for users to comment on consultations. The team conducted an in-depth evaluation of tools being used to deliver this functionality by a number of government organisations to assess whether an existing solution could be reused, including:
- Department of Health’s citizen space
- Scotland’s participatory platform
- Leapfrog’s make it stick
However, none of the evaluated existing government solutions met all of NICE’s requirements, such as specific security requirements. Open-source peer-review software, Hypothesis, in contrast was found to be mature and was a close match in terms of features.
Initially the team installed Hypothesis as a browser extension without creating a custom installation. However, it was ultimately found that Hypothesis used a deprecated version of angular for its frontend that the team felt left the software too open so the team took the decision to build their own version of the Hypothesis solution that was more aligned with their security requirements.
Having built an in-house solution ensures the team has the full flexibility to make changes to the software to meet user needs and prevents vendor lock in.
Vendor lock in is further prevented by the fact that the application is modularised and all elements could be replaced by a different solution if required. In addition, there are ongoing discussions across NICE to make the data storage solution more portable.
Accessibility
As the prototype developed during Alpha is an extension and improvement of an existing service, designed to make the existing process more streamlined, there are already a number of assisted digital channels in place that will not be replaced during Beta with users being able to respond offline or call NICE enquiry handling team to be assisted through the process. In addition patient groups are encouraged to comment via charities who support them through the response process.
Over the past 5 years the telephony option has only dealt with 8 occurrences where respondents needed digital support. 5 of these occurrences were related to IT issues and 3 were Digital literacy issues which were resolved by the telephone agent making the comments on behalf of the user.
During this period of monitoring there were approximately 20 consultations released for comment per month, indicating that the need for assisted digital support is relatively low. The panel were, however, pleased to hear that the team has recognised the need could be higher than currently recorded and that the changes to process resulting from the new comments functionality could increase the need and are taking steps to re-assess this need during Beta:
- The NICE market research team will be actively recruiting research participants with assisted digital needs
- A tender has gone out for a company to provide a complete accessibility audit
- The team will be engaging with a couple of consultations being run in November that will have respondents who are supported by facilitators.
The panel feel assured that this programme of work will ensure all assisted digital needs are identified and met.
Design
The team has had both UX designer and a front-end developer roles in post throughout Alpha and plans to continue to have these as key roles throughout Beta.
Whilst the comments tools is not built directly on gov.uk design patterns, it is based on NICE styles which are on bootstrap and these follow gov.uk design principles. In addition, there is currently a NICE-wide project to develop a NICE design system that will enhance the current bootstrap. When this is initially implemented it will be static, but the eventual plan is for this design system to act as a library that generates pull requests and reduces the need for manual updates to be made to all the services whenever an update to an individual design pattern is made.
The panel are confident that this work will ensure a consistently high quality user experience across all NICE services, including the comments collection tool, and would recommend connecting with the team working on the GDS design library to share learning.
Performance Monitoring
The service has not been registered with the GDS performance platform as there are already a number of internal NICE dashboards that are used for performance monitoring purposes. Additionally, the service will not drive an increase in uptake as the number of people/agencies who are are asked to respond to consultations is set. The team was, however, able to describe in great detail the work currently underway to set appropriate KPIs for the service.
A cost analyst has been employed by NICE and has for the past month been working on analysing the existing consultation response service, looking at the time taken to complete specific tasks in order to provide baselines that the new service can be measured against. This work has been finalised during Alpha for tasks completed by internal users, with initial estimates showing that the new tool reduces the time taken by NICE analysts to collate the comments by half a day per consultation, and will now be completed for tasks completed by external users.
In addition to this the NICE performance analyst has undertaken an initial review of the existing service using Google Analytics and Hotjar and is now devising a plan to track dwell times, and individual journey paths for the new service during Beta. This plan will be used to devise the KPIS which the service will be measured against.
The panel and the team also had a discussion about more qualitative performance indicators. For example, improving the quality and usefulness of comments received through the tool by monitoring how the number of comments that result in changes to the guidance under consultation increases over time. Both parties are in agreement that this is out of scope for the MVP but the panel recommend that further thought is given to this for the purposes of continuous improvement.
Summary
Overall the panel were very impressed by the quality and quantity of work undertaken by the NICE team during Alpha and felt that the majority of the digital by default criteria had been met. The panel also felt that the team demonstrated an exemplary ability to see the tool as part of a bigger whole in terms of overall service improvement and were already thinking beyond MVP development showing a commendable commitment to long-term continuous improvement.