Skip to main content

https://digitalhealth.blog.gov.uk/2016/06/21/hfea-clinic-portal/

HFEA clinic portal - service assessment

Posted by: , Posted on: - Categories: Service assessments

Thanks to HFEA for answering questions against the standard honestly and clearly, and to colleagues from Ministry of Justice and NHS Blood & Transplant for joining DH on the assessment panel.

The clinic portal allows clinics to submit, obtain and manage clinic information and allows the HFEA to give clinics performance data. Clinics will access alerts, guidance and news via the portal. Inspection reports and other compliance activities will be published here.

HFEA are redesigning the clinic portal to combine existing and enhanced functionality and make it easier to use.

Department / Agency:
Human Embryology and Fertility Authority (HFEA)

Date of Assessment:
12 May 2016

Assessment Stage:
Beta

Result of Assessment:
Pass

Lead Assessor:
L.Scott

Service Manager:
C.Hall

Digital Leader:
A. Bye


Assessment Report

Outcome of service assessment

The panel decided that on balance the service was ready to pass into a public beta phase. There is a significant amount of work to do and adjustments to make to ensure that service adheres to the standard for live operation. The team have made a lot of progress since the alpha assessment and are motivated to keep up this level of improvement throughout the next phase of the service redesign.

Recommendations

User needs and assisted digital

  1. The whole team needs to be involved in ongoing user research, including the development team at the supplier. Take the opportunity to go and observe users in context using the service in public beta.
  2. Put thought into finding ways to make the navigational paths for your everyday power users more efficient.
  3. Collect feedback on how personalisation (saving favourite documents, put together your own dashboard, etc.) can support your users.
  4. Ensure the icons and the labelling in the ToDo list are understood by the users - gather evidence to demonstrate this.
  5. Data and numbers are needed to justify design decisions made - ensure you use this kind of evidence to back up user research observations.
  6. Ensure you have a way of collecting feedback (a banner could be an option) from users who view the beta service.
  7. Don’t forget to make use of the personas and update them if necessary.

The team

  1. Establish a plan for continued development and a managed service once the current delivery partner leaves.
  2. Ensure you have funding for and access to specialist roles in future. For example, user research. Whilst it’s great the team is learning some of the principles and practices, expert help will be needed when using research to make service design decisions.

Technology, security and standards

  1. Keep the current deployment automation in place.
  2. Increase test coverage - 50% is acceptable for the current stage - it will not be for a live service.
  3. Introduce explicit regression testing and smoke-testing for releases.
  4. Keep in mind the danger of over-engineering for requirements which do not need to be accounted for yet.
  5. Produce a document of risks considered, likelihood and impact of threat, and what mitigation is in place
  6. Make code open now, and code in the open from now on.
  7. Get analytics data on browsers and devices and design accordingly.
  8. Produce an explicit plan for disaster recovery.
  9. Consider a fallback offsite backup facility. Regularly test both local and offsite backups.

Design and content

  1. Test and measure whether users understand the meaning of certain words and acronyms (red, green status...)
  2. Plan for how to improve the interaction design as you gather more evidence during beta when you’ll get a higher volume of users.
  3. Review the order of the navigational elements against evidenced user needs.
  4. Obtain data and evidence on what browsers and devices your users are using, and design accordingly. Analytics from the existing service may help here.
  5. Run a heuristic analysis of the interface design elements with focus on usability, interactive elements and design language consistency against the GDS design patterns.
  6. Resolve Javascript issues prior to public beta launch. Currently the service requires Javascript for some critical things - e.g. viewing what’s required on a to-do list.
  7. Check that the capitalisation is in sentence case style consistently across the site and avoid using full caps for anything apart from acronyms.
  8. Ensure that the responsive design actually works on mobile devices: eg the burger menu doesn’t work in Chrome on a smaller screen without a refresh.
  9. Consider testing a more appropriate way of visualising percentages and other data on the dashboard.

Analysis and benchmarking

  1. Work out a plan for measuring the service against the 4 mandated KPIs (where these are relevant). Communicate this.
  2. Plan to collect, analyse and act on any other meaningful metrics that will show whether the service is making things better.

Testing with the Minister

  1. Identify the Minister with portfolio for this area and make plans to demonstrate the service.

Sharing and comments

Share this page