Skip to main content

HFEA clinic portal - live service assessment

Posted by: , Posted on: - Categories: Service assessments

The clinic portal allows clinics to submit, obtain and manage clinic information and allows the HFEA to give clinics performance data. Clinics will access alerts, guidance and news via the portal. Inspection reports and other compliance activities will be published here.

HFEA are redesigning the clinic portal to combine existing and enhanced functionality and make it easier to use.

Department / Agency:
Human Embryology and Fertility Authority (HFEA)

Date of Assessment:
21 November 2016

Assessment Stage:

Result of Assessment:

Lead Assessor:


Service Manager:

Digital Leader:

Assessment report

The HFEA clinic portal service is seeking permission to be branded a live, Digital by Default service (on an exempted URL and not on the domain).

Outcome of service assessment

After consideration the assessment panel has concluded the HFEA clinic portal service has shown sufficient evidence of meeting the standard, and should go live as a service. This means the service can now remove its beta branding.

We’d like to congratulate HFEA on their outcome. The service team has worked hard to achieve the standard and have worked well with their delivery partner Reading Room to introduce an agile, service design approach to this product and to HFEA. The panel was also impressed with the effort and improvement the HFEA team had put into to reach the Live standard during beta development.

HFEA should discuss with DH Digital the plans for extending this service to handle patient identifiable data to agree how the development of this next phase should be assured.   


The service was assessed against all 18 points of the Digital by Default Service Standard. We asked questions from the prompts and evidence for assessors, supplied by GDS. This document has questions and the evidence sought for alpha, beta and live phases. We asked questions from the live section.

The panel concluded that the service currently meets the requirements of the standard for a live service. We viewed this proportionately - this is a clinician facing service with a closed set of users and a fairly low volume of transactions.

User needs and assisted digital

Assessor comments:

We saw much improvement here since the beta assessment. The team make use of personas, have a cadence of user research and development and have made design changes based on evidence of user need, eg the simplified search functionality. They use lab testing and user feedback plus focus groups, stakeholder panel to glean user needs and test improvements.

They have evidence and understanding of lower priority unmet needs and have a backlog of stories for continuous improvement, which they expect to continue with once live. The team have picked up user research skills from their supplier but expect to buy in specialist expertise still in future.

They will continue with the cycle of research and analysis > development > testing > development  > release in live operation. The team have a closed user group and after research and investigation have found no users with assisted digital needs. They have a channel in place for assisted digital - a business support team which can be reached by telephone and email.


  1. Put more work into understanding the end-to-end user journey
  2. Better link up content with services - join up the to-do list with transactions.
  3. Re-look at start pages - do users have all the context they need to start a service?
  4. Consider a push mechanism for notifications rather than forcing users to check the ‘to-do’ list
  5. Check against accessibility criteria for colouring (light grey on white background, yellow on white background)
  6. Make a clearer plan for ongoing user research

The team

Assessor comments:

There is now an internal service team in place with responsibility for this product, albeit not full-time. The main roles are covered with the product owner and business analyst working closely with an internal IT team to feed stories drawn from user research into a backlog for continuous improvement.

The team uses content and analytics specialists from other teams in the organisation. The main skillset missing is user research - the team have developed some of these skills and will buy specialists in. Significant skills transfer has taken place between the supplier and the technology team.

The team continue to use agile, prioritising as a team in fortnightly planning, and adapting their plans and processes rapidly to meet changing demands.


  1. Ensure that the product/service owner has full ownership of the service and can make decisions
  2. Keep working in an agile way once the service is in continuous improvement, releasing changes early and often
  3. Link up with cross-government communities of practise to glean support

Security, privacy, tools and standards

Assessor comments:

The team works in a sprint cycle and deployments are linked to that. Fully automated deployments are coming. The service uses Microsoft Azure, Umbraco and a rules engine for the forms. The in-house IT team are trained and confident developing on this stack.

The team have improved test coverage.

The team are planning further phases of this project  that will manage patient identifiable data.  This will have a significantly higher risk profile than the current service.

The software development kit is open - the team don’t have a public git repository as yet. The team are confident in the ease with which they could switch to a different identity provider.

The team have a plan in place with minimal interruption for users for transition to the new service and decommissioning the legacy service. This is backed up with a communications plan.

There are some disaster recovery plans in place.


  1. Follow the relevant government guidance on handling patient identifiable data, including NCSC’s cloud security guidance and government security classifications.
  2. Implement fully automated, zero downtime deployments.
  3. Produce an explicit disaster recovery plan.


Assessor comments:

The team have evidence to show that the service can be completed by users unaided. They have uncovered a backlog of user needs for the online forms - especially around completing and understanding them - and for the minimum viable product for launch have concentrated solely on visual improvements and some validation.

The service team need to agree content changes on these forms with various internal teams including legal. The team are keen to improve the way they identify where to better the user journey.

The delivery partner, Reading Room, have reviewed the service bearing the GDS design patterns / content style guide in mind. The team have made some choices based on research to amend these patterns.

Statistics show 10% of users are on mobile devices - these are almost certainly HFEA staff. Some functionality relies on JavaScript to be working or turned on.


  1. Use progressive enhancement - don’t rely on javascript for critical functions.
  2. Check links are meaningful - multiple links saying “downlaod PDF” are difficult for screenreader users to navigate through.
  3. Link up content and services, eg a reminder of renewal should take users to the form - not end on the “action point” text describing it.
  4. Feedback learning into the cross government community and draw from that community which is solving and sharing answers to difficult user experience challenges.

Analysis and benchmarking

Assessor comments:

There is no offline equivalent service, aside from a couple of legacy forms. There is support in place for users with a dedicated business support team. The team are using Google Analytics for data analysis and have support on interpretation from their in-house content team.

The team are measuring the 4 mandatory KPIs and reporting on the performance platform. In terms of bespoke ways to ensure they are making things better, the team’s main plan is to measure the number and nature of interventions the business support team needs to make. They’re aware that their users want as few dealings with the regulatory agency as possible.


  1. Ensure that insights from performance against KPIs and other metrics are used to make development decisions.
  2. Make use of the data to inform questions and tasks for ongoing usability tests
  3. Exclude the local HFEA IPs from the Analytics Traffic (more info)

Testing with the Minister

Assessor comments:

The team have a new minister, Nicola Blackwood, who is aware of HFEA’s digital development work. The minister will be testing the new website and find a clinic tool. For this specific internal tool, the HFEA Chief Executive is engaged in the project and has already tested it a number of times. The team are in contact with the HFEA Chair who will also test the service.


Sharing and comments

Share this page