Skip to main content

HRA corporate website - Beta service assessment

Posted by: , Posted on: - Categories: Service assessments

The HRA website is a key resource for the organisation. Providing information, guidance, policies, consultations, calls for comment and resources. It is often the first point of contact that researchers have with the HRA and our services.

The functions and duties of the HRA are set out in the Care Act.

The HRA are creating a new website that meets the needs of users and is future-proof, to ensure they are responsive to emerging technologies and trends, that it can build on to limit the need for future large-scale redevelopments.

The current website has grown organically as the HRA has taken on new services and responsibilities. It is dense and complex to navigate, holding around 400 pages and over 1,000 documents. The user journeys are unclear and result in a large number of unnecessary contacts by users via phone and email.

It has duties to publish guidance and best practice in the management and conduct of health and social care research and guidance on the requirements researchers conducting health or social care research are subject to.

Department / Agency:

Health Research Agency (HRA)

Date of Assessment:
4 September 2017

Assessment Stage:

Result of Assessment:
Not pass

Lead Assessor:

Service Manager:

Digital Leader:

Assessment Report

Outcome of service assessment

After careful consideration, the panel have concluded that the HRA website is not yet ready to move to public beta because:

  • Further face to face user testing is required on the private beta site
  • Assisted digital support should be tested before launching the site publicly
  • KPIS for the existing HRA site should be benchmarked to measure success in public beta

Conditions to moving to beta

  1. Before public beta, the team should conduct further face to face user research with a range of user types. Where possible this should be done by a user researcher, however:
    The team understands that there are budget constraints for user research. We suggest that the team contact a local organisation such as a University to carry out some face to face usability testing with new Phd students. The organisation might also be able to provide access to low digital & assisted digital students, those less familiar with HRA and those who have English as a second language. The team suggest HRA observe student users trying to complete tasks and use the various interactive search facilities to see if they succeed first time (e.g. date fields). Also new health & social care Phd students would be part of their most challenging user group of ‘new applicants/researchers’ who they need to research in more depth.
  2. HRA should test assisted digital support with those supporting the site over the phone to identify any challenges these staff may face supporting the new site.
  3. HRA to benchmark the KPIS for the existing site so that it is clear what success looks like in public beta

Recommendations to prepare for live assessment

  • HRA should start the process and have a plan to acquire or knowledge transfer the following skills in order that these skills can be retained once the supplier and the product delivery manager leave the project:
    • User research skills
    • Product delivery skills
    • Analytical skills
  • HRA to put in place a continuous improvement plan that includes how and when research will be conducted and when data will be fed back into the team to identify and prioritise improvements and updates in public beta and beyond. This should also include who in the team is responsible for the elements of continuous improvement.
  • HRA should implement GOV.UK common tools and patterns, where helpful
  • HRA to do a full accessibility audit in accordance with GDS standards
  • HRA to contact GDS to determine if they need a performance platform dashboard
  • HRA to create a channel shift plan
  • HRA comms team to create a plan to feed user research back to the HRA IRAS team so that valuable learning about the usability of IRAS can be actioned

User Needs and Usability

During Discovery and Alpha the team have clearly developed a good insight into their users, their needs, how they use the service, the pain points and the users who have the most challenging needs (‘new applicants/researchers’).

Some survey feedback has been provided from 17 users, but little face to face usability testing has been carried out in Beta. The Service Delivery Manager and Communications Manager have carried out some user research at a conference and events but largely the team has not been able to observe and iterate based on observing users using the site.

In addition testing has not yet been carried out with a full range of end users including those with low or no ability to use the digital service and those who have particular access needs.

The majority of people who have provided feedback on the service are current online users so the team are not yet aware if less digitally minded and non-subject experts are able to use the Beta service successfully.

Although the team have reviewed data from their own telephone support line and been to the NHS R&D forum and don’t believe that they have identified ‘assisted digital’ groups of users in their customer base due the nature of the role. However, as those who carried out testing in beta were mostly recruited online, it is difficult to judge if low / no digitally skilled users are potential users of this site.

The panel suggest that using the digital inclusion task specific questions could help identify these groups of users and the digital inclusion scale could then be used to map users.

The panel were pleased to see that the team have been running familiarisation sessions with HRA staff. However, the panel felt that in order to proceed to public beta, testing with telephone support staff should be carried out to identify any challenges of supporting the new site.


The panel were impressed with the dedication and knowledge of the HRA team and their passion to provide an improved future proofed site for their users. The team working on the build were skilled in their areas of expertise and the supplier being brought in to manage hosting and maintenance had been carefully chosen.

In Alpha and private beta the team have contracted their supplier to supply the following roles:

  • Account Director
  • Project Manager
  • UX Architect
  • Lead Designer
  • Lead Front End Developer
  • Lead Wagtail Developer


  • Product delivery manager (contracted until November)

Following the launch of the site the team will be:

  • Communications Manager
  • Communications Officer
  • Head of Communications

Although there is a UX Architect on the project for public beta, the assessment panel felt that not having a dedicated user researcher will impact the team’s ability to carry out research in public beta.

Continuous Improvement

The panel were pleased to see that the way the site has been built allows the technical flexibility to change and improve the site when required.

The panel felt that during public beta the team would benefit from having a plan for skills and knowledge transfer between the supplier and the comms team for product delivery, analytical and user research skills to support the continuous improvement of the site once the supplier and the product delivery manager leave the project.

It was felt by the assessment team that before proceeding to live, the team need to consider how they will maintain and action their backlog of user needs based on continuous research, engagement and analytics. The team should also be clear on who will be responsible for monitoring and maintaining this once the product delivery manager leaves the organisation.

The panel also felt that for planned additions after public Beta it would be important to continue to carry out further user research to ensure that the bespoke content for students, step by step guides, updated REC content and updates feed are meeting user needs (e.g. what should be in them, where would users expect to find them, which groups of users are using them etc.).

It was mentioned that analytics data will be reviewed every 3 months to look at emerging trends, mobile update etc. The panel thought this should be an integral part of the ongoing user research plan, and during public Beta and the initial months of Live this data should be analysed more regularly to measure success, identify issues and continue to iterate the service.


As a completely new service there are no constraints caused by legacy systems. The technology stack is also very short making it possible to make changes regularly without major constraints. In addition, the hosting provider was able to confirm that they have experience of using all the tools and systems used to build the service and will therefore have no issue taking over the service once it goes into Live Beta and Live. It will be possible to assess whether this hosting provider is offering value for money by assessing against the SLA. A variety of systems are in place to ensure real-time monitoring of the health of the service and the panel were confident in the supplier's approach to responding to these issues. Especially reassuring is the support provided by the hosting provider during beta with a series of dedicated support developers available to respond to emergencies 24/7 and a helpdesk that is constantly monitored during office hours.

The set-up of the testing environment makes it very flexible and updates can be deployed within as little as 15 minutes. There has been a browser strategy threaded throughout the build cycle meaning that all browsers have been tested at all stages. The hosting on AWS means that there are auto-scaling options which builds confidence in the ability to scale.

The risks associated with the service being temporarily unavailable are very small as users will still be able to access the research application that is the most important function. In addition, there is a clear plan if the site does go down. People will still be able to call. Cloudfront will automatically display an HRA branded contact page with this information and social media will also be used to pass on the message.

There has also been considerable thought given to mitigating the chances of the site going down. The hosting has a high level of resilience due to the failover set-up and all static data is automatically backed up by Amazon and volatile data is backed up over night.

Data Security

The website holds no high risk data, meaning there are no outstanding legal requirements and it will only be subject to risks that generally affect websites, such as automated attacks - it is not believed that the service will be exposed to targeted attacks. Amazon web application firewall will be used to filter out such attacks, including DDOS attacks. Internal governance structures ensure that the SIRO is made aware of any risks and any actions taken to mitigate these.

The cookie policy is reasonable ensuring no personally identifiable data is stored nor any unnecessary data. There is a clear explanation of how users can block google analytics cookies.


The site is built on wagtail which is entirely open source. The vast majority of the site is using code that already existed on wagtail and all the code is available on GitHub so any new code that has been developed is open to all. NHSD standards were used to guide style elements and NHS Gloucestershire is already making use of the code.

Open Standards and Common Platforms

The site platform is completely open source meaning there is no provider lock-in risk. Although the lack of resource in the internal team does mean that there is considerable dependence on the hosting and maintenance supplier. As the code is open and the tools/platforms used are very common the risks that the hosting could not be handed over to another provider are small but it would benefit all involved if a more concerted effort to transfer knowledge/skills was made. In the meantime the HRA team should ensure someone with technical understanding is engaged when major decisions are being made to ensure someone can represent HRA’s best interests to the supplier and prevent any unnecessary spend/ complexity being added to the arrangements.

As the site is not hosted on gov uk there are fewer open standards available than would be otherwise. However, better use could be made of these to limit the chance of duplicated effort - for example it was noted in the assessment that GDS are also looking into creating code that updates users of the template library when updates are made to the standards. This solution could be used to alert users when guidance is updated. It is also possible that gov Notify could be used for this.

Where connections to other websites are needed (for example pulling in research summaries that are hosted elsewhere) this is managed through an API which keeps things open and shareable. There are some instances where the same information (e.g. guidance for filling in research applications) are required by multiple regulators the information is agreed by all parties although it is not yet centralised. There is potentially an opportunity to review this but it is outside the scope.


The panel were impressed with the HRA’s commitment to having a fully accessible site.

Torchbox has carried out an internal accessibility audit on the private Beta website to meet Level AA Conformance to Web Content Accessibility Guidelines 2.0 and to ensure users with visual or motor impairments are able to use the site effectively.

Following the assessment the HRA provided further details of their accessibility testing, which has been thorough.

GDS guidelines recommend a full accessibility audit to validate WCAG 2.0 compliance and therefore the panel propose that this is done before the site moves to live.


A Lead Designer and Lead Front End Developer have both been involved in the development so far, and will be part of public Beta. A content designer has not been involved and there is no budget for a content designer, the Communications team has been involved in reviewing some of the content and implementing a new content strategy which includes a new style guide and governance process for content creation and publication going forward.

The service is responsive to mobile devices although for interaction webpages date format error messages were not visible. In public Beta it will be important to capture feedback from mobile devices to ensure that issues such as these are identified and fixed in a timely manner.

The panel recommends that before moving to live, GOV.UK design patterns should be used to help the site have a consistent user experience to other government sites.

Performance Monitoring

Although there are clear plans in place for monitoring the performance of the site through the use of Google Analytics and Hotjar. The team should continue to monitor the success of user journeys within the site and have a plan to act on analysis of the performance data as part of their continuous improvement plan.

The team has a set of KPIs to measure the new site, however there does not appear to be a set of benchmarks from the existing HRA site to compare the new site too when it is launched publicly. The assessment team felt that this would be a useful set of metrics to have in order to commence public beta testing so the site could establish where improvements need to be made.

In terms of channel shift, the team HRA team identified that reducing unnecessary calls to helpline as a KPI. The team will monitor calls volumes to identify a shift away from non-digital channels to the website. However, there does not yet appear to be a a plan in place

for increasing digital take up during public beta so the panel recommends that the service create a channel shift strategy before live.

In terms of mandatory measurements, the team are working with GDS to establish if / how the four mandatory KPIs could be measured and what other KPIs might be useful to publicly display.


Overall the panel were impressed with the HRA’s high quality team and felt that most of the points of the assessment had been met. The panel felt that the team had also demonstrated exemplary ways of working when collaborating and partnering with a supplier. DH will support HRA identify how extra capability can be be acquired to continuously improve the site and will support HRA to action the conditions to move to public beta.

Notes from Alpha Checkpoint Meeting:

After speaking to my colleagues in the Digital Strategy Team, we believe you have done enough to pass Alpha subject to the following conditions:

  • HRA must obtain formal approval from CO before moving into any Public Beta phase.
  • HRA must plan for a GDS-led Service Standard assessment at the end of Private Beta.
  • Any extension to hosting contract for current website will be subject to spend controls (HRA will be required to submit a new spend control form)
  • HRA should identify users with assisted digital needs (if any) and speak to users with assisted digital needs at the beginning of Private Beta.
  • HRA to attend DH show and tell to share learning and experience during Private Beta.
  • HRA must test the content further at the beginning of Private Beta.

It's always good to start preparing for the service assessment now. Some of the things GDS will pick up on is the structure of your team so I would be clear about your specific roles and how you expect to work in the future. GDS will also expect the HRA team to do multiple prototypes and for you to explain your reasoning behind your changes from prototype to prototype. Here are some additional comments to help:

  • The service manual has a guide on assisted digital. I would read that and join the assisted digital community You can discuss issues related to assisted digital and digital take-up in this group.
  • The HRA have a content heavy website, you should test the content further. As an example, I would try different variations on copy and test with users. I would also put all the content within a prototype (instead of lorem ipsum) and test it. When I looked at your prototype, it wasn't clear who the HRA were. Be sure to test the prototype with users who are less familiar with the HRA. GDS will expect you to have tested the content thoroughly and to show evidence for this during an assessment.
  • There is a slack community for content on the You will need to sign up with your government account and add the content channel. I can tell you how to do this if you get stuck.
  • I would run a design critique session. You can read more about it here and here. As an example, one thing I noticed in your prototype was that the links were not underlined.
  • I would re-use common design patterns I can appreciate you have NHS branding but you can use some of the learnings noted in the link.

Sharing and comments

Share this page