Skip to main content

Fleming Fund - Beta service assessment

Posted by: , Posted on: - Categories: Service assessments

In 2016 the UK government announced the development of the Fleming Fund – a £265 million project to improve laboratory capacity for diagnosis as well as surveillance of antimicrobial resistance (AMR) in low and middle income countries. This budget is taken from Official Development Assistance (ODA), and will support those countries that will be most acutely impacted by the spread of AMR. Project was assigned to the Department of Health and Social Care who are working collaboratively with relevant United Nations agencies and other development Partners. Currently we have an MVP website which was launched in April 2018.

The Fleming Fund identified 3 objectives for having a website:

  1. Provide priority audiences with a clear understanding of the Fleming Fund and its components.
  2. Enable the Fleming Fund to be transparent about its activities and spend
    • Who is the Fleming Fund working with?
    • Where is the Fleming Fund working?
    • What capacity is the Fleming Fund building?
  3. Provide the AMR community with access to resources developed by the Fleming Fund.

The users of this service are:

  • Potential grant applicants to the Fleming Fund - either governments, NGOs, international NGOs or local stakeholders.
  • Specialists - academics/researcher and those working in the field of AMR.
  • Officials in other donor countries.
  • Journalists.

Department / Agency
Department of Health and Social Care

Date of Assessment:
25 July 2018

Assessment Stage:

Result of Assessment:

Lead Assessor:
C. Pattinson

Service Manager:
Penny Walker-Robertson

Digital Leader:
Iain O’Neil

Assessment report

Outcome of service assessment

After careful consideration, the panel have concluded that the service met the standard because:

  • The team have substantially improved on what is currently available
  • The team know about their users and their needs
  • The team know there is more to research and the need to iterate the service.


The Fleming Fund team are replacing their current service, which is already publicly available. In a condensed phase they demonstrated that they understand their users and their needs. They have made sound technical choices and understand that they will need to continue to test, monitor and iterate the service moving forward.

While more needs to be done it is clearly an improvement than the current site. The panel saw that it is better for users, has improved content, is more accessible, aligns with GOV.UK standards and the team is committed to continue to improve and iterate.

In the future the team must:

  • monitor the service, test with users and iterate the service. This is especially true for the application journey. If the current application journey is found to be sub-optimum for users then they should make sure this is improved. The panel want the end-to-end journey for the user to be as frictionless as possible.
  • If the scope of the site grows then they should run new discovery phases for extra services they may wish to add. For example, an e-learning feature was mentioned and should be suitably researched before any development work is commissioned.
  • Make clear what their licensing arrangements should be for the service in order for the team to release code to their public repository.
  • Do another test of the full user journey with the Chief Medical Officer at the earliest possible opportunity.

Ahead of going live the service team must:

  • Identify the correct license to use to release code to their public repository
  • Review the remaining content before publishing, as alluded to in the assessment.

As part of ongoing development the service team must:

  • Understand the user journey end-to-end and capture pain points.  No users have been through the full process of discovering, applying and being granted funding. This may introduce a need to improve the process, based on evidence. Fleming Fund have staff in the field working with users and this is a fantastic opportunity to capture user feedback and observe them in-person using the product.
  • Produce a continuous improvement plan for the team to follow. Fleming Fund staff intend to quality assure the service end-to-end. A calendar of quantitative and qualitative methods need to be considered and scheduled in. Baseline metrics for performance and satisfaction need to be captured so improvements can be made and monitored.
  • Implement a process to capture feedback from users and share with the product team. There’s a risk that capturing feedback and taking sense of it becomes a huge, challenging task. By having a clear plan of when, what and how feedback will be captured, and selecting the right tools to document and share, the team can consolidate and review on a regular basis. User research insights must drive ongoing product development.
  • Make sure that the resources are available to move quickly and iterate the service.
  • Carry through with their plans to monitor the deployment and running of the service in live to check their assumptions on availability, load and instance specification.
  • Do another test of the full user journey with the Chief Medical Officer at the earliest possible opportunity.
  • Continue to be aware and communicate to key stakeholders that should the functionality and scope of the site increase (e.g. to take on some or all of the application processes, or add collaboration / community features) that many of their infrastructure, monitoring, tooling, testing decisions etc. will need to be reassessed as well as potentially team structure or whether to operate as separate services operating at differing maturity / phase levels.
  • Alleviate the mandatory email step for a user with assisted digital needs
  • The insights from measuring performance and their KPIs should be used to inform future improvements.

Service assessment

User needs

The Fleming Fund team were relatively new to user research before embarking on the design and build of this new service. Ahead of finding a supplier, some user research was conducted to understand more about the problem space. The service team identified a resourcing gap for ongoing user research and collaborated with a supplier to lead user research during the design and development phases of the project. The supplier involved the team in the process of designing, learning and testing, to educate them, and build capacity for future user research activity once the service goes live.

It is clear how user interviews informed the development of key personas. It’s also apparent that these personas have evolved over time, as more research was conducted. And it’s expected these personas will develop further as the service matures. The value of this artefact will be vital for ensuring that the team, suppliers, and stakeholders remain close to their users.

With an understanding of who the users were, evidence-based user stories were written. This helped drive the initial information architecture. Usability testing was conducted early to validate site structure decisions, and make changes ahead of prototyping.

Usability testing was conducted remotely as a mechanism to validate assumptions. Content design was a key activity during the website redesign, and testing was carried out to ensure plain English language and an appropriate tone of voice were adopted. There’s a need to put the new service in front of users to understand more about how they interact, and ensure the journey is clear.


The Fleming Fund team were paired with a supplier. The team had the right composition for the current scope of the service. The panel were confident that collectively the team knew how to build an appropriate service for their intended users.

The civil servants in the team embraced digital ways of working with support from a member of the DHSC digital team and the suppliers. They clearly knew and were passionate about the work of the Fleming Fund and the panel commends their effort in learning new skills and ways of working.

They used tools such as Slack and Trello and followed typical agile ceremonies. Despite teething problems they seemed to have worked well together.

Credit must be given to the team for engaging with senior stakeholders and engaging with the AMR community. They spoke of getting a minister to show public support of the Fleming Fund to gather more awareness and researched where best to publish open data for the AMR community in the future.


The team have selected WordPress as a content management system and are hosting in AWS. This gives the team a great deal of flexibility and monitoring and alerting capability. Their decision to rely on a single instance in AWS seems well reasoned as the service is informational currently, holds no data, offers services that do not require high availability and the anticipated load is not envisaged to exceed current specification limits. The team also have adequate processes for backing up data.  The team are aware that these assumptions will need to be closely monitored on the live system. The team were able to demonstrate that the means, ability and resources are available to make changes to the cloud infrastructure should the need arise.

They also have a couple of minor changes left on the backlog to do with deployment and automation. These are on track. They have deployed tools and working practices that allowed them to iterate the service effectively in the pre live phase and have an effective means of prioritising, communicating tasks, creating and deploying new content when the service goes live. With the new architecture in place new content can be deployed with no loss in service for users. The planned live phase will continue with no structural changes or new functionality (apart from frequent content changes) though there are epics on the backlog for possible future iterations; the possibility of adding application process functionality to the site and the possibility of adding collaboration functionality.

The technical choices made, tooling used to manage change, monitoring capability and evaluations of any data / or privacy issues and risks all make good sense considering the current scope of live functionality.

The team are making both their user research findings and source code open and reusable and were able to state that the intellectual property remains with the Fleming Fund team.

They are in contact with other ODA projects and have good lines of communication with these teams to make the most of these opportunities to share learnings and potentially code with these team that have many similarities with the Fleming Fund and potentially to a wider audience. They have some final checking to do on the right licensing to state in their public repositories but the team demonstrated an awareness of the issues and are able to make these final determinations.

The ability to test the site in a like-for-like environment is there. The ease with which they can deploy to a new environment is fit for their current purposes. Similarly their plan for the eventuality of the service going offline (making sure that regional coordinators processes are there to cover this) is proportionate to the planned non-transactional nature of the site. This, and many of the team’s current assumptions will need to be revisited should this change and the team demonstrated full awareness of this; the means to make this happen and that the dates and planned feedback activities where this decision might be made are known.


The service has an approved GOV.UK exemption. The team are still using many GOV.UK patterns and followed content best practices. It is a clear improvement on the current site.

The team did demonstrate how they were continuously learning and iterating on findings from research. For instance, the team has a low bandwidth version of the site for those known users who are in areas with a poor connection.

The team demonstrated good knowledge and commitment to accessibility testing. They did multiple accessibility tests on the service using their own devices. An addition would be to test with real users with accessibility needs rather than testing in-house.

The assisted journey is almost great. There are regional coordinators in the regions the Fleming Fund targets, who are on hand to help with any queries and administer the service if one cannot do so online. The one issue is that discovering assisted digital support prior to applying (which requires emailing the Fleming Fund) was not obvious and should be clearer to the user.

The teams scope was limited to some specific journeys rather than a full end-to-end service the Fleming Fund may become one day. One large reason for this is lots of the grantees have not actually completed their grants and are not ready to report/share their findings. Eventually this will come in scope. The panels biggest concern was around the application journey. It requires further monitoring and working with previous suppliers. The team seemed committed that if they get the necessary evidence to make changes then they will do so.


The team sadly could not retrieve benchmark analytics from the current site. They are getting some data from the existing supplier that manages most of the service (but no longer the website) which should continue to be monitored, for example the quality of applications.

The team are using google analytics on the site and have someone capable of monitoring the sites performance. The panel expects this to be done regularly and not intermittently. The team are also embracing GCS measurements for engagement.


Below are further recommendations for the project team. These are not conditions for going live:

  • Consider user needs before embarking on any future development. A number of large features were discussed on the product roadmap. It is not clear whether these features are based on assumptions or linked to user needs. By capturing evidence now, the team can rethink how valuable these features are, and prioritise when to deliver them.
  • Work more in the open and share their work publically. They can be proud of their approach and hard work. The panel would encourage the team to share their story to a wider audience.
  • Closely monitor their tooling for prioritising, creating, task allocation, proofing, deploying and managing content as the frequency, breadth of content, higher numbers of content contributors over time may change, especially as raising the awareness and engagement levels of Fleming Fund are some of the goals of the service.
  • Conduct a usability test with users with accessibility needs.

Sharing and comments

Share this page