Skip to main content

https://digitalhealth.blog.gov.uk/2014/11/05/intranet-testing-testing-and-more-testing/

Intranet: Testing, testing and more testing

Posted by: , Posted on: - Categories: Capability, Insight

As an agile digital team, running an agile intranet project, I always wanted our testing model to reflect the principles of agile.

Dilbert cartoon depicting joke about user testing
Courtesy of Dilbert.com

You’ve probably heard that the Department of Health (DH) recently launched a new intranet, which has been well received by the Department, both general staff and our business units. Part of this success story was our approach to testing.

Apologies in advance for such a long post, but we’ve had a few enquiries about how we ran our testing and lessons learned. This is my attempt to oblige.

As an agile digital team, running an agile intranet project, I always wanted our testing model to reflect the principles of agile.

Here’s how Wikipedia describes ‘Agile testing’:

Agile testing is a software testing practice that follows the principles of agile software development. Agile testing involves all members of a cross-functional agile team, with special expertise contributed by testers, to ensure delivering the business value desired by the customer at frequent intervals, working at a sustainable pace.

Agile development also recognises that testing is not a separate phase that you do at the end of a project, but an integral part of software development which needs to be done throughout the iterative process.

With those principles in mind, the first thing we did was to identify the areas of testing we felt were *really* needed for this project (outlined later). That done, we then had to decide how to deliver this programme of testing within the project timeline. What followed was a myriad of meetings with teams and individuals in the department, some digital services procurement, lots of internal communications and calling in of favours, and plenty of cajoling.

The end result was a virtual test team made up of approximately 300 DH staff members. The majority of the team were volunteers who responded to adverts placed on internal comms channels or our digital champions group. Others had been involved in our intranet discovery and alpha projects and some people/teams were invited to bring specialist knowledge and expertise (eg. hands-on, contacts or assurance) in areas such as IT security, IT systems integration, accessibility, policy making, design etc.

We also produced a couple of pieces of traditional IT documentation. Not exactly agile per se, but useful to us and reassuring to our IT department:

  • Test approach - a detailed outline of how we would run our testing. This included our testing methods (which I will describe shortly), our entry and exit criterias for each stage of testing, roles and responsibilities, risks and mitigation etc.
  • Testing plan/schedule - mapped out exactly when various tests would run, who would need to be involved and when, the timing of testing comms, when we’d collate and analyse testing results, periods of bug fixing, retesting etc. All set against the major project milestones.

And here’s a whistle stop tour of how we delivered each of our testing areas:

Validation Testing

Validation testing ensured delivered functionality worked as intended and met the user stories defined at the beginning of the project or sprint.

We used 5 validation testers to test all delivered functionality at the end of each sprint. To make results collation easier, we used a single common spreadsheet for testers to complete. The spreadsheet contained the user stories, a description of how the functionality was meant to work, a results column for testers to mark each test as a ‘Pass’, ‘Fail’ or ‘Partial fail’ and another column for notes.

The testers were chosen from the intranet project team as we felt that validation testers needed a good understanding of the user stories. And, as a lot of functionality needed to be tested in both the front and back end of the website, we felt that experience using WordPress was also a necessity.

To strengthen our testing further, we also varied the locations of our testers, the devices they used for testing and ran all tests using both of DH’s supported browsers.

Results were logged with our developers for investigation and any defects found were assessed, categorised (P1, P2, P3 or P4), prioritised (H, M, L), assigned a target resolution date and added to a definitive bug list.

Integration Testing

Integration testing ensured delivered functionality worked on all DH browsers, devices, other IT systems and business applications.

As 2 of our core user needs were remote access to the intranet and availability on mobile devices, integration testing was an integral part of our testing programme.

We approached our IT Testing team for their expertise and 2 professional system testers were allocated to the intranet project. They identified and performed 37 tests using the 11 supported DH devices and the 2 DH supported browsers. Results were fed back to the Digital team and logged with our developers for investigation. Any defects found were treated as described previously.

Accessibility Testing

We felt that the new intranet needed to meet 2 areas of accessibility compliance. The first was the Digital by Default Service Standard which stipulates a minimum W3C AA accessibility rating and the second, was that the intranet needed to work for our internal assistive technology users. We therefore ran 2 sets of accessibility testing:

  1. Code testing - The Digital team tested the intranet code with the W3C Validator, a reliable low cost tool used by developers and organisations to validate HTML, CSS and accessibility against W3C and WCAG global web standards. Our developers were already using another low cost accessibility tool called Total Validator for their checks, so we thought it best to perform our testing using a different tool.
  2. User testing - To ensure the new intranet worked for internal assistive technology users, we got in touch with DH’s Business Support and Accessibility Manager. She provided us with a list of the 6 assistive technologies supported by DH and a selection of users we could approach for testing. We had planned to ask testers to explore the intranet and to provide them with testing scripts for testing specific tasks, based on top user stories or areas where we had concerns about accessibility eg. functionality which required visual cues such as commenting or voting. After discussing this approach with some of the testers, it soon became apparent that testing scripts would not be viable for testers with certain disabilities. In these cases we provided a description of the functionality we wanted tested.

All bugs/issues were reported back to our developers for fix during this project. As compliance with the Government Digital by Default Service Standard and W3C AA accessibility was stipulated as a requirement of the project, any non-compliance had to be rectified before the project could be closed.

Regression Testing

To mitigate the risk of introducing new bugs or regressions through the iterative development process and on-going bug fixing, we decided to introduce a period of regression testing towards the end of the project. This happened prior to penetration testing and involved repeating all the validation tests from all the sprints.

To bring a fresh and subjective approach to this testing, the work was undertaken by our 2 professional system testers who had previously performed integration testing. They knew enough about the system to not require training but had not been involved in any previous validation testing. Any defects found were treated as described previously.

User Acceptance Testing (UAT)

As a project team, running an agile project, user needs were always at the heart of what we were trying to achieve with the new intranet. And end user testing was not only key to determining whether we had achieved our project objectives, but also to how the new intranet would be received in DH.

We decided early on that we wanted to get testers on the site and testing as soon as possible. That would enable us to fix as many issues and bugs as possible before the launch and to begin making the new intranet a part of normal business life as soon as we could. And, in doing so, avoid the ‘big launch’ expectations that have been the downfall of many an IT or digital project.

We took 3 approaches to UAT:

  1. General feedback - general exploration and bug logging
    Straight after beta launch we issued a testing invitation to 20 testers (chosen from 200+ testing volunteers) to gauge initial response to the new intranet and identify any major issues that could impact testing, such as problems logging in etc. The testing invitation informed testers that the beta site had been released and how to get to it, explained what beta meant (eg. that this isn’t a finished product – managing expectations) and explained how testers could get involved in the 3 approaches to UAT and when they were happening. We also included a link to a known bugs/issues page (on the new intranet) and a link to a dedicated Yammer Group and email box we had set up for testers to provide general feedback and bug logging.
    The feedback from the 20 testers was favourable and no major issues were identified, so we emailed the same testing invitation to the rest of the 200+ testing group.
  2. Soft/informal UAT - specific but unscripted user experience tests
    Soft/informal UAT took place over a 3 week period and targeted user experience. All 200+ testers were included by default, with an option to opt out. This was all explained to testers in the original testing invitation.Each Monday, for 3 consecutive weeks, an email containing a short 5 minute test and a link to an email feedback form was sent to testers. The testers were given till the end of the following Friday to complete each test and submit their feedback form. They did not have to commit to completing all 3 tasks. All 3 tests were based on the top user stories. Test 1 tested the navigation and downloading files, Test 2 tested the personalised location news functionality and multimedia and Test 3 tested searching for the most popular content.
    Segmentation information was also captured in the test feedback forms so that a good cross-section of DH staff participation could be ensured. We could then send out supplementary tests to groups we felt were under-represented in the results.
  1. Hard/formal UAT - scripted tests for specific end-to-end-processes
    Hard/formal UAT took take place over a one week period and targeted a number of core end-to-end processes, such as password authentication, commenting and moderation, editorial workflow etcTesters were informed of the approach to ‘Hard’ UAT in the original testing invitation and asked to volunteer for a 30-minute test by filling in a short email form. Again, segmentation information was requested in the volunteer forms so a good cross-section of DH staff participation could be ensured.

The results of all soft and hard testing were analysed by the Digital team and all bugs/issues/enhancements were reported back to the developers for fix during the project or captured for future assessment and possible inclusion in the product icebox.

Penetration testing

For those who don’t know, penetration testing looks for security vulnerabilities in IT systems. As 2 of our core user needs were remote access to the intranet and availability on mobile devices, it was essential that the new intranet underwent penetration testing.

Sometimes referred to as ethical hackers, penetration testing is a specialism which we don’t have in the Digital team. So we arranged a meeting with our IT security team, the subject matter experts for this specialism in DH.

It was soon established that we couldn’t perform this testing in house and that due to resourcing issues, the IT security team wouldn’t be able to project manage this work for us. It was therefore agreed that the Digital team would procure a specialist supplier to perform the testing and that our IT security team would assist us in 2 ways:

  • Support supplier procurement
    set the standards required from a Penetration tester eg. CESG CHECK accreditation etc and take part in the supplier sift.
  • Provide testing assurance
    set the exit criteria for the testing, sign off the final testing report and if necessary (it wasn’t) to present the results to the intranet steering board.

Conclusion

By taking an agile "whole-team" approach to testing we were able to "bake quality" into our product. And, in the process of doing so, built bridges and forged new relationships with areas of the business and staff who were previously sceptical or concerned by the agile approach or ICT projects in general.

Civil servants, as with staff from many large organisations, are not used to being involved in ICT projects. More often than not, they are used to coming into work one day to a new intranet or business application and then having to figure out how to do business as usual with it. We were pleasantly surprised by the enthusiasm and the number of staff who wanted to be involved in the project and testing. And a large part of the new intranet’s success is down to the hard work of our many testers, who helped us to iron out the majority of bugs before the launch and who continue to provide us with good ideas for enhancements and test new functionality for us.

Our collaborative, transparent and continuous approach to testing and the early involvement of testers in the project, also helped to foster what some of our volunteers have described as a ‘feeling of shared ownership’ of the new intranet. I have no doubt that this ‘feeling’ was fundamental in helping us to avoid what has become a common problem with new product launches, where a lack of end user involvement and testing leads to a build-up of user expectation, which combined with a ‘big launch’ can spell disaster – think HealthCare.gov.

That’s not to say everything ran smoothly. Testing was largely missed from the original project scope, which meant that it wasn’t fully factored into the project timeline and no resource or budget had been allocated for testing. This made my job, as the person brought in to fix the problem by devising and leading a testing approach, very difficult. The creation of our virtual testing team was as much about resource as it was about agile. There were other tests that I planned but couldn’t deliver because of time constraints and tests that I might have ran differently if we’d had more time. For instance, we were still running UAT during penetration testing, which is not best practice. We also required daily reports and conference calls during penetration testing so that our developers could start fixing immediately as we didn’t have time to wait for the full report.

Assembling our virtual test team also took a lot of time. If we had factored this need and started this engagement earlier on in the project, we would have had more time for testing. We were very lucky that staff were enthused and wanted to take part, and that areas of the business were prepared to share resource for this project.

So a definite learning for our team, is to ensure that the full lifecycle of a product (including business as usual – post product running) is considered in the original scoping of a project. That the minimum is agreed upon and built into the future model.

If you have any questions about our new intranet, or about the testing, do feel free to drop me a line or comment here.

Sharing and comments

Share this page

2 comments

  1. Comment by Dr Tim Chadborn posted on

    Hi Sarah,

    You're testing was extensive!

    Are you interested in A/B testing? I lead the PHE Behavioural Insights Team (Dan Berry leads the equivalent team in DH), where we have some experience in this area. It could improve the efficiency of the Intranet and Internet site and we could learn from it too for wider application in the health and social care system.

    Please get in touch if you're interested in collaborating.

    Tim Chadborn
    Behavioural Insights Lead, PHE

    • Replies to Dr Tim Chadborn>

      Comment by Sarah Wood posted on

      Hi Tim

      We'd love to do some A/B testing on the new intranet and it would be great to collaborate. Dan did a presentation for us a little while back on some of the A/B testing his team worked on for organ donation on GOV.UK. Really interesting stuff. We've just finished sprint planning for our pre-xmas sprint but will begin planning for the next sprint in January. That would be a great time for you to come in as once we've identified what the priorities are for that sprint, we'll need to flesh out the user stories and define what we want to develop and I think that's where some A/B testing would come in very useful. You might also have some other ideas for things we could test. Can I drop you a line at the beginning of January?

      Thanks
      Sarah