Language selection

Read the blog /

How we measure our delivery and why it’s useful

If you measure it, you know if you’re making it better. That’s why at the Canadian Digital Service we like to have a measure of where we are before we start iterating. That way, we know how far we’ve come after improving a product or service.

For the past year, a team made up of both the Canadian Digital Service (CDS) and Veterans Affairs Canada (VAC) staff have worked together to build Find benefits and services. Together, we conducted design research to identify the needs of Veterans and their families.

We focused on collecting metrics called key performance indicators (KPIs). These KPIs helped us assess whether or not the product was working based on what Veterans told us. We also tracked indicators of agile product development and human-centred design. These indicators were all based on the Government of Canada Digital Standards, which help assess the general function and standard for government services.

Key Performance Indicators (KPIs)

From our in-depth user interviews in discovery, we identified three indicators we wanted to test the product for. They were completion rate, confidence, and comprehension. Along the way, we tested against these KPIs to measure the impact each change had.

The following metrics are taken from our final usability test with 31 participants. We explain why we chose to test for each of these KPIs, how we knew the test met the goal, and what the average score was following our rounds of testing.

Completion rate

Why: Veterans reported they needed guidance while searching for benefits. Completion rate lets us understand if Veterans can find benefits and are taking their next steps to learn more or apply.

Score: 79.3% of participants identified that they found benefits that were relevant to their needs.

Confidence ranking

Why: Veterans reported they need better-set expectations and clearer information during the process.

Score: Participants gave a score of 7.4 out of 10 on how confident they are in their results.


Why: Veterans reported they experience information overload, disconnect with VAC’s language, and stigma when deciding to apply.

Score: 1 in 4 participants reported a single unclear word.

*A part of comprehension is the reading level of our content. We measured the reading level of our one-line descriptions to ensure that Veterans can quickly scan the content, get a sense of the benefit, and see if it’s one for them.

The average reading level:

  • English: grade 8.5
  • French: grade 9

Other metrics

To ensure the product was meeting the standards set by the Government of Canada Digital Standards, we conduct regular audits of product accessibility, deployment figures, and usability tests.

Design with users

Description: Research with users to understand their needs and the problems we want to solve. Conduct ongoing testing with users to guide design and development.

Score: We did research and usability testing with 106 participants, including 81 participants over four sessions from the beta phase.

Iterate and improve frequently

Description: Develop services using agile, iterative, and user-centred methods. Continuously improve in response to user needs.

Score: Throughout beta, the delivery team averaged 17 production deployments a week.

Build accessibility in from the start

Description: Services should meet or exceed accessibility standards. Users with distinct needs should be engaged from the outset to ensure what is delivered will work for everyone.

Score: The product was 89% compliant against Web Content Accessibility Guidelines (WCAG) 2.1 AA.

Collaborate widely

Description: Create multidisciplinary teams with the range of skills needed to deliver a common goal.

Score: VAC’s delivery team of 8 represented 8 disciplines (product management, delivery management, development, DevOps, business analysis, design, design research, content design).

Passing the torch

VAC is now gearing up for a public release by the end of April and therefore continuing the beta phase. Our testing to date has been in a moderated setting, and while we have really good data to tell us that Find benefits and services works for Veterans, we won’t know for sure until the tool goes live. But that’s the value of an iterative approach, you get feedback early and often to limit risk.

We’re honoured to have been able to work with VAC and be a part of serving Veterans better. We can’t wait to see how the VAC team responds to the feedback Veterans give the tool once it’s “in the wild” and is being used by thousands of people. We loved working with the delivery team and are excited to see them continue to explore and improve on the service, with Veterans and their families at the centre.

Interested in seeing if this product can help your users find benefits?

We’ve made a generic version of this product that is available on GitHub and can be adapted by any department for their specific use case!

If you’re looking to take the code - or any other part of the process - and hit a snag along the way, feel free to make a GitHub issue or contact us by email at Alternatively, you can find us both on twitter at @sidewalkballet and @stevieraytalbot.