Language selection

Read the blog /

Mixing our methods: How we improved usability with qualitative and quantitative design research

There’s risk involved with doing something new in every industry. So when we come to the table with new methods to deliver digital services in government, we understand why our partners in the public service are cautious or even hesitant to jump on board right away.

How do we build trust with our partners and make sure that the services we develop together are actually usable and appealing to people? We lean on the strength of evidence.

Recap of qualitative research

Over the last few months, while working on the Royal Canadian Mounted Police (RCMP) “Report a Cybercrime” tool, we tried a new research tactic to build an evidence base: mixed methods design research.

What does this mean? Well, you may recall that during early qualitative research sessions, we spoke with potential users of the service to understand their needs. Based on the findings of this research, we iterated on the prototype to meet those needs.

What changes did we make?

  • We simplified the form layout and reduced the number of questions per page.
  • We added emotionally reassuring language in an attempt to decrease shame.
  • We told victims, “you’re not alone, you’re doing the right thing.”
  • We added information on strategies to recoup losses and protect victims in the future.

While we improved the prototype through that qualitative work, we still needed to test the impact of these changes on a larger segment of the population. That’s where quantitative research comes in.

Confidence through quantitative research

Where qualitative research involves fewer participants but surfaces rich stories, quantitative research involves surveying a large number of participants to determine the prevalence of each experience. We mixed the two methods by using quantitative research to test the impact suggested by early qualitative findings.

To do that, we compared the existing reporting service against two new prototypes. We split an online panel of people who have been affected by cybercrime into three groups and asked each group to complete an incident report using one of the three versions of the service. After, we asked them for input on service usability and satisfaction. Using a scoring method we developed with our partners at RCMP, we looked at which of the three versions had a higher completion rate, produced more quality reports, and was more usable.

We learned valuable information from this exercise:

  • Both prototypes were rated as being more usable and were more likely to be completed than the existing form.
  • Participants were more likely to want to use Prototype 2 (the one with reassuring language), than the other versions.
  • Prototype 2 and the existing form gave the same quality of reports.

This information ultimately helped us and the RCMP make the decision to go with Prototype 2 as the foundation of the reporting service, going forward. That means we’d be deviating from what was used in the past, and trying something new.

A partner’s perspective

When we presented our proposed service (Prototype 2) and the findings on both the experience of using the service and its impact, decision makers saw the value in understanding and addressing victims’ needs.

Chris Lynam is one of those decision makers. Chris, who is Director General of the National Cybercrime Coordination unit at RCMP, shared his thoughts on the value of using design research to deliver a service that meets the needs of cybercrime victims.

What has been the most surprising or interesting thing that you’ve learned about cybercrime reporters since beginning this partnership with CDS?

Chris: The project and the partnership has provided previously unknown insight into why people report (or don’t report) cybercrimes or frauds to police.

The research was able to show that those who report can be generally grouped into several types. For example, there are those who report for altruistic reasons so that perhaps others will not be similarly victimized while others are seeking specific follow up actions by police in response to what has happened to them. Clearly understanding what motivates victims to report and what they are expecting has been fundamental to properly designing the new public reporting system so that it can best meet victims needs.

How did findings from the quantitative experiment change your perspective on the project?

Chris: The quantitative experiment was extremely useful to confirm that employing pages with less structured forms to fill out could achieve greater completion results than existing approaches, while still obtaining similar quality of reports.

What was one helpful thing you learned about reporters of a cybercrime in the Alpha phase?

Chris: The research was particularly useful for understanding the level of shame many victims feel when they have become a victim of a cybercrime. It gave me a new appreciation of this aspect that I did not fully consider before.

Many victims conveyed they felt foolish that they were duped and this in itself was an impediment to reporting to police (i.e., they were embarrassed and did not want to revisit those feelings of shame). One elderly victim conveyed that they did not want to even tell their family they had been victimized as they worried that their family would question their ability to maintain their independence.

Understanding how the victim is feeling has allowed us to try and build an online reporting system that uses reassuring language to convince victims they should not feel any shame and that if they report, police will increase their ability to pursue suspects and help prevent others from being victimized.

Power in numbers

Designing services and getting it exactly right for the people who use them is challenging. No one method will answer all the pressing needs they may have. Qualitative research is sometimes dismissed as anecdotal, while quantitative research may lack the social context to have an impact and be memorable. Evidence becomes strengthened, and design decisions become more clear, when we combine methods.

And as we’re seeing in this RCMP product, where there are numbers to support it, trying new things can feel far less risky.