Products About Blog

Learning from the people who want to use our reporting service. (But might not use it now.)

Our raison d’être at the Canadian Digital Service (CDS) is putting people first in order to make government services easier for them to use. So naturally, to help us better understand how we can do that, we talk a lot about conducting research with people who use our services.

But there’s also another group we can learn from: people who don’t use our service. These could be people who have used the service before, or attempted to use a similar service in the past, but would opt out of something similar in the future.

To learn more, I sat down with Mel Banyard, a Design Researcher who is working with the Royal Canadian Mounted Police (RCMP) to build the public-facing side of a cybercrime reporting tool.

Can you give an example of why a person might not report a cybercrime to the police, even if they’ve been a victim?

Mel: In an earlier phase of research, we discovered some overarching reasons why, like victims being unsure if what they experienced was actually a crime, not knowing what their options are for reporting, or having felt disappointed or intimidated by law enforcement in the past.

But I would say that shame is the biggest barrier we need to consider while designing and developing this service. It’s the most common reason we heard for why people wouldn’t report a cybercrime, or actually haven’t reported a crime in the past.

If you feel ashamed of what happened to you, it could be because you think it’s your fault because you did the wrong thing online, you think that you aren’t smart enough to use the internet, or you feel you aren’t super computer-savvy. That causes people to not report because they are embarrassed and afraid. They feel that if they go to the police, that the police will judge them.

It even prevents people from telling their friends and family, who could potentially be a reliable source of information and support. After speaking with police officers, we learned that senior citizens may be afraid of telling their families about an incident because they feel it might make people think that they can no longer protect and care for themselves. If you’re already vulnerable, becoming a victim may lead to further isolation.

People who don’t report, or “non-reporters,” are not a static group. Feeling embarrassed is a universal thing any of us could experience at any point in our life. Yes, we are designing for people who would report, but everyone has the potential to become the person who says “I wouldn’t report,” or “I won’t report in the future.” We hope to surface their needs early on to try to prevent those feelings that lead to not reporting.

What have you learned about building a service that people don’t have to use, but choose to use, like a cybercrime reporting tool?

Mel: Even though we’re building a reporting tool, reporting in itself is not a need for cybercrime victims. We learned that, for victims, reporting is a means to an end to get support, emotional reassurance, and guidance on how to prevent this from happening in the future.

There also isn’t a lot of education around cybercrime right now, which is challenging because you have to design for people who are experiencing a thing that they might not have the words to describe.

There’s value in learning from people who don’t use your service (either because they’ve had a bad experience in the past, don’t know how to reach it, etc.). It has helped highlight areas of opportunity where we can deliver other relevant services and features alongside the ability to report a cybercrime to the police.

Instead of building a service that is only built for people who know what cybercrime is, understand that it’s a criminal offense, or have the confidence and trust to report to the police, we’re trying to build something that everyone can use.

How are you and the team using this information to make the experience better for people to report?

Mel: We put together a document that highlights all the reasons why people don’t report. We listed all those bad experiences and barriers, and we asked ourselves, “What can we learn from these? How can we look at each of those points of discontent and turn them into positive experiences, or prevent them in the future?”

We took the data from the existing experience of reporting and created design recommendations in the form of a checklist. This checklist became the standard for a good reporting experience which our designers, developers, and product manager relied on throughout prototyping.

A screenshot of the checklist which reads, “Provide a clear definition of what a cybercrime is, or what types of cybercrimes will be accepted and processed within the system. Reassure people that they are in the right place to report a scam, an online fraud, online harassment, etc. Provide examples of what kinds of cybercrimes can be reported.”

We’re currently conducting validation and usability tests to ensure that we’re helping victims effectively report a cybercrime while also delivering emotional reassurance and guidance. Our next steps include exploring how we can design based on the recommendations we received from interviews and testing.

Resources:

Other blogs about this service: