Building for learning and iteration with our web search bar
If helpful information exists on a website but no one is able to find it, does that information really exist?
That’s one of the key questions we explored on the Canadian Digital Service (CDS) website when we implemented a new site-wide search bar feature. Here’s what we found.
Building for learning and iteration
At CDS, we publish blogs, guides, reports, and job opportunities to support government teams with their digital service delivery. By building this search bar, our goal was to make it as easy as possible for people to find those relevant resources quickly.
Keeping to the CDS value of “build for learning and iteration”, we know that a product is never fully “done”. So, our small but mighty website team released the new search feature knowing it would require continuous testing and improvement.
As a lifelong learner, I was excited to get the new web search feature in front of people to see how it was helping - but I was even more excited to learn about how it wasn’t, and what we could improve.
Getting feedback with limited team capacity
Typically a product release is followed by various forms of analysis and usability testing. But as a team of 3 (an interaction designer, a developer, and myself doing content design and product management while our Product Manager was on language training), we didn’t have the capacity to create a full-fledged research plan, while also maintaining the day-to-day functions of the website.
So we had to be a little more resourceful. We had to start small. One way to do this was to look for quick wins. So we started looking for light-lift opportunities that would improve the usability of the tool.
I knew that one of the problems we were trying to solve was to make it easier for CDS staff to find published content (such as blogs, guides, reports, or job opportunities) on our website in order to share it with people outside of CDS. This felt like an audience we could get quick and helpful feedback from. It would also allow us to improve the search bar feature in a tangible way for one of its target user groups.
We put out the call to our internal user group and that same week, we were interviewing people to share their experiences using the search bar.
How we got quick feedback
On Monday, our team put together a pared down version of a usability testing facilitation plan, scoped down to focus on our target user group - internal users.
The next day, we reached out to a few people from CDS who regularly use the website to join us for 15 minute testing sessions. During these sessions, we observed them using the search bar and asked 3 probing questions from the facilitation guide:
-
Q 1: Tell us about the problem you were trying to solve the last time you used the search feature.
-
Q 2: Please go to the homepage and share your screen with us. Can you walk us through the steps you took to try to solve that problem using the search feature?
-
Q 3: Did this solve your problem?
Our interaction designer, our developer and I took turns facilitating, note-taking, and observing each of the research sessions (we find it’s a helpful practice to have everyone hear the direct feedback from users whenever possible).
By the end of the week, we had a few pages of observations and feedback to work with. There was plenty of qualitative data that I could analyze and use to recommend a few ways to improve the usability of the search bar.
Users don’t (and shouldn’t) know our website inside and out
We learned that our search bar worked perfectly! … If you already knew the exact title of the thing you were looking for - which is rarely the case when people are using a search function.
For example, “Evan” (not our participant’s real name) got a request from a department outside of CDS to share more about how and why CDS uses Mailchimp software for our newsletters. Naturally, Evan turned to the search bar. But when he typed in “mailchimp”, 0 results showed up.
“Guess we don’t have any helpful information to share about Mailchimp”, Evan might have thought.
The problem is, we did have information about Mailchimp - quite a bit actually. But if Evan wasn’t able to find it, did that even matter?
Had Evan known that what he actually needed to search was “communications” or “data” to capture the title of this blog post where we talked about our newsletter, he’d be able to find the information he needed. But why would he search for “communications” or “data”? He was looking for information on Mailchimp.
Quick improvements
After observing a few similar instances where users were searching for something using keywords they would typically use - instead of the specific title we gave it - we realized our search parameters were too narrow to capture the kinds of words users may be searching for.
At that time, our search feature was only pulling keywords from the content titles or descriptions, which proved to be very limiting. After some discussion with the team to explore its technical capabilities, we decided to open the keyword search up to the body of the content as well.
Now, if Evan searches “Mailchimp”, he’ll find the information he was looking for, without needing to know what we title our resources.
Thanks to quick feedback from our colleagues, we were able to implement an equally quick fix to improve the usability of the search bar. But we know this is just the beginning when it comes to continuous improvement
Now we need your help
This first touchpoint with an internal user group was valuable. Our next step is to solicit feedback from people outside of CDS who want to use the search bar to find the things they need. And that’s where we need your help.
If you’re someone who has used our website, we’d love to hear from you. You can sign up to participate in future website testing here.