Skip to content

Case Study: How Content Testing Reduced Customer Support Requests and Ineligible Scholarship Applications

library

What does research look like in a content design context? Where does it fit in in the content design process? Who does the research and who’s responsible for sharing the research findings? Which format is the best for sharing content design research findings? When do you know you’ve done “enough research” and can get to work?

I didn’t set out to write a first paragraph with the 5 or 6 W’s and none of the answers, but here we are and we’re going to 📦 unpack all that 📦 in a case study.

Content testing case study

My role

  • Led content discovery and planning for a core workflow redesign, ensuring the right activities were prioritized to uncover users’ needs
  • Collaborated with product managers and engineers (Usually, I collaborate closely with visual designers as well, but we were working with a robust design system with clearly defined templates, modules, and brand guidelines.)

Defining terms

Fellowships are funded academic (or professional) opportunities that usually last 2-5 years. For example, UNESCO offers more than 450 fellowships to “continue your studies, pursue a research topic or set up an innovative project.”

Fellowship recipients are called fellows. They’re selected based on their potential and prior achievements. Contrary to popular belief, you wouldn’t greet them by saying “hi fellows.”

According to UserTesting, “content testing determines whether your target audience can find, understand, and comprehend your content. Done well, content testing exactly pinpoints to which words, phrases, and content people respond. It starts early in the UX process and reoccurs whenever new content is implemented.”

Context

I worked with a company we’ll call FG which stands for Fellowships Galore.

FG is a tech company and its research organization offers PhD fellowships. FG has funded more than 200 students. Fellows receive 2 years of paid tuition, an annual salary, and the opportunity to work on cool shit (state-of-the-art technology).

My approach

I was redesigning the fellowship application process to fit user mental models and support client goals by leveraging research findings, analytics insights, and content best practices.

After talking to stakeholders, I identified these main goals

  • Reduce the number of “unqualified” emails received – FG received thousands of questions about the fellowship application process via email. Most of these questions were already answered on the website dedicated to the fellowship program. This created an issue for FG staff who had to dedicate hours every day to responding to inquiries.
  • Reduce the volume of unqualified applications

Hypotheses

FG received thousands of already-answered questions via email. This meant fellowship program pages weren’t meeting user needs. My initial hypotheses about why that was included:

  • important answers lived in the Frequently Asked Questions (FAQ) section, but people weren’t looking for them there
  • information on deadlines and application documents used vague, ambiguous language.

Analytics

I reviewed web analytics data which included looking at:

  • traffic acquisition – how were people finding the site and the fellowship program specifically?
  • page visits – how many people were finding the site? Which were the most common initial entry points? Which pages were visited the most?
  • search terms – what were people typing on search engines to get to this site? What were people searching for on the site itself?
  • time spent on page
  • device breakdown

I like being able to find and understand this type of data on my own (even if at a high level), but not everyone is comfortable pulling this data themselves, and nor should they be. Many companies have dedicated SEO and analytics team members. If that’s the case where you work, they likely have a well-crafted analytics strategy and can help you analyze website performance data in context without feeling like you’re looking at charts in another language. Analysts make powerful UX partners and I love working with them.

Crafting the content testing plan

Based on analytics data, conversations with stakeholders, and content best practices (like “don’t hide important information in FAQs”), I put together 2 different versions of the fellowship program page. I used Userfeel, a user-testing tool that offers unlimited screener questions (for example, you can exclude people who don’t have a Master’s Degree), moderated/unmoderated tests, and assistance with participant recruitment.

4 participants had to fulfill tasks in the first version. 4 participants had to fulfill tasks in the second version. Tasks included finding:

  1. the deadline for applications
  2. who was eligible to apply
  3. what people need to include in their application
  4. if someone studying Applied Statistics could apply

Research participants had to rate the process of finding this information from 1 (Simple) to 5 (Complex) and explain why they chose that rating. I could see their screens and hear them explain their actions or expectations (“Oh, I was expecting B to be in D, but it’s actually in Y”).

If you’re wondering, I included the fourth question to get a better understanding of how people in unrelated fields were navigating the site. Was it clear to biology students that they weren’t eligible?

Content testing findings and outcomes

Before

After

  • We had a term in the in-page navigation called Fellowship Details. This term seemed like an afterthought to participants. They didn’t realize this was the bread and butter of application requirements. Details didn’t have enough information scent. This item was renamed to Eligibility Criteria.
  • We had a term in the in-page navigation called Application Dates. Participants completed the “Find the deadline for applications” task accurately and quickly. This tells us this phrase is effective.
  • We had a term in the in-page navigation called Available Fellowships. Participants completed the “Find if someone studying Applied Statistics could apply” task accurately and quickly. This tells us this phrase is effective. However, we also rewrote About, the program intro, and the Available Fellowships section to specify that these fellowships were aimed at tech-adjacent students. We wanted to ensure that would be obvious to anyone who would spend a few seconds on the site.
  • After adding How to Apply in the in-page navigation and rewriting copy to be clear and actionable (here’s what your application must include, here are the steps involved), participants were able to complete the “What do people need to include in their application?” task accurately and quickly.
  • Funding information generated a lot of interest so we added Funding details in the in-page navigation.
  • Crucial fellowship application answers (how to apply, who’s eligible) lived under FAQs. This was the last item on the page and in the in-page navigation. Most participants didn’t look there. We removed the FAQs section, incorporating frequent questions like “Is this open to internationals?” and “Do you provide funding for undergrads?” under relevant sections like Eligibility Criteria and Funding details.
  • The motion design team created a how-to video outlining steps across different platforms (the website, email, SurveyMonkey)

Content testing results

I’m super glad to report that we reached our stated goals and heard back from the fellowship program manager at FG sometime after launching the new and improved version that they were really happy with our work. Imagine all the time (and let’s be honest, mental energy) that not receiving and having to respond to thousands of emails frees up to focus on applications from eligible candidates!

I enjoyed working on this project and helping get more $$$ and opportunities in the hands of hard-working students.

paper bills on black envelope

Before I say au revoir, I can’t possibly overstate just how much Erika Hall’s Just Enough Research book has impacted my approach when it comes to research.

“There are many, many ways of classifying research, depending on who is doing the classification. Researchers are always thinking up more classifications. Academic classifications may be interesting in the abstract, but we care about utility, what helps get the job done.

I’ve always thought research “sounded” cool, but I’m not a trained researcher and I’ve never held a title that had “research” in it. Erika’s book gave me the permission I didn’t know I needed to not care about any of that and just do the damn thing. Nike who?!

Her description of research also really resonated with me: “Research is simply systematic inquiry. You want to know more about a particular topic, so you go through a process to increase your knowledge.”

Framed this way, research doesn’t sound intimidating at all and I feel rather excited to dive in. What do I want to know about the topic I’m researching? How can I acquire this knowledge?

If you care about research and pragmatism and practical advice, I highly recommend getting her book and subscribing to her fortnightly newsletter.

2 thoughts on “Case Study: How Content Testing Reduced Customer Support Requests and Ineligible Scholarship Applications”

Add a comment...

Your email address will not be published. Required fields are marked *