Case Study

The best way to get user feedback? Don’t ask!

Modern audiences demand rich content and prove harder to please. Measuring satisfaction and impact in an online world can be similarly daunting, as few visitors have the patience for feedback surveys. ParsonsTKO has developed a solution to both these problems, using a unique blend of interactive tools and analytics.

Earlier this year, we worked with the Committee for Economic Development of The Conference Board to develop a unique educational tool called the Paystub Education Project (PEP). Built with support from the Peterson Foundation, PEP’s goal is to educate the public about their connection to federal finances through the lens of their own personal paychecks. In this way, every American who receives a paystub from their employer can begin to connect the dots between the line items they see deducted from their salary and the federal taxes and spending they read about in the news, recognizing that they are all connected.

A major goal for this project was to collect data about how audiences react to the tool and to understand whether the content engages its audience and makes an impact.

The problem

The challenge with an educational website like PEP is that you often get only one chance to interact with each visitor, and that one visit must simultaneously entertain, educate, and elicit feedback, since they aren’t likely to visit the site again. If your content doesn’t appeal to your audience, many will abandon it before they’ve absorbed your message. That entertainment must be balanced, however, with meaningful content that truly educates your visitors. And to succeed in the goal of continually improving and validating the success of your work, you must also find a way to listen to your visitors without driving them away.

Three bad types of traditional user feedback

Three traditional methods of capturing feedback include pop-ups, tabs, and forms embedded at the bottom of content.

There are many direct approaches to soliciting feedback from visitors, each with their own set of drawbacks: Passive links or tabs inviting users to give feedback are often missed or ignored; End of site surveys may never be reached and beg the question of where the “end” of your site actually is; and pop-up questionnaires often disrupt the user, causing them to abandon the experience.

With no “good” off-the-shelf options for collecting user feedback, we worked with PEP’s editorial team to develop a unique solution to address this need.

The solution

Having identified and clearly defined the need to measure and qualify engagement early in the project, we looked for opportunities to gather feedback without needing to ask for it. The questions we wanted to ask (e.g. How compelling was our content? What did the user learn? What are the users opinions about the subject matter?) are not only interesting to us as the site’s creators, but also extremely relevant to the site’s users, and a part of what we want them to think about while they are consuming our content. As a result we saw the opportunity to ask these questions inline with the content in the form of interactive features that helped tell the story and challenge the user to think critically about the content when those thoughts were the most relevant.

Key user engagement features include:

  • Interactive graphs that visualize data relevant to the content
  • Mini-quizlettes that challenge the user’s assumptions and reveal additional information about the current topic
  • Single question polls that collect the user’s opinion about what they are reading
    From the user’s perspective, each of these features advance the narrative, give them a chance to test their knowledge, and give them a low-stakes way to react to what they are reading. Along the way, however, we are able to gain a big picture view of what features choose to interact with, and how they respond to different questions.

By placing interactive features in the content to help guide the educational process of the website, and then collecting data on how those features are used, we are able to learn what people think about our content and what our content makes them think about, without ever interrupting their process. This passive approach takes away the distraction of traditional methods and passively collects answers by making the question a part of storytelling. Where most feedback tools ask users about their experience, we made user feedback a part of that experience.

The results

All good theories must face the test of practice, and we were pleased to find during the initial beta test of the new website that user engagement with these features exceeded our expectations and vastly out-performed traditional user feedback methods. Engagement with our in-line and interactive questions rated ten times higher than the expected response rate for a more traditional feedback form placed at the end of content. Moreover, we found this approach was more fun, as it encouraged visitors to become active participants in the content and drew a large fraction of the audience deep into the equivalent of 60 printed pages of content!

This higher volume of feedback helps site developers and content creators make better decisions about how to design new content and improve existing content. Being able to gather audiences responses to specific questions and illustrate the depth of engagement also helps demonstrate the website’s contribution to its educational mission. This is the power of analytics in action!

Want to have this type of engaging feedback collection tightly integrated with your own website?

Ask me how!