Typical response rates

Like many researchers, you're probably wondering:

"What's a typical response rate for a customer survey?"
"What rate can I get if I use the Web?"
"Only 50 people answered my survey—is that normal?"

Looking at a table of "typical" response rates is somewhat like reading a BMI table—it provides a frame of reference, but doesn't actually make a difference. This is particularly true of surveys, where typical can cover substantial ground:

  • Employees: 60-90%
  • Customers and members: 5-40%
  • General public: 1-20%

Surveys are a balancing act of data quality, respondent ease, time, and money. The response rate is no different—to make things wonderful for the respondent, you may have to reduce the number of questions you ask or blow your budget. The Web is not inherently better than paper or telephone or vice-versa, it's just a matter of finding the best matches to your respondents and executing that method well.

So if typical doesn't matter, what does?

Just this: whether the data is serving your decision-making needs. Depending on your organization, you may require a statistically reliable sample, or simply enough people you feel comfortable moving ahead. Remember, the goal is not to get a lot of people to answer but to get a representative group of respondents. You want the middle ground along with the people who are ticked off or those willing to stand on their head for a sweepstakes.

When respondents decide whether to take (or finish) your survey, they're asking themselves two questions:

"What's in it for me?"
"How much work is it?"

There's a pretty clear indicator when the effort is greater than than their interest—you'll see respondents abandon the survey.

Increasing their self-interest

There are several factors which will make people more willing to spend time on your survey. When respondents are really involved, it's even possible to get a good response rate on a 40 minute survey.

  • Ongoing relationships, such as employees, repeat customers, and members, especially if they're used to hearing from you through newsletters or other channels
  • Heightened emotion—sometimes delight, often irritation
  • Trust that you'll actually do something with the answers
  • Curiosity about the survey results/what other people think
  • Direct rewards or drawings
  • Survey makes sense—they understand why you want their opinion
  • Anonymous when possible, and trust with their contact information when it's not

Minimizing effort

An astonishing number of surveys ask respondents to jump through hoops. While some hoops (like length) are research trade-offs, many surveys can simply benefit from a dose of usability.

  • Right medium, such as Web for office staff but paper or kiosks for factory employees
  • Inviting them at a time that's convenient, and/or with advance notice
  • Short as possible, with accurate time estimates and progress indicators
  • Easy password log-in (if any) to start the survey
  • Option to pause the survey and return later (though sometimes forcing a single session gets more completions)
  • Conventional questions and scales, with minimal filler words
  • Few or no required responses or formats
  • Good error messages for when they make a mistake
  • Technology (such as JavaScript) which enhances a survey, but doesn't become a barricade if respondents don't have it enabled
  • Clean layouts with a readable font size

5 Comments

Note: New comments disabled for a few days while debugging.

Hi, I was wondering where you got the percentages that you have on the "typical response rate" webpage. I am writing a report about on online survey of an organization's members and want to use your 5% to 40% figure, since I can't find anything that speaks directly to that audience anywhere else, but it did spark my curiosity. You may reply to Rachel Holbert at [email hidden by admin]. Thanks very much.

Hi Rachel,

I wish I had a stack of studies to hand you, but the ranges I quoted are simply my own experiences across a broad range of client projects over the years. When I wrote this article, I wasn't trying to pull together a meta analysis of response rates, but instead hoping to get readers past a generic "good/bad" percentage and into the factors they can control in moving their response rate up or down.

Since you're looking for membership in particular, you may want to drop ASAE a line to see if they have any numbers:
http://www.asaecenter.org/

Despite being a quantitative industry, survey research is often amazingly short of hard data justifying our own best practices. Some information does come out of the academic side, but they tend to be smaller studies due to cost constraints. Most of the research and the latest tech tends to be private sector, where the results and methodology are confidential or proprietary. It's also where clients are unlikely to sponsor a "split test" comparing the recommended configuration against an alternate, to see if the version their consultant thinks isn't quite as good really will do more poorly ;-)

Here's one of the exceptions if you're ever curious: http://www.practicalsurveys.com/books/questionsandanswers.php

Thankyou for this post, I am a big big fan of this site would like to keep updated.

so much wonderful information on here, : D.

Thanks, I've recently been searching for information about this subject for ages and yours is the best I've located so far.

Need a Hand?

A little help can add a lot of polish—or just save hours and headaches:

(206) 399-2344 Download VCard LinkedIn Profile
info@querygroup.com

I used one of the tips from class in working with data for [my] seminar and it saved me four hours! You were not a good instructor. You were great.

Steve Bottfeld
Executive Vice President
Marketing Solutions