Online surveys have emerged as an efficient way for nonprofits to gather data from their constituents, donors – and even their peers – entirely through the Web. At their core, these systems include tools for defining and customizing survey questions, and tools for tracking and downloading the responses. Most modern systems, such as SurveyMonkey or Zoomerang, have additional useful features, including templates, tools for publicizing your surveys, or the ability to filter and search though the results. For recommendations on which online survey system to choose, articles at TechSoup and Wikipedia have good lists of software and features.
This article, however, assumes you have already chosen a system. What remains is much more important – and much more often overlooked. The success of an online survey will depend more on how you define what you’re looking for, how you relate to your target audience and how well you execute the details. This article describes these factors and gives you tips to follow to ensure you get the most, best data possible – and while we’re at it, the most satisfied respondents, too.
I. Engage Your Audience
Be realistic about how motivated and available your intended survey-takers are. If you’re not sure, take the time to run a test with 10% of your list before investing more resources.
With an audience that’s already motivated, providing an incentive for completion could be optional. Otherwise, be prepared to offer a knick-knack or white paper download, or a raffle entry for a larger prize. While responses from only the most motivated constituents can be informative, you want to hear from as wide a spectrum of your audience as possible.
If your survey software allows, break up your survey into multiple pages if it’s longer than two to three screenfuls. Be sure to indicate “page 1 of N” at the top, so that your respondents won’t get intimidated and give up, and can set aside an appropriate chunk of time. This has the added benefit of saving the responses on any completed pages, even if respondents do give up partway through. Also be conscious of length: if you go beyond three pages of this size, you’ll get fewer responses, many of which will be incomplete. Be prepared to cut survey questions mercilessly.
In any case, it is essential to arrange questions in such a way that maximizes engagement and the amount of useful data you get when respondents abandon the survey. To do this:
1. Be sure to have introductory text at the beginning of the survey to provide context and assurances of privacy.
2. Ask a couple of open-ended questions early on, to make sure the respondent feels heard. (But be careful not to overdo this; an entire page of text boxes will scare off most respondents.)
3. Group questions together by subject. Make sure that if there are multiple questions for which some respondents would select “N/A,” these questions are next to each other and can be skipped together.
4. Ask for demographic information (name, address, income,…) and any more personal questions that might make the respondent uncomfortable at the end of the survey. Hopefully they’ll be more at ease with the survey by that point; if not, you’ll still have their earlier answers.
5. Make sure that the “done” message or “landing page” of the survey thanks respondents, reminds them of the importance and privacy of their responses, and if applicable, asks them to refer friends to the survey as well.
II. Elicit Meaningful, Accurate Data
As you plan your survey, be as clear and specific as possible about what it is that you’re looking for. Write it down. Run it by coworkers. As you create and test the survey, refer back to this purpose to make sure the questions you ask will give you data that meets it. If you don’t, you risk not only vague questions and meandering surveys, but useless data.
To be sure that the questions you ask will give you the data you want, you need to ask them in the right format. There are two general goals here: first, for your respondents to answer on topic, and second, for the answers to be stored in a way that allows as much summarization and analysis as possible.
Closed Question Types
In terms of online surveys, a closed question type is one where instead of typing in a response, the respondent clicks an answer that’s already visible on their screen. Because it’s quicker and easier both for the respondents to answer these questions, and for the preparer to graph, total and manipulate the results, they get used more often than open-ended questions. Note: for multiple-choice questions, be sure to include options like “don’t know,” “N/A,” and/or “other” as appropriate.
Multiple choice/choose one is the most restrictive question type, but that makes it the easiest to analyze. The subtypes below illustrate the range of useful information you can elicit in this format. Sometimes you’ll have a choice between presenting these as a drop-down menu versus “radio buttons” that display at all times – go with the latter for ease of use if at all possible.
– Subtype: “Please select the option that best describes…” (but consider “choose multiple” if many of your respondents won’t fit into only one box)
– Subtype: A rating scale which allows the respondent to indicate how much they agree with an adjective or statement; best with 5 options, without labels other than the extremes (e.g., 1=strongly disagree, 5=strongly agree)
– Subtype: Never/Rarely/Sometimes/Often/Always (but it’s better to use more meaningful labels, such as “Every Day” instead of “Often,” which the respondent may interpret differently than you intend)
– Subtype: Numerical ranges, such as 0-1 / 2-4 / 5-9 / 10+ (but be sure that the ranges don’t overlap – people who want to answer “5” will be confused if their options include both “2-5″ and “5-10″). You also get more accurate results when ranges start on a round number (i.e., don’t use 101-200 as an option; you’ll get fewer errors using 100-199 instead).
– Subtype: Yes/No questions allow you to require the respondent to click either Yes or No, so you can be sure of their response. Use this instead of a built-in “Yes/No” option that uses a checkbox – otherwise, you’ll be left scratching your head as to whether respondents chose to leave the answer unchecked, or if they didn’t see the question.
Multiple choice/choose many is a good alternative when you’re looking for a full spread of responses, or when it becomes obvious that a “choose one” question won’t elicit accurate answers. The downside is that you can’t compare the answers as easily – they won’t sum to 100%, and generally it will be more difficult to make comparisons between answer choices (because “both” will have been possible). These generally appear as checkboxes on the survey.
– Best for: Any open-ended question with “Check all that apply” appended, and a clear set of possible answers
Deciding Between These
Let’s say you’re asking for a respondent’s job role. While it’s possible that the respondent might be shared between two departments, most of the time that will be distracting noise in data analysis. It is almost always better to let the respondent make the determination of which one is most relevant. Thus, phrase the question as “Please select the box that most appropriately describes your position,” and use “choose one.” Only when a full inventory is necessary should you use “choose many” – for example, “Which of the following services would you be interested in?”
Open Question Types
Open-ended questions are questions that permit an unconstrained response. They generally allow richer detail in areas that you as the survey creator couldn’t have anticipated, and can convey to the respondent that their true opinions are being recorded and heard. They’re more difficult to process, however – you can’t total the results or graph them; the most you can do is painstakingly assign responses to subjective categories, or else cherry-pick a few to quote in a written report.
Essay questions (sometimes called “multi-line text”) are the most open – they allow the respondent to type a full paragraph or more.
– Best for: connecting with the respondent early in a survey, eliciting detailed information, and/or when expecting a small number of respondents
Text boxes (or “single-line text”) are partially open – they allow only a few words, and so you’ll only find a small amount of the unexpected in them.
– Best for: providing an “other” option to multiple choice, for cases that demand a brief response (perhaps “How did you hear about us?”), or for standard demographic data like Name, Address, etc.
Other Question Types
These more specialized question types are not available with all online survey packages.
Numeric inputs (or “single-line: numeric”) are barely open – they allow only numbers, but they don’t restrict the choice of number. If you’re trying to be very precise, it can be better to ask rating questions on a 0-100 scale rather than multiple choice. Warning: some survey systems will ignore or reject anything other than the digits 0-9. For example, make sure you can successfully input and download “$10,222.64″ before publishing a question that asks for a large dollar value.
Rating grids can be thought of as an extension of the “multiple choice/choose many” question type. Instead of checking yes or no for each option, you could ask respondents to rank their top three preferences out of a list of ten. Be careful: this gets unwieldy quickly with too many options, and the resulting data can be difficult.
III. Further Tips and Process
Do not design your survey within the web interface of your survey software. Revisions and reordering will be frustratingly slow and awkward. It is much better to write out questions with pencil and paper, or at most a word processing package. Only when it is near ready to go, and you have at least verified what question types are best to use, should you transfer the questions into the online survey framework.
– Whenever possible, arrange a test run of your survey with 3 – 6 people who are similar to your respondent pool. Arrange to talk with them afterwards, and make sure they do most of the talking.
– Be sure to start early. The surest way to end up with embarrassing typos, ineffective questions or broken invitation links is to be rushed. Starting early allows you to review and test your survey. Just because these services make it technically possible to put out a rushed survey doesn’t mean that you should.
– Almost every online survey involves sending invitations by email. Test this email – its subject line, text, and links – as you would any email announcement. You especially want to make sure that a link to the survey itself shows in the first screenful of email text.
– Be careful about sending reminder emails. If your survey software has an option to send reminder emails only to people who haven’t responded yet, use it. In any case, don’t send more than two reminders, or you’ll alienate your constituents.
– On a final page, next to any demographic questions, give respondents the opportunity to get more involved in your organization – for example, with a checkbox allowing them to subscribe to an e-Newsletter.