Hi, my name is Sarah, and I love taking surveys. It’s true. Ever since I was a kid. By filling out surveys and comment cards, I’ve won theater tickets, hotel rooms, and free meals. I believe in the value of feedback! So, it irks me when I see poorly designed surveys because they waste time, energy, and important opportunities to learn and improve!
There are many instances in which nonprofits seek to gather valuable information and feedback from donors, volunteers, staff, and clients, and surveys are often the most efficient ways to do so.
When it comes to outcomes in particular, many nonprofits struggle to find standardized measurement tools that are accessible, affordable, and applicable to their population, context, or program design. Instead, they opt to design their own measurement tools so that they can answer their own evaluation questions and also report to stakeholders. While these home-grown tools often lack the rigor of standardized assessments, they can be right-sized and feasible options.
Prioritize Planning
The number one mistake I see organizations make in their efforts to gather information is to skip or shortchange the thinking that should guide their data collection. Before you can select a tool or design a survey, you must be clear on what you want to know. In my last two posts (here and here), I shared tips and tools for articulating clear, focused, measurable indicators for your most meaningful outcomes. These tips are applicable to guide any data-gathering efforts, not just those related to program outcomes. Before you ask people to give you their time and feedback, be clear on what you want to know, why you want to know it, and how you plan to use it.
It’s All in the Question
Once you’re clear on what you want to measure and why, then, and only then, can you start designing the survey.
When writing questions or items, remember these basic rules:
- One thing at a time. Avoid double-barreled questions that include “and” or “or.” Respondents might have different responses for each separate idea or question you’ve included.
- Carefully craft choices. When writing multiple choice items, be sure that the options you provide are exhaustive and mutually exclusive. Everyone needs to find an appropriate response in your list (or have the option of selecting “other”), and there should only be one choice that fits (unless you allow them to select multiple).
- Watch your words. Be sure to use neutral, unbiased language. If you use words that imply value (good or bad), people will be less likely to answer honestly – instead, they’ll give you the socially desirable or acceptable response. Also, use words that are familiar and common.
- Be specific. If there is more than one way to interpret your question, your data will be less valid and less useful.
- Ask answerable questions. Don’t expect people to accurately recall something that happened a year ago, estimate how many times they did a common task in a month, or share deep dark secrets for the first time in a survey.
For some more advanced tips, check out this resource from the Harvard University Program on Survey Research. And for an in-depth resource, check out this Survey Guide from the Office of Quality Improvement at the University of Wisconsin-Madison.
Take it for a Test Drive
Lastly, before you roll out your survey, take it for a test drive.
First, have someone who wasn’t involved in designing it take it and give you feedback. Ask if the questions were clear, if the answer choices were appropriate, if it took too long, and if there’s anything else you should have asked.
Second, imagine the results you might get. Would they be specific enough to be actionable? Would you be confident enough in them to put your weight on them? What follow up questions would you ask when interpreting the data? Use your answers to these questions to go back and fine tune your survey. Focus or clarify your questions. Thinking a few steps ahead of yourself will help you get the most useful data.
One More Thing
I’d be remiss if I didn’t mention that perfectly good surveys can gather useless data if they are implemented poorly. Pay attention to how, where, and when you’re administering surveys. The method you choose, who is in the room, how the page is laid out, the time of day, and how you collect completed responses (among many other things) can all impact the completeness, accuracy, honesty, and volume of responses you get.