This post was originally written for the Early Career Climate Forum and posted in May 2016.
I recently finished an online survey of agricultural advisors in Texas, Oklahoma, Kansas, and Colorado about seasonal forecasting for winter wheat farmers. Online surveys are everywhere these days, and with free tools like SurveyMonkey or Google Forms, anyone can conduct a survey. Preparing and conducting a survey for research, however, is no small endeavor and requires careful consideration. My survey, for example, took 3 months to plan and another 3 to conduct. Here are 6 tips on how to get the most out of your efforts.
1. Survey or something else?
Ask yourself: What information am I interested in, and is a survey the best method to get that information? Surveys work well with quantitative information that can be ranked, listed, counted, or compared on like-dislike scales. But surveys are not good for qualitative research, like descriptions of events, detailed anecdotes, or open-ended conversations. For this kind of information, personal interviews are a better tool. Analyzing interviews – conducting, recording, transcribing, and analyzing text – is also much more labor-intensive–one reason why interview studies generally have fewer participants than survey-based studies. The pros and cons of qualitative approaches are nicely laid out in Berg (2008, full references at the end of this post), a seminal book on qualitative methods. Babbie (2014) provides the basics about survey research (chapter 9) and quantitative analysis (chapter 14).
Also, discuss your approach with faculty, experienced coworkers, and your committee. I decided that a survey followed by a small number of interviews would give me the best of both worlds – a large, quantitative dataset to analyze and detailed information to explain some of the most interesting survey results, all while being time-efficient.
2. Survey methods
You’ve established that a survey is your method of choice. But which survey method should you choose? Many surveys today are conducted online as opposed to via phone or snail mail, and for obvious reasons: Online surveys are easy to disseminate – via email or social media – and they are cheap or free to produce (try Google Forms or SurveyMonkey).
Online surveys also deliver instant results in a digital format, reducing errors from digitizing mail responses. They also have lower labor costs than phone surveys while having higher response rates: about 25% for online surveys versus 8 to 12% for phone surveys, according to FluidSurveys.
Online surveys, as convenient as they are, can create biases.
Your target population might not all have internet and/or social media access, or you might not have a complete email list. These biases could lower the explanatory power and generalizability of the survey results. Biases can’t always be avoided, or avoiding them could increase costs. In any case, these limitations should at least be mentioned in the publication. My survey, too, faced the problem of internet bias, but instead of changing my method I decided to change the survey population. Instead of surveying farmers, a group that doesn't use computer and internet much, I surveyed corporate extension agents, agricultural advisors with desk jobs, internet access, and publicly available email addresses. They are also in contact with many farmers in their county, and thus can, to a good degree, speak for them. I couldn’t ask them quite the same questions that I would have asked farmers, but that was a compromise I was willing to make.
3. Survey Design
Ask yourself again: What am I interested in? This should help you decide what question formats are best: matrices, multiple choice, open-ended text boxes, Likert scales, images and sketches …? Text books can give you some direction, but think critically about what you read in papers. Was that really the best way to answer the research question in that particular case, or was it just convenient? Could I do it differently and get better, more robust results? Discuss your ideas with your committee or peer researchers.
My survey was modeled after focus group and interview research of corn farmers in the Midwestern U.S., which I adapted to fit my time budget and to answer my research questions. Surveying also means explaining differences in responses and, often, trying to confirm or reject a hypothesis. Why did some participants answer in this or that way? Because of their income, their level of education, their geographic location — whatever it is, make sure you ask about it in your survey in order to later cross-tabulate answers and analyze them for significant differences.
Also, think about the order of your questions and if you really need to ask all of them. People might be okay spending 10 or 15 minutes on your survey, but too many questions will make them frustrated and tired. Keep it succinct, but still ask everything you need. Let people know at the beginning how long the survey will take (pretests can help you estimate that) and include a progress bar if you can.
4. Question Language
By now you probably see that developing a survey takes some time. After weighing the pros and cons of the question format, phrasing, testing, and refining your questions can take weeks or even months. Which words should you avoid? The farming community in the Southern Great Plains, for example, don’t like terms like “sustainability” (which many associate with more government regulations) or “climate change”, for obvious reasons, so I tried to avoid them.
Jargon is okay to use, but make sure people understand what you mean. Consult experts to fine-tune the wording. Make sure questions are unambiguous, easy to understand, and check that answer choices cover every possibility. Again, pretesting can reveal most of these issues before you release your survey. The easier you make it for your participants, the more likely they will finish your survey.
5. IRB Approval
Getting your survey approved by your Institutional Review Board (IRB, also called Independent Ethics Committee, IEC) is required for all research on human subjects (meaning survey, medical, psychological, and other research on humans) that is intended for publication. You can read more about the IRB here, but in general, the IRB’s job is to make sure you treat your participants fairly, protect their information, and don’t harm the reputation of your university. The University of Oklahoma produced a series of short videos to explain the IRB approval process.
Expedited IRB approval for low-risk studies, like in my case with the agricultural advisors, can take be dealt with in one week. If your survey population includes children, prisoners, or pregnant women (so-called “vulnerable populations”), a full panel review is necessary, which can take months, and reviewers might ask you to explain and justify just about every detail in your survey. Some studies need approval by multiple IRBs, for example studies of Native American tribes, which may have their own IRB process. Last but not least, make sure your survey is finalized when submitted for IRB approval. Even small changes, for example in the wording of questions, have to get approved again.
6. Distributing your online survey
Congrats! Your survey got IRB-approved and is ready to go. Now to getting your survey out there. Depending on your target population, this can be a challenge for several reasons. For my survey of agricultural advisors, for example, I couldn’t spread it via Facebook or Twitter. I wouldn’t know who took the survey, nor would it be the best way to access my target population. My results would become meaningless. In my case, personal email and email lists were the only method that made sense.
There are several ways to increase the number of responses. Connect with your survey population by attending their meetings and introducing yourself. Reach out to trade publications and ask if they would report about your research and the survey you are conducting in their circulation area. When they do, you can link them in your survey invitation. My research was reported by the Kansas Farm Bureau and the Texas Farm Bureau, which I mentioned in survey reminder emails. Especially for out-of-state surveys, this can create trust among the people who are otherwise unfamiliar with you.
“Local champions,” people well known and respected by your target group, can also help you boost your response rate. I asked state and regional extension directors to send out invites and reminders on my behalf. Their name in people’s inbox (as opposed to my) most likely made people more likely to take time out of their busy schedule and take my survey. One time I sent out reminders myself. I got zero responses.
But local champions are busy people, too. Provide them with email templates, a list of email addresses (semicolon-separated, so they can be copy-pasted into an email address field), and a PDF with information about your research they can attach. Also, ask them to copy you in their email. That way you know the email was actually sent out and when. Collaborating with local champions will be additional work for you, but it is worth the effort. Without them I would not have gotten the response rate that I got. After three months of surveying and several rounds of emails, it was at just over 40%. And as a nice side effect, I had several people say they were very interested in presenting my results.
Lastly, timing is critical. Think about times when people are easier to get a hold of. Winter wheat advisors have a lower workload during in the cold months of the year, before temperatures increase in spring and farm work picks up again, which leaves more time for them to do my survey around winter and early spring.
Babbie, E. R. (2014). The Basics of Social Research (Vol. 6). (especially chapters 9 and 14)
Berg, B. (1998). Qualitative Research Methods for the Social Sciences. Third Edition.
Edit (August 2017):
Pew Research published a comprehensive guide to designing public opinion surveys: