UC Blue Ash College

Survey Tips

Quick Tips for Survey Administration and Design

  • If you are trying to measure an attitude, concept, or behavior, there’s a good chance someone has done this before.
  • During your research, pay attention to how others are measuring the concept you want to measure. They may have already tested the reliability and validity of a measure. An identical question also allows for comparisons across surveys.
  • If possible, administer your surveys to a few potential respondents (or your friends) to get feedback.
  • Ask others to think out loud as they are answering the survey questions.  Ask “What does that mean to you?”
  • How many people you survey depends on several factors, such as the expected response rate, the level of accuracy you require, the sizes of any sub-groupings you will need to look at (e.g., if you want to look at your data by gender or student class year), and the number of response options in your questions.
  • Sample size calculators can help provide estimates of how many responses are needed to maintain an acceptable margin of error and confidence level.
  • Surveys can be administered by mail, phone, in-person, or online, and there are different questionnaire considerations for each mode.
  • If an interviewer is asking the questions, think about how the question will sound.
  • If you use an interviewer, give clear instructions to the interviewer for how to address respondent questions.
  • If a respondent is filling out the survey on paper or over the Internet, think about how the question will look.
  • If respondent is not able to read the question response options, be sure to keep options concise and have the interviewer read available responses as part of the question. (e.g., How worried are you about losing your job in the near future: A LOT, SOMEWHAT, or NOT MUCH AT ALL?)
  • Plan the timing of your survey to minimize interference with other surveys. You should also consider other events or holidays that are going on and how they may affect responses.
  • To help increase response rates, it's often helpful to share the following with potential respondents in a cover letter or introductory text.
    • The purpose of the survey.
    • Why they have been selected.
    • Who they may contact if they have questions about the survey.
    • How the information will be handled (including whether identifiers will be kept).
    • Who will have access to the data.
    • To whom summaries of the survey will be provided.
  • Survey responses can be impacted by previous questions.
  • If a respondent reads the survey, provide a title for each section. If an interviewer reads a survey, write smooth verbal transitions.
  • It’s usually best to start a survey with general questions that will be easy for a respondent to answer.
  • It’s usually best to ask any sensitive questions, including demographics (especially income), near the end of the survey.
  • If you are asking a series of similar questions, randomizing the order respondents hear them can improve your data.
  • Open-ended question ask respondents to respond to a question in their own terms.
  • Open-ended questions allow the greatest variety of responses, but are time consuming to ask and require a lot of work to analyze.
  • Closed ended questions are questions where the respondent is asked to place themselves into one of a limited number of responses which are provided to them.
  • Closed ended questions, when well designed, ensure that respondents interpret questions the same way.
  • Respondents are more likely to skip an open-ended than closed-ended question.

  • Generally, providing a middle category provides better data.
  • Points on the scale should be labeled with clear, unambiguous words.
  • Try using scales that provide balance and avoid bias.
  • Be consistent with assigning value; that is, positive items are usually scored 'high' and negative items are usually scored 'low.'
  • It measures the underlying concept it is intended to tap.
  • It doesn’t measure other concepts.
  • It means the same thing to all respondents.
  • Avoid technical terms and jargon. Words used in surveys should be easily understood by anyone taking the survey.
  • Avoid vague or imprecise terms.
  • Define things very specifically.
  • Avoid complex sentences.
  • Provide reference frames.
    • Make sure all respondents are answering questions about the same time and place.
    • For example, if you ask: “How often do you feel sad?” some people might provide an answer about their life’s experience, while others might only be thinking about today.
    • Usually, it’s better to provide a reference frame: “How often have you felt sad during the past week?”
    • Don’t ask: “How good is the economy these days” and assume everyone is talking about the same economy.
    • A better way might be to ask: “How good is the national economy these days” or “How good is the economy in your community these days?”
  • Make sure scales are ordinal.
    • If you are using a rating scale, each point should be clearly higher or lower than the other for all people.
    • For example, don’t ask “How many jobs are available in your town: Many, a lot, some, or a few.“
    • It’s not clear to everyone that “a lot” is less than “many.”
    • A better scale might be: “A lot, some, only a few, or none at all.”
  • Avoid double-barreled questions.
    • Questions should measure one thing.
    • Double barreled questions try to measure two (or more) things.
    • For example: “Do you think we should lower taxes and spending.” Respondents who think we should do only one of these things might be confused.
  • Answer choices should anticipate all possibilities.
    • If a respondent could have more than one response to a question, it’s best to allow for multiple choices.
    • If the categories you provide don’t anticipate all possible choices, it’s often a good idea to include an “Other-Specify” category.
  • If you want a single answer, make sure your answer choices are unique and include all possible responses.
    • If you are measuring something that falls on a continuum, word your categories as a range, such as 1-4, 5-10, etc.
  • Dillman, D.A., Smyth, J.D., & Christian, L.M. Internet, Phone, Mail, and Mixed-Mode Survey: The Tailored Design Method. (4th Ed). Hoboken, NJ: John Wiley & Sons, 2014.
  • Fowler, F.J. Jr. Improving Survey Questions: Design and Evaluation, Applied Social Research Methods Series, Volume 38. Thousand Oaks, CA: SAGE Publications, 1995.
  • Robinson, J. P., Shaver, P. R. and Wrightsman, L. S. (eds.) Measures of Political Attitudes. San Diego: Academic Press, 1999.
  • Sudman, S., & Bradburn, N. Asking questions: A Practical Guide to Questionnaire Design. San Francisco, CA: Jossey-Bass, 1982.


Contact Information

Muntz Hall - Room 150D
Phone: 513-558-9444
Email: ucbair@ucmail.uc.edu