23 Survey Best Practices & Real Examples

February 10, 2020 13 min read

Cameron Johnson

Cameron Johnson

Survey Best Practices

Conducting surveys is a smart business practice that enables you to keep your finger on the pulse of your organization. Surveys allow you to get critical feedback from the people who matter the most – your customers and employees. Figuring out survey best practices can take time to get the most out of them for your business but it’s well worth it.
You can use this feedback to make better business decisions that will ultimately strengthen your relationship with your audience and boost your organization’s success and bottom line.
These days you don’t even have to spend a whole lot on printing out paper questionnaires to distribute to survey takers. Whether you have plans to introduce a new line of products, expand your services, or organize a large event, an online survey offers a cost-effective and easy alternative to finding out whether your plans are viable.
Implementing survey best practices at every stage will ensure that you design a survey that asks all the right questions. This is most crucial to getting concrete answers that you can act on.
That’s why we’ve put together this A-Z survey best practices guide! We’ll walk you through each of the following topics so that you can design and send out perfectly crafted surveys to your target audience.

Effective Surveys Begin with Best Practices

The first thing you need to think about is how you are going to develop and design your survey.
Survey development is science. Every aspect of your survey, from the number of questions you ask to the order in which you ask them, will trigger a different response.
A haphazardly designed survey with no thought to the structure, order, or the number of questions will give you random results. You won’t get anything concrete that you can act on.
Understanding and implementing survey design best practices is a key factor in creating an optimal survey.
These survey guidelines will do just that.

1. Identify Your Target Audience

Before you write out your first question, you need to be crystal clear about your target audience. Start with a broad audience, and drill it down as much as you can.
Which of these describes your target audience?

  • Middle-aged corporate executives
  • Young families interested in a holistic lifestyle
  • Seniors looking for a solution to mobility issues

choosing the best target audience to survey
Every person thinks, feels and talks differently. To maximize your marketing endeavors, and increase your odds of getting the right feedback, your questions must be tailored to match the language of your audience.
Keeping your target demographic in mind at every stage of your survey development will help you phrase your questions accurately.

Question to ask yourself: Who is my target audience?

2. Have a Clear Objective in Mind

You’ve identified your target audience. Now, what is it that you hope to learn from them?

  • Are you conducting a survey to test the market for a new product that you want to introduce?
  • Do you want to get feedback that you can use to improve your customer service?
  • Are you looking for an objective assessment of your post-transaction follow up process?

Conducting one survey to get feedback on multiple issues can be counterproductive. Not only will it skew the results, but you will have missed a golden opportunity to get answers to your core questions. Chances are you won’t be able to do much with the answers that you get back.
For a survey to be effective, limit the questions to the core issue and nothing else. This will ensure that you will get the actionable data you need to make strategic decisions.

Question to ask yourself: What is my ONE objective for conducting this survey?

3. Keep Your Questionnaire Short

Admit it, don’t you groan inwardly when you are presented with a list of questions a mile long? Your customers or employees are likely to react the same way.

Question Count

Average Seconds Spent Per Question

Total Survey Completion Time

1 75 1 min 15 sec
2 40 2 min
3-10 30 2-5 min
11-15 25 5-7 min
16-25 21 7-9 min
26-30 19 9-10 min

Don’t torture your survey takers with too many questions. Do them a favor, and keep your questionnaire short.
Most studies indicate that surveys that take 5 minutes or less to complete tend to get the best response rate.
Anything longer than 11 minutes is just too long and is likely to result in higher abandonment rates or worse, random answers. With random answers, you will end up spending a whole lot of time and money doing all the wrong things.

Question to ask yourself: How long will it take to complete my survey?

4. Keep Those Answer Options Brief

Give your reader 3–5 options to choose from. Anything more will confuse the reader and increase completion time, giving them cause to abandon the survey altogether.
Here is a typical survey question that follows this rule:
survey best practice example - short questions
If five options do not provide exhaustive options, maybe you are asking the wrong questions or perhaps the question is not phrased properly.
Re-examine the question and see how you can rephrase it.

Question to ask yourself: Can I rephrase or break down the questions to limit the possible answers?

5. Survey Question Order Matters

It may sound surprising, but your question order does make a difference. Place your most complicated, most challenging question first, and you are sure to scare away the survey taker.
A crucial best practice for surveys is to make sure that the first couple of questions are easy and interesting. This hooks the customer and gets them enthusiastic about jumping right in.
Once they get into the flow, they may even enjoy the challenge of answering the more difficult questions.

Question to ask yourself: Do the questions go from easy to progressively more difficult?

6. Where To Place the Profile Questions

Most surveys get this wrong by placing profile-related questions right at the beginning.
When you place these “intrusive” questions at the beginning, a reluctant respondent may prefer not to participate in the survey at all.
Someone who has already taken the time, however, to go through the full questionnaire will be more open to sharing their personal information.
survey best practice example - asking for personal info at the end
Regardless of where you place personal questions, keep them to the bare minimum. Only ask questions that will help influence your decisions – nothing more.

Question to ask yourself: Are my profile questions placed at the end of the survey?

With the basic survey design best practices covered, you are ready to start compiling the questions that will form your survey.
Related: The Top 75 SaaS Companies of the Decade (2010–2020)

Survey Best Practices for Writing Effective Questions

This really is the crux of your questionnaire. To get accurate, actionable feedback, your questions must be simple, unambiguous, close-ended, subjective, and placed in the right order.
It’s equally important to provide just the right number of response options – not too many, not too few.
There’s a lot to think about when you are compiling questions for your survey, but don’t let that overwhelm you.
These survey guidelines will help you get those questions, and answers, dead right every time.

1. Keep It Simple

When it comes to best practices for surveys, this is the number one rule. Keep those survey questions as simple as possible. You want the survey takers to answer those questions, not get overwhelmed or confused.
Trim down the questions until they are straightforward, easy to understand, and even easier to answer.

2. Reframe Those Double-Barreled Questions

  • What do you think about the appearance and taste of our waffles?
  • How did you like the entertainment and food at the annual gala?

Take another look at the two questions above. Both are classic examples of double-barreled questions.
These are questions that seem straightforward enough at first, but they are in fact two distinct questions in one.
What if the survey taker loved the taste of your waffles, but did not quite appreciate the way they looked?
Or what if they enjoyed the entertainment, but were not too enthused about the food that was served?
It would be difficult to answer either of the questions above. And even if the respondent did tick one of the options provided, the feedback would be inaccurate.
When asking about two unrelated subjects, you must break them up into two separate questions. This will allow respondents to answer each question independently so the feedback you get is spot-on.
survey best practice example - double-barrel question

3. Use Open-Ended Questions Sparingly

Open-ended questions require complete, meaningful answers.
For example, “What can we do to improve the design of our product,” is an open-ended question that requires the respondent to put some thought into their answer.
While that may be a crucial question that you need to be answered in your survey, asking too many open-ended questions can result in survey fatigue.
Wherever and whenever you can, reframe your questions so they are close-ended and require a simple, straightforward ‘yes’ or ‘no’ response.
survey best practice example - open-ended questions
This way respondents are more likely to plow through close-ended questions and complete your survey. They’ll feel good about it, too.

4. Delete Those Extraneous Questions

It’s not uncommon for surveys to ask the same question in two or more different ways.
Think about how many times you’ve been asked for your zip code, and then asked which state you lived in.
Asking for your date of birth and then your age is another mistake that you will often come across when taking surveys.
However, just because it is common does not make it a survey best practice. It can in fact be very frustrating.
After you’ve compiled your list of questions, go through them meticulously. Are there are two questions that essentially ask the same thing? Think about how you can reframe and optimize one of them, and delete the other.

5. Phrase Your Questions As Precisely As Possible

Avoid ambiguity when phrasing your questions. You want the survey taker to decipher your questions at a glance. They shouldn’t have to spend time trying to figure out what you are asking.
Using double negatives in a question can create a certain amount of confusion in the reader’s mind.
Take this question for example:

“Do you oppose not allowing the board to pass the act on road safety?”

If the survey-taker answers ‘yes’, does it mean they oppose or support the decision?
Instead of spending time trying to decipher the meaning of the question, most readers will simply choose an answer randomly. A quick, thoughtless response is often the easiest way out for them.

6. Phrase Your Response Categories As Precisely As Possible

Choosing the right words in your response categories is absolutely crucial.
Avoid using subjective metrics such as:

  • ‘sometimes’
  • ‘often’
  • ‘a lot’
  • ‘usually’
  • ‘rarely’

For example, you may think going out to the movies once a month or consuming fast food twice a week is ‘rarely’, while someone else may consider that ‘a lot’.
Using objective metrics instead such as ‘once a month’ or ‘more than once a week’ will give you more accurate results.

7. Avoid Biasing the Response

This happens when you ask leading questions that tend to nudge a person to answer one way or another.
survey best practice - biasing the response
Take these questions for example:

  • “Did you enjoy our fantastic new menu?”
  • “Don’t you agree that our post-sales service is brilliant?”

Using the word “fantastic” or “brilliant” in the above questions create a certain picture in the reader’s mind and nudges them towards ticking ‘yes’ in response.
A better way to frame these same questions would be:

  • “How would you rate our new menu?”
  • “How would you rate our post-sale service?”

Dictating the responses will not give you the answers you need. They will only give you the answers you like.
There’s not much you can do with that in terms of improvement.

8. Multiple Choice Questions Should Be Mutually Exclusive

Here’s a classic example of a multiple-choice question that’s not mutually exclusive:
Question: “What did you eat for breakfast?”
Answer: Choose any one of the options below:

  • Cereal
  • Waffles
  • Pancakes

What if the respondent ate cereal AND pancakes? Or maybe none of the above applies because they ate scrambled eggs and bacon for breakfast?
mutually exclusive example in a survey
There is no way to answer the above question accurately.
If you have to include an article such as this in your questionnaire, you must provide a ‘none of the above’ option. That’s the only way to get an accurate answer.
Let’s take another example, if only because this happens more often than you can imagine.
Question: “Which age range do you belong to?”
Answer: Choose any one of the options below:

  • 20-30
  • 30-40
  • 40-50

Which answer should a 30-year old choose? Should it be the 20-30 range or the 30-40 range?
When faced with options that are not mutually exclusive, respondents will end up choosing an answer at random. Some will tick the first option and others will go with the second, which ultimately does not give you the accurate data that you are looking for.
To get comprehensible, actionable data, you must provide options that are mutually exclusive.
The correct answer options to the above question would be:

  • 21-30
  • 31-40
  • 41-50

Now that 30-year old respondent knows exactly which option to tick without resorting to complete guesswork.

9. Be Consistent with the Formatting

This is one of those small but critical best practices for surveys that are often overlooked.
Consistent formatting makes it easier for the respondents to breeze through the questionnaire and actually complete it. It also makes it easier for you to assess the results. So what does consistent formatting entail?
It’s simple really. Try as much as possible to use the same scale for all questions.
Do not use a three-point scale for some questions, a five-point scale for a few others, and a seven-point scale for the remaining.
survey error - inconsistent formatting
Aim for 3 to 5. Less than 3 may not offer enough options, and more than 5 is excessive.

Survey Distribution Best Practices 

You’ve designed and developed the perfect questionnaire using tried and tested survey design best practices; it’s now time to send it out to your survey takers.
When it comes to distribution best practices, there are several factors you have to keep in mind:

  1. Send readers an invite to participate in the survey. Emphasize that you are collecting the data for their benefit.
  2. Be clear about the time expectations in the invite. This shows that you respect their time.
  3. Reassure survey takers that their data is secure and that you will not be sharing it with any third parties. This is especially important if there are any personal questions included in the survey.
  4. If you have to, sent out one reminder to the recipients – only one. People rarely need more than one reminder. Those who have not filled it out are probably not interested in doing so anyway.
  5. Be sure to thank each and every person who took the survey. Take this one level up by sending personalized thank you emails to those who offered personal praise. More importantly, send out personalized thank you emails to those who provided negative feedback. You need to address their concerns and reassure them that you are taking their feedback seriously.

Related: 25 Ways to Maximize Productivity While Working From Home

Survey Best Practices for Interpreting the Results

When you conduct an online survey, it is important to have a plan in place to interpret and measure the results. There really isn’t much point in going through all the trouble if you are not sure how to use the feedback that you’ve received.
So, where do you start with interpreting the results?

1. Start with a Quick Review

Doing a quick review of the feedback will help you get an overall picture of the results.
Once you’ve scanned through the results and have a broad idea of the feedback you’ve received, it is time to start looking for patterns within the responses.
It feels great to read positive feedback about your products and services, but that’s not what you should be focusing on.
The goal for conducting any feedback is primarily to ascertain which areas you should improve upon. To do this, you should focus more on the negative feedback that you receive – those are the areas that need improvement.

2. Create a Visual Representation of the Feedback

Using visual formats such as graphs, charts, and word clouds can help you get a better understanding of how the responses measure up against each other.
survey analytics - visual representation
You can see at a glance whether the feedback is overwhelmingly negative or positive, and which specific questions received mostly negative or positive feedback.

3. Decide on a Plan of Action

You’ve identified the patterns in the feedback and reinforced those patterns through visual representations. Both of these should give you a pretty good idea of what you need to do next.
Make a list of issues your audience is dissatisfied with and their level of dissatisfaction.
Now, make a plan on how you will go about addressing these issues.
If you can designate several employees to address all issues simultaneously, great.
If not, focus on resolving the most grievous issues first and work down the list.

Choosing the Best Survey Tool

Choosing the right online survey software is vital to developing and designing a survey that hits the right note and gets you those crucial answers.
The Nextiva Surveys product is designed to make it easier than ever to develop a robust survey that will give you valuable insight into what your customers are thinking.
Nextiva Surveys is the best survey tool
With a tool such as Nextiva, you can collect feedback from your audience, easily export your findings, and interpret your results. This will help you make the right business decisions.
Related: How to Develop a Winning CRM Strategy for 2020 & Beyond


Applying these survey best practices will help you get the most out of your online surveys. A survey that is designed using tried and true best practices will help you make more informed decisions about how you can improve your products and services.

Cameron Johnson


Cameron Johnson

Cameron Johnson was a market segment leader at Nextiva. Along with his well-researched contributions to the Nextiva Blog, Cameron has written for a variety of publications including Inc. and Business.com. Cameron was recently recognized as Utah's Marketer of the Year.

Posts from this author
Call badge icon