Some common responses you’ve probably noticed survey writers use are “don’t know” and “not applicable.” While many researchers and survey authors include “don’t know” or similar options on surveys and questionnaires to give respondents a wide swath of answers to choose from, it can hinder the collected data.
These answers enable survey respondents to refrain from engaging with certain topics because they can avoid sharing their thoughts altogether. Surveys that feature “don’t know” answer selections can negatively impact results in a variety of ways. Learn how providing "don't know" response options on survey items can help you.
One of the primary reasons researchers continue providing “don’t know” responses on survey items is to give respondents a way to avoid making choices about something they have no opinion about or have no experience with.
In theory, there is nothing wrong with this. In certain scenarios, "don't know" responses can give organizations areas to focus on in future marketing efforts. In practice, however, this method of survey-making can lead to low-quality data because respondents may select "don't know" or "not applicable" to avoid thinking about the topic or the question altogether. Additionally, respondents may have clarifying, rich commentary to add but no opportunity to share it.
This method of survey-making leads to low reliability in survey responses because many respondents select “don’t know” or “not applicable” to avoid thinking about the topic or reading the question altogether. Working to resolve a ‘don't know’ answer in a survey response can depend on your data and needs - whether it’s removing the option altogether or adding a follow-up question to add data value, see how you can glean the most from your survey.
Not providing users with “other” responses in your surveys can mean missing out on collecting meaningful data and information from respondents with unfavorable opinions. Many respondents might also have a negative attitude toward a product or service they have no first-hand experience with.
Allowing respondents to select “other” or “not applicable” and then giving them a way to explain why they feel that way can give you valuable insights into why consumers have negative assumptions about your product or service. For example, it’s possible that consumers avoid purchasing a product because it looks cheaply made or inconvenient to use. Providing them the option to tell you why they are avoiding your product might give you valuable information about how consumers perceive your brand.
Surveys that enable respondents to disengage from thinking about the answer are generally unlikely to yield fruitful results and helpful information. In general, it’s always valuable to give users a chance to explain themselves, even if they have limited experience with your service or product.
There are instances where including “don’t know” and “not applicable” is appropriate for the survey's design. Surveys with questions pertaining to sensitive information need to give users an “out” because it could hurt the completion rate in some scenarios, as you may be forcing users to provide information they may not feel comfortable providing.
Questions involving brands and products should be constructed in a way that the respondent is rating brands and products they know. Simply asking for a rating on a something that the respondent does not know about without asking them for their level of familiarity in an earlier question could end up skewing your data. In this scenario, the respondent may have no opinion because they have no experience with what you're asking about, and now they are being forced to give a rating.
Your audience might choose a ‘Don’t Know’ answer for a variety of reasons. However, you can help minimize these occurrences by removing survey bias from your questions and refraining from personal bias in the queries. In addition to including an ‘other’ response, you can learn how to reduce survey bias with us to guarantee the most accurate results from your audience.
One way to reduce your “don’t know” responses is to add a follow-up question to ask users why they said they don’t know. A 2018 study about non-response answers in behavioral surveys found that adding a prompt to encourage users to answer again following a “Don’t know” answer can successfully reduce the Don’t Know numbers. Adding the follow-up question can prompt further detail as to why they did not answer the question or if they had an option that stood out more to them.
If your organization needs assistance in building surveys with impactful questions and answer choices to obtain the most accurate data possible, IntelliSurvey can help. We provide an array of services to help you take the guesswork out of knowing which questions to ask and which to avoid in your market studies.
IntelliSurvey offers a variety of products to help companies conduct effective surveys and large multi-market studies. If you’re interested in collaborating, please contact us for more information.