Asking survey questions that measure what you are trying to measure
10/18/12 / David Kennedy
It’s a common problem seen in market research – asking one question to imply the answer to another.
Sometimes it’s unavoidable- when writing a survey, you can’t show your hand and let participants know the information you are really looking for. However, too often interpretations and decisions are made not by faulty data, but by misinterpretation of the actual questions.
Case and point, as MotorTrend pointed out in a recent blog post, J.D. Power Quality Ratings may not properly gauge quality. This is especially problematic considering how widely reported the ratings are in the media.
The problem here is that the survey measures many ergonomic issues as well as quality issues. Questions such as, “Is the radio easy to see, reach, and operate”, make up much of the survey. How easy it is to do something isn’t necessarily a measure of quality if still works as designed (e.g., maybe the GPS’s operation isn’t intuitive, but it does work). J.D. Power’s defends their survey by saying they measure quality “as defined and reported by consumers.”
Defining your questions clearly to your respondents – as well as placing the results in perspective – is critical to ensure everyone is answering and interpreting the research in the same way. In general, asking for a specific measure will get you a specific answer and asking for a general measure will get you a general answer. “How satisfied are you with the quality of X car?” (general measure) vs. “How satisfied are you with the fuel efficiency of X car?” (specific measure).
Or as MotorTrend says, “check the surveys, comparison-drive the cars, and then decide for yourself.”