We’ve probably all received some type of payment for participating in research before, whether it was for completing a survey, participating in a focus group or online community, or other form of research.
But should we be paying people for their participation? While it would be nice if people would be intrinsically motivated to take surveys, the fact is people are busy, and while your survey is very important to you, it may not be to them.
Among the reasons to include an incentive:
- To say thanks. Participating takes times and whether the participant is a current customer or not, showing appreciation for their time is a nice gesture.
- Boost response. How many people do you need to invite to get one willing participant? Offering an incentive can reduce this number. Sure, sometimes you could just send out even more research invitations, but many times you’re going to be limited.
- Boost quality. In research talk, this means reducing our non-response bias. In other words, if only 1% of people take your survey, are those 1% different from the 99% who didn’t? And if so, are the results really representative of your entire audience? The higher response rate you get, the more confidently you can say your results aren’t suffering from this type of bias.Corona regularly tests incentives and their impact on response rates and data quality. On more than one occasion we’ve not only seen a boost in overall response, but a boost in the very type of respondent we were hoping to reach most, including tough audiences like customers/donors who had left our client, unhappy customers, and so on. The reason we conducted those surveys was to find out how our clients could improve, and an incentive provided the boost those audiences needed to be willing to give their feedback.
- Lower cost. While this may seem contradictory, if including an incentive increases willingness to participate enough, then sampling and recruiting may become easier and therefore less expensive.
So, what type of paid incentive is best? The answer is, “it depends.” The incentive should be tailored to the audience and what’s being asked of them. The incentive should have broad appeal so as not to inadvertently bias the results due to one group being significantly more attracted to the incentive than another group.
What incentives have you tried? Are there ones you found particularly effective in boosting response?
Yesterday, Denver Mayor Michael Hancock revealed Denver’s first cultural plan in 25 years. This strategic plan, written by Corona Insights in partnership with Denver Arts & Venues, will fuel the next era for our city’s art, culture and creativity. What a treat it was to attend the press conference, see the final printed plan and hear firsthand the excitement felt by city leaders and residents.
Corona leveraged its expertise in strategy, data and market research to serve the company’s hometown. The result? A community-centered plan designed to achieve a seven part vision. From finding more art around every corner, to learning over a lifetime and supporting local artists, Denverites hunger for more art.
What can you do? Go to www.imaginedenver2020.org and check out the plan. There will be an official release party on Thursday at 6pm. Come early to see a presentation of the research behind the plan by Corona Insights that starts at 4pm. Then stay tuned to see how you can get involved.
Business, governments, and nonprofits often ask those who come into contact with them how satisfied are they with X? You’ve undoubtedly been asked this yourself in the past and perhaps you’ve even run your own customer feedback (often dubbed Voice of the Customer) program. Doing so is smart as it can uncover problem areas and give a chance to resolve still lingering issues.
However, what are you measuring when you ask someone whether they are satisfied? There are two general areas in which we measure satisfaction:
- Transactions or event-based. This is when we ask someone how satisfied they are with their recent interaction with an employee, service received, or other specific action between your organization and the customer.
- Relationship. This is when we ask how satisfied they are overall with their relationship with your organization.
The former helps us diagnose very specific issues and uncover unresolved issues with their last interactions. The latter gives us a snapshot of the organization overall and not only the most recent interaction.
The challenge is that organizations often use a transaction survey as a measurement of the broader relationship. Depending on the nature of your interaction, asking people their overall satisfaction may be appropriate, but often a question of, “how satisfied were you with your purchase/service call/donation?’ is used as a proxy for overall organizational satisfaction. The issue here is that someone can have a bad experience once and that won’t necessarily translate to low satisfaction overall.
For example, if you called your cell phone company and had a bad experience on the phone – it took too long to get a human, you had to be transferred multiple times, etc. A survey of that experience would likely show dissatisfaction. However, you may still be happy with the quality of service and price you pay. A survey about your overall relationship may show positive results, even if slightly dented by the recent episode.
Both types of satisfaction surveys have their place, and knowing what you’re trying to measure can help you refine the question you ask and how you interpret the results. Better insights start with better data that start with better design.
In an upcoming blog post we’ll discuss the different ways to report satisfaction and which may be best for your needs.
Here at Corona, we strive to help our clients maximize the value of their research budgets, often by suggesting solutions that get the job done faster, better, or at a reduced cost. In survey research, developing an accurate sampling frame (i.e., a list of the study population and their contact information) is instrumental for success, but sometimes developing or acquiring a sampling frame can be time consuming, expensive, or impractical. Using a cluster sampling technique is one potential solution that can save time or money while maintaining the integrity of the research and results.
What is cluster sampling? Cluster sampling, as the name implies, groups your total study population into many small clusters, typically defined by a proximity variable. For example, street blocks in a neighborhood are clusters of households and residents; schools represent clusters of employees that work in the same school district. The main difference between simple random sampling and cluster sampling is instead of selecting a random sample of individuals, you select a random sample of clusters. This approach provides a representative sample that is appropriate for the use of inferential statistics that draw conclusions about the broader population.
How to use cluster sampling: First, make sure the nature of your research question is compatible with cluster sampling; if your analysis will require completely independent respondents, then this is probably not the best approach. Second, consider the configuration of your population; you must be able to group people by defined boundaries, such as a city blocks or office building floors. After grouping your population into small clusters, use a random number generator to draw a random sample of clusters (rather than a sample of individuals). Typically, every individual from those selected clusters are sampled, although you can infuse your sampling plan with other techniques such as stratified or systematic sampling. As long as 1) you can match every person in the population with a cluster, 2) you have an appropriate person to cluster ratio, and 3) assuming you have a complete list of clusters, you can use these groupings as a sampling shortcut.
When might cluster sampling be useful? Cluster sampling is useful when you don’t have enough resources to develop a complete sampling frame or when it takes significant effort to distribute or collect surveys (such as going door-to-door). For example, if we wanted to survey bus riders within a city, it would be impractical to develop a list of all bus riders on any given day, let alone to find our random sample of individuals and give them all surveys. Cluster sampling allows us to select a random sample of bus routes and times, and then survey everyone on those buses. Although individual clusters may not be representative of the population as a whole, when you select enough clusters at random, your sample as a whole will be representative.
Potential problems: Cluster sampling should be applied with caution, and there are some disadvantages to using cluster sampling compared to a simple random approach. It is better to sample more, smaller clusters than fewer, larger clusters. For example, for a nationwide survey it is better to cluster by counties than by states. If your clusters are too few and too large, you might draw a sample that does not adequately represent the population. The size and homogeny of each cluster and your final sample size desired also impact the viability of cluster sampling.
At Corona, we start fresh with each research project, and we are full of solutions that can help maximize the value of your research budget and resources. If you are struggling with how to reach your population of interest, give us a call, maybe we can shed some light on the situation.
When clients come to Corona and ask us to help them find answers to their most difficult questions, we typically take a quantitative or qualitative approach to our research. Sometimes, however, we use a combination of both methods. As you might imagine, there can be lots of value in bringing the two types of data together, and a mixed methods approach can offer a powerful resource to inform and illuminate the answers that our clients seek.
But how, exactly, do we determine when it is appropriate to use a combination approach? Although it is essential to understand a client’s goals and the research questions that need to be answered as a first step to any project, it is also important to consider the purpose of potentially combining qualitative and quantitative data. There are several reasons to choose a mixed methods approach, four of which are described below:
- Enriching data – A client could choose to enrich their understanding of a topic or issue by using qualitative work to collect information not obtained by quantitative methods, or vice versa. Let’s say, for example, we have a client who puts on a special event each year. Our client wants to have more information about the kinds of people who attend that event, in addition to understanding how attendees’ experiences might be improved. To achieve these goals, we could conduct intercept surveys during the event to obtain basic demographic information about attendees. Then, Corona could conduct follow-up interviews with select attendees to understand their experiences at the event and how they can be improved.
- Examining findings – We might generate hypotheses from quantitative work that will also be tested using a qualitative approach, or vice versa. Recently, in an effort to evaluate the ways in which school meals are served, Corona conducted in-depth interviews with school administrators. From these interviews, we generated hypotheses that we later tested in a survey with a larger pool of the same audience.
- Explaining findings – In the event that there are unanticipated findings from quantitative work, we might recommend a qualitative approach to understand these results. For instance, Corona conducted one survey with college students to understand the type of beverages they drank. After analyzing the survey we had some surprises regarding the frequency that some beverages were consumed. In follow-up focus groups we were able to understand the reasons around consuming certain beverages, including situations and activities.
- Triangulation – This approach uses qualitative data to confirm or refute results found from quantitative data, or vice versa. The main idea behind this approach is that we can be more confident with a result if different methods produce the same result. Take, for example, a client who wants to choose a new logo. We might test potential logos through a survey with the client’s target audience to see which on they prefer. Then, Corona could conduct an online bulletin board with different participants to see if people pick the same logo as their favorite. If both audiences pick the same logo, our client can find more assurance that the chosen logo will resonate with a broader audience.
Overall, the goal of designing research is to use a combination of any tools to best answer our clients’ questions, and it’s important to remember that we’re not only limited to surveys, focus groups and interviews. In a world where we are constantly gathering information from countless resources (e.g., Census data, sales data, website analytics), the potential combinations are endless and might seem a little overwhelming. That’s why we’re here to help guide our clients so that they can find the most comprehensive, defensible data to drive their decision making.
There is more to qualitative research design than meets the eye. Much should be taken into account in developing a questionnaire designed to elicit in-depth qualitative feedback from research participants. For the sake of brevity, I’ll hit on a few conceptual highlights.
For starters, a researcher will consider the background and context for the research project. This includes:
- Overriding research question (in-line with research objective): This can serve as a kind of compass to which all questions can point. For example, if it’s a customer satisfaction-type objective, then all questions should elicit responses that in some way inform a client’s understanding of this as well as possible actions to take.
- Audience: Even with a target respondent audience that has similar characteristics, there will still be subpopulations. If we’re talking with a group of home remodelers about a building material and some have recently completed a project while others are going to in the near future, this will impact how questions are framed so that they are relevant to all.
Then, keep in mind function and form for a qualitative questionnaire.
- Function: A primary function will be to elicit candid and in-depth responses. Questions are generally designed in an open-ended fashion to encourage explanation and reasoning for answers (versus simple “yes” or “no” or other closed-ended responses). However, there may still be well-timed, strategically placed “pulse-checking” questions that are helpful to uncover where consensus and main themes in responses may lie (especially in a group research setting).
- Form: The number of questions should be the right amount to allow for in-depth feedback, and for an opportunity for a moderator to probe within the allotted timeframe. Typical questionnaires may cover four to five high level topic areas. In sequencing, topic areas may go from broad to narrow, and then questions within topic areas also generally go from broad to narrow.
Now, you’re ready for question design. As mentioned, questions must be properly and logically sequenced, but a few general question types may include the following:
- Exploratory: These are generally open-ended questions designed to invoke unaided responses. Thus, these tend to be more “big picture” and “top of mind” and capture attitudes, opinions and perceptions very well. (e.g., What are your first thoughts when you hear the term _____?; or, What would a library of the future look like?)
- Explanatory: These are questions that prompt the audience to respond to some kind of stimulus. This could be reactions to a product, concept, marketing message or positioning statement, creative piece, etc. These may also gather more in-depth or “why” explanations to first-phase quantitative findings, such as survey findings. Word games or list ranking-type exercises can be fun and informative here in group sessions. Again, these questions effectively capture attitudes, opinions and perceptions.
- Behavior: Although observational, real time or experiential research methods may assist in understanding behaviors the best, researchers may still make behavioral queries in more controlled research settings such as online or in-person focus groups or interviews. Surprisingly, these questions are sometimes more difficult questions to answer than those asking about attitudes or opinions. When asking about behaviors, questions may encourage specific examples and stories in order to illustrate actual decisions or choices. “How” versus “what” questions may also stimulate respondents’ recall or intentions related to their behaviors. Finally, some simulated exercises may work well as the closest thing to actual observation.
In summary, qualitative questionnaires can look pretty simple when finished, but a lot of thinking and consideration goes into proper design.
With tremendous pride and a full heart, Karla Raines presented the Denver Commission on Cultural Affairs (DCCA) with IMAGINE 2020: Denver’s Cultural Plan at their January 2014 meeting. The commission had been strong proponents of a refreshed cultural plan for Denver. These volunteers served as Corona’s creative muse throughout the 15-month process. They held firm in their belief that Denver needed a research-driven plan that was strategic by design and held forth a bold vision for Denver. Their insistence that the process be community-driven resulted in a cultural plan that speaks to the aspirations and expectations of Denver residents.
Corona was happy to host the commission’s monthly meeting in their downtown Denver office (pictured below). A celebratory toast and freshly baked cookies from Maggie and Molly’s Bakery capped off the event.
Stay tuned! IMAGINE 2020 will be revealed to the public in early March. . Visit ImagineDenver2020.org for more information.
Pictured: Ginger White, Deputy Director of Denver Arts & Venues, addresses the Denver Commission on Cultural Affairs, A&V staff and Team Corona.
We were honored to work with the Colorado Municipal League (CML) for the fifth year on their annual report (PDF), State of Our Cities and Towns. Starting this year, the organization has decided to do a deep investigation into a different issue each year. They started with transportation issues this year and you can expect future reports on other important issues facing towns and cities in the coming years. To read a quick overview of transportation issues in Colorado, you can view their annual summary report (pdf).
There are nearly 16,000 miles of city streets in Colorado … and every mile is essential to deliver groceries to the store on the corner, get children to school, connect to work, and home.
To complement the report, CML also produced an easy to digest video highlighting the key themes from this year’s survey findings. The short video is a great way to communicate the important information to a statewide audience.
- To learn more about the Colorado Municipal League, visit their website.
- Interested in reading previous year’s reports?
An important part of Corona’s front-end consulting for research involves helping clients pick the best approach and research method and instrument. In-person focus groups are still very relevant for qualitative research, and the takeaway value from focus groups lies in the discussion, but what do you do when you need qualitative feedback that doesn’t require as much interactive discussion? What if you want to hear more from each individual, and be able to compare their opinions against each other? Interviews are a great way to achieve this as well, but sometimes there are logistical and time challenges with telephone or in-person interviews. If media or messaging is to be tested, these challenges may be compounded. A case like this is when an online bulletin board may be an appropriate tool.
Online bulletin boards enable participants to log in at times and from locations that are convenient to them, answer questions posted by a moderator, and also read and respond to the other participants’ comments. Software platforms have a “board” that can display text, graphics, images, video, websites, and can even upload video from participants as the stimulus for participants to consider.
Usually, 15-30 participants will give feedback over a 3-7 day period by logging in and answering questions and responding to others. In essence, the result is rich qualitative feedback from a good sample size in a short amount of time.
Here’s an example of a question one might ask on an online bulletin board:
Please rank the following Corona logos “1” “2”, and “3,” with “1” being your favorite and “3” being your least favorite. Why did you order them the way you did?
Image testing, like that above, is a great example of a use for an online bulletin board. The platform gives the participant immediate access to the visual. Similarly, in an online bulletin board, respondents are a bit more isolated in their responses than during a focus group, and tend to focus on their own ideas rather than the rest of the group. This can be a great way to get an affordable, relatively small sample of 30 respondents to help point image development in the right direction by being able to stack their responses for a decent overall picture while also getting some explanatory feedback with a “Why?” follow-up question.
Corona Insights facilitates the entire bulletin board process, helps design and optimize research approaches and methods, and provide analysis that helps answer organizations’ important business questions. For marketers and those considering using a bulletin board, here are some criteria for when a bulletin board would be an appropriate tool:
- When you care less about group interaction and more about having in-depth qualitative data. In a real-time group there is only 10-12 minutes of air time per person, but in the longer time frame of the online bulletin boards the format invites rich, detailed information.
- When you want to facilitate different modes of response besides just text or voice. Bulletin boards also invite the use of mark-ups and video responses, providing more flexibility with different stimuli and alternative ways to respond.
- When your participants are busy, geographically dispersed or logistically challenged. Unlike a face-to-face group, it is possible to reach difficult recruits and overcome geographic limitations. The cost of travel is the major source of savings in using online bulletin boards.
Prior blogs I’ve written have established the link between company-level strategic planning, which includes competitive strategy, and marketing organization planning. I’ve also discussed several analytical tools used for competitive strategy, including the SWOT analysis (Strengths-Weaknesses-Opportunities-Threats).
I was recently reading through a LinkedIn group’s lively dialogue on the topic of SWOT analysis. One theme that arose is that some in the competitive intelligence (CI) community feel that they may be somewhat unfairly losing out to “information professionals” or vendors who are hired and then apparently proceed to merely populate a SWOT framework with regurgitated, existing data. This is viewed by the CI community as a shortcut to actually using the framework in a more pure or dynamic sense – that is, as a basis for strategic thinking and actual competitive analysis. And CI professionals also subtly make the point that the kind of thinking necessary for this analysis is the kind that relies upon experience, knowledge, and training, and is ultimately backed by good old fashioned intellectual horsepower in order to best serve a client in making important decisions.
Fair enough. Overall, however, I think a combination of reliable and relevant data that is accurately analyzed and then used by experts and their clients for smart strategic planning and decision making is a great approach.
Additionally, I think it’s important to step back and understand where a SWOT analysis lives as part of a larger strategic process. This includes examining what is really necessary for an effective SWOT as well as how a SWOT analysis informs the strategic process going forward. (I’ll explain below in the specific context of a marketing executive’s perspective, but the general principle holds for another line function executive or business manager who may find him/herself undertaking such as process).
WHERE SWOT ANALYSIS COMES FROM AND QUESTIONS IT ANSWERS:
A thorough internal situation analysis examines a long and strategic list of internal areas that apply at the company or marketing organization level. This is best accomplished with the help of reliable existing data and in some cases primary, original data in order to reveal the “SW”. The SW then provides key insights to answer a marketing executive’s important question, “What products or services are we best positioned to offer given our business’s strengths?”
And likewise, a thorough, data-driven analysis of a strategic list of external factors at both the industry and the market level naturally reveals the “OT”. The OT produced then answers a marketing executive’s question, “What is the trending of the industry, market and customers we serve (or potentially serve) and how attractive is the current competitive business environment?”
WHERE SWOT ANALYSIS LEADS:
Once answers to the questions above are clear, then comes an opportunity for smart recommendations and guidance. Logical next steps include: 1) making “invest” or “divest” decisions for a business unit(s) or product area(s); 2) establishing realistic marketing goals for that unit/area; and 3) developing strategies to achieve goals in #2 as it relates to pursuing new and/or existing markets and doing so with either current or new generation products.
And this is just the beginning of the fun for a marketing executive.