Research reveals that accuracy is one of the top criteria that senior marketing professionals look for in market research. But what exactly is “accurate” market research, and how do you ensure it’s part of your efforts?
A Digsite Sprint is an agile consumer insights community platform that helps you find and directly engage with a highly targeted audience - in this case 25 senior marketers at consumer product companies. The interactive, social media-style interface allows participants to answer survey questions, share ideas and images or video, and respond to each other’s comments.
In this 3-part blog post series, we’re focusing on each of the desired areas of market research, detailing what each term actually means when applied in real life, and why they’re so important to marketing professionals. In our previous blog post, we detailed "actionable" marketing. Now, we turn our attention to accuracy.
Missing the mark on accurate market research can a have tremendous impact on organizations. For innovation projects, it can lead new product teams to focus on building features that don’t ultimately add value to the customer. For branding projects, it can lead to missing the mark on market awareness or adoption. And for customer experience projects, it can lead to lost opportunities relative to competition. Let's dig more deeply into what accurate research truly means, some key mistakes companies make, and how you can avoid them.
What is Accurate Market Research?
Accuracy can mean many things to different people. For marketers who are not trained market researchers, “accurate market research” would include data that reflects their customers and helps improve the success of new campaigns, products and services. Most claims related to accuracy (and the ones many marketers rely upon) are associated with surveys among large groups of people. We are lured into comfort by “95% confidence level” or “error range of +/- 3%.”
While these claims provide reassurances that the research is accurate, they don’t actually capture many of the sources of inaccuracy in market research. In fact, I would argue that some of the most inaccurate research is actually from large-scale surveys, as opposed to more targeted qualitative research.
Interested in learning more about a tool for producing “accurate insights?” Click here to learn more about Digsite Sprints!
Why Were the 2016 Election Polls Inaccurate?
A great example of this was the 2016 election polling. Thousands of people were surveyed and error ranges were given, but these assurances still didn’t mean that the results accurately predicted the outcome of the election.
Why? It’s because these assurances (the high confidence level, the small error range) only measured reliability – which is the likelihood that you will get the same answer if you randomly selected a different group from the same population.
In the case of the election, even if those polls were reliable in regards to the population they were polling, they failed to predict the outcome correctly because other errors were introduced into their survey methods. One of the biggest issues was polling the right audience.
Larger Sample Sizes Don’t Equal Better Accuracy
Let’s explore the concept of the right audience even further, because it’s truly the difference-maker when it comes to accuracy. One of the most frequent questions I get as a market researcher is, "How many people do I need to speak with to get accurate responses?" The sample size you need is NOT based on the size of the population (unless your target is very small). The sample size you need has to do with how similar or different the opinions are of your target audience.
If you get a group of similar people together, and they all agree on something, then you can feel with a high degree of confidence that theIr response is valid. It’s representative of that specific audience. This is how we arrived at the three attributes of marketing research that are important to marketers. Our poll showed the unanimous response for the attributes Actionable, Accurate and Affordable.
And the verbatim responses all indicated similar thoughts and feelings. We can use a technique like triangulation to further enhance the accuracy of comments like these:
"It has to be accurate for it to matter."
"We use research a lot to help with decision making so findings needs to be accurate & actionable."
"Access to timely data with a solid methodology leads to the best actionable results."
"I don't want to spend money on data that is not accurate."
"Research and insights don't matter if they are not actionable or accurate."
"If it is not accurate there is no point in doing it."
While the sample size of our Digsite Sprint may not be large enough to use statistical testing for accuracy, the fact that we’re tapping into insights of a homogenous crowd allows us to trust the accuracy of the responses.
Mistakes That Can Impact Marketing Research Accuracy
So how do you ensure you don’t make the same mistake as the election pollsters? You need research that accurately reflects the needs of your target market. Here are a few major issues that can stymie those accuracy efforts.
1. Recruiting too narrow of an audience
Thinking back to the election, many of the online polls had thousands of responses. However, because many were generated from either a conservative or liberal news organization, the results only reflected a small segment of the voting population. Companies often make the same mistake, overlooking the people who will be making the specific decision.
For example, let’s say you are developing a new lawn mower. You want to know if potential customers would be willing to spend an extra $250 for the new features on the new mowers. You could talk to your current customers, your dealers, or competitive buyers for feedback. Deciding which of these audiences best reflects the target customer is critical. Choose the wrong one, and the answer won't be an accurate representation of your target. In fact, doing qualitative research among a small group may yield more accurate results than one large study among the wrong group.
2. Recruiting too broad of an audience
In consumer products, many of us were raised with the idea of talking to a “nationally representative” population. However, when I conducted validation research that compared the results from “representative” samples to in-market sales, it indicated this approach wasn’t very effective. Why? To have accurate research results, you benefit the most from talking to the people who are most likely to impact your sales.
For many product categories, some iteration of the 80/20 rule applies. If 80% of sales come from 20% of customers, your research should focus on these more influential customers. For example, a new concept test I conducted for a food company had results from a “nationally representative” sample of 300 people. Of that sample, about 100 were people who had bought the product category in the last six months - the influential buyers. When I looked at the in-market results, the purchase interest among those 100 people much more accurately reflected the in-market success of the new product than the purchase interest among the other 200 people who rarely went down that aisle in the grocery store.
3. Mistaking attitudes for behaviors
We all know that our attitudes do not actually reflect our behaviors. We may believe in protecting our planet, but throw away a recyclable item because there isn’t a recycling bin nearby. Or we may believe in eating healthy, but indulge in fast food on a business trip. Getting accurate research results means recognizing that our situations impact our decisions as much as our attitudes do.
For example, when we conducted a focus group on a new food product, we knew people were overstating their need for a product when they said “this would be great for camping.” Given that the respondents went camping infrequently, we realized they were coming up with a situation when they could purchase the product rather than telling us how it fit with their everyday needs.
Asking questions can also be tricky if you’re not thinking about questions in the customer’s terms. “You need the mindset and language of the consumer,” said Jennifer Cooper, a market researcher and owner of BuyerSynthesis. “They might just think about your product once a year, but you think about it every day.”
4. Underestimating outside influences on behavior
One common research technique I have used is to have participants complete the sentence “when I ___, I will use ___ instead of ___ because ___.” The idea was to always understand the context of the decision and the alternatives they are choosing from. This can be important for accurate research. Rather than simply asking someone if they like an idea or what they like and dislike about it, start by asking them what they are doing now. Help them provide answers in the specific context of their most recent experience. Otherwise, they may give you feedback either thinking about how other people would use the product, or about a situation that is uncommon.
5. Putting people in a situation where they aren’t giving honest answers
Many analysts believe one of the main reasons the 2016 election polls were wrong was that people didn’t feel comfortable admitting they were voting for Trump because they didn’t want to be accused of being sexist or racist. In fact, issues with people trying to please the interviewer (what psychologists call social desirability bias), has been well documented, particularly for phone and in-person research.
This can be made worse if the person asking the questions introduces their own biases. You need to be detached in terms of what you’re searching for, and not try to simply validate your own thinking. “You have to be open to surprises,” Cooper said.
Cooper also noted that it’s important to provide a safe, secure environment where the participant can trust you and what you’re trying to achieve.
“Even when the end client makes defensive comments, it’s up to you as the researcher to treat each person you interview with respect, and to realize that even inaccurate comments they make originate from their experiences and guide their decisions,” she said.
Participants can tell when you have internally dismissed them, which will not only yield inaccurate results, but taint participants’ views on participating in future research. “In a larger sense, every time you do research, you are setting the stage for the next research situation, so you need to pay it forward,” Cooper urged.
6. Overwhelming people with surveys that are too long and have too many questions
Quantitative surveys are running into a number of issues. As Ray Poynter of NewMR alluded to in this post, surveys that are too long are a major problem. Poynter notes that anything that takes over 20 minutes is “bound for the insight scrap heap.”
“Over-surveying” is also an issue, according to Jennifer Cooper. When people have been bombarded with surveys, they eventually tune them out. “You’re left with only the people who really like the product, or the ones who had a problem,” she said.
There are two problems with this: First, your non-response error will get bigger and bigger as you get responses only from those people who have something extreme to say. The people you are surveying will become inherently different.
“You’re muddying the waters,” Cooper said.
Finding the Right People Isn’t Easy
To achieve accurate results, the biggest challenge is to find the right participants for your research project. So how do you find these people? It’s an ongoing struggle for the industry, especially when focusing on getting large numbers of responses. You’re very much at the mercy of the recruit.
With large samples, researchers tend to focus on closed-ended questions, which can be “faked” by robots or respondents who aren’t your target. And, it takes to much time to ask open-ended questions or understand the context of responses.
Recruiting the right people is an area where agile qualitative holds a distinct advantage over quantitative. You tend to narrow in on a highly relevant audience, and can easily tell by the responses whether someone is truly a relevant contributor. While you have fewer responses, the ability to ensure that each response is accurate is much greater. If finding the right people is the key to achieving accuracy, then it’s up to the industry to continue to foster new ideas and methods to make it happen.
What Can You Do Right Now to Improve the Accuracy of Your Research?
Now that you're aware of some pitfalls related to accuracy, how do you improve the accuracy of your research?
First, always start with a clear hypothesis on what you expect to find and how that might be different across your target audience. This will help you hone in on who you really need to talk to so you done define your audience to narrowly or broadly. It will also give you or your research partner a good start on determining how many people you need to talk to to get accurate results.
Second, focus your research at all stages on understanding consumers beliefs in the context of what they are doing now. It is very difficult to influence behavior, and focusing on understand their situation and context for decision making can help participants be more accurate in their responses to new ideas. Finally, consider agile online vs. in-person qualitative research or surveys.
Online qualitative tools not only allow you to recruit the right target audience, but also gives you the opportunity to dig deeper and truly understand the “why” behind an answer. Participants are more likely to be honest in their answers when they are responding in the comfort of their own environment without someone they are trying to please. Your team is more likely to act on the results because they can see real-people’s responses and ask follow-up questions as needed.
For example, Sub-Zero used Digsite to improve the accuracy of their field trials by converting from in-person interviews to online qualitative. They got the necessary feedback and field trial results to improve initial quality and speed to market. Get the full case study to read the full report.
Considering how many researchers are taking advantage of agile, online qualitative research tools like Digsite, it’s a fate few are willing to chance.
Ready to learn more or get started? Download our free ebook, Agile Research Guide: How Consumer Product Teams Can Innovate Faster.