Sunday, July 24, 2016
Lies, Damn Lies, and B2B Marketing Surveys
Research surveys and their associated reports have become important marketing tools for many kinds of B2B companies. They are particularly popular with companies that provide marketing services and marketing technology solutions. Many of these companies conduct or sponsor surveys on a fairly frequent basis, and they feature the survey reports in their marketing efforts, which are usually directed at B2B marketers.
Survey reports can be valuable sources of information about industry trends and emerging marketing practices and technologies, but they can also be misleading. So it's important for B2B marketers to carefully evaluate the research reports they encounter. It's always a good idea to approach any research report with a critical eye and a healthy dose of skepticism. As Mark Twain once wrote, "There are three kinds of lies: lies, damned lies, and statistics."
In my work, I review lots of survey reports. I make extensive use of survey results and research reports when I'm developing content resources for clients, and I frequently describe survey findings in this blog. Over the years, I've developed a mental checklist of questions that I ask and things that I look for when I'm reviewing a research report. In this post, I'll discuss the first two questions that I ask when I start to review a survey report.
What is the Agenda of the Research Firm and/or the Research Sponsor?
Because many companies now view surveys and survey reports as marketing tools, I always begin by assuming that a research report has been designed to support a marketing agenda. The existence of a marketing purpose doesn't automatically mean that a survey or report is flawed or has no value. It does, however, put me on high alert for indications of bias in the design of the survey or in the presentation of the results.
Are the Survey Respondents a Representative Sample of the Target Population?
Sampling is a complex subject, and I won't attempt to explain it fully in this post. The most important thing to remember is this: If the survey respondents are not a representative sample, the results of the survey cannot be "projected" to the overall target population. In essence, those results are only valid for the group of people who responded to the survey.
Unfortunately, some report authors either ignore this limitation or use language in the report that fails to make the limitation clear. For example, I recently reviewed a survey report that deals with the use and benefits of "predictive marketing analytics" by/for B2B companies. The report was based in part on a survey of 150 B2B marketers, and the report does not claim that the survey respondents are a representative sample.
Forty-nine percent of the survey respondents said their companies are using predictive marketing analytics, but the report contains the following statement:
"Forty-nine percent of companies are currently using predictive marketing analytics." (Emphasis added)
The problem is, this statement isn't supported by the survey results because the survey respondents aren't a representative sample of marketers in all types of B2B companies.
Very few of the surveys that I review are based on representative samples, so this issue almost always exists. These types of surveys can be useful, but they can also easily be misused, because unfortunately, we humans have a tendency to focus on the specific survey findings and forget the limitation.
In a future post, I'll discuss several other issues that B2B marketers need to watch for when they're reviewing survey reports.
Illustration courtesy of Daniel Oines via Flickr CC.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment