Research surveys and survey reports have become important marketing tools for many kinds of B2B companies, including those that offer marketing technologies and various kinds of marketing-related services. Many B2B companies are conducting or sponsoring surveys, and they feature survey reports in their marketing programs. As a result, many B2B marketers are now both producers and consumers of research-based content.
Survey reports can be valuable sources of information about business trends and practices, emerging technologies, customer attitudes and a wide range of other subjects. But survey results and reports can also be unreliable and/or misleading.
The availability of free or inexpensive and user-friendly survey tools has made it easier for marketers to create and conduct surveys. Unfortunately, these same tools also make it easy to design and conduct surveys that don't produce reliable results.
As producers, marketers obviously want potential customers to view their survey results and reports as credible and reliable. And as consumers, marketers are increasingly using the results of surveys when making important decisions. So it's important for them to carefully evaluate the survey reports they encounter. As consumers, it's always a good idea to approach any survey report with a critical eye because as Mark Twain wrote, "There are three kinds of lies: lies, damned lies, and statistics."
In my work, I review lots of survey reports. I make extensive use of survey results and other research studies when I'm developing content for clients, and I frequently discuss survey findings in this blog. Over the years, I've developed a mental checklist of things I look for when reviewing a survey report.
I'm planning to devote three posts to this topic. In this post, I'll discuss some of the basic things I look for when I'm reviewing a survey report. My next two posts will discuss issues that can affect the validity of survey findings and/or the credibility of survey reports.
My Starting Mindset
Whenever I begin reviewing a survey report produced or sponsored by a business enterprise, I assume the survey was conducted to support a marketing agenda. Having a marketing purpose doesn't necessarily mean the research is flawed, but it does put me on alert for indications of bias in the design of the survey and/or in the presentation of the findings.
Survey Methodology
Many survey reports will briefly describe the research in the introductory section of the report, but a thorough report will also include a detailed description of the methodology used in the research.
For a survey of business professionals relating to a business subject, I look for the methodology description to include at least the following:
- Sample size (the number of responses the survey received)
- When the survey responses were collected
- How the survey responses were collected (e.g. online, telephone)
- How potential survey participants were selected
- If appropriate, how survey respondents were qualified
- Job role/job function
- Industry verticals/types of companies represented
- Company sizes represented
- Geographic locations of respondents
No comments:
Post a Comment