Sunday, September 11, 2016

The Dark Side of Marketing Performance Measurement


"Yes, a Jedi's strength flows from the Force. But beware of the dark side."

For the past several years, marketers have faced growing pressure to prove the value of their activities and programs. As a result, they are placing greater emphasis on measuring the performance of marketing tactics, channels, and programs, and some marketing leaders are allocating budgets and basing marketing mix decisions on performance measures.

Overall, this has been a positive development. It's hard to argue that marketers shouldn't track and measure the performance of their activities, and use performance metrics to guide marketing investments. Common sense says that this approach should lead to better marketing decisions.

But there's a potential dark side to the current fixation on marketing performance measurement. The problem arises when the ability to measure a marketing activity becomes the primary criterion for determining its value.

When taken to the extreme, this way of thinking can lead marketers to choose marketing tactics based largely on ease of measurement. As a recent blog post put it, "While marketers once accepted as fact that they didn't know which half of their budget was wasted, today they've done a 180 and believe that if it can't be measured, it's not worth doing."

I can understand why marketers are tempted to think this way. After all, in an environment where proving the value of your work can mean the difference between keeping or losing your job, marketing methods that are easily measured can appear to be the safe choice.

But making measurability the prime criterion for determining value is short-sighted and ultimately dangerous. It's a classic example of the McNamara Fallacy at work. The McNamara Fallacy was named for Robert McNamara, the US Secretary of Defense during the Vietnam War, and it relates to his approach to managing the war effort. The term was coined by the noted social scientist Daniel Yankelovich, who described it this way:

"The first step is to measure whatever can easily be measured. This is OK as far as it goes. The second step is to disregard that which can't be easily measured or to give it an arbitrary quantitative value. This is artificial and misleading. The third step is to presume that what can't be easily measured really isn't important. This is blindness. The fourth step is to say that what can't be easily measured really doesn't exist. This is suicide."

 Ironically, some of our efforts to improve marketing performance measurement have also exacerbated its dark side. For example, most marketers are now focused on measuring the impact of marketing activities on revenues. So, we're now constructing complex attribution models in an attempt to assign revenue dollars to specific marketing activities.

Measuring the performance of marketing activities that produce quick results is relatively easy. It's much harder to measure the performance of marketing activities that may not bear fruit for months or even years. For example, the content that you're creating and publishing this year can produce a positive impression in the mind of a potential buyer, and that impression can influence a buying process that won't even begin for two years. Likewise, some of the sales you're closing this year are due, at least in part, to the marketing activities and programs that you ran in 2014 and 2015.

In a recent interview, David Cote, the CEO of Honeywell, described the importance of long-term effects in these terms:  "You do well this year, not because of what you're doing this year, but because of what you did in the previous 5 years."

Marketing activities with long gestation periods, and those whose impacts are several steps removed from the final buying decision can be very difficult to measure. But many of these activities are vitally important for marketing success. Unfortunately, when we fixate on measurability, we can end up under-investing in these critical marketing activities.

As Albert Einstein purportedly wrote on his blackboard:  "Not everything that counts can be counted, and not everything that can be counted counts."

Illustration courtesy of Kory Westerhold via Flickr CC.

No comments:

Post a Comment