Metrics maze: marketers struggle to measure effectiveness

New research suggests a big drop in marketing effectiveness. Does this expose fundamental problems, or could flimsy metrics be to blame?
Office workers pour over graphs at a table

The Data & Marketing Association (DMA) turned up a disturbing insight in 2021. After three years of consistent performance in the marketing sector, effectiveness suddenly slumped. What happened? 

The data came through the DMA’s awards, which have received more than 1,000 entries since 2017. The association found that in 2021, effectiveness slumped by a massive 23%.

Tim Bond, director of insight at the DMA, says “the pandemic was always going to be a curveball and we weren’t sure what would happen. But last year we were encouraged by the increase we saw. This year [the drop] is potentially of concern”.

He offers potential explanations for the sudden and steep drop. “During the pandemic, when brands were hyper-conscious on reduced budget, there were added measures put in place to make sure they were getting bang for their buck. When that spend returned, some of those hygiene factors may have dropped.”

Paul Sinclair is marketing director at Zen Internet. He’s not surprised by the data. But although it is not new, it’s good to have statistics to demonstrate the sector’s fears. 

“Part of it is the proliferation of data, more and more digital channels, TV and radio becoming digitised. There’s more data to be looked at and analysed and many marketers or leaders have taken their eye off the ball for the ones that really matter.”

Mythical marketing metrics

So could the problem be traced to the methods companies use to measure their effectiveness? 

The DMA research highlights four categories to determine marketing effectiveness: business effects (profit, market share); brand effects (awareness, consideration); response effects (leads, Bookings); and campaign delivery effects (reach, impressions).

The research found that marketers entering the awards used a total of 170 different metrics, with the use of campaign delivery metrics making up 41% of entries and 59% devoted to business, brand or response effects. In fact, only 6% of effectiveness measures seen in the research related to business effects. 

Campaign delivery effects are often termed vanity metrics, easy numbers to obtain and often cited in millions of something – views, clicks, audience – that grab the attention but offer little in terms of understanding business outcomes. So why are they even used?

There’s more data to be looked at and analysed and many marketers or leaders have taken their eye off the ball for the ones that really matter

Vanity metrics have a place, says Jamie Irving, global head of digital marketing at Boden, but he noted “it’s about quality, not quantity after all”. They should be secondary or wrapped into a single KPI to be effective, Irving says. 

“Ultimately, the most important metric is the one which will shift business outcomes.  If we view vanity as guardrails and acknowledge the impact they’ll have on the metric you need to hit, then you are in control of the performance you need to deliver. That should help maintain effectiveness.”

Campaign effects have a certain immediacy: deploy an ad and see the number of clicks only an hour later. That’s seductive, but not necessarily helpful, and certainly not on its own. The DMA’s report shows that medium-term activity – campaigns between four and 12 months in duration – were squeezed in 2021, even though they’re generally seen as optimal for driving ROI. With the desire to pin immediate profit to marketing spend, that change could have a negative impact in ROI multipliers. 

Campaign effects can be useful for media planning, but for anything more profound “they’re not fit for purpose,” Bond says. It’s best to take a blended approach, prioritising business, brand and response effects while matching them to business realities such as category and budget. This helps to really understand effectiveness. 

“At the most basic level, logic dictates that if we’re choosing to take an action there will be a return, whether it’s a shift in awareness, consideration or sales. Therefore there are only really three metrics that could matter,” Irving says. “What changes is the timescale we’re looking at and the need for proxies to have an ‘early read’. It’s the early read that’ll be the killer if not everyone is aligned to the true metrics.” 

Long-term commitment to brand building

Sinclair acknowledges others may feel pressure from on high, which can lead marketers to grasp at the quick fix straws of campaign delivery data. “We [have] no short-term shareholder demands to meet. [But] I often report back to the board quarterly with metrics that give us confidence – the type of customers we’re acquiring, understanding if we’re keeping existing customers happy.”

One of the key implications to come out of the DMA’s research is that there should be a renewed focus on brand building. Placing less weight on immediate outcomes, this impacts ROI by stimulating future demand. It can also insulate companies from the price promotion race to the bottom, by encouraging consideration based on brand values rather than cost. 

However, as brand effects tend to be more complex and costly to measure, the research suggests companies will have to commit more resources to the area if they’re to see long-term results. 

What is certain is that 170 metrics is too many. Marketing departments are going to have to rationalise their KPIs to find those that give a true picture of the state of their business, and reduce their dependency on the quick, reassuring hit of sky-high but useless campaign effects. 

Sinclair suggests that marketers stop using vanity metrics as a proxy for a scientific approach to measurement and actually understand what it means to gain real insight. “We’re commercial people within the organisation so we need to get comfortable with data sets, how they link to pricing and so on. I’m joined at the hip with the CFO.”

There is already something of a market correction. Famously, challenger bank Starling pulled its ads from Facebook and Instagram over privacy concerns, but several weeks later stated that it “hasn’t caused a noticeable decline”. 

Irving believes recent moves to retire the third-party cookie could be a step in the right direction for refocusing that conversation. “By default, business metrics [will] come more to the fore than they may have done in recent years and the data we need to use to make decisions [will] become more reliable.”

For marketers and their colleagues in finance, sales and even the C-Suite, it may feel like there’s a need to brush up on the basics. To this end, in partnership with the DMA Media Council, the DMA has recently launched a Marketing Framework 101. This will provide best-practice guidance on how to measure business outcomes and understand the impacts of marketing across different timescales. 

The framework will also include a glossary to define terms around measurement so that everyone is at least using the same terminology. Siloed media planning creates siloed media measurement and distorts marketing effectiveness. The report suggests that it leaves marketers trying to justify marketing spend “with one hand tied behind their backs”. 

The glossary, the DMA hopes, will help organisations understand what is meant by various terms across the media measurement process and ‘de-silo’ the process of measurement. Armed with a more rigorous approach to effectiveness, perhaps in future initiatives that seek to prove their successes on the awards circuit will then truly be worthy of a gong.