A major advertiser nearly cut branded search. An incrementality experiment proved it was driving over £1B in revenue through a conversion path attribution had missed entirely. The principle: measure twice before you cut once.
Case Study - 7 min read - MASS Analytics
There’s a principle borrowed from carpentry that applies with precision to marketing: measure twice, cut once. Before you remove something, a beam or a budget line, you want to be certain of what it’s doing. Some things look removable until you take them out and the ceiling falls in.
Branded search is one of marketing’s most reliably misunderstood channels on the question of incrementality. In attribution models it tends to look impressive: high last-click conversion rates, strong ROAS reported by the platform. But attribution measures the last step of a journey, not whether the journey would have happened without the ad. The real incrementality question is: are these customers buying because of our search ad, or would they have found us anyway?
The Decision That Almost Happened
A Fortune 100 technology company was running over £70 million in annual branded search spend. The platform numbers looked solid. But internal pressure to cut costs had put the channel under review. The question: is this spend genuinely incremental, or are we paying to capture customers who were already coming?
It’s exactly the right incrementality question to ask. And it has exactly one reliable method for answering it: a controlled experiment.
Why Attribution Missed It
Platform attribution measures the digital journey from ad impression to online conversion. It cannot see offline conversions: phone calls, in-store visits, or purchases made on a different device. Branded search was triggering behaviour that completed outside the digital funnel entirely. From attribution’s perspective, the channel looked marginal. The incrementality experiment revealed it as critical.
This is the structural limitation that incrementality measurement is designed to address. The question isn’t whether attribution is doing its job, because it is. The question is whether the conversions it tracks represent the full causal story. For branded search at this company, they represented a small fraction of it.
The real risk is not measuring incrementality. It’s making budget decisions on channels you don’t understand , and cutting the ones that were genuinely driving growth.
A Second Example: Proving MMM Recommendations Work
Incrementality experiments aren’t only used to evaluate individual channels. They’re also a powerful tool for validating whether MMM-driven optimisation recommendations actually deliver in the real world , independent of the team that built the model.
One of our clients, a major retailer, ran exactly this kind of validation. They implemented a structured A/B test across their store network: 10% of locations adopted an MMM-optimised marketing strategy; 90% continued with their usual approach. Before the experiment, cost performance between the two groups was negligible. After the optimised strategy was implemented, the cost difference shifted significantly in the treatment group’s favour, lower costs, maintained performance.
10%
Of the network ran the MMM-optimised strategy in the treatment group
Clear
Cost differential emerged in favour of the treatment group after optimisation
Independent
Validation: the experiment confirmed MMM’s recommendations worked in practice
The lesson from both examples is the same: incrementality measurement removes the guesswork from high-stakes decisions. Whether you’re evaluating a channel for cuts or validating an optimisation strategy, the experiment provides the causal evidence that model outputs and platform reporting simply cannot.
Channels where incrementality experiments reveal what attribution misses
- Branded search — capturing offline conversions and cross-device completion paths
- TV and connected TV — brand equity effects and delayed purchase decisions
- Radio and audio — in-store and phone behaviour with no digital trace
- Out-of-home — awareness effects invisible to any digital funnel
- Retail media — halo effects on in-store purchase beyond the platform’s reporting
Back to Series overview
Previous article: How Geo Experiments Measure Incrementality
Next article: How Incrementality Measurement Makes Your MMM Sharper
