The third day of the ATTRIBUTION & ANALYTICS ACCELERATOR, a conference organized by the Advertising Research Foundation, revolved around the depreciation of cookies and the loss of identity solution. Presenters from the USA, Australia, and the UK took turns showcasing the variety of solutions that helped them deliver actionable results without relying on cookies.

Jim Spaeth, from Sequent Partners, kicked off the day by reminding the audience that the first two days focused on how attribution and advanced analytics expanded to measure the marketing ecosystem and more completely estimate the value of advertising. He then highlighted how much that was needed and how the first two days also served as a demonstration of the progress the industry had made in that regard.

Despite the variety of solutions offered on that day, there was a consensus that could be summed by what Dan Eadon said during his presentation: “There should not be fear of cookies disappearing, we should run towards it with modern prediction machines.” Dr. Ramla Jarrar from MASS Analytics built on this statement, during the panel discussion, and added that the other struggle that the attribution industry needed to face head-on was data standards.

The March to Analytics Maturity

The first presentation of the day “The March to Analytics Maturity” did not directly address the issue at hand; what it did was pave the way perfectly by asking how to take analytics from being a science experiment, and have it integrated into planning, brand strategy, and media decisions; suggesting the need for a level of maturity that allows the industry to be resilient in the face of cookies’ disappearance. Neustar’s Marc Vermut suggested that to achieve this level of maturity, a culture change should be implemented. He added that the key elements of analytics excellence do not just include data & analytics, but also people, culture, and processes.

More Realistic Assumptions

Dan Eadon, from Optus, led the next presentation “Predicting Audience Exposure From Privacy Screened Google ADH Data For Cross-Channel Media Impact Measurement” and built it on juxtaposing the simplicity of consumer behavior in the 1950s and the complexity it has developed in our modern times with the evolution of media and the emergence of digital media. Based on this comparison, he stressed that it is unrealistic to assume that exposure is the same across consumers, journey, and contexts. He also argued that it is unrealistic to assume that all tactics within a medium are equally effective and that consumers respond the same across journey and contexts. The final assumption he deemed unrealistic was that user-level data is representative. To answer these issues, he suggested that mixed levels of data (aggregate/ and panel) should be used in different models in a way that captures the heterogeneous impacts of different touchpoints.

CMM is the New Recipe

The next presentation “Cookies Are Crumbling, is CMM the New Recipe?” followed suit, in that, the presenter Dr. Ramla Jarrar, acknowledged the complexity of consumer behavior and the need to account for that in ways that do not threaten consumers’ privacy. She introduced the audience to Consumer Marketing Mix (CMM), and explained that it is “a form of Marketing Mix Modeling that has the customer as a core unit measurement instead of total sales”. She added that CMM created a favorable ground for MASS-Analytics to have more targeted marketing and media plans. This allowed for more actionable insights to the brands they model. The case study Dr. Jarrar used showcased how they managed to isolate the issue, which one of their clients was facing, through creating segments based on consumer behavior and creating a separate model for each one of them.

Granularity is the Silver Lining

Similar to what preceded, the last two presentations of the day “Back In The Mix – Marketing Mix Modeling Evolved For Tactical Advantage In A Digitally Fragmented World” and “The Silver Lining To The Crumbling Cookie: Turning To Better Means Of Measurement” had at their core, the key phrases: “a need for granularity” and “cookies are crumbling”. Tim Ferris and Keith Wulinsky focused on the evolution of Marketing Mix Modeling toward more granularity to “yield untapped opportunities within marketing plans”. They argued that speed and granularity of media and geography are paramount to having a successful granular mix. To this end, they used the example of Dunkin’ breaking down the TV channel to smaller elements like cable, broadcast, etc. DISQO’s Stephen Jepson, on the other hand, focused on providing a new source for granularity after the disappearance of cookies. He presented the audience to what he described as one of the US’s largest consumer audiences; a solution through which customers agreed to not only share their opinions, but also their ad exposure data, and consumer journey. Jepson added that they did not just do traditional brand lift, they combined that with control vs exposed lift; search behavior; site vitiation; and shopping behavior. This allowed their client Nestle to fill in the gaps in cross-media and cross-channel measurement.


The third day of the ATTRIBUTION & ANALYTICS ACCELERATOR conference was a day of solutions; The speakers all agreed that the measurement industry should not tiptoe around the threat of cookies’ decay, and they certainly did not lean toward overly complicated alternatives like Clean Rooms. Instead, they chose to showcase how independent the measurement industry could be from the inevitable fate of cookies. This independence happens to be an evolved Marketing Mix Modeling based on granularity, automation through ML, and calibration through experiment analysis.