By Pravat Dash, VP Analytics, Ipsos MMA

Challenges and Opportunities

Marketing measurement focuses on quantifying the incremental effect of marketing and related commercial activities on business results (in terms of revenue, profit, units, or share) and upper-funnel measures like brand awareness or search activities. There are two techniques typically utilized – Marketing Mix Modeling (MMM), an econometric approach, and multi-touch attribution (MTA).

These techniques are most effective when combined. Marketing Mix Modeling is well suited to measure and validate the incrementality of each channel (including non-marketing factors that impact consumer decisions). MTA can measure highly granular insights within specific marketing channels or targeting platforms. While MMM and MTA are sometimes seen as separate, disconnected analytic solutions, using them as disconnected capabilities too often results in inaccurate insights because separating the two results in omitted variable biases, where too much credit is given to activities that don’t deserve it, while other activities that do deserve receive too little credit.

Combining MMM and MTA for a Unified Approach

The answer lies in their synergies. Unified Marketing and Commercial Measurement combines MMM with Agile Attribution capabilities. This unified approach combines econometric modeling with AI/ML to create a transparent, integrated, predictive view of how marketing will drive incremental sales performance. Marketers benefit from the stability of a Marketing Mix Model and its ability to measure incrementality while achieving very granular daily in-platform level reads from MTA (or Agile Attribution).

The analytics engrained within an integrated MMM and MTA (Agile Attribution) capability leverage advanced regression-based techniques, making them dependent on high-quality, low-latency data in order to maximize their combined benefits. This begins with the data inputs into both. This article focuses on the requirements for establishing a foundation for accurate measurement through data acquisition, integration, governance, and compliance programs.

Data Driven & Fact-based Decision-Making

Everyone has heard the phrase “data-driven, fact-based decision-making.” Establishing a marketing measurement program often requires a shift in philosophy to bring that to life. In short, “what got you here, might not get you there” relative to where you want to go in today’s data-driven environment. Many companies and executives got to where they are now without leveraging data-driven, analytic and statistically derived insights – the depth and accuracy to which they are available now simply weren’t available then. But make no mistake, they are now.

Today, if you are not leveraging data driven analytics to make better decisions, you are at a competitive disadvantage. Unfortunately, however, even in today’s increasingly data and analytic-driven world, not everyone takes the time to embrace these advantages let alone look to identify the important data requirements. These requirements include ensuring the necessary data is there; it can be organized and harmonized to be holistic enough, reliable, and fast and of sufficient quality to power decisions at the speed organizations need to effectively activate it.

When it comes to leveraging data and analytics to create a competitive advantage the term “garbage in, garbage out” is applicable. Every day we read in the news about financial and competitive advantages achieved through leveraging predictive analytics combined with AI/ML to improve decision-making. However, without an effective data strategy supporting the identification, cleansing, integration and management of your data, building a marketing analytics stack and supporting solutions will be challenging at best. Moreover, all of this needs to be thought through, constructed and mapped collecting the data you need to the problems you are looking to solve. Simply dumping everything into a database or data lake without thought to what you are trying to use it for more often than not doesn’t solve the problem.

Attributes of Data Management for Marketing Measurement

Marketing measurement programs use advanced algorithmic techniques to explain the effects of changing a set of “independent variables” (TV, digital, radio, print, newspaper, price, promotion, etc.) on one or more “dependent variables” (sales, revenue, market share, etc.). Independent variables may also be called “predictors” or “explanatory variables.” They represent market stimuli that may influence consumer demand for your products or services. Dependent variables represent the KPI (key performance indicator) or business outcome you use to measure organizational performance.

Coverage

Successful measurement programs determine what the most relevant dependent and independent variables need to be. The dependent variable is usually tied to how an organization measures short and long-term performance. Short-term performance is typically straightforward as low-latency and high-quality data are normally available to quantify business performance. It is also important to include sufficient upper funnel KPI data, such as brand awareness, if that data is available, to measure the impact of brand on loyalty and longer-term sales.

The goal of setting independent variables is to quantify the key factors affecting consumer demand. That said, coverage of all factors is an unachievable objective in most cases given the infinite number of influences that exist. So, the key is to capture those that are material to measuring marketing investment. While no data is perfect, your predictive analytics solution provider should be able to account for and control for missing data.

However, the more coverage you have, the more statistically correlated the predicted results will be to the actual outcomes. There are a variety of factors depending on the industry sector. Paid media owned and earned touchpoints, direct marketing channels, pricing and promotional activity, distribution, product lifecycle and innovation, event marketing, salesforce, competitive signals, and other exogenous effects like weather, macro-economic and category trends are consistently variables that need to be considered in the marketing mix.

Starting an engagement, erring on the side of “more vs. less” in terms of relevant data sources is the best option. During the testing and pre-modeling phase less relevant variables in terms of investment and incrementality fall-out, helping ensure not only accuracy but speed of delivery for the program through efficient management of data inventories as well as in working with 3rd party data partners.

Granularity

Granularity pertains to the level of detail at which data is provided. This includes periodicity, or the time granularity of the data. Best-in-class measurement utilizes daypart, daily data, or event-level data streams (transactions, click-streams, etc.), though there may be some caveats to event-level data streams. Monthly data sources may be acceptable, especially for slower moving signals – but this level of aggregation tends to hide variation, a fundamental requirement of time-series analysis. Less granularity will provide less detailed results relative to the business questions that can be addressed.

Other important data dimensions (product, geography, target, placement attributes, etc.), and granularity requirements are driven by the following factors:

  • Key business questions
  • Availability of standardized platform and system feeds
  • Consistency of taxonomy naming conventions for each source, particularly for media placement attributes

More granular data enables more detailed and specific insights. As alluded to previously, it follows that granularity of insight is tied to the availability of granular data – so the ideal situation is to build out a data platform from the bottom up. This supports insight and value creation at the strategic and tactical levels. Sometimes, due to data origination or metadata coding issues, sourcing data at a granular level may incur a significant trade-off in the latency, frequency, or quality of the data stream. This can add fragility to the measurement system, which needs to be understood and evaluated as data is sourced. The associated cost and value equations also need to be measured in terms of the projected incremental value that will be achieved.

Latency

Data latency refers to the time it takes for data to be processed and made available for use after it has been captured. Impacting data latency is how often data is updated for processing as well as when the data will be provided.

When the data will be provided can have a material impact on the recency of the data and how relevant it will be when it is processed and made available. If you have a source with daily periodicity, can that source be updated every day? If so, how many hours, minutes, seconds, or milliseconds will it take? Thoughtful consideration should be given to this as that speed may come with a cost.  When do you really need that data, should be the question to optimize the value equation. Aligning data sources in a way that enables the analytics that support key business needs and objectives while producing a clear case for incremental value is an important element in establishing a data strategy that meets key business goals.

Usually the source with the highest latency will establish the baseline for the minimum possible latency in measurement. However, there are exceptions to this based on an organization’s measurement needs. Experienced marketing measurement providers will help identify how to manage various levels of data latency to meet business objectives as well as source low-latency proxies for potentially high-latency sources.

Another aspect of solving data latency is identifying how many of your platforms (ad-tech, POS, CRM, data lakes, etc.) are supported for native integration by your marketing measurement provider. Support from your partner can help to reduce the latency and increase the frequency while minimizing the time required from your internal teams and partners.

Quality

Data quality is, as you would expect, critical to successful analytic, measurement, planning and investment optimization programs.

Technical data reviews, assessments, and validations effectively authenticate data pipelines by comparing the input to the output to ensure that there are no errors in the ingestion or processing pipeline. This is the starting point but with the diversity and amounts of data processed today – additional scrutiny is required.

More complex issues must be solved for including gaps in data capture, missing data, handling of dynamic product categorization, campaign attributes, and placement taxonomies. Implementing sophisticated automation, quality assessments, and comparisons supported by human review to identify anomalies and potential quality inquiries during initial data onboarding is crucial. This should be supplemented by natural language processing (NLP) techniques conducted by individuals intimately familiar with the business context to identify and rectify issues effectively.

A common theme across clients and industries is the need to validate data that is not actively being used. Even with a sophisticated marketing data management and quality assurance hub, until business users adopt it, that data should be considered ‘at risk.’ Frequency of use and adoption by business users is the essential indicator of source data quality you can trust, and that comes through a rigorous process of quality management and proof of value to the cross-functional business stakeholders. It is very risky to assume that data in warehouses, lakes, hubs, or any other system not actively consumed by business users, especially when it’s being used in analytics, is accurate for consumption for cross-functional users’ business needs.

Historical Sufficiency

To capture market changes such as changes in consumer preferences, the introduction of new competitors, regulatory impacts, and long-term brand and seasonal effects, typically two to three years of data is required. Experienced providers in the space will help fill in missing history using alternative sources – that can include historical benchmarks, synthetic and surrogate data, as well as various imputation techniques and sources.

Adherence to ensuring that the historical data has been collected and recorded consistently over the entire period is important. Inconsistent data can lead to incorrect conclusions, lowering the model’s reliability and diminishing internal trust and buy-in.

Governance and Compliance

Privacy regulations have become a significant factor in the MMM/MTA industry and can be difficult for inexperienced practitioners to navigate. Regulations vary considerably by market and sector, and even within a market, they can be vague in their application. Techniques that avoid collecting individual-level data can materially reduce the regulatory and compliance burden as well risk the measurement program.

In addition to compliance and regulatory concerns, individual data has emerged as a significant source of data quality and coverage challenges. There has been a lot in the news about the deprecation of third-party cookies and associated digital media trackers and, more recently, that Google has given the third-party cookie an indefinite stay of execution.

However, leaders in the space have become very much aware that third-party cookies have been largely unreliable for measurement purposes for years outside of some very narrow use cases. The problem is that many browsers and devices (Safari on Apple devices of note) have already blocked these cookies. The result is that cookie-based measurement approaches rely on a small and potentially biased sample that can compromise measurement accuracy.

Alternatively, you can focus on sophisticated methodologies combined with transparent validation processes that provide detailed insights from “micro-aggregate” data sources. This data environment changes daily, so keeping your finger on the pulse and adapting to signal gain or loss in real time is important.

Next Steps for Adopting MMA/MTA Solutions

With the evolution of leading data and analytic platforms’ ability to measure “true incrementality” (vs. basic ROI or MROI metrics) in terms of incremental sales, companies are recognizing the need to invest in support data strategies and data management platforms. By doing so, integrated MMM/MTA solutions can deliver measurable growth in terms of revenue, brand equity, and share growth. Getting there means understanding and adhering to the best practices outlined in this article and fostering a culture that implements these strategies.

  1. Prioritize data quality and documentation:
    Implement a comprehensive, cross-functionally understood and bought-into approach. The approach should include integrating and centralizing data from various sources, ensuring accuracy and consistency through rigorous data cleaning processes, and maintaining up-to-date datasets to capture current market trends. Focus on collecting granular data, validating data through cross-verification techniques, and maintaining thorough documentation of data sources and transformations.
  2. Leverage the data science organization to ensure continuous improvement:
    Regularly review data quality metrics and performance, encourage feedback from team members to identify areas for improvement, and implement iterative improvements based on data-driven insights. Invest in training and development programs to ensure that resources are equipped with the latest skills and knowledge in data management. Ensure data practices are aligned with business goals, leading to more effective and accurate marketing mix modeling outcomes.
  3. Partnerships are key:
    Constantly look to identify, and leverage data partnerships with external data providers, AdTech, and MarTech industry partners to enhance the scope and accuracy of the measurement data. Strive to incorporate diverse datasets that may not be available internally, such as third-party marketing platform data, consumer behavior data, competitive intelligence, and media consumption patterns. Regularly review partnership outcomes to maximize the value derived from these collaborations.
  4. Stay current on privacy trends:
    Actively engage with industry networks and regulatory bodies to receive updates on new laws and best practices. Collaborate with legal and compliance teams to ensure alignment with current regulations to support granular low-latency measurement. Implement privacy-enhancing technologies to maintain adherence to privacy standards.