Incrementality Testing: The Fundamentals

May 22, 2025

Let’s hit pause for a moment. Set aside what you’re working on, tune out any distractions, then answer this question: In this very moment, what is your absolute #1 goal as a marketer?

Chances are, the answer came to you quickly. If you said “Improve overall marketing efficiency,” you certainly aren’t alone. In our recent industry survey, 61% of respondents said the same thing. You also aren’t alone if you said growth in ecommerce (36%), growth in retail (21%), or growth on Amazon (8%). 

Whether you’re after efficiency or growth (or both), incrementality testing is an increasingly essential tool in the modern marketer’s toolkit. After all, incrementality testing does something 2010s measurement darlings like last-click attribution never could: It measures causation. Is your marketing causing more customers to convert?

Using control groups, incrementality tests isolate the actual impact of your campaigns — and filter out the customers who would have converted anyway. While that might sound technical, it’s a surprisingly intuitive process once you grasp the fundamentals — so let’s dive in. 

What is incrementality testing?

Incrementality testing uses a control group to isolate and measure the causal impact of your marketing campaign. 

What’s a control group? It’s just a portion of your audience that sees no ad campaign. So you might compare conversion rates among the 80% of people who saw your ad (test group) versus the 20% who didn’t (control group). If those who saw your ad convert at a higher rate, your marketing campaign is incremental. 

Incrementality testing functions similarly to randomized control trials in drug discovery. A pharmaceutical company gives their drug to one portion of test subjects, then a placebo to the others to test the true effectiveness of the drug. 

For more on the basics of incrementality testing, visit Incrementality School. Nope, no homework at this school — just snackable info about the ins and outs of incrementality, designed by Haus experts.

Causality: the heart of incrementality testing

Traditional measurement tools like last-click or multi-touch attribution look at patterns in your data and assign credit. Naturally, a lot of what they measure is statistical noise. After all, a spike in conversions could happen for various reasons — seasonality, channel overlap, or external events. 

But if you accidentally attribute that spike to your new campaign, then decide to double down…you’re lighting ad dollars on fire. 

“If you don’t have a sense of the incrementality of your channels, you’re likely spending a lot of money that is not driving any actual value for the company,” says Haus Measurement Strategist Nick Doren. “You’re paying for things that are already likely to convert anyway, so you’re hurting the bottom line of the company.”

Incrementality testing measures causal impact so that you can invest confidently in channels that drive actual impact. Working with an incrementality partner who takes test design seriously only elevates that level of confidence. That might mean using synthetic controls like Haus does to more exactly mirror your control and treatment groups.  

Wondering which incrementality partner is right for your brand? Our incrementality testing buyer’s guide offers some useful guidance. 

Incrementality vs. A/B testing: What’s the difference?

If you’re a marketer, you’re probably familiar with A/B testing. It measures performance between two variants of the same experience — two landing pages, two email subject lines, etc. 

A/B testing is a trusted tool for many marketers — but it shouldn’t be confused with incrementality testing. Incrementality testing differs by measuring whether the experience or campaign had any effect. It does so by comparing performance with your campaign versus performance without your campaign. 

Incrementality Testing A/B Testing
Primary purpose Measures causal lift Compares two creative variants to see which one resonates best
Key question “Would this conversion have happened without this treatment?” “Which variant of this treatment drives more conversions?”
Scope & structure Tests full campaigns or channels via treatment vs. holdout Tests specific elements — both groups see an experience, just with different versions
Complexity & cost Incrementality requires bigger samples, deeper analysis, and data science expertise A/B is more DIY — faster, cheaper, ideal for quick creative tweaks
Limitations & attribution Ignores flawed last-click models and measures pure lift Often relies on platform attribution, so you see relative wins, not net-new impact

While these methods are different, they’re complementary. You might use incrementality testing to figure out whether Meta ASC moves the needle for your brand — then you use A/B testing to pinpoint the more effective Meta campaign creative. For more on how A/B testing and incrementality testing differ (but work together), check out our deep dive on the topic

Questions that incrementality testing can answer

Incrementality testing shines when you need clear answers about campaign performance. Here are a few common use cases:

Look for an incrementality partner with talented growth experts who can help you get the most out of your measurement strategy. 

“We’ll push our customers,” says Haus Measurement Strategist Alyssa Francis. “We’re not afraid to tell them straight up, ‘That won’t be a valuable learning for you — we recommend this instead.’ We find things really click when you run interesting tests.”

At Haus, brands are empowered to push the status quo and answer the big questions — then they’re using those insights to improve the bottom line

Ready to see Haus in action?

Discover how Haus can help you drive incremental returns on your investments

Get A Demo

Distinct ad campaigns require distinct incrementality tests

When designing an experiment, you must decide how many variables you want to test and whether or not you need a control group. The nature of the question you’re trying to answer will determine these structural details. 

Let’s dive into different structures for holdout tests on the Haus platform.

2-cell experiment with a control group

What it does: Offers a controlled environment for assessing the causal impact of a specific marketing treatment by comparing it against a control group

When to use it: To measure the impact of your ads on a key channel compared to a group receiving no ads

Example: Ritual ran a series of 2-cell with holdout tests when they were testing TikTok as a new channel

2-cell experiment without a control group

What it does: Essentially an A/B test in which two variables are tested against one another. You will learn if cell A or cell B performs better than the other (but won’t measure incrementality)

When to use it: You already know the channel is incremental, and you want to optimize its performance

Example: Testing two different types of creative for your new Meta campaign

3-cell experiment with a control group

What it does: Two treatments are tested against a control to measure which treatment is most incremental (while the control measures the incrementality of each treatment)

When to use it: You’re trying to determine optimal spend level or want to compare tactics to determine which is the most incremental

Example: Caraway ran a 3-cell with holdout test on Google to see which was more incremental- PMax with Branded or PMax without Branded

3-cell experiment without a control group

What it does: Compares three variables against one another to learn which performs best. Doesn’t measure the incrementality of these three variables since a control is required to identify business lift

When to use it: You want to know which treatment is the most impactful or better understand the diminishing marginal returns curve

Example: FanDuel ran a 3-cell without a holdout to determine whether they had room to spend more on YouTube efficiently or if they would hit their diminishing marginal returns curve

The typical process for incrementality testing

The incrementality testing journey often starts with a question. This “question” is more aptly known as your hypothesis. For instance, “We think our audience is on TikTok and feel it could be an effective channel for us” is a clear hypothesis. You can then design and run an incrementality experiment to test this hypothesis.

Here’s how the process typically breaks down:

  1. Define the hypothesis. What exactly do you want to know? This might mean confirming a hunch or finally getting clarity on a gray area that’s confused your team for a while.

  2. Design the test. Considering your business goals and challenges, you’ll decide how long to run your test, how big of a holdout you’ll need, and how many treatment and control groups you need. For instance, a diminishing returns test often calls for three groups, while a new channel test often calls for two.

  3. Run the experiment. Your test will be most effective if you work with a partner with a dedicated team of econometricians and machine learning experts who can clean your data thoroughly and use the latest frontier methods in causal inference to isolate variables precisely.

  4. Measure the lift. The fun part is diving into results and finally getting answers to those nagging questions. Most crucially, you’ll want to look at incremental lift, which calculates how conversion rates among your control group compare to your treatment group.

  5. Act on the results. With your results, you can confidently reallocate budget, kill ineffective tactics, and scale what’s working. (We were wrong. This is the fun part.)

Example: Testing Google PMax across three cells

Hypothesis: A DTC brand wanted to understand how effective Google’s Performance Max (PMax) campaigns were — and whether increasing spend would drive incremental lift.

Design: They ran a 3-cell experiment:

  • Cell A (Control): No PMax spend.
  • Cell B (Baseline): Current PMax budget.
  • Cell C (High spend): 50% higher PMax budget.

Result:

  • Cell B delivered a moderate lift over Cell A, proving PMax was adding value.
  • Cell C showed a stronger lift, but with diminishing returns starting to appear.

The test confirmed that PMax was incremental — but also highlighted the sweet spot for spend. By scaling judiciously, the brand avoided wasting dollars on the tail end of the curve.

The impact? They reallocated budget to hit that performance sweet spot — boosting overall efficiency without overspending

Practical results require a clear experiment design

Not all incrementality testing tools are created equal. Some rely on black-box attribution or dated methods — so setup drags on and your results stay murky.

When you’re evaluating platforms, look for:

How Haus designs incrementality tests

At Haus, we build tests around synthetic control: an advanced modeling method that constructs control groups that are characteristically nearly identical to your test group. This leads to results that are 4X more precise than those produced by matched-market tests. 

Haus isn’t just the most precise solution. It’s also a true incrementality partner. This means you’ll work with a dedicated team of Measurement Strategists who help you make the most of your learnings, then turn them into improved business outcomes.

Ready to see it in action? Check out our incrementality use cases or get started with our step-by-step guide.

Frequently Asked Questions about Incrementality Testing

What is incrementality testing, and how does it use a control group to measure campaign impact?

Incrementality testing measures the causal impact of a marketing campaign by comparing conversion rates between a test group that sees the ads and a control group that does not, isolating the lift driven by the campaign.

Why is causality — the ability to measure true lift — important compared to traditional attribution methods like last-click?

Traditional attribution assigns credit based on correlation and often misattributes seasonal spikes or overlapping channels, whereas incrementality testing filters out conversions that would have happened anyway, preventing wasted ad spend and revealing true performance.

How do 2-cell and 3-cell experiment structures differ, and when would you use each on the Haus platform?

A 2-cell test with a control group compares one treatment to no ads (ideal for new channel validation), while a 3-cell test with a control lets you compare two treatment levels against a holdout (perfect for testing spend levels or tactic comparisons).

What are the main steps of a typical incrementality testing process from start to finish?

First you define a clear hypothesis, then design the test (holdout size, duration, cells), run the experiment using robust data-cleaning and modeling, measure the incremental lift, and finally reallocate budget or optimize tactics based on the results.

Ready to see Haus in action?

Discover how Haus can help you drive incremental returns on your investments

Get A Demo

Subscribe to our newsletter

All blog articles

Incrementality Testing vs. A/B Testing: What Is Each For?

Education
May 12, 2025

Use A/B testing to optimize and incrementality testing to prove impact. Dive into differences, use cases, and how to pick the right test.

Marketing Measurement: What to Measure and Why

Education
May 5, 2025

This guide outlines the metrics, testing methods, and proven frameworks you can use to measure marketing effectiveness in 2025.

Why An Econometrics PhD Left Meta To Tackle Big Causal Questions at Haus

Inside Haus
May 2, 2025

Senior Applied Scientist Ittai Shacham walks us through life on the Haus Science team and the diverse expertise needed to build causal models.

What You’re Actually Measuring in a Platform A/B Test

Education
May 1, 2025

Platform creative tests may not meet the definition of a causal experiment, but they can be performance optimization tool within the bounds of the algorithm.

Beyond the Buzzwords: Why Transparency Matters in Incrementality Testing

From the Lab
Apr 29, 2025

Brands need to have complete information to make responsible decisions like their company depends on it.

Should I Build My Own MMM Software?

Education
Apr 11, 2025

Let's unpack the pros and cons of building your own in-house marketing mix model versus working with a dedicated measurement partner.

Should I Build My Own MMM Software?

Should I Build My Own MMM Software?
Education
Apr 11, 2025

Let's unpack the pros and cons of building your own in-house marketing mix model versus working with a dedicated measurement partner.

Why An Analytics Expert Left Agency Life to Become Haus' First Measurement Strategist

Why An Analytics Expert Left Agency Life to Become Haus' First Measurement Strategist
Inside Haus
Apr 10, 2025

Measurement Strategy Team Lead Alyssa Francis sat down with us to discuss how she pushes customers to challenge the testing status quo.

Understanding Incrementality Testing

Understanding Incrementality Testing
Education
Apr 2, 2025

Fuzzy on some of the nuances around incrementality testing? This guide goes deep, unpacking detailed examples and step-by-step processes.

MMM Software: What Should You Look For?

MMM Software: What Should You Look For?
Education
Mar 27, 2025

We discuss some of the key questions to ask a potential MMM provider — and the importance of prioritizing causality.

How to Know If An Incrementality Test Result Is ‘Good’ – And What to Do About It

How to Know If An Incrementality Test Result Is ‘Good’ – And What to Do About It
Education
Mar 21, 2025

Plus: What to do when a test result is incremental but not profitable, and a framework for next steps after a test.

Why A Leading Economist From Amazon Came to Haus to Democratize Causal Inference

Why A Leading Economist From Amazon Came to Haus to Democratize Causal Inference
Inside Haus
Mar 19, 2025

We sit down with Principal Economist Phil Erickson to talk about Haus’ “unhealthy obsession” with productizing causal inference.

Haus x Crisp: Measure What Matters in CPG Marketing

Haus x Crisp: Measure What Matters in CPG Marketing
Haus Announcements
Mar 13, 2025

When real-time retail data meets incrementality testing, CPG brands can finally measure what’s working and optimize ad spend with confidence.

Why Magic Spoon’s Former Head of Growth Embraces Incrementality at Haus

Why Magic Spoon’s Former Head of Growth Embraces Incrementality at Haus
Inside Haus
Mar 10, 2025

In our first episode of Haus Spotlight, we speak to Measurement Strategist Chandler Dutton about the in-the-weeds approach Haus takes with customers.

Do YouTube Ads Perform? Lessons From 190 Incrementality Tests

Do YouTube Ads Perform? Lessons From 190 Incrementality Tests
From the Lab
Mar 6, 2025

An exclusive Haus analysis shows YouTube often delivers powerful new customer acquisition and retail halo effects that traditional metrics miss.

Getting Started with Causal MMM

Getting Started with Causal MMM
Education
Feb 24, 2025

Causal MMM isn’t rooted in historical correlational data – it’s rooted in causal reality.

A First Look at Causal MMM

A First Look at Causal MMM
Haus Announcements
Feb 19, 2025

Causal MMM is a new product from Haus founded on incrementality experiments. Coming 2025.

Would You Bet Your Budget on That? The Case for Honest Marketing Measurement

Would You Bet Your Budget on That? The Case for Honest Marketing Measurement
From the Lab
Feb 14, 2025

Acknowledging uncertainty enables brands to make better, more profitable decisions.

Incrementality: The Fundamentals

Incrementality: The Fundamentals
Education
Feb 13, 2025

Let's explore incrementality from every angle — what it is, what you can test, and what you need to get started.

Getting Started with Incrementality Testing

Getting Started with Incrementality Testing
Education
Feb 7, 2025

As the customer journey grows more complex, incrementality testing helps you determine the true, causal impact of your marketing.

Matched Market Tests Don't Cut It: Why Haus Uses Synthetic Control in Incrementality Experiments

Matched Market Tests Don't Cut It: Why Haus Uses Synthetic Control in Incrementality Experiments
From the Lab
Jan 28, 2025

Haus’ synthetic control produces results that are 4x more precise than those produced by matched market tests.

Incrementality School, E6: How to Foster a Culture of Incrementality Experimentation

Incrementality School, E6: How to Foster a Culture of Incrementality Experimentation
Education
Jan 16, 2025

Having the right measurement toolkit for your business is only meaningful insofar as your team’s ability to use that tool.

Geo-Level Data Now Available for Amazon Vendor Central Brands

Geo-Level Data Now Available for Amazon Vendor Central Brands
Industry News
Jan 6, 2025

Vendor Central sellers – brands that sell *to* Amazon – can now use Haus to measure omnichannel incrementality.

How Does Traditional Marketing Mix Modeling (MMM) Work?

How Does Traditional Marketing Mix Modeling (MMM) Work?
Education
Jan 2, 2025

Traditional marketing mix modeling (MMM) often relies on linear regression to illustrate correlation, not causation.

2025: The Year of Privacy-Durable Marketing Measurement

2025: The Year of Privacy-Durable Marketing Measurement
From the Lab
Dec 28, 2024

Haus incrementality testing doesn’t rely on pixels, PII, or other data that may be vulnerable to privacy regulations.

Meta Shares New Conversion Restrictions for Health and Wellness Brands

Meta Shares New Conversion Restrictions for Health and Wellness Brands
Industry News
Nov 25, 2024

Developing story: Starting in January 2025, some health and wellness brands may not be able to measure lower-funnel conversion events on Meta.

Incrementality School, E5: Randomized Control Experiments, Conversion Lift Testing, and Natural Experiments

Incrementality School, E5: Randomized Control Experiments, Conversion Lift Testing, and Natural Experiments
Education
Nov 21, 2024

Sure, the title's a mouthful – but attributing changes in data (ex: ‘my KPI went up') to certain factors (ex: ‘we increased ad spend’) is hard to do well.

Incrementality Testing: How To Choose The Right Platform

Incrementality Testing: How To Choose The Right Platform
Education
Nov 19, 2024

Whether you’re actively evaluating incrementality platforms or simply curious to learn more, consider this checklist your first stop.

Incrementality School, E4: Who Needs Incrementality Testing?

Incrementality School, E4: Who Needs Incrementality Testing?
Education
Nov 14, 2024

As brands' marketing strategies grow in complexity, incrementality testing becomes increasingly consequential.

Incrementality School, E3: How Do Brands Measure Incrementality?

Incrementality School, E3: How Do Brands Measure Incrementality?
Education
Nov 7, 2024

Traditional MTAs and MMMs won't measure incrementality – but geo experiments reveal clear cause, effect, and value.

Incrementality School, E2: What Can You Incrementality Test?

Incrementality School, E2: What Can You Incrementality Test?
Education
Oct 31, 2024

Haus’ Customer Marketing Lead Maddie Dault and Success Team Lead Nick Doren dive into what you can incrementality test – and why now's the time.

Incrementality School, E1: What is Incrementality?

Incrementality School, E1: What is Incrementality?
Education
Oct 24, 2024

To kick off our new Incrementality School series, three Haus incrementality experts weigh in describing a very fundamental concept.

Inside the Offsite: Why Haus?

Inside the Offsite: Why Haus?
Inside Haus
Oct 17, 2024

At this year's offsite, we dove into why – of all the companies, options, and career paths out there – our growing team chose Haus.

Haus Named One of LinkedIn's Top Startups

Haus Named One of LinkedIn's Top Startups
Inside Haus
Sep 25, 2024

A note from Zach Epstein, Haus CEO.

Google Announces Plan to Migrate Video Action Campaigns to Demand Gen

Google Announces Plan to Migrate Video Action Campaigns to Demand Gen
Industry News
Sep 6, 2024

The news leaves advertisers swimming in uncertainty — which is why it’s so important to test before the change.

Conversion Lag Insights: How Haus Tests Can Help Optimize Q4 Budgets

Conversion Lag Insights: How Haus Tests Can Help Optimize Q4 Budgets
From the Lab
Sep 5, 2024

Post-treatment windows offer a unique glimpse into the lingering impacts of advertising campaigns after they’ve concluded.

PMAX Experiments Revealed: Including vs. Excluding Branded Search Terms

PMAX Experiments Revealed: Including vs. Excluding Branded Search Terms
From the Lab
Aug 20, 2024

We analyzed experiments from leading brands to understand the incremental impacts of including vs. excluding branded terms in PMAX campaigns.

CommerceNext Session Recap: How Newton Baby Uses Incrementality Experiments to Maximize ROI

CommerceNext Session Recap: How Newton Baby Uses Incrementality Experiments to Maximize ROI
From the Lab
Aug 9, 2024

“We ran the test of cutting spend pretty significantly and it turns out a lot of that spend was not incremental,” says Aaron Zagha, Newton Baby CMO.

Introducing Causal Attribution: Your New Daily Incrementality Solution

Introducing Causal Attribution: Your New Daily Incrementality Solution

Causal Attribution syncs your ad platform data with your experiment results to provide a daily read on which channels drive growth.

Haus Announces $20M Raise Led by 01 Advisors

Haus Announces $20M Raise Led by 01 Advisors
Haus Announcements
Jul 30, 2024

With this additional support, Haus is well-positioned to deepen our causal inference capabilities and announce a new product: Causal Attribution.

3 Ways to Perfect Your Prime Day Marketing Strategy

3 Ways to Perfect Your Prime Day Marketing Strategy
Education
Jun 26, 2024

Think Amazon ads are the only effective marketing channel for Prime Day? Think again.

Maximize Your Q4 Growth With 4 High-Impact, Low-Risk Tests

Maximize Your Q4 Growth With 4 High-Impact, Low-Risk Tests
Education
Nov 8, 2023

Not testing during your busy season may be costing you more than you think.

Why Maturing Direct to Consumer Brands Need to Run Incrementality Tests

Why Maturing Direct to Consumer Brands Need to Run Incrementality Tests
Education
Sep 15, 2023

The media strategy that gets DTC brands from zero to one does not get them from one to ten.

5 Signs It’s Time to Invest in Incrementality

5 Signs It’s Time to Invest in Incrementality
Education
Aug 9, 2023

5 common signs that indicate it is definitely time to start investing in incrementality.

$17M Series A, Led by Insight Partners

$17M Series A, Led by Insight Partners

Haus raises $17M Series A led by Insight Partners to build the future of growth intelligence.

Why Meta's “Engaged Views” Is a Distraction, Not a Solution

Why Meta's “Engaged Views” Is a Distraction, Not a Solution
Industry News
Jul 25, 2023

While additional data can be useful, we must question whether this new rollout is truly a solution or merely another diversion.

Why You Need a 3rd Party Incrementality Partner

Why You Need a 3rd Party Incrementality Partner
Education
Jul 6, 2023

Are you stuck wondering if you should be using 3rd party incrementality studies, ad platform lift studies, or trying to design your own? Find out here.

iOS 17 Feels Like iOS 14 All Over Again. What It Means for Growth Marketing…And Does It Matter Anymore?

iOS 17 Feels Like iOS 14 All Over Again. What It Means for Growth Marketing…And Does It Matter Anymore?
Industry News
Jun 12, 2023

A single press release vaguely confirmed that Apple will continue its assault on user level attribution. Here, I unpack what I think it means for growth marketing.

How Automation Is Transforming Growth Marketing

How Automation Is Transforming Growth Marketing
Education
May 30, 2023

As platforms force more automation, the role of the media buyer is evolving. Read on to learn what to expect and what levers are left to pull.

Statistical Significance Is Costing You Money

Statistical Significance Is Costing You Money
From the Lab
Apr 13, 2023

It is profitable to ignore statistical significance when making marketing investments.

The Secret to Comparing Marketing Performance Across Channels

The Secret to Comparing Marketing Performance Across Channels
Education
Mar 2, 2023

While incrementality is better than relying on attribution alone, comparing them as-is is challenging. Thankfully, there’s a better way to get an unbiased data point regardless of the channel.

Your Attribution Model Is Precise but Not Accurate - Here’s Why

Your Attribution Model Is Precise but Not Accurate - Here’s Why
Education
Feb 8, 2023

Learn which common marketing measurement tactics are accurate, precise, neither or both.

How to Use Causal Targeting to Save Money on Promotions

How to Use Causal Targeting to Save Money on Promotions
Education
Feb 1, 2023

Leverage causal targeting to execute promotions that are actually incremental for your business.

Are Promotions Growing Your Business or Losing You Money?

Are Promotions Growing Your Business or Losing You Money?
Education
Feb 1, 2023

Promotions, despite their potential power and ubiquity, are actually hard to execute well.

User-Level Attribution Is Out. Geo-Testing Is the Future.

User-Level Attribution Is Out. Geo-Testing Is the Future.
Education
Jan 27, 2023

Geotesting is a near-universal approach for measuring the incremental effects of marketing across both upper and lower funnel tactics.

The Haus Viewpoint

The Haus Viewpoint
Inside Haus
Jan 18, 2023

We are building Haus to democratize access to world-class decision science tools.