All Case Studies Measurement & Intelligence

Proving True Marketing Impact With Incrementality Testing

3.2x

Incremental ROAS

22%

Organic Cannibalization Exposed

30%

Budget Allocation Improvement

2. Context

At Shikho, our Adjust-attributed CPA looked healthy. Meta reported strong ROAS. Google showed efficient cost-per-install. But I had a nagging question: were our paid ads actually driving incremental conversions, or were they just claiming credit for users who would have found us organically anyway?

In Bangladesh's edtech market, no company had ever run an incrementality study. The tools and consultants that exist in Singapore or New York aren't available here. If this was going to happen, I had to design and execute it myself.

3. Approach

Phase 1 — Geo-Holdout Incrementality Test

Split Bangladesh into test and control regions at the division level. Paused all paid spend in control regions for 2–3 weeks while maintaining everything else unchanged. Measured the gap between "expected conversion loss" (based on historical performance) and "actual conversion decline." The difference = true incremental impact of paid media.

Geo-Holdout Test Design

Test Regions

Paid Active

Normal campaign spend

Compare

Control Regions

Paid Paused

Organic-only baseline

Gap = True Incremental Impact of Paid Media

Phase 2 — Media Mix Model

Built a regression-based MMM using Meta's open-source Robyn framework, covering 18+ months of weekly data across all paid channels, controlling for exam-season seasonality (SSC in February, HSC in August), competitive activity, and organic growth trends. Calibrated with incrementality test priors.

Phase 3 — Incrementality-Calibrated Attribution

Created a hybrid model that adjusts Adjust's last-touch attribution data with geo-lift coefficients — producing "incrementality-adjusted CPA" for each channel. This became the basis for all quarterly budget allocation decisions.

4. Results

  • True incremental ROAS: 3.2x — significantly different from the 4.5x platform-reported ROAS
  • Discovered that 22% of attributed conversions were organic cannibalization — these users would have converted without paid ads
  • One channel believed to drive 25% of installs was actually contributing less than 8% incrementally
  • Redirected $4K/month from cannibalized spend to genuinely incremental channels
  • Budget allocation efficiency improved 30% through MMM-guided reallocation
  • Ran 4 incrementality studies total over the 24-month period, each refining the model

5. Key Learnings

This was the most career-defining project I've run. Most growth marketers at my level have never designed an incrementality study. Doing it at a startup, without a data science team, in a market where the concept barely exists — that's the proof-point. Noom lists incrementality testing as a job requirement. Booking.com's entire measurement philosophy is built on it. I didn't wait to join those companies to learn it — I built it myself.

The uncomfortable truth about attribution: platform-reported numbers are a sales pitch, not a measurement. The gap between what Meta tells you and what's actually incremental can be 30–40%. If you're not testing for it, you don't know what your real CAC is.