Reality Check: Validation for Marketers
How do you know if any of it is actually working? How do you separate genuine results from busywork, hype, and stuff that doesn't move the needle?
First Principles Thinking
Don't accept "best practices" at face value. Ask "Why?" relentlessly. Goal: "Increase traffic 20%." Why? → "More leads." Why more leads? → "Sales needs them." Are current leads converting? → "Actually, no." The real problem is traffic quality, not volume.
The Bullshit Police
"10,000 impressions!" — Did any lead to a sale? "Engagement up 200%!" — From whom? "5x ROAS!" — By which attribution model? If you can't draw a line from a metric to a business outcome, that metric is noise.
Data Scepticism
Attribution problems: Every platform claims the conversion. Platform self-reporting: They have no incentive to tell you they're not working. Survivorship bias: Case studies show successes, not the hundred failures. Correlation ≠ Causation.
MER: Marketing Efficiency Ratio
MER = Total Revenue ÷ Total Marketing Spend. No attribution model needed. Just: is the engine profitable? MER is the vital sign.
LTV/CAC
CAC: Total cost to acquire one customer. LTV: Total revenue from one customer over their lifetime. LTV:CAC of 3:1 or better is generally healthy. Below that, you're paying too much or not retaining well enough.
Small Bets & MVP Testing
Validate before you scale. Every dollar scaling an unvalidated idea is a dollar gambled. Every dollar testing is a dollar invested in learning. This maps to Q3 in the 4-Quadrant model.
→ Persuasion Toolkit · → Decode Traffic · → Master Your Offer · → Home