Is your conversion rate lying to you?

Probably not. But your conversion rate may not tell the whole truth.
Conversion rate is the metric most eCommerce teams check first. It's visible, it's simple, and it's in the name of the discipline most of us work in (CRO, which, if we're being honest, is named after the wrong thing).
The problem isn't that CR is a bad metric, just incomplete.
And when everyone optimises for it, the number may look good while more important metrics quietly get worse.
What CR actually measures, and what it doesn't
CR tells you the percentage of visitors who bought.
That's it. It doesn't tell you:
- Whether those buyers were profitable
- Whether they'll ever come back
- Whether the offer that converted them cost more than it was worth
- Whether the sale would have happened anyway without a discount
It's an activity metric. It measures what happened. It doesn't tell you whether what happened was good.
This difference between activity and outcome highlights why it's essential to look beyond the dashboard for the metrics that truly reflect your business health.
The activity vs outcome problem
Most eCommerce reporting is built around activity metrics. Numbers that are easy to read and satisfying to report.
Sessions. Orders. Revenue. Conversion rate.
All useful. None of them, on their own, tells you whether the business is actually moving in the right direction.
Here is a scenario that plays out regularly:
A brand applies a sitewide discount to hit a monthly target. Sessions hold steady. CR climbs. Revenue hits the number. The team reports a good week.
But margins have compressed. Customers who were going to buy at full price got a discount they didn't need. Some of those new buyers were deal-seekers who won't return without another offer.
The dashboard showed green. The business got weaker.
Outcome metrics ask a different question. Not what happened, but what was the net effect? The shift is small. The difference to your bottom line isn't.
Four metrics that answer the right question
1. Revenue per Session (RPS)
RPS combines CR and AOV into one number. It tells you how much money your store generates per visit, which is a much more honest measure of store health than CR alone.

Here's a simple example of why it matters:
Your store runs a 20% sitewide promotion for a week. Traffic holds at 10,000 sessions. CR climbs from 2% to 2.8%. Most teams would celebrate.
But AOV drops from £85 to £62 because shoppers are gravitating toward cheaper items.
- Before: 10,000 × 2% × £85 = £17,000 revenue. RPS: £1.70
- During: 10,000 × 2.8% × £62 = £17,360 revenue. RPS: £1.74
Revenue is marginally up. CR is meaningfully up. But margins have compressed, and the actual gain is negligible.
Now imagine a targeted campaign that shows exit offers only to shoppers genuinely at risk of leaving without touching the full-price journey for everyone else.
CR moves from 2% to 2.3%. AOV holds at £83.
10,000 × 2.3% × £83 = £19,090 revenue. RPS: £1.91
Lower CR uplift. Higher RPS. Better outcome.
Measuring the right thing means focusing on metrics that show both performance and value gained, not just surface numbers.
2. Incremental revenue
This one asks whether your offer actually changed what a shopper did.
If a customer was going to buy anyway and you showed them a 15% discount, that discount didn't drive revenue; it reduced it. The sale was already happening. You just gave away the margin.
Incremental revenue strips that out. It measures sales that genuinely resulted from an intervention, not those that occurred alongside it.
IT Luggage moved away from blanket offers and started using targeted, behavioural campaigns to reach shoppers who were genuinely at risk of leaving.
The result was a +16% uplift in conversion rate, driven by incremental activity rather than sitewide discounting.
3. Margin per order
Not all orders are equal.

A £100 sale at 40% margin is better than £120 at 15%; most dashboards won't show this automatically.
Tracking margin per order, even roughly, forces a more honest conversation about which products to promote, which offers to run, and which campaigns are actually worth the budget.
4. Repeat purchase rate
This is the one most teams underweight when they're focused on acquisition, and it might be the most revealing metric of all.

A customer who buys once and never returns is not the same as a customer who buys three times a year. They look identical in a weekly revenue report. Over the course of 12 months, the difference in value is enormous.
Repeat purchase rate is crucial for understanding if your customers are loyal or simply responding to deals, helping determine if your growth is durable.
Brands with high discount dependency often see this metric quietly deteriorate. Because the customers they're attracting respond to price, not the brand. Remove the promotion, and they're gone.
If your repeat purchase rate is low, no amount of CR optimisation will fix the underlying problem.
Why does the traffic source make CR even harder to trust
There's one more reason to stop treating CR as a headline number: it's almost impossible to interpret without knowing where your traffic came from.
Paid and organic traffic convert at very different rates. Not because one is better, but because they arrive with different intent.
A shopper who typed your brand name into Google is close to making a purchase. A shopper who clicked a broad awareness ad on Meta may never have heard of you before. Average them together and you get a number that accurately describes neither.
This matters for promotions. A paid campaign driving high volumes of cold traffic will drag your sitewide CR down, even if underlying store performance is unchanged. A week with strong branded search volume will inflate it, with nothing to do with your promotional strategy.
The fix is straightforward: segment CR by traffic source and report them separately. Blended CR is a number. Segmented CR is insight.
With all this in mind, is CR worth tracking at all?
Yes, just not in isolation.
CR is useful, but as a primary metric, it drives the wrong incentives: more discounts, cheaper products, lower-intent traffic, and customers who only return for sales.
Brands that grow sustainably use a balanced mix of outcome metrics, not just CR. To drive better results, track RPS, incremental revenue, margin per order, and repeat purchase rate side by side.
The question worth asking of any promotion isn't "did it move the CR?"
It's "did it change behaviour that wouldn't have changed otherwise?"
Answering whether your promotions create true behavioural change leads to lasting success and more valuable insights, focus your analysis there for the best results.

