Why Optimising for Conversion Rate may not Increase Your Revenue
So, you’ve run a couple of tests and noticed a 30% increase in your conversion rate. Where’s that 30% extra revenue per month? Google Website Optimizer tells you that the test won with 98% “confidence” that it won. Come on!
Believe it or not, there can be a number of things at play here. Let’s take a look at them:
You’re optimising for micro conversions
You may have heard of an old trick in the book when running tests on sites with limited traffic. That is, you optimise for a step early in the sales funnel. I.e. you optimise for a micro conversion. Micro conversions do not mean you’ll always get more macro conversions (completed sales). In fact, just because you’re getting more people to click through to your checkout could mean you’re pissing your customers off more to do so.
You’re not optimising for revenue/margin
A lot of people in the industry are totally focussed on conversion rate. This metric (as important as it may be) is only part of the picture. Of course people may be more likely to convert at $50 than $100 and that may mean you’re worse off.
You’re bringing in more tyre-kickers
Marketing Experiments has an interesting little model about conversion rate and lead quality. Basically, they propose that as the conversion rate rises, that lead quality decreases. This makes a hell of a lot of sense as many of your visitors may not be motivated to make a purchase. Instead they find it unnaturally easy to convert and take your free offer. Whilst this results in more conversions, it may lower your average order value.
Markitechture explains this concept very nicely here.
You need to brush up on stats
Just because your test completed at 30% improvement and 98% chance to win, doesn’t mean that 30% will hold true.
What you need to understand is that 30% improvement is indicative of what happened in that specific situation - not what will hold true for the future. Therefore you need to make sure of a few things before you can begin to rely on those figures more:
1. You run the test over a normal period without any crazy offers or events (i.e. Christmas, Easter, mothers’ day, weekends etc)
2. You run the test under normal conditions, without modifying your traffic (i.e. Let’s only send branded traffic through the test! Durrr…)
3. You’re checking your results and making a decision too early. I don’t want to address the intricacies of why this is an issue, but I’ll give you an example. Even if one page is a clear winner by 50% after two days and the stats say it is 98% confident that it’s a winner, it does not mean that’s the performance increase you can expect - it just means there’s a 1 in 50 chance that your winning page will beat your original page’s conversion rate under those specific conditions.
For more details about stats and statistical significance, read this article by Evan Miller.
Anyway, I hope that sheds some light on conversion rate optimisation and its (not always faithful) relationship with revenue. So next time you see a test increasing conversion rates by 200%, be weary!
I recently wrote a short article about 1 factor of conversion rate optimisation - better buttons:
http://www.danielduckworth.com.au/blog/increase-conversion-rates-with-better-buttons/