Today's post is a response to a seemingly simple question, which is: how high should your conversion rate be? What's a good conversion rate, what should you be aiming for?
There's a lot of talk about optimizing your conversion rates, but it would be very helpful to know some figures of what's good or bad to begin with, right?
Watch the video below to see my answer:
The Truth About 'Conversion Rate'
Everything mentioned in the video is also why I don't like blanket statements about conversion rates such as "orange buttons convert 10% better than green ones". Not only is the statement insincere, it also hides the more important factor of what conversion is being measured and how that translate into actual value for your business.
What About Opt-In Conversion Rates & Thrive Leads!?
If you know my products, then you might be shouting "hypocrisy!" at this point. After all, one of my flagship products - Thrive Leads - is all about building your mailing list. One of the big features in Thrive Leads is that you can A/B test everything - and the A/B tests measure the opt-in conversion rate (number of people who see the form divided by number of people who opt in).
To see how this fits with the answer I gave in the video above, there are two things we need to consider:
1) The Tech Problem
The ideal solution for A/B testing your list building would indeed measure your value per visitor rather than the opt-in conversion rate. Unfortunately, the nature of email marketing makes this technically difficult. You'd have to track visitors who opted in through different variations of your form all the way through to revenue - and with email marketing, that might happen days or weeks later.
It would also mean integrating the testing with your shopping cart or payment processor so that information about who buys what and for how much can be passed back to the A/B test running on your opt-in forms. That's the kind of thing you usually need to hire a developer for.
So, while testing all the way through to revenue would be better, it's simply not practical for most users.
2) Comparing Conversion Rates in an A/B Test
The second point is that comparing the opt-in conversion rate of two variations in an A/B test is different from comparing the conversion rate on your site to the conversion rate on a different site or to an "average" conversion rate of other sites.
If we determine that the conversion rate on your site is higher than the conversion rate on my site, we haven't actually learnt anything useful because it's an apples to oranges comparison (or it might be, for all we know).
However, if you run an A/B test, showing two versions of the same opt-in form to the same audience on the same website and with the same goal, then finding out that one of them has a higher conversion rate is actually useful.
The only blind assumption we are making in this case is that getting more people onto your mailing list is always better. This is an assumption and it's not always true. But the chances are good that an opt-in form with a higher conversion rate is better for your business overall than one with a low conversion rate.
Over to You
As you can tell, there's quite the rabbit hole that we can follow down, based on the seemingly simple question this post started out with.
What other questions do you have about conversion rates, visitor value and A/B testing? Let me know by leaving a comment below!