Starting Your First A/B Test? Brace for Disappointment!
A/B testing is a cornerstone of building a truly effective online business and something we always encourage our users to do. However, you might find the experience of A/B testing quite frustrating, especially if you're new to it and you came in with stories from spectacular case studies (like this one) buzzing in your head.
In today's post, you'll discover why A/B testing results can often seem disappointing and what you need to do to get good value out of them anyway.
Learn to Love Small Wins
"Slow and steady wins the race" is a platitude, but it certainly applies to A/B testing.
It's exceedingly rare to see a huge win on any single test and if you expect to turn your business around on a single split-test, you're in for disappointment.
Even when testing does result in dramatic wins - like the 200%+ increase in lead conversions from this post - it's often the result of a series of tests, many of which may have been inconclusive.
What's more, spectacular looking case studies are often just plain wrong. Two common problems you'll often see when you take a closer look are:
- The test wasn't run for long enough, so the result is actually just randomness and not a real performance difference.
- The control is extremely bad. It's not difficult to double the conversions on a horribly designed landing page that takes 2 minutes to load... A case study is really only interesting if the control already performs decently.
To benefit from A/B testing in the long run, make testing a habit (you should always have some test running somewhere), get used to waiting for 2 weeks or more before drawing conclusions from a test and learn to love the small wins.
There is one exception that comes to mind: scarcity.
I've used time constraints and limited offers for many years now, but the effect never ceases to surprise me. Adding scarcity is an almost frighteningly reliable way to produce a big spike in conversions and sales. Hence why we created Thrive Ultimatum.
As cool as it is to make use of scarcity, it shouldn't be the only thing you rely on. A benefit you get from A/B testing is insight: the winning variations can tell you things about how your audience thinks and feels and what matters to them. That's something you don't get from using scarcity - all it tells you is that people make a decision when you light a fire under their butts.
If you want to improve your website and tap into the power of A/B testing, the bottom line is this: realize that many tests are inconclusive or only lead to marginal improvements. And don't let that discourage you. Instead, take those small wins and keep on testing.
If you're not sure where to get started, check out this post about the first 3 things you should optimize and test on any landing page.
You can also check out the Rapid Landing Pages course (which is free), to learn a simple framework for building conversion optimized pages as well as a framework for what to test and in what order.
What has your experience with A/B testing been like? Any big wins of frustratingly inconclusive results? Let us know by leaving a comment below!