Following the amazing community response to our recent post where we shared the secrets of how we improved our conversion rate by a whopping 25%, we figured you’d want some more practical, actionable tips for running successful A/B tests on your website.
So today it’s time to dive a little deeper into our 7 golden rules of how to run successful A/B tests.
This will help you to:
- Hit the ground running with your A/B tests.
- Make informed decisions that have a real impact on your conversion rate.
- Avoid mistakes that will invalidate your entire tests.
- Adopt an ‘always be testing’ approach so your business continues to grow.
You’ll want to bookmark this post and give it another read every time you start a new A/B test!
More...
1. Set it and forget it... resist the urge to make decisions before you have enough data.
We get it, you’re excited that your new A/B test idea could boost conversions.
That’s great... we’re excited for you too!
But try not to check in on the test progress every day. You’ll be tricked into seeing patterns where they don’t really exist yet.
For example, it’s only natural to think you’ve hit the conversion jackpot if one page variant massively outperforms the other for 3 consecutive days. So why not go with the obvious winner and turn off the test early?
Because the pattern doesn’t matter if you haven’t yet collected enough data to produce a solid confidence score (or “chance to beat the original” score). In some ways, the confidence score is really the only thing that matters.
2. Test bigger changes first.
Imagine you’re a car engineer tasked with improving its top speed.
What would you test first?
- Different types of windshield glass with better aerodynamic properties?
- Or different types of engines?
It’s obvious that a completely different engine type will tell you sooner and with less data required whether it’s an improvement on the original design.
Although a different type of windshield glass might help, you’ll need to collect tons of test drive and wind tunnel data before you can see a reliable pattern emerge among all the randomness.
In this example, the new engine test would let you make speed improvements much faster.
In fact, you could probably pack a few more big tests into the same time it would take you to run just the windshield test.
A/B testing your website is exactly the same – bigger, more obvious changes will improve your conversion rate faster (and with more confidence) than smaller, cosmetic changes.
Discover several examples of BIG difference A/B tests you can run on your website by checking out Hanne’s guide to low traffic A/B testing.
3. Try to obtain a 90% or higher confidence score before choosing a winner. Greater than 95% is best.
We keep coming back to this concept of a confidence score – or “chance to beat original” as it’s called in Thrive Optimize. It’s important, so let’s recap...
Confidence score tells you the chance that your winning variant will continue to outperform the other variant.
Let’s say you’re 7 days into testing whether or not a video at the top of your sales page generates more sales than a sales page without a video in the above the fold.
So far, your A/B testing tool (we use Thrive Optimize) shows the video variant produces 2% more conversions. That’s great!
But wait... the confidence score is only 65%.
This means that – given the data collected so far – the video variant is predicted to beat the original variant only 65% of the time.
An easier way to think about this is: you need to collect more visits and sales to give your A/B testing tool enough data to feel confident it’s found the real winning variant.
A confidence score of 70-89% means your A/B testing tool is starting to see some regular patterns, but not reliably enough yet to be sure.
A confidence score of 90% or more is a good place to be before you decide to make a move.
A confidence score of 95% or more means you’ve almost certainly found your winner.
4. Change NOTHING else during the test.
A/B tests are like serious relationships. They demand total commitment and exclusivity.
Once you hit that test button, you’re in it for the long haul.
You can’t add new elements, switch colors on a whim, update the text, or change anything else on your test variants.
But what if you make the same change on both variants? Surely that won’t affect the comparative testing, will it?
Yes, it can.
Many of your customers will visit your sales page multiple times before they make a purchase, and they will each be at a different stage of their individual conversion journey.
You may make a well-meaning change at a critical moment for some visitors, leading to an increase or decrease in conversion rate... which will affect the overall test results.
Your A/B test works best when there is total consistency across visitors, and across their return visits.
5. Don’t run ads or other campaigns to the variants mid-way through the test.
If you add or remove a source of traffic mid-way through your A/B test, you can invalidate the results.
If you increase or decrease the budget on an ad campaign mid-way through your A/B test, you can invalidate the results.
If you start or stop a special offer mid-way through your A/B test, you can invalidate the results.
You get the idea!
Every change in traffic source and quality will affect your conversion rate, so it’s important that you try to avoid sudden changes that can affect people earlier in your sales funnel.
Of course, we know this is not always possible if your A/B test requires a longer time to collect enough data. In these cases, at least try to ensure any traffic changes you make equally affect both variants at the same time.
6. A/B testing is an iterative process... Once you’ve identified a clear winner, test something else.
It's super exciting to identify a change that can boost your conversion rate by a significant amount, especially because you know you’ll be making more sales from that point onwards!
But we encourage you to treat A/B testing as a long-term iterative process.
Even if you only eke out a 0.25% improvement from an A/B test, it’s still a success for your business. Four of those little successes, and you’ve added a whole 1% to your conversion rate!
So plan, test, decide, repeat... until all those small successes combine to create one huge success.
This is the exact same strategy used by the most successful sports managers and athletes around the world. It’s called the “the aggregation of marginal gains” (or incremental gains).
In other words, always be testing something.
7. Save your time and sanity by enabling automatic winner selection.
Automatic what now?
Once you get the A/B testing bug, you can become a little... well obsessed... about improving your conversion rates.
You’ll find yourself ‘just quickly checking the results’ every few days, like a child waiting for the cookies in the oven to be ready.
But as an online business owner, your time and energy is better focused on other areas that can grow your audience, subscribers, sales and revenue.
That’s why Thrive Optimize comes with automatic winner selection, a handy feature that automatically turns off the losing variant when it’s collected enough data and confidence to reliably identify the long-term winner.
So you really can just set-it-and-forget-it, safe in the knowledge Thrive Optimize will do the rest.
Always be testing. Always be learning.
With the tools available in Thrive Suite, you have everything you need to run A/B tests on your website.
You’ll discover surprising insights about your audience's unique behaviors, which will help your online business grow with each discovery.
But only if you take action.
If you’re reading this post and thinking “my website is too small to A/B test anything” or “I’ll get around to A/B testing after I do X, Y and Z”, then you’re not taking action.
You’re making excuses.
There IS something worthwhile you can test to improve your website.
Set it up today so your A/B test can run while you’re focusing on growing other parts of your business!
How do I know the confidence level?
Most serious A/B testing tools will include a confidence score.
In Thrive Optimize, it’s called “chance to beat the original”, as that makes more sense to average users.
Hi David,
Is there a way to use it with the following scenario.
My landing pages are on WP with Thive.
The button on the landing page goes to a checkout page on another website.
But when I look at Thrive Optimize it wants me to say what WP page it goes to.
Is there a way to A/B test my landing pages that go to an external checkout page.
I’m just trying to determine which landing page gets clicked through the most.
Thank you.
Hey Trish, unfortunately this can’t be done yet, but it’s a highly requested feature and it’s on our radar. In order to do this, we’d have to capture the link clicks on the page before the visitor is sent off-site. I’ll spare you the technical details, but it is surprisingly difficult for such a simple thing. Fortunately we already have figured out how we’ll add this feature, but there’s no ETA at this stage.
Chalk up one more vote for this from me. I’ve been using Thrive for a few years now and this has always been one of the biggest pain points. I can’t actually run A/B tests effectively. 🙁