No, that isn’t a clickbait title... it’s the real deal.
We really did instantly boost our sales page conversion rate by a staggering 25% with one simple change. This isn’t a sensational claim built around a rogue sales spike either – it’s a consistent and reliable increase in daily sales with rock solid proof.
Want to know which secret sauce we added?
Want to know how you can do the same?
Of course you do!
Let’s get started...
Here at Thrive Themes, we’re fanatical about building conversion-focused websites.
Read any of our posts or watch our videos, and you’ll hear us harping on about testing every element of your sales pages: the headline, copywriting, layout, pricing, calls-to-action and pretty much anything else that can be swapped out and analyzed.
Because it works.
And because you have all the tools you need within Thrive Suite
The exact same tools that we used to improve sales by 25%.
Enough chit-chat. Let’s break down how we set up the sales page A/B test, and then deep dive into the results.
What is an A/B Test?
By comparing the conversion rate of slightly different versions of the same sales page, you can identify the winning version and improve future sales performance.
We call these versions A and B, or the Control and the Test.
No matter how many variants you’re testing at the same time, it’s still just called an A/B test.
We’re always talking about the power of using testimonials as social proof for your sales pages. They let your happy customers share positive experiences and results, which – in theory – should encourage new visitors to buy your product.
So we were a little embarrassed to discover we’d actually forgotten to add testimonials to our Thrive Architect sales page!
With so much historical conversion rate data, this was a perfect opportunity for an A/B test... both to improve our revenue AND to share the results on the blog so you can improve your website too.
Here’s the question we set out to answer...
“Will adding testimonials to an existing sales page have a noticeable impact on conversion rate?”
Setting up the A/B Test
To get started with the test, we used Thrive Optimize, our powerful WordPress plugin for A/B testing landing pages. I won’t go through the technical setup here, as you can find all that on our guide: Create Your First A/B Test Using Thrive Optimize.
We sent 50% of traffic to the original sales page, and 50% to the new variant with added testimonials.
Our original Thrive Architect sales page converted at about 2.2%, pretty standard for a public sales page with a variety of traffic sources. We were hoping to see the new variant produce a consistent improvement over 2.2%.
Rather than check in constantly, we enabled the Automatic Winner Settings – this automatically disables the losing variant once your A/B test has collected enough data and confidence to reliably know the winning page is a long-term winner.
... And then we let the test run for 6 weeks.
Drum roll, please....
After 6 weeks of equal visitor traffic, the sales page A/B test results looked like this:
Sales page A
Sales page B
Revenue per Visitor
Every important metric saw a significant improvement thanks to the addition of a few testimonials!
Here’s the daily conversion rate of the control (blue) and test (green) variants:
How do we know this 25% improvement came from the testimonials?
Aside from adding the 2 testimonials under the video, we made no other changes to the sales page over the 6 weeks.
The page variants were served to a random 50-50 distribution of visitors, meaning that seasonality and other external factors affected both landing page variants equally.
How confident are we that the winning variant will consistently perform better?
Oh we’re extremely confident.
The data shows a 98.13% chance that the testimonial variant will beat the original variant.
A confidence score of greater than 95% means we can be fairly certain that the winning variant will continue to outperform the original, even with more visitors, sales and time.
Why was the conversion rate for both variants significantly higher at the start?
The first few days of an A/B test are always volatile. Randomness, return visitors and other oddities usually produce erratic spikes and valleys. You always need some time to get through that initial randomness.
No matter how much traffic you’re getting, we always recommend ‘setting and forgetting’ your A/B tests. Come back to them after a couple of weeks (2 weeks being the minimum to limit randomness based on the day of the week) so you’re not trying to make important business decisions without enough data to produce consistent results.
Why is there such a big difference in conversion rate during the first week?
This is the classic pattern of many conversion rate A/B tests!
The first few weeks are a wild ride (see above), and it’s very tempting to make important decisions based on early results.
Resist this temptation!
As more data is collected, most A/B tests start to converge, with variant performance getting closer and closer together. Eventually they’ll either merge (meaning there’s no discernable difference) or one variant will maintain a slight lead (meaning you’ve just increased your conversion rate!).
Thrive’s Golden Rules of A/B Testing
I bet you’re feeling the urge to run some A/B tests of your own.
It’s easy with Thrive Optimize (which we used to run our tests) but there’s also some important ‘rules’ that help to avoid the common pitfalls that first-time A/B testers learn the hard way.
- 1Set it and forget it... resist the urge to make decisions before you have enough data
- 2Test Bigger Changes First. bigger, more obvious changes will improve your conversion rate faster (and with more confidence) than smaller, cosmetic changes.
- 3Try to obtain a 90% or higher confidence score before choosing a winner. Greater than 95% is best.
- 4Change NOTHING else during the test.
- 5Don’t run ads or other campaigns to the variants during the test, unless they will run consistently. Changes in traffic sources and quality will affect your conversion rate.
- 6A/B testing is an iterative process... Once you’ve identified a clear winner, test something else.
- 7Save your time and sanity by enabling automatic winner selection.
Ready to Run Your First Sales Page A/B Test?
Split testing your sales page doesn’t have to be complicated, and you absolutely do not need to redesign massive sections to see consistent improvements in conversion rate.
We added TWO testimonials and boosted our conversion rate by 25%!
Here’s what I want you to do: post your sales page in the comments below, and the Thrive team will offer suggestions on what you could easily A/B test.
It’s a great chance to show everyone your website and get valuable feedback from the conversion rate experts here at Thrive.