Much like the titan Prometheus stole fire from the Greek gods to bring civilization to humanity...
...Thrive Themes just stole figurative fire from those YUGE online companies that dominate the internet.
And just like mythological Prometheus, we want to share that competitive fire with you, the solo-preneur.
More...
Monsters like Google and Amazon not only have massive marketshare and capital advantages, but they leverage expensive A/B testing technologies – run by the world's nerdiest split testing geeks – to squeeze every last drop of conversion juice from their webpages.
Well, that powerful A/B testing advantage is now at your fingertips!
Simple, fast and scary affordable landing page, sales page and lead generation page A/B testing can be harnessed on your own WordPress website through our new Thrive Optimize plugin.
Now you can steal this A/B testing fire and wield it like a conversion optimization flame thrower...as long as you implement the following 10 A/B Testing Principles to compound your gains over time, just like the current titans of the internet do!
1. State a clear hypothesis for every A/B test:
If you don’t have a clear idea of what you’re testing, how will you know if an A/B test succeeds or fails?
To avoid aimlessly running A/B tests you can’t learn anything from, always write down the hypothesis you want to test first. Then design your A/B test around trying to answer that clearly stated hypothesis.
Your Hypothesis Creation Equation:
Having trouble trying to decide what hypothesis to test? Try this formula to help focus your A/B testing efforts:
"Changing [element] from X to Y will [result] due to [rationale/research]"
To randomly spitball A/B tests is unhelpful and will cause you much frustration, missed learnings and slow growth.
2. Never A/B test more than one variable at a time:
The number of elements you will A/B test simultaneously...is one.
Let your clearly stated hypothesis guide what single variable is best to test and hopefully answer that hypothesis.
Design your testing variation in a way that only significantly alters a single variable from your control.
For example, if you decide to test a 2 column vs. 3 column pricing table, don't also test different value propositions between the two variations at the same time. If there's a significant change in conversions, how will you know what element was actually responsible for the difference?
Testing a single variable will help you avoid attributing causation to the wrong elements and prevent harmful decisions in the future based on incorrect interpretations.
3. Always test two conspicuously different versions of your single variable:
Do your conversion rates really depend on what color your call-to-action buttons are?
No...likely not. And even if button color does matter, the impact is minimal.
More importantly, button color tests won’t teach you anything about your customers.
Accept that most of your A/B testing results will cause negligible changes and offer inconclusive results. But if you ever want to score the occasional big win, you must test conspicuously different variations of your chosen test variable.
We call this the Paperclip vs. Bowling Ball Strategy:
Think about it like this...
Ask people to guess how much both a paperclip and bowling ball weighs and you’ll get a clear difference among everyone's estimates. People are really good at differentiating between two obviously different things.
But ask people to guess how much a carton of milk and a bottle of soda weighs and the estimates won't lead to a clear separation in the data. People are pretty bad at determining the difference between two similar things:
The Lesson: Between your control and variation, present 2 conspicuously different versions of the element you're testing to increase the chances of achieving a clear and useful answer to your hypothesis.
Use this Paperclip vs. Bowling Ball analogy to help yourself design A/B tests that attempt to clearly answer your hypothesis like in the following examples:
- Does a 2 column or 3 column pricing table convert better?
- Does an extremely long sales page convert better than a short sales page?
- Does [Value Proposition A] or [Value Proposition B] convert better?
After you state a clear hypothesis and decide on a single variable to test, make sure your A/B test variations are conspicuously different so it's usually clear what's driving the final results.
Short vs. Long Video Sales Page Example:
When you A/B test landing pages, make the single variable you're testing conspicuously different to help clearly understand if it has a strong effect on your results. Using the above Thrive Architect templates as an A/B test example, your hypothesis could be: "Changing the length of my sales page from a short video sales page to a long copy, video sales page will increase my conversions due to the high complexity of my product (i.e. customers require more information before purchasing). If the longer sales page does measurably better than the short sales page, it's a strong indicator that the hypothesis is valid. Notice that the video, overall design and headlines between the two variations are identical.
4. Never stop A/B tests before achieving statistical significance:
Do not allow yourself to stop any A/B test before enough time has passed or sufficient conversions have been collected to pick statistically significant winners.
The phenomenon of random chance must be paid its proper respect by never ending A/B tests too early and selecting what appears to be the initial winner.
It’s common that the leading variation at the start of a test is not the winning variation over time.
A/B testing results from the Thrive Leads dashboard.
Notice how the green "No Opt-out" test initially outperforms the yellow and blue tests. The underperforming blue "No thanks" test was ended on August 2nd as it hit statistical significance of being a clear loser. However, the yellow "Wasting time" tests eventually outperformed the green "No Opt-in" test by a small conversion rate margin. Small result variations like this are much more common than the big wins showcased in your typical case study post.
You must let the random testing noise seen at the start of most tests die down before selecting your winner. Otherwise you risk selecting losing variations that will compound your conversion losses over time.
In A/B testing tools like Thrive Optimize, Thrive Leads and Thrive Headline Optimizer, we recommend that you let your A/B tests run for 2 weeks collecting a minimum of 100 conversions – not impressions – before selecting a winner.
To be confident that your testing results are real and not random chance, your “Chance to beat original” should be greater than 95% (your variation wins) or less than 5% (your control wins) – after the minimum time and conversion thresholds have been met.
Screenshot of an a/b testing result showing the "chance to beat original" at 94.59% and a conversion rate improvement of +26.44% after 186 conversions. This a/b testing result meets our recommended minimum statistical significance of 95% after at least 100 conversions.
Statistical Significance Explained:
A test variation's "Chance to beat original" percentage, which is usually referred to as the statistical significance, basically asks: "how sure can we be that the difference we're seeing is not random chance?"
A 95% or greater "Chance to beat original" is the minimum statistical significance we suggest achieving before declaring your variation a winner.
95% statistical significance means you'll get the same result if you repeat the exact same A/B test 95 out of a 100 times. The closer this “Chance to beat original” is to 100%, the better.
Conversely, a variation with a "Chance to beat original" of 5% or less means the control has a 95% or higher statistical significance of being a real winner.
Of course, if you want to remove your human emotions from the process altogether, all Thrive A/B testing plugins have automatic winner settings. you can use to set-and-forget your A/B tests without ever having to check-in on them!
5. Always be A/B testing:
Your A/B tests don’t need to be super complicated.
If you use every A/B test as an opportunity to learn something about your customers, the value you gain from increased conversions and engagement will compound over time.
To do this, commit to running A/B tests every time you publish a new page on your website. You don't have to spend more than a few extra minutes to accomplish this each time.
With Thrive Optimize, you just clone your control variation in 1-click, quickly modify your variation page and then start running your A/B test!
Just make sure to ask yourself:
Will this A/B test give me new information about my customers? No? Then there's a better A/B test to run. #ThriveOptimize
6. Avoid perfection and run “good enough” A/B tests instead:
Don’t ever let yourself waste an A/B testing opportunity because you don’t have the PERFECT testing idea.
There is no perfect A/B test.
The best you can do is test ideas that are “good-enough” to learn something meaningful about your customers.
Take what you currently know about your audience, use that knowledge to develop a useful hypothesis, create an A/B test to answer that hypothesis, and start collecting data.
For better or for worse, take what you learn and let it guide your future A/B testing ideas.
The compounding benefit of this rapid feedback loop will lead you towards the promised land of increased sales and online business growth!
7. Don't be disappointed by inconclusive test results:
You must accept the fact that many of your A/B tests will be inconclusive.
This does not mean that such A/B tests are a waste of time. Indeed, they will provide you with many learnings you should appreciate!
Inconclusive A/B tests simply mean that your hypotheses didn’t really matter and should be de-prioritized in favor of more promising experiments.
These inconclusive learnings can be quite beneficial as they will provide you with new insights and shift your thinking towards more valuable ideas.
Celebrate and learn from your inconclusive tests just like you would from a big conversion optimization win.
8. Don't freak out about missed conversions during your A/B tests:
Your human emotions must be set aside while running A/B tests.
Although it’s easy to let your emotions run wild while monitoring low performing variations, you must not indulge your impulse to end tests too soon.
The emotional roller coaster of extrapolating how many leads or sales you’re missing while letting "losing" variations continue will be hard. However, you must reach that recommended benchmark of 95% statistical significance before naming any winners or losers.
A/B testing results from the Thrive Leads dashboard.
An A/B testing example showing the control losing to the variation initially only to become the statistically significant winner over time. Don't underestimate the phenomenon of random chance affecting the initial stages of your test. Wait at least 2 weeks with at least 100 conversions before selecting a winner.
The foolproof solution to sidestep irrational emotions during A/B testing is to use the set-it-and-forget-it automatic winner settings Thrive Optimize, Thrive Leads and Thrive Headline Optimizer provide.
9. Make sure your conversion funnel experience is congruent:
Realize that the results of your A/B tests are not strictly dependent on your copy and page layouts alone.
The congruence from your Facebook or Google ads to your webpages plays an important role in your overall conversion rates.
Thus, you must always take your traffic sources into account when launching any A/B test.
For example, do not run a Facebook ad A/B test while also running a landing page A/B test.
You can listen to Shane and our in-house Facebook ad expert Dave discuss this congruence issue on the ActiveGrowth podcast here.
If your customer’s funnel experience blends seamlessly, your conversion rates will benefit. But if any part of your ad to website handoff clashes, expect conversion rates to suffer.
10. Don't blindly follow the A/B testing claims of your marketing gurus:
Have you ever been excited by case study headlines like:
- How A Single A/B Test Increased Conversions by 336%
- How AMD Used A/B Testing to Achieve 3600% Increase in Social Sharing
- How Server Density A/B Tested Pricing Plans and Increased Revenue by 114%
Do you just blindly follow A/B testing recommendations based on such claims?
Or do you use case studies to help generate your own A/B testing ideas that fit within the context of your own online business?
Remember that there's a publication bias in case studies. They only share impressive results because those are the headlines that get clicks. The more typical single digit testing gains or inconsequential results never make it to press.
The reality is that small wins can compound into significant conversion rate gains over time, but only if you make A/B testing a habit every time you publish.
Do not underestimate the benefit of small, but frequent A/B testing wins compounded over time. The other benefit to frequent A/B testing? You're more likely to land a big conversion optimization win at some point. If you don't test, don't expect anything to change...
If you learn to appreciate your more frequent, but small A/B testing gains, expect compound conversions and business growth as a result!
Now It's Your Turn
If you're in need of an A/B testing tool that is both affordable and super simple to use, Thrive Optimize is the fire from the gods you've been waiting for.
Go forth and learn those valuable lessons about your customers you can use to compound your sales – and profit – over time!
And if you want some help getting started creating a useful A/B test that follows these 10 A/B testing principles, post your questions and testing ideas in the comment section below so the Thrive Themes Team can help you with some rapid feedback!
Is Thrive Optimize also part of the thrive membership deal (all plugins included)? Or do I have to get it separately?
Thrive Optimize is definitely included in our Membership.
Alternatively, you can buy the Thrive Architect + Thrive Optimize bundle if you don’t already have Thrive Architect. Thrive Optimize is a paid add-on to our Thrive Architect front-end builder plugin.
I love this. Now entrepreneurs can a/b test, just like the big players. Thrive Themes is awesome!
I hope the plugin is going to work on posts in the future, and not just pages.
Also, I would love additional goals. For example, clicks on external links. Or even better, clicks to a specific domain, like Amazon. Thousands of your customers who use Thrive for Amazon affiliate sites would do anything to get that feature.
Thanks Emil!
This is just the first version of Thrive Optimize so expect improvements in the future based on feature requests like yours!
I just wish it worked with other builders that weren’t architect
Why would anyone want to use another builder? ;-P
Thrive Optimize is an add-on to Thrive Architect so it’s not compatible with other builders.
I won’t bore you with past experience in sales & advertising. Let me offer some simple advice;
1. Go to whichtestwon.com and sign up for their newsletter. It won’t cost anything and you’ll be able to see (and guess) which A/B test won (and why). If you’re interested in A/B testing…you’ll love this.
2. To help simplify the idea of testing; test one element only. When I ran tests, I would often change only the color of the buy button. Don’t allow yourself to be intimidated.
I’m a fan of the one element theory of testing when you’ve narrowed down to a page that’s performing well. To give you an example of why I did it that way; race care teams change one element at a when they really have their car “dialed in.” Here’s why they do this…if they they make 10 changes and the car is faster…which change made the difference. It’s not an absolute, it’s just a helpful way of understanding testing.
Good luck
Mike
Hi Mike,
Principle #2: Never A/B test more than one variable at a time – is definitely in agreement with your “test one element at a time” suggestion.
For most solopreneurs however, testing buy button colors probably isn’t very useful. It’s definitely an easy element to test, but often an inconsequential conversion optimization lever. It also won’t teach you anything useful about your customers.
It’s better to focus on testing bigger CRO levers that follow Principle #3: Always test two conspicuously different versions of your single variable – using the Paperclip vs. Bowling Ball Strategy explained in the post.
Using Principle #1: State a clear hypothesis for every A/B test – will help solopreneurs prioritize testing their biggest CRO levers first.
Matt,
Thanks for your reply. I owe you an apology…I skimmed the article (was in a major rush).
I came back later and read the article in full. Realized then that I’d only repeated what you’d said.
Thanks for help & patience.
Cheers
Mike
It’s all good Mike…many thanks for reading and commenting on the article!
Your single element testing suggestion is on point and always worth reiterating.
Most people get so eager to make improvements in whatever they’re doing (whether for website CRO, race car performance, or health) that they change too many variables at once and miss the opportunity to identify actual cause and effect relationships.
i am very excited about this plugin
what about the Ultimate theme that you promised us, when will it be available ?
we are just waiting for it to subscribe to all of your themes and plugins and migrate everything to your ecosystem
Thanks Mohamad!
We’re working hard behind the scenes on the new theme. We hope to show you a sneak peek really soon!
A very good read, thank you. I am a Thrive member but have not yet made the jump from Builder to Architect. When I do (soon), I will embrace the A/B testing available with this plugin.
Once question: can the plugin handle conversions that are external links. For example, a visitor clicking an affiliate link on my site that brings them to another site? Thank you!
Thanks Anon!
To answer your question, “link click” is not a conversion goal you can A/B test at the moment.
You can use Thrive Optimize if at one point your visitor is coming back to your site (Eg. an after purchase thank you page), but not if people are leaving your website never to return once they click your external pointing link or button.
I like this guy. good exposure. stuff we always overlook or are left out from.
Cheers Kamarul!