Did you know the more you can control your need to pee, the better you are at saving money (source), and If you study ethics, you’re more likely to steal (source)?
Also, intelligent people are more likely to binge drink (source). But, don’t expect it to get you any brownie points when police find you at 1:00pm belligerently yelling “I’M SMARTER THAN YOU!” at a fire hydrant because it won’t give you extra ketchup for your French fries…
The point is: it is POSSIBLE for two seemingly random things to be correlated and if you don't understand why, all your A/B testing is doomed to fail.
More...
These bizarre correlations are what I think about every time I get the question: “what button color gets the highest conversions?”
When I hear this question, I know the person asking it doesn’t fully understand A/B testing and will likely spend a lot of time testing the wrong things.
Sure, it is POSSIBLE that changing a button from green to blue increases conversions. But I’m not going to be convinced until I can see the raw data. And even then, I still wouldn’t change all the buttons on my website to blue.
In this post I’m going to show you why you shouldn’t be worrying about things like button color and what you should be focusing on instead. This will allow you to quickly take advantage of A/B testing for your landing pages and A/B testing for your opt-in forms to grow your audience and business.
Data Is Your Master (Obey Your Master)
If I told you to find out if people like french fries or onion rings better, you might start by walking down the street and asking the next 5 people you see. Three of them say “onion rings.”
That’s a majority, but have you really learned anything?
Would you be confident to say that, in general, people prefer onion rings over french fries? I hope not.
Every experiment runs into the issue of imperfect measurement. This is because there's randomness in tracking, randomness in people's behavior and randomness in uncontrolled (and perhaps unknown) external factors.
Maybe one of the people you asked just finished eating onion rings. That sweet and salty goodness is still fresh on their mind, so they answer "onion rings", even though, on average, their eating habits would show they definitely prefer (the obviously far inferior) french fries.
Or maybe the Worldwide Onion Ring Convention is being held just around the corner. As a result, there are many more die-hard onion ring fans walking around than on most days.
(Or perhaps the person you asked just finished reading this blog post, which has an obvious agenda to promote onion rings—the food of gods—and this subconsciously affects their decision.)
The solution to this problem?
Ask more people (gather more data). The more people you ask, the more confident you can be about where people in general stand on the issue of french fries vs. onion rings.
But how many more people do you need to ask to overcome the imperfect measurement issue? Well, this depends on how closely related the two things you’re testing are.
This brings us to the next problem:
How Much Data Do You Need?
Let's do another thought experiment. This time, imagine asking people to judge the weight of two different objects.
First up: a paper clip and a bowling ball. The hypothesis we propose is: the bowling ball is heavier.
The results from asking 10 people to guess the weights might look like this:
We didn’t ask many people, and everyone answered slightly differently on what they thought each item weighed. But, because they are so drastically different, we can tell from these results that people generally find a bowling ball to be heavier than a paper clip.
Next, we repeat this same test, but with two new items:
The next are a quart of milk and liter of soda. The hypothesis we propose is: the liter of soda is heavier.
In this experiment, it’s a lot more difficult to come to a conclusion with the results, since the answers are overlapping.
We could ask more people, to add more data to this test. Maybe we can come to a conclusion about whether people in general guess that a liter of soda is heavier than a quart of milk, but even if we ask 100 people or more, we won't get a highly confident answer.
Remember...
The more people we ask, the longer the test will take and in the end, we might not learn anything other than that humans just aren’t very good at distinguishing small differences in weight.
Milk, Soda & Button Colors
Testing different button colors is an example of testing a small change, where the difference in conversions you can measure is probably going to be very small. It's like our soda vs. milk example above.
This means you’re going to need a lot more data and time to get significant results.
When you’re just starting out, it’s a much better use of time to perform bowling ball vs. paper clip type tests. They will give you better insights into your customers and are a much more effective use of your time to grow your business in the early stages.
We'll look at some examples of effective tests you can use immediately, but first, I want to address another issue I see with people new to A/B testing: following case studies...
Why A/B Testing Case Studies Are (Almost) Useless
You read about a case study where a change to a landing page made a HUGE difference in conversions, so you go ahead and apply the exact same change to your own site, right away.
Makes sense, right?
Wrong.
This approach doesn't work because A/B testing is context dependent.
This means results are extremely specific to the audience and business.
For example, this case study found that 88% of the clicks on the website they were testing happened below the fold.
So let's say you read this result and promptly move all the call to action buttons on your site below the fold. What happens?
If your product is, say, a 99 cent iPhone app, your sales might plummet dramatically.
The business in the case study is a university, where students are trying to decide if they should enroll.
One of the major differences here is the price point.
If you’re looking to spend thousands of dollars (in the case of getting a college education), most people are going to need A LOT of information before they decide to buy. The visitors in the case study were scrolling below the fold to read testimonials and gather more info, so it makes sense to offer a call-to-action button at the bottom.
The same almost certainly won't apply for a low-priced product like a $0.99 iPhone app.
I get it, when you’re new, it’s easy to get caught up in the hype of A/B testing when you see results like:
But remember, the results are context dependent, so use case studies for ideas on WHAT you can test, but don’t copy them expecting to get the same results.
Conversion Optimization Without the Time Wasting Mistakes
People often think A/B testing is too complicated and intimidating and as you've just seen, there are some nuances to consider. But let's look at some methods you can apply today to make A/B testing quick and easy.
Step 1: Do You Have Enough Data?
Use this calculator to get a general idea of how long you need to test based on how many conversions you get per day.
As a general guideline, you should test for at least 14 days. Use the calculator (or our automatic winner settings), and if you’re not anywhere close to being able to get significant results in 2 weeks, go to step 2. Otherwise, skip to step 4.
Step 2: Get More Traffic
If you have a low traffic website there are 3 ways you can quickly optimize it:
- Only fix what’s broken.
- Run usability tests.
- Create 1 funnel and drive traffic to it.
Here's a quick guide explaining this approach in more detail: How To Optimize a Website with Low Traffic
Step 3: Deliver Something Good Enough. Quickly.
Initially, the goal shouldn't be to create the perfect landing page or website.
The goal should be to build something effective and get it published as soon as possible. Even if you don't have any traffic to run tests on, you can follow some basic guidelines and best practices to ensure what you publish is good enough to start with.
Here is a great FREE course that will teach you how to build highly effective landing pages to increase your conversion rate for any market. The course will show you how to use a principle called rapid implementation - one of the most important principles you can learn as an online entrepreneur.
Remember:
Your chances of ever creating something great can be predicted by your ability to DELIVER something simple.
Step 4: Test Paper Clips vs. Bowling Balls
Focus on testing high-impact, big changes rather than small tweaks like button colors.
Big changes are more likely to lead to a big difference in conversion rates and thus lead to significant results (and significant insights into your customers) faster.
Here are ideas of high-impact items you can test:
- Does [Free Report A] or [Free Report B] convert better?
- Does an extremely long sales page convert better than a short sales message?
- What value proposition or overall sales message performs better?
Set up some A/B tests of some high-impact items. Even if you don't have much traffic and don't get an answer quickly, if you test big changes, you'll get a clear answer eventually.
Again, this free course is great for getting A/B test results by learning how to deploy landing pages in record time.
Step 5: Keep Testing But Realize...
It’s important to have an idea what A/B testing is actually like as you continue.
This could be an entire course by itself, so I’ll keep it to a broad overview, with a few examples to give you a taste.
A/B Testing is a Marathon, Not a Sprint
It can be an extremely effective tactic to grow your business, but it's a long-term tactic, not an overnight fix.
For example, Fiverr saw a 457% increase in landing page registrations, but this took 10 months, and over 400 tests, with at least 500,000 unique visitors to each one of those tests.
Huge Wins Aren’t The Norm
When you start to get enough traffic and enough data, you can switch to more focused testing. Meaning, start taking what you have learned works and start testing small adjustments, like headlines, to see if you can improve your conversion rates.
You can even take the testing to the extreme and test small details.
For example, the lingerie company Adore Me who is obsessively diligent in their A/B testing, even testing how different hand positions their models use affect sales.
It’s important to realize when you get to this point, you’re not going to be seeing HUGE wins and drastic improvements.
So, realize the small increments can add up over time, but sometimes you might spend a lot of time and resources, not increasing your conversions at all.
Try to avoid chasing after small details and tweaks. No matter how much traffic you get, testing big changes is always more valuable. And even the mighty Google, famously data driven and analytical as they are, doesn't have enough traffic for some tests.
Watch the video below if you want to see WP Engine's Jason Cohen rip Google's famous test of 41 shades of blue for their buttons apart.
Your Testing Advantage
You're now armed with more insight about how to perform A/B tests and optimize your site than most business owners will ever be.
Keep in mind that in the end, A/B testing doesn't have to be hard if you're using the right tools (Thrive Leads for your opt-in forms & Thrive Optimize for your landing pages) and it isn't so much about technical details as it is a tool to help you understand your visitors and users.
One way of looking at it is by asking yourself: will this test give me new information about who my customers are? If the answer is "no", there's probably a better test you can run. Always ask yourself: will this A/B test give me new information about my customers? No? Then there's a better test to run.
If you have any questions or feedback about this post or want to chime in on the onion rings vs. french fry war, please let me know by leaving a comment!
And also:
LONG LIVE TEAM ONION RINGS!
Nice one Dave 🙂
It’s so nice to see some unfortunately uncommon, ‘common’ sense brought to bear on this subject.
Anybody who has spent any time with statistics and the heuristics that govern our unconscious behaviour will know that between ‘priming’ (what precedes something conditions our response to it) and a small data pool, all sorts of conclusions can be deduced.
It is not without good reason the adage: Lies, damned lies, and statistics was popularised by Twain.
Thanks again for shedding a strong light on this.
Steve
Such a good quote!
Glad you enjoyed the post! Statistics is definitely a powerful tool that is used poorly (and wrong) so often, but it can be a hard concept to fully grasp as well. It was my goal (and hope) to try and make it easier to understand so it can become more common sense.
Thank you so much for the feedback, I appreciate it 🙂
Great article. Helpful and takes away some of the pressure for a newbie like me. And now I want onion rings.
Glad you enjoyed the article, Lauren!
Goals:
1) Take the pressure away for newbies
2) Induce onion rings cravings
Check aaaaaaaaaaaaaand check 🙂
Thanks for the feedback, Lauren 🙂
What an awesome article, and I’d agree with Steve, total common sense brought to the table that old and new to A/B split-testing can digest and implement.
The kind post I love
Tony (your new fan) B
A new fan?! That’s what I love to hear 🙂
Glad you enjoyed the post, Tony! And I’m glad you found it digestible and actionable.
Cheers,
Dave (and now you have a new fan) D
I KNEW there was a good reason I like to drink and smoke “stuff”. I’m an intellectually superior being.
Seems like I should’ve already known that…
Actually, I’m much more intelligent when I’m drinking. You can even ask me.
Hahahaha drunk Scott knows what he’s talking about. Tell that intellectually superior being I say hello 🙂
Definitely Onion Rings!
My man, William! Good choice 🙂
For me the Belgium Fries, not the French Fries 🙂
Good article Dave D, love to see more of that.
Also good to see more and more variety in the authors blogging here.
Ah yes, Frieten met mayonnaise zijn super lekker!
Thanks so much for the feedback name twin! I’m glad you enjoyed the article and I’m glad you’re enjoying the new authors popping up, because there will be more 🙂
Great spin on a topic too rarely discussed in length!
Thanks Jay 🙂
Well, I still prefer french fries :).
Nivel article, as for lauren, this information makes me feel better!
Cheers
Stoked the post made you feel better, but kinda sad I couldn’t turn you into an Onion Ring Convert, but I guess it means more O-Rings for me 🙂
Thanks for the great feedback, Jorge!
Seems like we agree on so many things! I also die a little inside each time people ask me whether placing the button on the top left or middle left of the image will drive better conversions. Lol.
Haha, ah that’s another good example! I’m glad (but also sad) we can relate to each other on this, so my hope with the post was to help educate more people on the topic! Glad you enjoyed the post Jason 🙂
Thoroughly enjoyed reading this article. I have never done A/B Testing but after going through the Rapid Landing Page building course and reading through this article, I can’t wait to go testing.
Onion Rings, FTW! Alas, we don’t have a lot of joints that serve that here in the Philippines.
Glad you found it helpful, but sad at your lack of access to delicious onion rings…