Discover How You Can A/B Test Even With Low Traffic

Author 
Hanne   14

Traffic doesn’t matter.

Please don’t boo me off stage just yet... Give me a few minutes to explain why this is.

When we talk about A/B testing, the very first concern people always have is: Do I have enough traffic?

Harry's comment on the Thrive Optimize A/B testing webinar sums it up perfectly:

"This sounds all very well, but the requirement is that you need a lot of traffic on your website to have enough data to do A/B testing." 

And I understand where he’s coming from, but the full story is more nuanced than that...

More...

The 3 Concepts You Need to Understand About A/B Testing

In order to fully grasps the complexity of A/B testing, there are 3 concepts you need to understand.

  1. Randomness
  2. The paperclip vs. the bowling ball
  3. Conversion Delta

#1 It Might Just Be Completely Random

Imagine testing a lead generation landing page. Your original page (or as it’s called in A/B testing terms: the control version) has a video on it and the test variation doesn’t. Your A/B testing tool will randomly show one or the other version to your visitors.

Person 1 sees the version without video and signs up.
Person 2 sees the version with video and does not sign up.
Person 3 sees the version with video and does not sign up.
Person 4 sees the version without video and signs up.
Person 5 sees the version without video and doesn’t sign up
Person 6 sees the version with video and signs up.

All of a sudden, the test variation without a video has 3 times more conversions than the original control version…

But at this point, it could just be completely random!

Maybe person 1 and 4 would have signed up even on the video landing page. That’s why you need enough data; to get through the noise of randomness.

For now, just understand there will ALWAYS be randomness in any test and more data will decrease the randomness and improve your results.

Which brings us to the second principle…

#2 The Paperclip and the Bowling Ball

Can you guess the weight?

What's the weight of the paperclip? How about the bowling ball?

If you ask 10 people to guess the weight of the bowling ball and the weight of the paperclip, their answers might look something like this:

There's no doubt which one is heavier...

If you’re using that data to decide which one is heavier, there will be little doubt. The bowling ball is significantly heavier than the paperclip.

If you repeat this same experiment with an apple and orange people's guesses would overlap so much it's impossible to come to a conclusion from the data on which is heavier, no matter how many people you ask.

The same holds true for your A/B tests.

The bigger the difference in performance between the two versions, the sooner you’ll find whether it’s a real difference or simply randomness.

Note: What we refer to here as a real difference is also called “statistically significant”. This means that some complicated math has proven that there is a > 95% chance the result is not due to pure randomness.

Let’s go back to our lead generation page example and assume the original control page with a video got 1000 visitors just like the test version without video.

Signups on the control version (with video)

Signups on the test version (without video)

Random or Statistically significant?

10

15

Random

10

20

Statistically Significant

100

150

Statistically Significant

200

220

Random

Random or Statistically significant result with 1000 visitors per page

As you can see, the total number of conversions is less important than the magnitude of the difference in conversions.

The good thing is that you don’t need to understand all the complicated math behind this… But you need to remember: The bigger the difference is in the number of conversions the sooner you’ll get a statistically significant result, not a result based on randomness.

Now that you understand this, you know why we cry a little each time someone asks us what color button performs best… ​

We shed tears because changing a color button is NOT a bowling ball vs paperclip change and you’ll need much more a ridiculous amount of data to filter out the randomness.

That’s why we always avocate to make BIG difference in your variations.

Examples of big changes are:

  • Video vs. no video
  • Short animated video vs. long talking head video
  • Completely different value proposition
  • Different prices
  • Different (opt-in) offer
  • ...

And these are small changes:

  • Changing a button color
  • Swapping out one word in your headline
  • Anything a friend who doesn’t know your page by heart wouldn’t notice.

#3 Conversion Delta

Now that you understand the above principles it’s time to prove why traffic is not what matters… Conversions are!

If we take the exact same data as above but get only 100 visitors instead of a 1000 to each page, the difference between statistically significant results and random results doesn’t change.

Signups on the control version (with video)

Signups on the test version (without video)

Random or Statistically significant?

10

15

Random

10

20

Statistically Significant

Random or Statistically significant result with 100 visitors per page

Which means that even if you only have a 100 visitors to each page but you’re seeing 10 signups on one and 20 signups on the other, your A/B test is valid and you just increased your conversion rate significantly.

I know this is hard to believe, but don’t take my word for it, test it yourself with this A/B testing calculator.

An Important Note on Randomness because of Timing

Have you ever done a yard sale? If you have, you know you’ll sell way more early in the morning (when the buyers are hunting bargains) than in the afternoon (when the families are out on a Sunday trip).

The same is true for your landing pages. You’ll get different types of visitors depending on the time of the day and the day of the week.

That’s why we urge you to keep your test running for at least 2 weeks even if you see a statistically significant result before that.


How do you feel right now? Did A/B testing just become a possibility for your landing pages?

Now, you might have noticed that if you’re only getting 100 people to a page and you need 20 of them to become subscribers you’ll need a much higher conversion rate (20%) than if you would get 1000 visitors to that page (2%)…

So let’s have a look at where to start in order to get to those mouthwatering conversion rates that you’ll need to make up for low traffic.

How to Get Enough Conversions?

If you’re looking at your website stats and you’re not seeing a significant number of conversions on the page you want to test, here are 3 things you can do immediately: 

1) Test Higher Up in the Funnel

On your sales pages, you might only get a few sales a month. But what about your lead generation landing page?

Giving away something for free will always get more conversions than selling something.

That’s why you should start by testing pages getting the most conversions.

And you know what the beauty of this is? You’ll be getting more leads by optimizing your lead generation landing page, which increases the chances you’ll sell more and eventually get to enough conversions to start A/B testing your sales pages!

If you don’t have a proper lead generation landing page in place, you can follow this step-by-step tutorial. ​

2) Make a (better) Opt-in Offer 

If you have a lead generation landing page or an opt-in offer on your site but you’re not seeing any conversions, chances are your opt-in offer isn’t good.

Here’s how you can improve:

3) Turn Your Homepage into a Lead Generation Page


The quickest way to see a conversion boost is to turn your homepage into a lead generation page.

This means placing your opt-in offer and a lead generation element at the top of your homepage.

Naturally, your homepage is one of the most visited pages on your site and adding that lead generation element can make all the difference in your conversions.

What If it’s Really Just You and Your Mum?

If you’re just starting out and your website gets really no visitors, you’ll have to fix that first.

Here’s how to improve a low traffic website in 3 simple steps (and which mistakes to avoid).

What's Holding You Back?

You made it all the way through! Now, I would like to know, is there still something that’s holding you back? Or are you ready to give A/B testing a chance?

P.S.: If you're looking for an ultra-usable tool to start A/B testing your pages, check out Thrive Optimize. It allows you to set up a new test in under a minute!

by Hanne  February 15, 2018

14

Enjoyed this article ?

You might also like:

Leave a Comment

  • You guys constantly mention A/B testing of landing pages. But what about the other content pages in a website? Aren’t they worth to be A/B tested? Thrive Leads does A/B testing. What is the difference?

    Furthermore, it seems to me that there’s a confusion in the definition of landing page. Is it any page that has been selected as the target of an ad, or is it a page with only one call to action and no links at all? If the latter one, then probably “landing” is not the right word to describe it.

    • Hi Javier,

      Sure, testing your opt-in forms is definitely a good idea and we talk a lot about that too (as you can see in this article or this article).
      About the term landing page, I assure you that that is not something we invented… A landing page is indeed any page that has a specif call to action and (usually) no other links (such as a top menu or sidebar). It’s not only a page where you would send paid traffic to (even though that can be the case).

      • Thank you for your response, but what I was asking is why Thrive Optimize is for landing pages only (that’s what the product header says). Not for any other content pages? Why not?

      • No that’s not planned. Are you using Headline Optimizer? That will already give you the 80/20 to increase the engagement on your blog posts!

  • This is a wonderfully simple, but clear, explanation of the benefits of ‘meangingful’ split-testing – no matter how much traffic you’re site is getting.

    Makes a lot of sense, and I shall seriously consider purchasing Thrive Optimise, and adding it to Thrive Architect purchase. Thank you Hanne 🙂

  • Thank you so much Hanne.

    I love it when you guys base your explanations on scientific methods, and I totally agree with the statistically significant data with a confidence level of 95% and an alpha of 5%.

    Uhmmm, which got me thinking (even when I have not tried the plugin yet) that it would be advantageous to have those kind of settings as selection tools inside the plugin, just to make sure what confidence level people want to choose to run their A/B tests based on statistically significant data.

    I´ll give it a try…

    Take care Hanne.

    Luis.

    • Hi Luis,
      In the plugin, you’ll see the “automatic winner settings” Here you can choose what confidence level you want , what time and how many conversions 🙂

  • Another great tutorial and simplified explanation on yet another great addition to the Thrive suite. I’m sure that when I get into a position to use this it will be perfect for my needs and will help greatly.

    Nevertheless, like so many here, my mind is also focussed with anticipation on something else Thrive related and I find myself coming back to the Thrive blog section to see if there are any further developments and updates on the elusive ‘Elephant in the room’ that is…… the new Thrive Theme.

    Hanne, Shane, Stephanie!!! Can you please give us something? Screenshot? Teaser? Taster? Incites? Anything?

  • {"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}
    >