Starting Your First A/B Test? Brace for Disappointment!

A/B testing is a cornerstone of building a truly effective online business and something we always encourage our users to do. However, you might find the experience of A/B testing quite frustrating, especially if you're new to it and you came in with stories from spectacular case studies (like this one) buzzing in your head.

In today's post, you'll discover why A/B testing results can often seem disappointing and what you need to do to get good value out of them anyway.​


Learn to Love Small Wins

"Slow and steady wins the race" is a platitude, but it certainly applies to A/B testing.

It's exceedingly rare to see a huge win on any single test and if you expect to turn your business around on a single split-test, you're in for disappointment.

Even when testing does result in dramatic wins - like the 200%+ increase in lead conversions from this post - it's often the result of a series of tests, many of which may have been inconclusive.​

What's more, spectacular looking case studies are often just plain wrong. Two common problems you'll often see when you take a closer look are:

  1. The test wasn't run for long enough, so the result is actually just randomness and not a real performance difference.
  2. The control is extremely bad. It's not difficult to double the conversions on a horribly designed landing page that takes 2 minutes to load... A case study is really only interesting if the control already performs decently.

To benefit from A/B testing in the long run, make testing a habit (you should always have some test running somewhere), get used to waiting for 2 weeks or more before drawing conclusions from a test and learn to love the small wins.


There is one exception that comes to mind: scarcity.

I've used time constraints and limited offers for many years now, but the effect never ceases to surprise me. Adding scarcity is an almost frighteningly reliable way to produce a big spike in conversions and sales. Hence why we created Thrive Ultimatum.

As cool as it is to make use of scarcity, it shouldn't be the only thing you rely on. A benefit you get from A/B testing is insight: the winning variations can tell you things about how your audience thinks and feels and what matters to them. That's something you don't get from using scarcity - all it tells you is that people make a decision when you light a fire under their butts.

Your Turn!

If you want to improve your website and tap into the power of A/B testing, the bottom line is this: realize that many tests are inconclusive or only lead to marginal improvements. And don't let that discourage you. Instead, take those small wins and keep on testing.

If you're not sure where to get started, check out this post about the first 3 things you should optimize and test on any landing page.​

You can also check out the Rapid Landing Pages course (which is free), to learn a simple framework for building conversion optimized pages as well as a framework for what to test and in what order.​

​What has your experience with A/B testing been like? Any big wins of frustratingly inconclusive results? Let us know by leaving a comment below!


Author: Shane Melaugh

Shane Melaugh is a co-founder and the CEO of Thrive Themes. When he isn't plotting new ways to create awesome WordPress themes & plugins, he likes to geek out about camera equipment and medieval swords. He also writes about startups and marketing here.

  • Robin C says:

    Signed up fpr the course twice but no confirmation email. It is not in Spam Experts quarrantine. I receive other mails from you fine.

    • Shane Melaugh says:

      Hi Robin,

      Sorry for that. I just updated the form and tested it again. It’s definitely working for me. I will send you an email with the course link, so you don’t have to try again.

  • Matt says:

    Hi Shane, great work with putting this together!

  • Lee says:

    What an elegant explanation. You always add such value to your content.

  • Ian B says:

    Worth noting when you look at other people’s spectacular results that:

    a) They may have run 100 tests with no or small results, but they’re only going to blog about the big winners
    b) Even if they do their tests properly then with a 95% significance there’s still a 5% chance that this result was pure chance
    c) The significance level relates to the winning arm being better than the losing arm – but not necessarily by the percentage increase recorded. In other words, if you run a test and A beats by by 50% with a 95% significance, that means that there is a 95% chance that A is better than B – but it might only be better by 1% or 2% – not necessarily the full 50%


    • Shane Melaugh says:

      Very good points, yes. Especially the second one can be infuriating to the human brain, I think. It’s just difficult for us to understand randomness in this way, but it’s totally true.

  • Johannes D says:

    I like it. It’s refreshing hearing an honest opinion about something we deal with every day! My question for you: Do you have a rule of thumb as to how long you will let your tests run? Or how many impressions/conversions you are going for until you make a decision?

    • Shane Melaugh says:

      Hi Johannes,

      I generally let a test run for at least 2 weeks. If you watch any kind of stats on your site, you’ll see that there’s a weekly cycle, just like there are daily cycles. So, you might get more traffic or more sales on some days of the week than on others.

      Because of that, any test that runs for less than a week is more likely to produce false results.

      For conversions, a good rule of thumb is to let the test run until you have 100 conversions per contender in the test. So, if you’re testing 3 different pages, leave the test running until you have at least 300 total conversions.

      But as with all things, these rules are very context dependent.

  • Mehdi says:

    Is this post indicating a new coming plugin… ? :)

  • Mike Baker says:

    Are you not going to make an A/B split testing feature for your landing pages then?

    • Shane Melaugh says:

      Don’t worry, we are. :)

      But when you use it, it will be good to know that not every test you run will be a smashing success.

  • Quentin P says:

    An A/B test can go horribly wrong if you target the wrong audience in the first place. This is why I rely solely on paid traffic to test. It’s the only way I know the right traffic is being sent in the first place (organic can really upset the results). That’s my tuppence worth anyway. Great article Shane. It’s great you’ve highlighted this issue.

    • Shane Melaugh says:

      Thanks for your comment, Quentin! Paid traffic can definitely help create more of a “hermetically sealed” environment for your tests.

  • Bryce M says:

    Wonderful insights Shane! Thanks for bringing this up

  • Larry Rampulla says:

    Thrive Themes rocks. You reach out to your customers with Tips, Tricks and Traps to avoid. Your engine is running on all cylinders. You communicate well offering high touch high value as a good business model to follow. Marketing only works when businesses commit to a campaign run over time. Anyone expecting immediate results will always be disappointed. I brag on Thrive Themes having a great product made better by offering explanations and examples showcasing how and why to best use it.

  • Tamás says:

    My opinion is that what if the first period of the test are different kind of habit user then at the end?

    • Shane Melaugh says:

      Hello Tamás,

      This is why tests need to be run for long enough periods of time. What you see in this typical graph is the randomness (big differences in the beginning), which is reduced the longer the test runs. By running a test long enough, we can insure that the sample of people who were included in the test (and their behavior) is representative of our visitors in general.

  • darlene says:


    I’m looking at “Done For You” sales and/or marketing funnel companies. The two I like only work with ClickFunnel, and I don’t want to switch from Thrive to CF.

    Is there anyone on your team, or do you know of reputable company who I can hire?

    thx, d

  • Jakob D says:

    I recently did a test with ads+affiliate links to just affiliate links, expecting to see dramatic results… but I didn’t. Not exactly an AB split test, but a test nonetheless and one I expected to see something more conclusive from thant what I actually did…

    • Shane Melaugh says:

      Hi Jakob,

      Yeah, even apart from A/B tests, I think experimenting with different strategies on your site is always a good idea. It’s good to question “does this work?” instead of just assuming that it does.

  • Tony says:

    I will consider to buy thrive theme. Really It is great !

  • AC says:

    I’m not sure what your point of this article is? disappointment from small gains? how is that disappointment? can you elaborate

    • Shane Melaugh says:

      Hello AC,

      I don’t know if I can explain it better than the way I did in the video. The points are to A) not prematurely stop tests, B) not expect huge wins all the time, like you generally read about in case studies and C) make a habit out of continually testing, so that small wins can accumulate over time.

  • >