Starting Your First A/B Test? Brace for Disappointment!

Shane Melaugh   59

If you want to build a truly effective online business, A/B testing is a ​must. A/B testing is what leads to truly kick-ass pages and conversion rates.

​Unfortunately, the experience of ​testing can be frustrating. Especially if you're new to it and you came in with stories from spectacular case studies (like this one) buzzing in your head. We'd love to experience 200% conversion increases every time we run a test, but the reality of conversion optimization isn't quite so exciting.

In today's post, you'll discover why A/B testing results can often seem disappointing and what you need to do to get good value out of them anyway.​


Learn to Love Small Wins

"Slow and steady wins the race" is a platitude, but it ​holds true for A/B testing.

It's exceedingly rare to see a huge win on any single test. That means that if you expect to turn your business around on a single split-test, you're in for disappointment.

Even when testing does result in dramatic wins - like the 200%+ increase in lead conversions from this post - it's often the result of a series of tests, many of which may have been inconclusive.​

What's more, spectacular looking case studies are often just plain wrong. Two common problems you'll often see when you take a closer look are:

  1. The test wasn't run for long enough, so the result is actually just randomness and not a real performance difference.
  2. The control is extremely bad. It's not difficult to double the conversions ​on a truly awful landing page. A case study is really only interesting if the control - the original version of a page that we're trying to improve - already performs ​well.

What's the takeaway from this?

To benefit from A/B testing in the long run, make ​it a habit. ​Ideally, you should always have some test running somewhere​. Also, get used to waiting for 2 weeks or more before drawing conclusions from a test. And finally: learn to love the small wins.

​Low Friction Testing

All of the above led us to create Thrive Optimize, our A/B testing plugin for WordPress. Many other testing tools were already available, some even for free. However, we found that setting up and starting a test could often be maddeningly complicated, involving countless steps, switching back and forth between different dashboards and copy-pasting chunks of code.

Thrive Optimize is not the most feature-rich or complicated testing tool available. And that's on purpose. Thrive Optimize is the fastest, most frictionless way for you to start an A/B test on your WordPress site.​​​

Because not every test will lead to a big win and because you need to make testing a habit, not something that happens on rare occasions only, we emphasize this "frictionless" aspect. With Thrive Optimize, you can start your tests quickly and painlessly. This makes it easier to test regularly and consistently. And that's how you get those small wins stacking up over time.

Are there Exceptions?

There is one exception that comes to mind: scarcity.

I've used time constraints and limited offers for many years now, but the effect never ceases to surprise me. Adding scarcity is an almost frighteningly reliable way to produce a big spike in conversions and sales. ​That's why we created Thrive Ultimatum.

As cool as it is to make use of scarcity, it shouldn't be the only thing you rely on. A benefit you get from A/B testing is insight: the winning variations can tell you things about how your audience thinks and feels and what matters to them. That's something you don't get from using scarcity - all it tells you is that people make a decision ​once you light a fire under their butts.

Your Turn!

If you want to improve your website and tap into the power of A/B testing, the bottom line is this: realize that many tests are inconclusive or only lead to marginal improvements. And don't let that discourage you. Instead, take those small wins and keep on testing.

If you're not sure where to get started, check out this post about the first 3 things you should optimize and test on any landing page.​

You can also check out the Rapid Landing Pages course (which is free), to learn a simple framework for building conversion optimized pages as well as a framework for what to test and in what order.​

And finally, grab your copy of Thrive Optimize to take the pain and complication out of A/B testing.

​What has your experience with A/B testing been like? Any big wins of frustratingly inconclusive results? Let us know by leaving a comment below!


by Shane Melaugh  April 12, 2018


Enjoyed this article ?

You might also like:

Leave a Comment

  • Signed up fpr the course twice but no confirmation email. It is not in Spam Experts quarrantine. I receive other mails from you fine.

    • Hi Robin,

      Sorry for that. I just updated the form and tested it again. It’s definitely working for me. I will send you an email with the course link, so you don’t have to try again.

  • Worth noting when you look at other people’s spectacular results that:

    a) They may have run 100 tests with no or small results, but they’re only going to blog about the big winners
    b) Even if they do their tests properly then with a 95% significance there’s still a 5% chance that this result was pure chance
    c) The significance level relates to the winning arm being better than the losing arm – but not necessarily by the percentage increase recorded. In other words, if you run a test and A beats by by 50% with a 95% significance, that means that there is a 95% chance that A is better than B – but it might only be better by 1% or 2% – not necessarily the full 50%


    • Very good points, yes. Especially the second one can be infuriating to the human brain, I think. It’s just difficult for us to understand randomness in this way, but it’s totally true.

  • I like it. It’s refreshing hearing an honest opinion about something we deal with every day! My question for you: Do you have a rule of thumb as to how long you will let your tests run? Or how many impressions/conversions you are going for until you make a decision?

    • Hi Johannes,

      I generally let a test run for at least 2 weeks. If you watch any kind of stats on your site, you’ll see that there’s a weekly cycle, just like there are daily cycles. So, you might get more traffic or more sales on some days of the week than on others.

      Because of that, any test that runs for less than a week is more likely to produce false results.

      For conversions, a good rule of thumb is to let the test run until you have 100 conversions per contender in the test. So, if you’re testing 3 different pages, leave the test running until you have at least 300 total conversions.

      But as with all things, these rules are very context dependent.

    • Don’t worry, we are. 🙂

      But when you use it, it will be good to know that not every test you run will be a smashing success.

  • An A/B test can go horribly wrong if you target the wrong audience in the first place. This is why I rely solely on paid traffic to test. It’s the only way I know the right traffic is being sent in the first place (organic can really upset the results). That’s my tuppence worth anyway. Great article Shane. It’s great you’ve highlighted this issue.

    • Thanks for your comment, Quentin! Paid traffic can definitely help create more of a “hermetically sealed” environment for your tests.

  • Thrive Themes rocks. You reach out to your customers with Tips, Tricks and Traps to avoid. Your engine is running on all cylinders. You communicate well offering high touch high value as a good business model to follow. Marketing only works when businesses commit to a campaign run over time. Anyone expecting immediate results will always be disappointed. I brag on Thrive Themes having a great product made better by offering explanations and examples showcasing how and why to best use it.

    • Hello Tamás,

      This is why tests need to be run for long enough periods of time. What you see in this typical graph is the randomness (big differences in the beginning), which is reduced the longer the test runs. By running a test long enough, we can insure that the sample of people who were included in the test (and their behavior) is representative of our visitors in general.

  • Hello!

    I’m looking at “Done For You” sales and/or marketing funnel companies. The two I like only work with ClickFunnel, and I don’t want to switch from Thrive to CF.

    Is there anyone on your team, or do you know of reputable company who I can hire?

    thx, d

  • I recently did a test with ads+affiliate links to just affiliate links, expecting to see dramatic results… but I didn’t. Not exactly an AB split test, but a test nonetheless and one I expected to see something more conclusive from thant what I actually did…

    • Hi Jakob,

      Yeah, even apart from A/B tests, I think experimenting with different strategies on your site is always a good idea. It’s good to question “does this work?” instead of just assuming that it does.

  • Shane,
    I’m not sure what your point of this article is? disappointment from small gains? how is that disappointment? can you elaborate

    • Hello AC,

      I don’t know if I can explain it better than the way I did in the video. The points are to A) not prematurely stop tests, B) not expect huge wins all the time, like you generally read about in case studies and C) make a habit out of continually testing, so that small wins can accumulate over time.

    • Good stuff! Of course the big wins do happen and it’s wonderful when they do. Sometimes, we have to have the patience to start many tests before a big win hits.

  • I love it! Shane, this article is literally the thoughts that were running through my head a for the past few months. A/B testing has to be done just right (as you mentioned in your article) for the results to be taken seriously.

    • Thank you for your comment, Jabir! Yes, it’s not all fun and games with A/B testing, but it definitely pays off in the long run. 🙂

  • Really great post however its years later but still feels fresh while reading. I never comment to anyone but i appreciate this post

  • Great article Shane! One question though: We’re testing two different lead magnets (and thus landing pages). One is converting at around 50% and another is converting at around 47%. We have had 1000+ leads for each of the magnets. Statistically speaking, there is no significant difference. What should we do in this case?

    • Abandon the test and start a new one. If you don’t get a clear winner after 2+ weeks and hundreds of conversions, the best move is to start a new test.

  • Hi Shane!

    I found this video especially useful! I was expecting a “clear winner” on my first A/B testing results, and then I started to realize who this works.

    Great content, clear and honest.

  • Great points to keep in mind with A/B testing. It’s about incremental growth, 200% gains are nice but you don’t need those kinds of numbers to have a tangible impact on performance or revenue.

  • Hey Shane!

    It’s great advices. Especially liked the phrase that you need to learn how to love small winss. Tell me, in what time should I conduct an A/B testing? 2 weeks? Is this definitely enough?

    • 2 weeks is not definitely enough, no. It’s the minimum time we recommend, even if you have enough data before the 2 weeks are up. However, you often need to keep a test running for longer, if you don’t have a clear winner yet.

  • I have done A/B testing for one of my affiliate sites recently and results seems shocking at first but when I found what users want from my page for example – I have implemented those and found immediate gains in user conversions !

  • Yes continuous A/B testing will get your site where you need it. We are constantly testing to improve and we will continue to do so! Thanks for the article!

  • I have used Ultimatum, to good effect, and I’m going to start using optimise to a/b test some landing pages next.
    Question: Is it possible to combine the two, and A/B test a page that an Ultimatum campaign is running on (or vice versa)?
    If not, then which is best to start with?

  • “Slow and steady wins the race… it’s exceedingly rare to see a huge win on any single test.”


    If I had a penny for every time a marketer gave up after only one or two tests… I’d be rich.

    Patience and trusting that your desired outcome will eventually come if you continue to persist is what will make you a successful digital marketer.

  • Great insights! When we first started with A/B testing, I was expecting larger results, based on previous case studies I’ve read. But you’re right: slow and steady! Thanks for sharing.

  • I think A/B testing is something a lot of website owners skimp on. There are so many variables it can seem overwhelming to try and figure out what it increasing conversions.

  • {"email":"Email address invalid","url":"Website address invalid","required":"Required field missing"}