Feature Update: Get Better A/B Test Results, Faster!

A/B testing is a core feature in Thrive Leads as well as in Thrive Headline Optimizer. We're proud of the fact that we don't just add a half-baked split-testing feature into those plugins and call it a day. Instead, we built a refined testing engine with automation features and safeguards against prematurely ending tests.

With today's update, we've taken the power of our A/B testing features another step forward.​


Faster Results for Free

You can now manually intervene in running tests and eliminate weak variations while keeping the test running. You can do this by clicking on this icon in an A/B test report:

While this is a nice addition, the real power of this feature update lies in the Automatic Winner feature. Activate Automatic Winner and a running test will now eliminate the weakest variations in your test while keeping the test running. That means you can direct more of your traffic to the strongest test variations. When more traffic is directed to fewer variations, it also means that the test can come to a positive conclusion faster.

The more variations you test at the same time, the more valuable this new feature is.

In short, it means that:

  • ​Less traffic is wasted on weak variations.
  • The strongest among strong variations is identified sooner.
  • The test ends sooner and shows the most confident winning variation.

What do you need to do, to get all these advantages? Nothing!

Other than updating your Thrive Leads and Thrive Headline Optimizer plugins to the latest version and continuing to use Automatic Winner in your tests, there's nothing to do. You get these benefits automatically.​


P.S.: If you want to see a real-world example of just how powerful A/B testing can be, check out this case study where we used Thrive Leads to boost opt-in conversions by over 200%!

P.P.S: If you want to A/B test your landing pages, check out Thrive Optimize. The premium add-on for Thrive Architect that enables A/B testing from within your WordPress dashboard.

Author: Shane Melaugh

Shane Melaugh is a co-founder of Thrive Themes. When he isn't plotting new ways to create awesome WordPress themes & plugins, he likes to geek out about camera equipment and medieval swords. He also writes about productivity here.

  • Ed says:

    I like the update. More refined. I do have a nagging question about changing headlines though. Is it still important to choose headlines that have Google keywords in them that are reflective of my research for the right keywords? …or do I ignore the keywords? I know if I send a notice of a blog to my list it probably doesn’t matter as much as Google finding my blog and filing it based on keywords etc.

    • Shane Melaugh says:

      Hi Ed,

      The more important headline for SEO purposes is your post’s meta title and this will remain unchanged, independent of the headline test running.

      Of course, an argument can be made that the post title itself is also an SEO factor, but personally, I would never compromise the post title for the sake of SEO. In my opinion, the post title (and the content, for that matter) should be 100% made for your readers, with SEO considerations being a distant after-thought at most.

  • Evan says:

    I thought you were going to say that you added an A/B testing feature to TCB. Got my hopes up haha

  • Anon says:

    Awesome. Avid user of Thrive Leads and Thrive Headline Optimizer.

    Over the past year, Leads has 4x-ed my subscribers and Headlines has done the same for my engagement.

    Keep the improvements coming, we love them.

    – Anonymous Entrepreneur

    • Shane Melaugh says:

      Thank you for your comment! It’s awesome to hear that you’re getting such good results with our products. 🙂

  • Joe Stronsick says:

    shoot I came to this after an email which led me to believe it was split testing for TCB…

    • Amanda E says:

      Me too!

    • Shane Melaugh says:

      Hello Joe,

      A/B testing will probably never be a part of TCB, but it will become available for landing pages. And in general, we will bring more A/B testing capabilities into our products, over time.

  • Don says:

    For those asking about split testing in TCB, I wonder if you can create that result by using Google Content Experiments plug-in in conjunction with TCB. Might be worth a few minutes of testing.

    • Shane Melaugh says:

      Hello Don,

      Thanks for your comment. Currently, I recommend this method for testing landing pages and other site pages.

      I can’t really envision A/B testing as part of Thrive Content Builder, because that would imply testing individual elements and that would mean multi-variate testing. Multi-variate testing is something that sounds exciting, but is a waste of time for 99% of businesses, so I’m not eager to add it to our products.

      Testing on a page level is something we’ll definitely bring into our suite of tools, though.

  • Quentin P says:

    Another super clever and simple idea. Thanks Shane.

  • Robert Botha says:

    That’s awesome. It was becoming very cumbersome having to restart the tests all the time.

  • David L says:

    From what I understand, this update allows us to reach statistical significance faster, by ‘turning off’ under-performing variants earlier, and splitting the traffic between fewer variants.

    My own experiences with A/B testing shows a LOT of fluctuations until enough data is collected, so I would recommend only declaring variants to be ‘losers’ (and turning them off) if you have ~1000 impressions, at least a few weeks’ duration, and <5% chance to beat original.

    Otherwise there's a risk you'll disable variants too early which could be perfectly good performers.

    • Shane Melaugh says:

      Hi David,

      That’s exactly right and this is what the automatic winner settings are for. You set those minimum thresholds to make sure no variation is eliminated or picked as the winner before enough data is gathered. I don’t recommend using impressions as a threshold criterium, though. Using total conversions as a threshold is a lot more reliable and less dependent on the average conversion rate across all variations.

      • David L says:

        Cheers Shane. As a general rule, do you think 50 conversions is a good threshhold to aim for?

      • Shane Melaugh says:

        50 total conversions per variation in the test is a reasonable minimum. More is always better, though.

  • Mike says:

    Excellent update…as usual!

    One question; is there an article that shows the “best practices” for determining the Automatic Winner settings?

    I know we’re after statistical significance, just want to make sure I’m not creating settings that are too low. I assume the settings seen in the video would be a good to start?


    Portland, Oregon

    • Shane Melaugh says:

      Hi Mike,

      Thanks for your comment! When you activate Automatic Winner, the default settings are created as a good baseline. If you change nothing about them, it’s going to be perfect for most test cases.

  • Joe Stronsick says:

    I am in total support of an A/B split tester for pages… I hope it comes soon. Many in my field have given up on Thrive for Click Funnels for this reason. But I love Thrive and the stuff you put out. Please Oh Please get a split tester for pages.

  • Richard says:

    I liked the update. Thanks for this update. This is now more user friendly.

  • Matthew P says:

    Can you split test between 2 landing pages created from your templates?

  • >