Why Sorting Templates by Conversion Rate is Just… the Worst Idea Ever

Every once in a while, someone will ask us: why is there no way to sort landing page templates in Thrive Architect by conversion rate?

We get this question because one of our competitors advertises sorting landing pages by conversion rate as a great feature and advantage of their tool. To clarify: we're not talking about the highest converting landing pages on your site, among the landing pages you've used. We're talking about sorting all landing pages templates based on the average conversion rates of all users.

At first glance, this might seem like good idea. I mean, who wouldn't want to be able to pick the highest converting template to start with?

In this post, I'll explain why sorting templates by conversion rate is wrong on, like, SO MANY levels...

More...

How to Pick the Fastest Running Shoe

Let's forget about landing pages for a moment. Let's say you're in the market for a new running shoe. And you come across a store that features a list of all of their running shoes, sorted by speed.

In a running shoe, faster is better, so this is great, right? You can pick among the fastest shoes, using this list.

But wait a minute - how is this data collected? How does this list come about?

You ask a store clerk and she tells you that the data comes from all the customers' real world use of the shoes. The store has aggregate data that shows how fast people are going in each of the shoes and can average out all those values to sort the shoes by speed.

Here's the problem: the only thing we know is the shoe model and the average speed among thousands of people using the shoe.

What we don't know includes:

  • The age of the runners.
  • The distance the shoes are used for (100m dash and marathon data are all lumped together).
  • The terrain the shoes are used on.
  • The athleticism of the wearers (data from professional athletes and hobby runners are lumped together).
  • And many more factors...

As you can see, the factors we don't know make a much bigger difference than the factors we do know.

Sure, a better shoe model might help you run a bit faster, but the difference in speed between a senior citizen on a leisurely Sunday run and a professional 100m sprinter during a record attempt is greater than the difference between individual shoe models, by many orders of magnitude.

The difference in the speeds of different shoe models is therefore much more determined by factors we don't know and can't control for than by factors we do know.

And that, unfortunately, makes our list of shoe models sorted by speed completely useless.

What Shoes and Page Templates Have in Common

Okay, why am I talking about running shoes?

All the problems mentioned in the shoe example apply to landing page templates as well. It's just a bit more abstract and more diffiuclt to visualize than with shoes.

Here's what we could know and measure about how landing page templates perform:

  • Number of visitors and conversions.
  • The average conversion rate.
  • How many people use each template.
  • The broad category of the templates (e.g. sales pages and opt-in pages).

What we can't know and control for includes:

  • The copy used on the page.
  • Images, elements added, elements removed, changes made to the template (we could know them but not quantify them).
  • The type and quality of the offer being presented.
  • The price of products being sold.
  • The "temperature" of traffic being sent to the page (think: traffic from an email list of fans vs. cold traffic from PPC ads).
  • And many more factors...

We have the same problem as with the hypothetical shoe example above. The difference that these unknown factors make is vastly greater than the difference a template makes.

Example Comparison

A shoe worn by a professional athlete in a race will go much faster than a shoe worn by a senior citizen going for a stroll, regardless of the shoe model. A landing page selling a $5 product to an audience of long-time fans will have a vastly higher conversion rate than a landing page selling a $2,000 product to cold traffic, regardless of the template used.

Can't We Categorize?

An opt-in page for a free offer will always have a higher conversion rate than a sales page. So, what if we split up the templates by categories? We can compare sales pages to sales pages, opt-in pages to opt-in pages and so on. Doesn't this remove some of the randomness and lead to a useful apples-to-apples comparison?

No. Here are some examples to illustrate why:

Example 1

Offer 1:

Type:

Sales page

Product:

Simple, universally appealing.

Price:

$1

CR:

6.9%

Design:

Template A

Offer 2:

Type:

Sales page

Product:

Complex, professional, niche.

Price:

$4,000

CR:

0.72%

Design:

Template B

Template A is used on a page with almost 10x the conversion rate (CR) of Template B. But does that really mean Template A is better?

Example 2

Offer 3:

Type:

Opt-in page

Offer:

"Subscribe to our newsletter"

CR:

1.2%

Design:

Template C

Offer 4:

Type:

Opt-in page

Offer:

Valuable, relevant free course.

CR:

15.8%

Design:

Template D

Template D performs much better, but how much does that have to do with the template and how much with the difference between the two offers?

Example 3

Offer 5:

Type:

Webinar registration

Copy:

Excellent copy, crafted by a pro.

CR:

19.2%

Design:

Template E

Offer 6:

Type:

Webinar registration

Copy:

Boring, generic, typos everywhere.

CR:

1.4%

Design:

Template F

Template F seems to be a lot worse, but what would happen if the copywriting was elevated to a much higher level? Would the difference between templates E and F still exist?

Copywriting, pricing, the quality and value of an offer presented on the page: these are all factors that can't be controlled for and that make a huge difference to the performance of a page.

What if You Have a Ton of Data?

If we gather data from tens of thousands, hundreds of thousands or even millions of users, doesn't it average out and we end up with a useful list?

No. More data of this kind doesn't average out in any meaningful way. This is a classic big data problem: when we have a lot of data and we run models on it, it always seems like something relevant is happening. But you can measure what is effectively just noise and still find patterns in it. That doesn't mean you're measuring anything relevant or getting any real information out of the data.

You can think of it like this: measurements of what matters most to the conversion rate of a page are either imprecise or non-existent. You can't add up many imprecise measurements and average them out to reach a precise measurement.

Imagine that you have a scale made for weighing people. It's made to measure many kilograms/pounds and it's precise down to about 100 grams (0.22 lbs). If you try to weigh a needle, which weighs less than 1 gram, this scale won't give you any useful data.

And what if you weigh the needle 1,000 times and calculate the average? It makes no difference at all. You still won't be any closer to knowing the true weight of the needle.

What if We Filtered Data by Account?

What if we compared the conversion rates of different templates only within individual accounts and then ranked the pages overall, based on their performance within accounts?

In other words, this would look at the relative performance of templates used by the same user. This would eliminate some of the randomness. We can assume that the quality of copywriting will be roughly the same across the board, on pages made by the same user. We can also assume that the same user is likely to have similar offers in the same market, used across all pages.

It would present us with a new problem, though: small sample size. Most users will never use enough different templates and send enough traffic to all of them to provide a meaningful ranking.

Plus, it doesn't eliminate most of the other problems, such as comparing templates that have both been customized to the point where they have nothing in common with the original template anymore.

Think of this: the same user can load the same template on two pages and make changes to them, to run an A/B test. Presumably, one version will win the A/B test. So, the template is at the same time worse and better than itself. How do we translate that into useful data for a "rank by conversion rate" list?

This just illustrates, once again, that factors we can't control make a greater difference than which template was chosen.

Where Our Focus Lies

By now, I hope I've convinced you that any attempt to rank templates by conversion rate is hopeless. That's why we don't do it. And frankly, measuring this data and advertising the results as a useful feature would be dishonest.

Instead, there are two important factors we focus on, for our templates:

1) Built In Conversion Best Practices

There are some things - not many, but some - that reliably lead to higher conversions. These are the kinds of things that almost all high converting pages have in common. For example:

  • A large, attention-grabbing headline at the top.
  • Clearly visible, high contrast buttons/calls to action.
  • A strong focus on a single call to action or a very small number of calls to action.
  • A clear visual hierarchy.
  • High contrast text in a crisp, readable font.
  • Not having a big, slow, animated slider.

All of our templates come with these factors built in, so that if you do nothing but change the text and tweak the design, you have a great page. The conversion basics are taken care of.

2) Rapid Implementation

If you want optimal conversion rates, you need to test. And we try to make creating and testing pages easier by making our templates rapidly customizable.

The more time you have to spend on making your page look right, the less time you'll have to create A/B test variations and generally focus on the business side of your business. This is why rapid implementation is extremely important to us.

What's Your Take?

What's your take on this topic? Were you hoping for a "sort by conversion rate" feature in our pages? Did I change your mind?

Let me know by leaving a comment!

Shane

Author: Shane Melaugh

Shane Melaugh is a co-founder and the CEO of Thrive Themes. When he isn’t plotting new ways to create awesome WordPress themes & plugins, he likes to geek out about camera equipment and medieval swords. He also writes about startups and marketing here.

Check out our ​UnBlackFriday Deal