The law of unintended consequences states that every human endeavor will generate some result that was not, nor could have been foreseen. The law applies to hypothesis testing as well.

In fact, Brian Cugelman introduced me to an entire spectrum of outcomes that is helpful when evaluating AB testing results. Brian was talking about unleashing chemicals in the brain, and I’m applying his model to AB testing results. See my complete notes on his Conversion XL Live presentation below.

Understanding the AB Testing Results Map

In any test we conduct, we are trying everything we can to drive to a desired outcome. Unfortunately, we don’t always achieve the outcomes we want or intend. For any test, our results lie on one of two spectrums defining four general quadrants.

Map of possible outcomes from hypotheses 

Map of possible outcomes from hypotheses.

On one axis we ask, “Was the outcome as we intended, or was there unintended result?” On the other axis we ask, “Was it a negative or positive outcome?”

While most of our testing seeks to achieve the quadrant defined by positive, intended outcomes, each of these quadrants gives us an opportunity to move our conversion optimization program a step forward.

I. Pop the Champaign, We’ve Got a New Control

With every test, we seek to “beat” the existing control, the page or experience that is currently performing the best of all treatments we’ve tried. When our intended outcome is a positive outcome, everyone is all smiles and dancing. It’s the most fun we have in this job.

In general, we want our test outcomes to fall into this quadrant (quadrant I), but not exclusively. There is much to be learned from the other three quadrants.

II. Testing to Lose

Under what circumstances would we actually run an AB test intending to see a negative outcome? That is the question of Quadrant II. A great example of this is adding “Captcha” to a form.

CAPTCHA is an acronym for “Completely Automated Public Turing test to tell Computers and Humans Apart”. We believe it should be called, “Get Our Prospects to Do Our Spam Management For Us”, or “GOPDOSMFU”. Businesses don’t like to get spam. It clogs their prospect inboxes, wastes the time of sales people and clouds their analytics.

However, we don’t believe that the answer is to make our potential customers take an IQ test before submitting a form.

These tools inevitably reduce form completion rates, and not just for spam bots.

CAPTCHAS make sure you are not a spam robot

CAPTCHAs reduce spam, but at what cost?

CAPTCHAs reduce spam, but at what cost?

So, if a business wants to add Captcha to a form, we recommend understanding the hidden costs of doing so. We’ll design a test with and without the Captcha, fully expecting a negative outcome. The goal is to understand how big the negative impact is. Usually, it’s too big.

In other situations, a design feature that is brand oriented may be proposed. Often a design decision that enhances the company brand will have a negative impact on conversion and revenue. Nonetheless, we will test to see how big the negative impact is. If it’s reasonable, then the loss of revenue is seen as a marketing expense. In other words, we expect the loss of short-term revenue to offset long term revenue from a stronger brand message.

These tests are like insurance policies. We do them to understand the cost of decisions that fall outside of our narrow focus on website results. The question is not, “Is the outcome negative?” The question is, “How negative is the outcome?”

III. Losers Rule Statistically

Linus Pauling once said, “You can’t have good ideas without having lots of ideas.” What is implied in this statement is that most ideas are crap. Just because we call them test hypotheses doesn’t mean that they are any more valuable than rolls of the dice.

When we start a conversion optimization process, we generate a lot of ideas. Since we’re brilliant, experienced, and wear lab coats, we brag that only half of these ideas will be losers for any client. Fully half won’t increase the performance of the site, and many will make things worse.

Most of these fall into the quadrant of unintended negative outcomes. The control has won. Record what we learned and move on.

There is a lot to be learned from failed tests. Note that we call them “inconclusive” tests as this sounds better than “failed”.

If the losing treatment reduced conversion and revenue, then you learn something about what your visitors don’t like.

Just like a successful test, you must ask the question, “Why?”.

Why didn’t they like our new background video? Was it offensive? Did it load too slowly? Did it distract them from our message?

Take a moment and ask, “Why,” even when your control wins.

IV. That Wasn’t Expected, But We’ll Take the Credit

Automatic.com was seeking a very specific outcome when we designed a new home page for them: more sales of the adapter and app that connects a smartphone to the car’s electronic brain. The redesign did achieve that goal. However, there was another unintended result. There was an increase in the number of people buying multiple adapters rather than just one.

We weren’t testing to increase average order value in this case. It happened nonetheless. We might have missed it if we didn’t instinctively calculate average order value when looking at the data. Other unintended consequences may be harder to find.

This outcome usually spawns new hypotheses. What was it about our new home page design that made more buyers decide to get an adapter for all of their cars? Did we discover a new segment, the segment of visitors that have more than one car?

These questions all beg for more research and quite possibly more testing.

When Outcomes are Mixed

There is rarely one answer for any test we perform. Because we have to create statistically valid sample sizes, we throw together some very different groups of visitors. For example, we regularly see a difference in conversion rates between visitors using the Safari browser and those using Firefox. On mobile, we see different results when we look only at visitors coming on an Android than when we look at those using Apple’s iOS.

3.9% more Android users converted with this design, while 21% fewer iPhone users converted.

Android users liked this test but iPhone users really did not.

In short, you need to spend some time looking at your test results to ensure that you don’t have offsetting outcomes.

The Motivational Chemistry and the Science of Persuasion

Here are my notes from Brian Cugelman’s presentation that inspired this approach to AB testing results. He deals a lot with the science of persuasion.

My favorite conclusions are:

“You will get more mileage from ANTICIPATION than from actual rewards.”

“Flattery will get you everywhere.”

I hope this infographic generates some dopamine for you, and your new found intelligence will produce seratonin during your next social engagement.

Motivational Chemistry infodoodle by Brian Cugelman at ConversionXL Live

Motivational Chemistry infodoodle by Brian Cugelman at ConversionXL Live

 

After reading our Ultimate A/B Testing Guide, we thought you might want to take advantage of a few creative approaches to landing page AB testing. They go beyond basics to improve your digital marketing ROI.

There are a surprising number of AB tests smart marketers can run on their landing pages to ensure they’re getting the highest conversion rates possible. You already know you can experiment with different landing page elements for conversion rate optimization. Headlines, subheads, calls to action, and design elements can all be considered when building your testing hypotheses’ list.

Now, let’s kick it up a notch with these nine approaches for landing page AB Testing that may inspire you to create your own unique AB tests.

1. Create Targeted Landing Pages for A B Testing

Let’s say you are launching an advertising campaign. And you want to know what page will convert best: a generic one or a highly targeted to the audience and message. For this split test you could have ad variations that would use your generic landing page and one or more targeted landing page versions.

For example, if Shopify did an AB test between their generic vs. targeted landing pages in an advertising campaign, the control would apply to anyone looking for an eCommerce solution.

Shopify’s homepage with a generic target audience would be the control in this AB test.

Shopify’s homepage with a generic target audience would be the control in this AB test.

For a variation or version of a landing page, they could use their page that targets booksellers.

This landing page addresses the ecommerce needs of a specific audience.

This landing page addresses the ecommerce needs of a specific audience.

To maximize your conversions with this test, analyze your customer database to determine who your best customers are and create targeted landing pages focused on those specific groups of customers. If you do a quick Google search for Shopify’s landing pages you can see they have created targeted landing pages for their top customer segments.

Different landing pages convert different demographics and interests.

Different landing pages convert different demographics and interests.

You can use these as examples of how to create targeted landing pages for your top customer groups and demographics.

2. Landing Page AB Testing: Experiment with Animated Headlines

Headlines can make or break your landing page, as they are the first words that capture your visitor’s attention. It’s a simple AB test that can make a dramatic impact on conversions, and it can be done using most AB testing tools.

The key to the headline AB test is to change nothing but the headline. For example, you can see how ActiveCampaign changed from a simple headline about their features…

Test your headlines. ActiveCampaign's headline about features.

ActiveCampaign’s headline about features.

…to a better headline about the benefits their customers can expect when using their service.

Headline about benefits.

Headline about benefits.

This change proved to be a winner, as they have kept it since 2015 with a constantly changing message about their benefits. Therefore. test your headline.

This animated headline with benefits proved to be a winner

This animated headline with benefits proved to be a winner

3. Landing Page Split Test Idea: Vary Featured Homepage Products

In most cases, your homepage will be the most popular landing page on your website. Hence, it will be a page where you should do extensive AB testing.

One test you can do to see if you can increase conversions is simply changing your featured product.

Nest does this by swapping out its popular monitoring camera (formerly Dropcam)…

Landing page split test idea: Nest featuring its Nest Cam on the homepage

Nest featuring its Nest Cam on the homepage.

…with its popular thermostat controller.

Landing page ab testing idea: Nest featuring its thermostat on its homepage

Nest featuring its thermostat on its homepage

This test is currently running (as noted by the ?alt=1 at the end of their homepage URL), so we’ll have to see which one wins in the end.

In some cases, you may want to change your featured product based on the one that is currently getting featured coverage in the media. Or whether that coverage is positive or negative.

4. Creative Landing Page AB Testing Idea: Explore Different Stories

Do stories resonate with your customers and if so, which stories translate into the most sales? Find out through AB testing. Apple did this in this past by running multiple campaigns on their website, social media, and television ads featuring stories about musicians…

Apple’s landing page story appealing to musicians

…explorers…

Apple’s story appealing to explorers

Apple’s story appealing to explorers

…environmentalists…

Apple’s story appealing to environmentalists

Apple’s story appealing to environmentalists

…parents…

Apple’s story appealing to parents

…and many other page visitors. The goal was to show how their products could help tell everybody’s story. No matter what they did or how much of an impact they made in the world.

Apple ultimately went back to a homepage focused on its latest products, but without AB testing, Apple couldn’t just assume that approach would convert the highest.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three

"*" indicates required fields

This field is for validation purposes and should be left unchanged.


5. AB Testing your Landing Page: Shift the Focus to One Product at a Time

Landing pages have better conversion rates when there is one, clear call-to-action; sometimes that CTA is buying a product. But when your top landing page is your homepage, you can’t focus on just one product, right?

Maybe you can. In the past, Logitech had a pretty standard homepage that offered up all of its products all at once to homepage visitors.

AB Test your Landing Page: Logitech’s original, more traditional homepage

Logitech’s original, more traditional homepage

But now, Logitech gives visitors a tour through their top products, one at a time.

Landing page focusing on a single product.

Logitech’s new homepage focusing on a single product

In a few moments, you are completely immersed in a particular product and its main benefit, thanks to this new landing page. It’s definitely a way of landing page AB testing you will want to try if you have a few products you can highlight in this fashion.

6. Transform Your Hero Image to Video

You know that great image of your product that you use to compel people to sign up or buy? Why not convert it into a video that dives even deeper into exactly what happens with that feature of your product. MailChimp made that change with their landing page, going from a screenshot of their newsletter designer…

Mailchimp’s landing page with an image of newsletter editor

…to a video of how their newsletter designer worked.

Video of their newsletter editor increased Mailchimp's landing page conversion rate.

Video of their newsletter editor increased Mailchimp’s landing page conversion rate.

Instead of hoping that an image would convince visitors that their newsletter designer was easy to use, as the landing page claimed, the video was right there to prove it.

While their current landing page features a different, shorter animation, it still features one that demonstrates the ease of use of their newsletter designer. Thus proving that since 2012, video and animation on the landing page beats a screenshot for conversions.

7. Try a Change of Scenery: AB Testing Landing Page Copy vs Image

Sometimes it’s not text or functionality that will make your site convert better, it’s simply imagery that matches the story of your value proposition. Zillow experimented with this idea by changing the background of its search.

One variation was a neighborhood overview with home sale prices, which actually contradicts the line below the image about looking for rentals.

AB testing landing pages message: Zillow’s background showing homes with their prices isn’t relevant to people looking for rental property. Especially with a message at the bottom of the page specifically written for renters.

Zillow’s background showing homes with their prices isn’t relevant to people looking for rental property. Especially with a message at the bottom of the page specifically written for renters.

Another variation used an image of a specific home, which could appeal to both for sale and for rent searchers.

AB testing landing pages: Zillow’s new image says “home” whether you’re buying or renting

Zillow’s new image says “home” whether you’re buying or renting

It seems that they have stuck with the individual home view as it works with what most searchers are looking for.

8. Landing Page A B Testing Basics: Rearrange the Elements

It may not be your product, your service, your copy, your CTA button, your colors, or other elements on your page that are lowering your conversions. It may simply be the order in which they are presented.

Just like when you rearrange all the furniture in your house because it just doesn’t quite feel right, you might want to do the same with your landing page.

Take AgoraPulse, for example. They went from this…

AgoraPulse home page before ab testing

AgoraPulse home page before ab testing

…to this.

AgoraPulse home page after split test

AgoraPulse home page after split test

It’s easy to see why the latter layout works. It flows right into starting your free trial after a simple and convincing headline and subhead. And for visitors still not convinced they should convert, there’s simple video and bullet points to convince them to click that call to action button.

9. Copy Your Competitors

The most creative AB tests might be ones you don’t run on your own website. In addition to AB testing tools, there are tools that will alert you to when your competitors make changes to their websites. Potentially based off of their own AB Testing landing page experiments.

Rival IQ monitors your competitor’s website to see if changes have been made to it recently. The entry level plan allows you to track up to 15 companies. You’re able to track each company’s website design history along with their social media accounts, organic search rankings, and paid search traffic.

In the website history dashboard, you can view a variety of web pages from your competitor’s websites.

In the website history dashboard, you can view a variety of web pages from your competitor’s websites.

Depending on how long the company has been in Rival IQ’s database, you can get a couple of year’s worth of design history from each company.

ActiveCampaign’s design history over a few years

ActiveCampaign’s design history over a few years

When you click on a particular month, you see the breakdown of when particular changes occurred. As well as the ability to click upon a particular design to see the full landing page.

Zeroing into the competitor's design changes.

Zeroing into the competitor’s design changes.

This will give you an idea of what AB tests a competitor has run in the past. Based on the length of time a competitor has stuck with a particular design, you will know which test was the presumed winner. That is, it was statistically significant in terms of increasing their conversions. Be it for their homepage, pricing page, features page, or other significant landing pages.

ActiveCampaign's pricing page history

ActiveCampaign’s pricing page history

In addition, you can sign up for email alerts when your competitors make major changes to their website. This will let you know when your competitors have run new tests on their website and made changes based on their results. Or you may even see the tests themselves in action as the pages change from their original to alternative versions.

If you have a lot of competitors, and you’re not sure which to monitor, use the BuiltWith Google Chrome extension. It will help you find out if a particular competitor’s website is using AB testing software. Chances are, the ones that are will be the ones that will be making frequent changes.

BuiltWith browser extension for analytics and tracking

BuiltWith browser extension for analytics and tracking

What are your unique approaches to Start A B testing your landing page?

If you’ve already done the standard AB tests on your landing pages, then we hope that these additional split test ideas will further help you increase website conversions. Just like it has for the brands and online businesses mentioned here.

Have you been running or reading about some interesting AB tests? We’d love to hear about them in the comments.

AB Testing is only effective when you’re testing something meaningful. This is especially true on small-screen devices we call smartphones, or “Mobile” generically.

Talia Wolf believes that the root of every conversion = Human Behavior. We certainly wouldn’t argue with her. She also believes that emotion is at the core of human behavior. Her strategy for designing web pages that leverage emotional triggers was one of our favorite at ConversionLX Live.

I took notes on her presentation and share them here as an instagraph infographic.

AB Testing Inspiration using Emotional Triggers Infographic

AB Testing Inspiration using Emotional Triggers Infographic

Four Steps to AB Testing Inspiration

The infographic covers the four steps of her process.

  1. Emotional Competitor Analysis
  2. Emotional SWOT Analysis
  3. Emotional Content Strategy
  4. Testing

Emotional Competitor Analysis

According to Wolf, this step helps you understand “where the market is emotionally”. It also shows you where you fit.

Choose ten to fifteen competitors (or as many as you can) and rate each one by four criteria.

  • What their base message is.
  • How they use color.
  • How they use images.
  • What emotional triggers they appeal to.

One of our Content Scientists was recently looking for a new Wacom graphics tablet. She likes to doodle. All of the retailers offered the tablet at the same price, so the only differentiator would be message, color, images, and emotional triggers.

Here are some of the sites she visited before buying.

Best Buy communicates emotions of service and trust on its product pages.

Best Buy communicates service and trust on its product pages.

Message: Best Buy’s is trust and safety. They offer star ratings, price match guarantees, free shipping and more to show safety.

Color: The dark blue color of their site says “Trust” and “Logic” but may also say “Coldness” and “Aloofness”.

Images: In an ecommerce environment, high-resolution images are usually helpful to buyers. Interestingly, they offer pictures of all sides of the box.

Emotional Triggers: Trust us to sell the right product at the right price.

Rakuten communicates emotions of spontineity and action on its product pages.

Rakuten communicates emotions of spontineity and action on its product pages.

Message: Shopping is Entertaining! We sell lots of things, and just give you the facts. This is an informational presentation with detailed headings, stocking status.

Color: Red is excitement, passion. It can also mean aggression and stimulation.

Images: Use of icons (promotions, shipping, credit card). Limited product images.

Emotional Triggers: Spontaneity. You’ve found the product. Take action.

Emotional S.W.O.T.

SWOT stands for Strengths, Weaknesses, Opportunities and Threats. It’s a common way to generate market strategies in almost any context. Wolf asks us to consider these from an emotional standpoint.

As the infographic shows, the strengths and weaknesses pertain to our business. Do we have a strong message? Are we using color and images powerfully? What emotional triggers are we tripping. The opportunities and threats relate to the industry we are in. Our emotional competitor analysis helps us define these.

Emotional Content Strategy

When we look at our emotional strengths and weaknesses, we can ask the question, “How do we want to make our customers feel?” This helps us define our emotional content strategy and related hypotheses.

In our examples above, Best Buy wrote their own product summary description. Rakuten uses the manufacturer-supplied copy and images. Best Buy communicates “Trust” by focusing on service. Rakuten focuses on “Act Now” with availability and price information. If we wanted to find a unique emotional content strategy, we might focus on building relationships. Messaging and images might showing employees who care and customers who are happy.

AB Testing

Regardless of how much research we do, we can never be sure we’ve hit the right combination until we do a test. Wolf recommends creating two treatments to test.

The first is based on our competitors’ approaches. The second is based on our research on emotional triggers and content. Each combines five aspects:

  1. Emotions
  2. Elements
  3. Words
  4. Visuals
  5. Color

If emotion is at the heart of purchases, then understanding how to integrate emotional triggers into our persuasive designs is critical to success.

It is true that we can learn important things from an “inconclusive” AB test. But that doesn’t mean we like inconclusive tests. Inconclusive tests occur when you put two or three good options out for an AB test, drive traffic to these options and — meh — none of the choices is preferred by your visitors.

  1. Our visitors like the page the way it is (we call this page the “control”), and reject our changed pages.
  2. Our visitors don’t seem to care whether they get the control or the changed pages.

Basically, it means we tried to make things better for our visitor, and they found us wanting. Back to the drawing board.
Teenagers have a word for this.
It is a sonic mix of indecision, ambivalence, condescension and that sound your finger makes when you pet a frog. It is less committal than a shrug, less positive than a “Yes,” less negative than a “No” and is designed to prevent any decision whatsoever from being reached.
It comes out something like, “Meh” It is a word so flaccid that it doesn’t even deserve any punctuation. A period would clearly be too conclusive.
If you’ve done any testing at all, you know that your traffic can give you a collective “Meh” as well. We scientists call this an inconclusive test.
Whether you’re testing ad copy, landing pages, offers or keywords, there is nothing that will deflate a conversion testing plan more than a series of inconclusive tests. This is especially true you’re your optimization program is young. Here are some things to consider in the face of an inconclusive test.

1. Add Something Really Different To The Mix

Subtlety is not the split tester’s friend.
Subtlety is not the split tester’s friend. Your audience may not care if your headline is in 16 point or 18 point font. If you’re getting frequent inconclusive tests, one of two things are going on:

  1. You have a great “control” that is hard to beat, or
  2. You’re not stretching enough

Craft another treatment, something unexpected and throw it into the mix. Consider a “well-crafted absurdity” a la Groupon. Make the call to action button really big. Offer something you think your audience wouldn’t want.

2. Segment Your Test

We recently spent several weeks of preparation, a full day of shooting, and thousands of dollars on talent and equipment to capture some tightly controlled footage for video tests on an apparel site. This is the sort of test that is “to big to be inconclusive.” However, video is currently a very good bet for converting more search traffic.
Yet, our initial results showed that the pages with video weren’t converting significantly higher than the pages without video. Things changed when we looked at individual segments, however.
New visitors liked long videos while returning visitors liked shorter ones. Subscribers converted at much higher rates when shown a video recipe with close-ups on the products. Visitors who entered on product pages converted for one kind video while those coming in through the home page preferred another.
It became clear that, when lumped together, one segment’s behavior was cancelling out gains by other segments.
How can you dice up your traffic? How do different segments behave on your site?
Your analytics package can help you explore the different segments of your traffic. If you have buyer personas, target them with your ads and create a test just for them. Here are some ways to segment:

  • New vs. Returning visitors
  • Buyers vs. prospects
  • Which page did they land on?
  • Which product line did they visit?
  • Mobile vs. computer
  • Mac vs. Windows
  • Members vs. non-members

3. Measure Beyond the Click

Here’s a news flash: we often see a drop in conversion rates for a treatment that has higher engagement. This may be counter-intuitive. If people are spending more time on our site and clicking more — two definitions of “engagement” — then shouldn’t they find more reasons to act?
Apparently not. Higher engagement may mean that they are delaying. Higher engagement may mean that they aren’t finding what they are looking for. Higher engagement may mean that they are lost. So, if you’re running your tests to increase engagement, you may be hurting your conversion rate. In this case, “Meh” may be a good thing.
In an email test we conducted for a major energy company, we wanted to know if a change in the subject line would impact sales of a smart home thermostat. Everything else about the emails and the landing pages were identical.
The two best-performing emails had very different subject lines, but identical open rates and click-through rates. However, sales for one of the email treatments was significantly higher. The winning subject line had delivered the same number of clicks, but had primed the visitors in some way making them more likely to buy.
If you are measuring the success of your tests based on clicks, you may be missing the true results. Yes, it is often more difficult to measure through to purchase, subscription or registration. However, it really does tell you which version of a test is delivering to the bottom line. Clicks are only predictive.

4. Print A T-shirt That Says “My Control Is Unbeatable”

Ultimately, you may just have to live with your inconclusive tests. Every test tells you something about your audience. If your audience didn’t care how big the product image was, you’ve learnd that they may care more about changes in copy. If they don’t know the difference between 50% off or $15.00 off, test offers that aren’t price-oriented.
Make sure that the organization knows you’ve learned something, and celebrate the fact that you have an unbeatable control. Don’t let “Meh” slow your momentum. Keep plugging away until that unexpected test that gives you a big win.

This was adopted from an article that appeared on Search Engine Land.

Here are six tips for getting your A/B testing right. These were captured at Affiliate Summit West 2016 and presented by Digital Marketer’s Justin Rondeau.

Focus on Process Not Hacks

Don’t just try what others  say works. Have a process that allows you to know your MARKET.

Your A/B Testing effort should focus on process.

Your A/B Testing effort should focus on process.

Measure Multiple Metrics that Matter

Measure the right metrics for the part of the funnel you’re testing

You'll track different kinds of A/B testing metrics depending on where your visitors are in the sales funnel.

You’ll track different kinds of metrics depending on where your visitors are in the sales funnel.

Use Analytics to Identify Problems

Don’t just test anything. Use analytics to identify problem pages.

Take the Guesswork out of A/B Testing

Take the Guesswork out of Testing

Fix What’s Broken. Only Test What’s Ambiguous

If it’s broke, don’t bother testing it. Just fix it.

Test Persuasive and intuitive issues. Sometimes test Usability. Otherwise just fix the problem.

Test Persuasive and intuitive issues. Sometimes test Usability. Otherwise just fix the problem.

Schedule a Finite Time to Stop

Don’t expect your tests to just run until they’re successful or lose. Testing has an opportunity cost.

Conversion Optimization is about meeting user expectations.

Conversion Optimization is about meeting user expectations.


This instagraphic was captured live by Brian Massey of Conversion Sciences.
 
Applying Optimization Fundamentals Infodoodle-Justin Rondeau-Affiliate Summit West 2016 600x2288

Applying Optimization Fundamentals Infodoodle from Justin Rondeau’s Affiliate Summit West 2016 presentation.



Here’s a common question: “How do you increase conversions when you only get a small amount of traffic?”

The first answer is, go get more traffic.

The closer your conversions are to zero, the closer your conversion optimization efforts will be to guessing.

You can do statistical optimization using split testing if you have enough conversions, but this usually comes with more traffic.

The second answer is to get more conversions so you can do conversion optimization to get more conversions. Which came first, the conversion or the optimizer?

This last point is, of course, the proverbial “rub.”

Here’s how to get started if you are running low-traffic websites.

Get Accurate Data

Be sure your analytics is setup properly. I offer an analytics setup checklist to help with Google Analytics. You’ll want to avoid blind spots such as overlay windows, tabbed content, and subdomains on separate analytics accounts.

You’re going to need a good source of data when you start picking things to test.

Compare your analytics data to a secondary dataset. Compare lead conversions to your CRM. Compare transactions reported to your accounting system. Your analytics should be within 15% of reality. Don’t be afraid to install a secondary analytics package to verify your main analytics setup.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three

"*" indicates required fields

This field is for validation purposes and should be left unchanged.


Get Some Qualitative Data

Low-traffic websites need to get more qualitative data. Right now, the one-stop-shop for qualitative data is HotJar. It offers click-tracking, session recording and feedback surveys. For alternatives, check out the ConversionDashboard.com.

Low-traffic Websites Use Serial Tests

If you don’t have the conversions to do split testing, you’ll want to do serial testing. This simply means making a single small change to your site and letting it run for at least two weeks. Since you have solid analytics (see above), you can see if there is an improvement in performance.

Measure More Than Conversions

There are some predictive metrics that you can use to gauge the performance of your serial tests.

  1. Bounce rate on landing pages
  2. Click-through-rate on key internal pages
  3. Add-to-cart for ecommerce sites
  4. Form completion percent
  5. Abandonment rate (basically the opposite of the last two)

Time on page, time on site, and pages per visit are to be taken with a grain of salt. Increasing these may correlate with lower conversion rates.

Start with the Message

Nothing works until your value proposition is strong. I recommend testing changes to your value proposition.

Nothing works until your value proposition is strong. I recommend testing changes to your value proposition. I’ve done hundreds of free strategy consultations over the years. Most of the time, I ask the consultee to tell me about their business. Typically, I get a concise, clear statement of the offering and value.

Rarely does this clarity appear on the website.

Sit with a copywriter and tell your story. Then, don’t edit them. Whatever they come up with, try it.

You should also test:

  1. Headline
  2. Call-to-action button text
  3. Pictures. If you can’t write a meaningful caption for an image, change it.
  4. Add sub-headlines
  5. Add bulleted lists

Don’t bury the lead. A great headline — called the “lead” — is the core of a strong value proposition. Often the headline that would best “grab” a reader is buried somewhere in the copy.
Find the headline that gets visitors to read your value proposition, and you’ll have the cornerstone of conversion in place.

Look for Big Wins

You’re going to have to find what we call “big wins.” This means that your change increased conversions by more than 50%. Rich Page wrote on low-traffic testing. My comment on his post was as follows:

You can also split testing with less than 100 conversions. You just need really big wins. If you have a treatment with 20 conversions and another with 40 conversions, a 100% difference is something you can probably bank on, even with such small numbers. However, if one treatment got 20 conversions and the other got 30, that 50% increase is too close to the margin of error and shouldn’t be considered an improvement (even though it feels like a win).

Technically, it’s OK to make a treatment with, say, a 30% increase the new control. Just know that you’re not likely to continue to see such an increase with small transaction amounts.

Ditch Your Outliers

You’re going to have to eliminate “outliers” in your data. Outliers include extreme orders in ecommerce sites and rushes of leads from activities such as email blasts and bursts of word of mouth.
For an ecommerce site, you should look at orders that are one or two standard deviations away from the mean.
So, what does that “mean?”
Here is two weeks of daily sales data for a site that gets about one sale per day.

There are two obvious outliers: One day with no sales in the first week, and one with $160 in sales the second week. Statistically, a 16% increase is irrelevant, but the point is driven home when you calculate the standard deviation range.
For this data, an outlier will be lower than $27.90 or higher than $86.89.

When we remove outliers we see a drop in sales of six percent. This is statistically uninteresting as well, but illustrates how outliers can affect results.
If you’d like to see how I calculated the min and max, download Example of Outliers-Conversion Scientist-low-traffic post.

Don’t Let it Run

Split testing can be done on low-transaction sites. However, don’t let the test run for more than, say, six weeks. The results just aren’t reliable. There are too many other variables mucking with your data over such long timeframes.

Always Be Testing

Just because you have few transactions per month doesn’t mean you can’t be learning. In fact, not learning may well be the reason you have few transactions per month. Never stop trying things, and use good data to decide what you keep and what you throw away.
Feature image by Shaun Garrity via Compfight cc and adapted for this post.

Today’s question is at the heart of AB testing. “How do you decide what elements of a site to test?” We call the test “hypotheses.”
But, a better question is, “How do you determine what NOT to test.”

It’s relatively easy to come up with ideas that might increase your conversion rate. We typically come up with fifty, seventy-five, one-hundred or more ideas for each of our client sites. Filtering through this list is the hard part.

Conversion-Scientist-Podcast-Logo-1400x1400


Subscribe to Podcast

The Five Steps

In this week’s podcast, I take you through the five steps we use to determine what to test on a website.

  • Step One: Look for Evidence
  • Step Two: Rate the Traffic
  • Step Three: How Hard is it to Test?
  • Step Four: What does experience tell you?
  • Step Five: Bucket the Winners

We’re pretty good at picking low-hanging fruit. Last year 97% of our clients continued working with us after our initial six-month Conversion Catalyst program that uses this approach.

Each of our hypotheses gets an ROI score using the following formula:

ROI = Evidence + Traffic Value + History – Level of Effort

Once we’ve ranked all of our hypotheses, we classify them into buckets.

The top ten hypotheses reveal an interesting pattern when you bucket them.

The top ten hypotheses reveal an interesting pattern when you bucket them.

Bucketing Your Hypotheses

I also talk about how we classify hypotheses into buckets.

  1. User Experience: For hypotheses that would alter the layout, design, or other user interface and user experience issues.
  2. Credibility and Authority: For hypotheses that address trust and credibility issues of the business and the site.
  3. Social Proof: For hypotheses that build trust by showing others’ experiences.
  4. Value Proposition: For hypotheses that address the overall messaging and value proposition. Quality, availability, pricing, shipping, business experience, etc.
  5. Risk Reversal: For hypotheses that involving warranties, guarantees and other assurances of safety.

This helps us understand what the primary areas of concern are for visitors to a site. Are there a lot of high-ranked hypotheses for Credibility and Authority? We need to focus on building trust with visitors.

There’s much more detail in the podcast and my Marketing Land column 5 Steps to Finding the Hidden Optimization Gems.

As a Conversion Scientist, I used my background in Conversion Rate Optimization and Landing Pages to create the first draft of my OkCupid profile, the landing page of me. I utilized the chemistry of a successful landing page formula to make sure I hit all the known conversion points. OkCupid’s setup will limit me in the type of test I do. We’ll be doing pre/post testing so I started by putting my best page up, letting it run for two weeks and calculating my “pre” conversion rate.

This is a key piece of knowledge for any business ready to test – know your base conversion rate.

During the first 14 days my profile was live, I had 104 visitors with nine messages. Those nine messages resulted in four qualified leads. My starting overall conversion rate is 8.65%. My qualified lead conversion rate is 3.84%.

My first stop in testing was a critique with an expert in landing pages. Lucky for me, I work for one. Sometimes, it’s difficult to asses your own work, so calling in an outside expert is always a great place to start.

The Conversion Scientist, Brian Massey, was nice enough to do one of his famous live critiques. In his video critique he pointed out blind spots and a few things that might be troubling.

If you’re not ready to call in an expert, there are tools you can use to give you a better sense of what might be happening. As a Conversion Scientist, I always start with analytics, click-tracking heatmaps, and screen capture sessions. These data points allow me to come up with a hypothesis list.

When creating a hypothesis list for a client, analytics is always the first stop. It allows me to identify key pages and performance metrics. I look at landing pages, all pageviews, audience channels and conversion metrics for each. This is where I start to see patterns and look for what pages I should be testing.

Questions to ask when looking at analytics:

  • Where are visitors coming from?
  • Which pages are they landing on?
  • Which pages get the highest traffic?
  • What are the key pages in the funnel?
  • Are there pages with high exit or bounce rates?

I use this data to compile a list of key pages I want to look at more closely.

With OkCupid — and most landing pages — it’s pretty easy to know what to target. Visitors are coming from /match or /quickmatch pages and coming to my profile landing page.

Once I know what pages I will focus on, I switch to another set of tools. Heatmaps and Session Recordings provide a lot of insight into where visitors are getting hung up. The data these tools generate is a hot bed for hypothesis generation.

They allow me to see if a key call-to-action is in blind spot or if something on my page is getting surprise attention. Check out the Conversion Lab for a list of awesome conversion tools options.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three

"*" indicates required fields

This field is for validation purposes and should be left unchanged.


Even though OkCupid won’t let me install Crazy Egg or Hotjar, I’m still going to treat my dating landing page like I would a client’s website when I start the optimization process. I make a list of hypotheses I think could improve the conversion rate and come up with a plan of action about how to test each one.

Normally the resources I can install on a client’s website inform the hypothesis list and the recommendations I come up with, so I have to be creative by relying on my own experience and on an expert’s opinion, namely Brian Massey.

Here are a few hypotheses from his analysis.

I create a list of hypotheses to test when I begin optimizing

I create a list of hypotheses to test when I begin optimizing

Brian’s critique gave me some great ideas on what to test. I know that my copy needs a bit of work, as does my landing page’s scannability. This is the first hypothesis I’m testing:

Hypothesis: If I change the copy to be about the visitor, instead of myself and improve scannability with bold text and paragraph breaks I can improve conversions.

I carefully changed all of the “I” statements and made them about the visitor. I also added more paragraph breaks and highlighted key words in my text allowing a visitor to more easily scan my profile.

revised okcupid profile

My revised profile

When testing, it’s important to isolate as many variables as possible, so for now the copy is the only thing I changed. I could have swapped out my headshot for a party shot, but if I see an increase in conversion rate, I won’t know if it’s the photo or the copy that’s improving my numbers.

For our testing purposes, my primary goal will be to beat my qualified lead conversion rate of 3.84%, but I will be tracking my overall conversion rate and visitor count as well.

I’m going to want to test more than one hypothesis to get this profile just right. For my next test, I’ll focus on images. Choosing the right images are vital to the success of a landing page, maybe even more so on this particular type of landing page. Since my next test will focus on the images. I did some research, scouring the internet for articles from online dating experts and determined the best profile photos were a smiling woman looking at the camera, showing some skin but not too much skin.

I had a small selection of photos I thought would fit the bill so I decided to take an informal poll of men that fit the type I was looking for: I asked a bunch of my guys friends to help me choose a photo. The photo of me in a black sleeveless dress smiling warmly at the camera was the clear winner. I filled out the rest of my profile photos with a variety of activities and a few shots of me dressed up a bit to show that while I may wear a lab coat to work, I do clean up okay for a night on the town.

This first test isn’t about the images, but after Brian’s critique, I knew that my images might not be saying what I wanted them to say. For this initial pre/post test, I left the photo winners from my poll as they were but added captions to clarify what I wanted the viewer to get from each image.

I've shared what I was doing when this photo was taken and also indicated that it's a fairly recent photo

I’ve shared what I was doing when this photo was taken and also indicated that it’s a fairly recent photo.

With my changes made and my visitor count ticking up, there’s nothing to do but wait and see. We’ll check back in a week (and I’ll look every day in between) to see how my text changes have fared. With any luck (or in my case, with science), I’ll have upped that 3.8% conversion rate.

How many goals do you set when you’re designing a split test for your website?

We’re goal-crazy here in the lab at Conversion Sciences. It is not unusual for our tests have dozens of goals. Why is that?

We see split testing as a data collection activity, not a tool that gives us answers. It’s not like wikipedia. The split-testing software on the market to day is amazingly agile when it comes to tracking, targeting and snooping on visitor behavior. We certainly want to track transactions, revenue and leads. But we learn so much more from our tests.

Conversion-Scientist-Podcast-Logo-1400x1400


Subscribe to Podcast

In my new Marketing Land column The Multi-Goal Magic Of Split Testing Software, I describe how we use some of these goals to find sweet spots in a website.

  • Find out how to “light up” a funnel that is invisible to analytics.
  • Discover the pages are most influential in converting.
  • Segment your audience based on their behaviors.

You can listen to the column or read it for yourself.

The Mobile Web is still in its infancy.  Today, alleged “mobile best-practices” are nothing more than successful desktop strategies scaled to a smaller screen.  But people behave differently on small-screen devices than they do when they are sitting at a computer.

Conversion Sciences has begun to see what Mobile Web 2.0 will look like. Having completed dozens of mobile design split tests, key trends have begun to show themselves. Much of what we have learned flies in the face of conventional beliefs.

This is why we test.

Some of our customers now have higher converting mobile sites than desktop sites.

Our approach to mobile design is controversial because, as scientists, we can’t just accept traditional wisdom at face value.  We need evidence.

Joel Harvey will be reveals the results of dozens of tests we’ve completed.  Insights are based on real tests. No gut instinct here.  Watch Mobile 2.0: Judgment Day to learn what he has discovered. He shares:

  • Can mobile websites can convert better than the desktop?
  • How to increase mobile conversion rates.
  • What is poison to your mobile conversion rate.
  • How iPhone and Android visitors act differently.

Watch the replay on demand in its glorious entirety.

Don’t ignore your mobile traffic. It can be a real revenue generator sooner than you think.

© Copyright 2007 - 2024 • Conversion Sciences • All Rights Reserved