In this post, we share nine creative approaches to AB Testing your landing page. Tests that go beyond the basics to expand on what you’re already doing. They might even inspire you to come up with your own imaginative AB tests.

There are a surprising number of AB tests smart marketers can run on their landing pages to ensure they’re getting the highest conversion rates possible. You already know you can experiment with different headlines, subheads, call to action text and colors, and you know to introduce design elements that draw the eye where you want visitors to look.

1. Create Targeted Landing Pages for A B Testing

This is an AB test you would apply at specific times, such as when you launch an advertising campaign. Your advertising campaign would be targeted to a specific audience, and your ad variations would use your generic landing page and your targeted landing page.

For example, if Shopify did an AB test between their generic vs. targeted landing pages in an advertising campaign, the control would apply to anyone looking for an eCommerce solution.

Shopify's homepage with a non-specific target would be the control in this AB Test

Shopify’s homepage with a non-specific target would be the control in this AB test

For a variation, they could use their landing page targeted to booksellers.

This landing page has a specific group of users in mind

This landing page has a specific group of users in mind

To maximize your conversions with this test, analyze your customer database to determine who your best customers are and create targeted landing pages focused on those specific groups of customers. If you do a quick search of Shopify’s targeted landing pages on Google, you can see they have done this and created targeted landing pages for their top customer groups.

There are different landing pages to accommodate different demographics and groups of users

There are different landing pages to accommodate different demographics and groups of users

You can use these as examples of how to create targeted landing pages for your top customer groups and demographics.

2. Experiment with Animated Headlines

Headlines can make or break your landing page, as they are the first words that capture your visitor’s attention. It’s a simple AB test that can make a dramatic impact on conversions, and it can be done using most AB testing tools including Marketizator.

The key to the headline AB test is to change nothing but the headline. For example, you can see how ActiveCampaign changed from a simple headline about their features…

ActiveCampaign's headline about features

ActiveCampaign’s headline about features

…to a better headline about the benefits their customers can expect when using their service.

ActiveCampaign's headline about benefits

ActiveCampaign’s headline about benefits

This change proved to be a winner, as they have kept it since 2015 with a constantly changing message about their benefits.

This animated headline with benefits proved to be a winner

This animated headline with benefits proved to be a winner

3. Vary Featured Homepage Products

In most cases, your homepage will be the most popular landing page on your website. Hence, it will be a page where you should do extensive AB testing. One test you can do to see if you can increase conversions is simply changing your featured product. Nest does this by swapping out its popular monitoring camera (formerly Dropcam)…

Nest featuring its Nest Cam on the homepage

Nest featuring its Nest Cam on the homepage

…with its popular thermostat controller.

Nest featuring its thermostat on its homepage

Nest featuring its thermostat on its homepage

At the time of writing this article, this test was currently running (as noted by the ?alt=1 that appears at the end of their homepage URL on occasion), so we’ll have to see which one wins in the end. In some cases, you may want to change your featured product based on the one that is currently getting featured coverage in the media and whether that coverage is positive or negative.

4. Creative AB Testing Landing Page Idea: Explore Different Stories

Do stories resonate with your customers and if so, which stories translate into the most sales? Find out through AB testing. Apple did this in this past by running multiple campaigns on their website, social media, and television ads featuring stories about musicians…

Apple's story appealing to musicians

Apple’s story appealing to musicians

…explorers…

Apple's story appealing to explorers

Apple’s story appealing to explorers

…environmentalists…

Apple's story appealing to environmentalists

Apple’s story appealing to environmentalists

…parents…

Apple's story appealing to parents

Apple’s story appealing to parents

…and many other customer groups. The goal was to show how their products could aid in the telling of everyone’s story, no matter what you did or how much of an impact you made in the world. Apple ultimately went back to a homepage focused on its latest products, but without AB testing, Apple couldn’t just assume that approach would convert the highest.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three
  • This field is for validation purposes and should be left unchanged.

5. AB Testing your Landing Page: Shift the Focus to One Product at a Time

Landing pages have better conversion rates when there is one, clear call-to-action; sometimes that CTA is buying a product. But when your top landing page is your homepage, you can’t focus on just one product, right?
Maybe you can. In the past, Logitech had a pretty standard homepage that offered up all of its products all at once to homepage visitors.

Logitech's original, more traditional homepage

Logitech’s original, more traditional homepage

But now, Logitech gives visitors a tour through their top products, one at a time.

Logitech's new homepage focusing on a single product

Logitech’s new homepage focusing on a single product

In a few moments, you are completely immersed in a particular product and its main benefit, thanks to this new landing page. It’s definitely a way of AB testing your landing page you will want to try if you have a few products you can highlight in this fashion.

6. Transform Your Hero Image to Video

You know that great image of your product that you use to compel people to sign up or buy? Why not convert it into a video that dives even deeper into exactly what happens with that feature of your product. MailChimp made that change with their landing page, going from a screenshot of their newsletter designer…

Mailchimp's landing page with an image of newsletter editor

Mailchimp’s landing page with an image of newsletter editor

…to a video of how their newsletter designer worked.

Mailchimp's landing page with a video of newsletter editor

Mailchimp’s landing page with a video of newsletter editor

Instead of hoping that an image would convince visitors that their newsletter designer was easy to use, as the landing page claimed, the video was right there to prove it.
While their current landing page features a different, shorter animation, it still features one that demonstrates the ease of use of their newsletter designer, thus proving that since 2012, video and animation on the landing page beats a screenshot for conversions.

7. Try a Change of Scenery when AB Testing your Landing Page

Sometimes it’s not text or functionality that will make your site convert better, it’s simply imagery that matches the story of your value proposition. Zillow experimented with this idea by changing the background of its search. One variation was a neighborhood overview with home sale prices, which actually contradicts the line below the image about looking for rentals.

Zillows background showing homes with their prices.

Zillow’s background showing homes with their prices isn’t relevant to people looking for rental property, especially with a message at the bottom of the page specifically written for renters.

Another variation used an image of a specific home, which could appeal to both for sale and for rent searchers.

Zillow's new image says home whether you're buying or renting

Zillow’s new image says “home” whether you’re buying or renting

It seems that they have stuck with the individual home view as it works with what most searchers are looking for.

8. Rearrange the Elements

It may not be your product, your service, your copy, your colors, or other elements on your page that are lowering your conversions. It may simply be the arrangement. Just like when you rearrange all the furniture in your house because it just doesn’t quite feel right, you might want to do the same with your landing page. Take AgoraPulse, for example. They went from this…

Agora Pulse homepage before

AgoraPulse homepage before

…to this.

Agora Pulse homepage after

AgoraPulse homepage after

It’s easy to see why the latter layout works. It flows right into starting your free trial after a simple and convincing headline and subhead. And for visitors still not convinced they should convert, there’s simple video and bullet points to convince them to click that call to action button.

9. Copy Your Competitors

The most creative AB tests might be ones you don’t run on your own website. In addition to AB testing tools, there are tools that will alert you to when your competitors make changes to their websites, potentially based off of their own AB Testing landing page experiments.

Rival IQ monitors your competitor’s website to see if changes have been made to it recently. The entry level plan allows you to track up to 15 companies. You’re able to track each company’s website design history along with their social media accounts, organic search rankings, and paid search traffic.

In the website history dashboard, you can view a variety of web pages from your competitor's websites.

In the website history dashboard, you can view a variety of web pages from your competitor’s websites.

Depending on how long the company has been in Rival IQ’s database, you can get a couple of year’s worth of design history from each company.

ActiveCampaign's design history over a few years

ActiveCampaign’s design history over a few years

When you click on a particular month, you see the breakdown of when particular changes occurred as well as the ability to click upon a particular design to see the full landing page.

ActiveCampaign's homepage design history over Novemer and December 2014

ActiveCampaign’s homepage design history over Novemer and December 2014

This will give you an idea of what AB tests a competitor has run in the past and, based on the length of time a competitor has stuck with a particular design, will let you know which test was the presumed winner in terms of increasing their conversions for their homepage, pricing page, features page, or other significant pages.

ActiveCampaign's pricing page history over November and December 2014

ActiveCampaign’s pricing page history over November and December 2014

In addition, you can sign up for email alerts when your competitors make major changes to their website. This will let you know when your competitors have run new tests on their website and made changes based on their results. Or you may even see the tests themselves in action as the pages change from their original to alternative versions.

If you have a lot of competitors, and you’re not sure which to monitor, you can use the BuiltWith Google Chrome extension to find out if a particular competitor’s website is using AB testing software. Chances are, the ones that are will be the ones that will be making frequent changes.

BuiltWith browser extention for analytics and tracking

BuiltWith browser extention for analytics and tracking

What are your imaginative approaches to AB testing your landing page?

If you’ve already done the standard AB tests on your landing pages and found the best headlines, subheads, and call to action buttons, then we hope that these additional creative AB tests will further help you increase conversions on your website as it may have for the brands mentioned here. Have you been running or reading about some interesting AB tests? We’d love to hear about them in the comments.

About the Author

Kristi Hines is a freelance writer, blogger, and social media enthusiast. You can follow her latest tweets about business and marketing @kikolani or posts on Facebook @kristihinespage to stay informed.
Feature image by Serge Saint via Compfight cc and adapted for this post.

AB Testing is only effective when you’re testing something meaningful. This is especially true on small-screen devices we call smartphones, or “Mobile” generically.
Talia Wolf believes that the root of every conversion = Human Behavior. We certainly wouldn’t argue with her. She also believes that emotion is at the core of human behavior. Her strategy for designing web pages that leverage emotional triggers was one of our favorite at ConversionLX Live.
I took notes on her presentation and share them here as an instagraph infographic.

AB Testing Inspiration using Emotional Triggers Infographic

AB Testing Inspiration using Emotional Triggers Infographic

Four Steps to AB Testing Inspiration

The infographic covers the four steps of her process.

  1. Emotional Competitor Analysis
  2. Emotional SWOT Analysis
  3. Emotional Content Strategy
  4. Testing

[sitepromo]

Emotional Competitor Analysis

According to Wolf, this step helps you understand “where the market is emotionally”. It also shows you where you fit.
Choose ten to fifteen competitors (or as many as you can) and rate each one by four criteria.

  • What their base message is.
  • How they use color.
  • How they use images.
  • What emotional triggers they appeal to.

Our Content Scientist Trina Bolfing was recently looking for a new Wacom graphics tablet. She likes to doodle. All of the retailers offered the tablet at the same price, so the only differentiator would be message, color, images, and emotional triggers.
Here are some of the sites she visited before buying.

Best Buy communicates emotions of service and trust on its product pages.

Best Buy communicates service and trust on its product pages.


Message: Best Buy’s is trust and safety. They offer star ratings, price match guarantees, free shipping and more to show safety.
Color: The dark blue color of their site says “Trust” and “Logic” but may also say “Coldness” and “Aloofness”.
Images: In an ecommerce environment, high-resolution images are usually helpful to buyers. Interestingly, they offer pictures of all sides of the box.
Emotional Triggers: Trust us to sell the right product at the right price.
 
Rakuten communicates emotions of spontineity and action on its product pages.

Rakuten communicates emotions of spontineity and action on its product pages.


Message: Shopping is Entertaining! We sell lots of things, and just give you the facts. This is an informational presentation with detailed headings, stocking status.
Color: Red is excitement, passion. It can also mean aggression and stimulation.
Images: Use of icons (promotions, shipping, credit card). Limited product images.
Emotional Triggers: Spontaneity. You’ve found the product. Take action.

Emotional S.W.O.T.

SWOT stands for Strengths, Weaknesses, Opportunities and Threats. It’s a common way to generate market strategies in almost any context. Wolf asks us to consider these from an emotional standpoint.
As the infographic shows, the strengths and weaknesses pertain to our business. Do we have a strong message? Are we using color and images powerfully? What emotional triggers are we tripping. The opportunities and threats relate to the industry we are in. Our emotional competitor analysis helps us define these.

Emotional Content Strategy

When we look at our emotional strengths and weaknesses, we can ask the question, “How do we want to make our customers feel?” This helps us define our emotional content strategy and related hypotheses.
In our examples above, Best Buy wrote their own product summary description. Rakuten uses the manufacturer-supplied copy and images. Best Buy communicates “Trust” by focusing on service. Rakuten focuses on “Act Now” with availability and price information. If we wanted to find a unique emotional content strategy, we might focus on building relationships. Messaging and images might showing employees who care and customers who are happy.

AB Testing

Regardless of how much research we do, we can never be sure we’ve hit the right combination until we do a test. Wolf recommends creating two treatments to test.
The first is based on our competitors’ approaches. The second is based on our research on emotional triggers and content. Each combines five aspects:

  1. Emotions
  2. Elements
  3. Words
  4. Visuals
  5. Color

Examples

For some examples, I recommend checking out Talia Wolf’s deck on slideshare. She focuses on dating applications to help explain how emotional triggers can be laid out onto the screen.
[slideshare id=60398911&doc=riu8gdgr2kappdszvmpv-signature-c4bdb7a72292ae78b59833171bcb37ff0713fbf1792cee73ad4bdb0a8cf2928f-poli-160403034845]
If emotion is at the heart of purchases, then understanding how to integrate emotional triggers into our persuasive designs is critical to success.
[signature]

It is true that we can learn important things from an “inconclusive” AB test. But that doesn’t mean we like inconclusive tests. Inconclusive tests occur when you put two or three good options out for an AB test, drive traffic to these options and — meh — none of the choices is preferred by your visitors.

  1. Our visitors like the page the way it is (we call this page the “control”), and reject our changed pages.
  2. Our visitors don’t seem to care whether they get the control or the changed pages.

Basically, it means we tried to make things better for our visitor, and they found us wanting. Back to the drawing board.
Teenagers have a word for this.
It is a sonic mix of indecision, ambivalence, condescension and that sound your finger makes when you pet a frog. It is less committal than a shrug, less positive than a “Yes,” less negative than a “No” and is designed to prevent any decision whatsoever from being reached.
It comes out something like, “Meh” It is a word so flaccid that it doesn’t even deserve any punctuation. A period would clearly be too conclusive.
If you’ve done any testing at all, you know that your traffic can give you a collective “Meh” as well. We scientists call this an inconclusive test.
Whether you’re testing ad copy, landing pages, offers or keywords, there is nothing that will deflate a conversion testing plan more than a series of inconclusive tests. This is especially true you’re your optimization program is young. Here are some things to consider in the face of an inconclusive test.

1. Add Something Really Different To The Mix

[pullquote]Subtlety is not the split tester’s friend.[/pullquote] Your audience may not care if your headline is in 16 point or 18 point font. If you’re getting frequent inconclusive tests, one of two things are going on:

  1. You have a great “control” that is hard to beat, or
  2. You’re not stretching enough

Craft another treatment, something unexpected and throw it into the mix. Consider a “well-crafted absurdity” a la Groupon. Make the call to action button really big. Offer something you think your audience wouldn’t want.

2. Segment Your Test

We recently spent several weeks of preparation, a full day of shooting, and thousands of dollars on talent and equipment to capture some tightly controlled footage for video tests on an apparel site. This is the sort of test that is “to big to be inconclusive.” However, video is currently a very good bet for converting more search traffic.
Yet, our initial results showed that the pages with video weren’t converting significantly higher than the pages without video. Things changed when we looked at individual segments, however.
New visitors liked long videos while returning visitors liked shorter ones. Subscribers converted at much higher rates when shown a video recipe with close-ups on the products. Visitors who entered on product pages converted for one kind video while those coming in through the home page preferred another.
It became clear that, when lumped together, one segment’s behavior was cancelling out gains by other segments.
How can you dice up your traffic? How do different segments behave on your site?
Your analytics package can help you explore the different segments of your traffic. If you have buyer personas, target them with your ads and create a test just for them. Here are some ways to segment:

  • New vs. Returning visitors
  • Buyers vs. prospects
  • Which page did they land on?
  • Which product line did they visit?
  • Mobile vs. computer
  • Mac vs. Windows
  • Members vs. non-members

[sitepromo]

3. Measure Beyond the Click

Here’s a news flash: we often see a drop in conversion rates for a treatment that has higher engagement. This may be counter-intuitive. If people are spending more time on our site and clicking more — two definitions of “engagement” — then shouldn’t they find more reasons to act?
Apparently not. Higher engagement may mean that they are delaying. Higher engagement may mean that they aren’t finding what they are looking for. Higher engagement may mean that they are lost. So, if you’re running your tests to increase engagement, you may be hurting your conversion rate. In this case, “Meh” may be a good thing.
In an email test we conducted for a major energy company, we wanted to know if a change in the subject line would impact sales of a smart home thermostat. Everything else about the emails and the landing pages were identical.
The two best-performing emails had very different subject lines, but identical open rates and click-through rates. However, sales for one of the email treatments was significantly higher. The winning subject line had delivered the same number of clicks, but had primed the visitors in some way making them more likely to buy.
If you are measuring the success of your tests based on clicks, you may be missing the true results. Yes, it is often more difficult to measure through to purchase, subscription or registration. However, it really does tell you which version of a test is delivering to the bottom line. Clicks are only predictive.

4. Print A T-shirt That Says “My Control Is Unbeatable”

Ultimately, you may just have to live with your inconclusive tests. Every test tells you something about your audience. If your audience didn’t care how big the product image was, you’ve learnd that they may care more about changes in copy. If they don’t know the difference between 50% off or $15.00 off, test offers that aren’t price-oriented.
Make sure that the organization knows you’ve learned something, and celebrate the fact that you have an unbeatable control. Don’t let “Meh” slow your momentum. Keep plugging away until that unexpected test that gives you a big win.
[signature]
This was adopted from an article that appeared on Search Engine Land.

Here are six tips for getting your A/B testing right. These were captured at Affiliate Summit West 2016 and presented by Digital Marketer’s Justin Rondeau.

Focus on Process Not Hacks

Don’t just try what others  say works. Have a process that allows you to know your MARKET.

Your A/B Testing effort should focus on process.

Your A/B Testing effort should focus on process.

Measure Multiple Metrics that Matter

Measure the right metrics for the part of the funnel you’re testing

You'll track different kinds of A/B testing metrics depending on where your visitors are in the sales funnel.

You’ll track different kinds of metrics depending on where your visitors are in the sales funnel.

Use Analytics to Identify Problems

Don’t just test anything. Use analytics to identify problem pages.

Take the Guesswork out of A/B Testing

Take the Guesswork out of Testing

Fix What’s Broken. Only Test What’s Ambiguous

If it’s broke, don’t bother testing it. Just fix it.

Test Persuasive and intuitive issues. Sometimes test Usability. Otherwise just fix the problem.

Test Persuasive and intuitive issues. Sometimes test Usability. Otherwise just fix the problem.

Schedule a Finite Time to Stop

Don’t expect your tests to just run until they’re successful or lose. Testing has an opportunity cost.

Conversion Optimization is about meeting user expectations.

Conversion Optimization is about meeting user expectations.


This instagraphic was captured live by Brian Massey of Conversion Sciences.
 
Applying Optimization Fundamentals Infodoodle-Justin Rondeau-Affiliate Summit West 2016 600x2288

Applying Optimization Fundamentals Infodoodle from Justin Rondeau’s Affiliate Summit West 2016 presentation.


[sitepromo]
[signature]

Here’s a common question: “How do you increase conversions when you only get a small amount of traffic?”

The first answer is, go get more traffic.

The closer your conversions are to zero, the closer your conversion optimization efforts will be to guessing.

You can do statistical optimization using split testing if you have enough conversions, but this usually comes with more traffic.

The second answer is to get more conversions so you can do conversion optimization to get more conversions. Which came first, the conversion or the optimizer?

This last point is, of course, the proverbial “rub.”

Here’s how to get started if you are running low-traffic websites.

Get Accurate Data

Be sure your analytics is setup properly. I offer an analytics setup checklist to help with Google Analytics. You’ll want to avoid blind spots such as overlay windows, tabbed content, and subdomains on separate analytics accounts.

You’re going to need a good source of data when you start picking things to test.

Compare your analytics data to a secondary dataset. Compare lead conversions to your CRM. Compare transactions reported to your accounting system. Your analytics should be within 15% of reality. Don’t be afraid to install a secondary analytics package to verify your main analytics setup.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three
  • This field is for validation purposes and should be left unchanged.

Get Some Qualitative Data

Low-traffic websites need to get more qualitative data. Right now, the one-stop-shop for qualitative data is HotJar. It offers click-tracking, session recording and feedback surveys. For alternatives, check out the ConversionDashboard.com.

Low-traffic Websites Use Serial Tests

If you don’t have the conversions to do split testing, you’ll want to do serial testing. This simply means making a single small change to your site and letting it run for at least two weeks. Since you have solid analytics (see above), you can see if there is an improvement in performance.

Measure More Than Conversions

There are some predictive metrics that you can use to gauge the performance of your serial tests.

  1. Bounce rate on landing pages
  2. Click-through-rate on key internal pages
  3. Add-to-cart for ecommerce sites
  4. Form completion percent
  5. Abandonment rate (basically the opposite of the last two)

Time on page, time on site, and pages per visit are to be taken with a grain of salt. Increasing these may correlate with lower conversion rates.

Start with the Message

Nothing works until your value proposition is strong. I recommend testing changes to your value proposition.

Nothing works until your value proposition is strong. I recommend testing changes to your value proposition. I’ve done hundreds of free strategy consultations over the years. Most of the time, I ask the consultee to tell me about their business. Typically, I get a concise, clear statement of the offering and value.

Rarely does this clarity appear on the website.

Sit with a copywriter and tell your story. Then, don’t edit them. Whatever they come up with, try it.

You should also test:

  1. Headline
  2. Call-to-action button text
  3. Pictures. If you can’t write a meaningful caption for an image, change it.
  4. Add sub-headlines
  5. Add bulleted lists

Don’t bury the lead. A great headline — called the “lead” — is the core of a strong value proposition. Often the headline that would best “grab” a reader is buried somewhere in the copy.
Find the headline that gets visitors to read your value proposition, and you’ll have the cornerstone of conversion in place.

Look for Big Wins

You’re going to have to find what we call “big wins.” This means that your change increased conversions by more than 50%. Rich Page wrote on low-traffic testing. My comment on his post was as follows:

You can also split testing with less than 100 conversions. You just need really big wins. If you have a treatment with 20 conversions and another with 40 conversions, a 100% difference is something you can probably bank on, even with such small numbers. However, if one treatment got 20 conversions and the other got 30, that 50% increase is too close to the margin of error and shouldn’t be considered an improvement (even though it feels like a win).

Technically, it’s OK to make a treatment with, say, a 30% increase the new control. Just know that you’re not likely to continue to see such an increase with small transaction amounts.

Ditch Your Outliers

You’re going to have to eliminate “outliers” in your data. Outliers include extreme orders in ecommerce sites and rushes of leads from activities such as email blasts and bursts of word of mouth.
For an ecommerce site, you should look at orders that are one or two standard deviations away from the mean.
So, what does that “mean?”
Here is two weeks of daily sales data for a site that gets about one sale per day.

There are two obvious outliers: One day with no sales in the first week, and one with $160 in sales the second week. Statistically, a 16% increase is irrelevant, but the point is driven home when you calculate the standard deviation range.
For this data, an outlier will be lower than $27.90 or higher than $86.89.

When we remove outliers we see a drop in sales of six percent. This is statistically uninteresting as well, but illustrates how outliers can affect results.
If you’d like to see how I calculated the min and max, download Example of Outliers-Conversion Scientist-low-traffic post.

Don’t Let it Run

Split testing can be done on low-transaction sites. However, don’t let the test run for more than, say, six weeks. The results just aren’t reliable. There are too many other variables mucking with your data over such long timeframes.

Always Be Testing

Just because you have few transactions per month doesn’t mean you can’t be learning. In fact, not learning may well be the reason you have few transactions per month. Never stop trying things, and use good data to decide what you keep and what you throw away.
Feature image by Shaun Garrity via Compfight cc and adapted for this post.

Today’s question is at the heart of AB testing. “How do you decide what elements of a site to test?” We call the test “hypotheses.”
But, a better question is, “How do you determine what NOT to test.”

It’s relatively easy to come up with ideas that might increase your conversion rate. We typically come up with fifty, seventy-five, one-hundred or more ideas for each of our client sites. Filtering through this list is the hard part.

Conversion-Scientist-Podcast-Logo-1400x1400


The Five Steps

In this week’s podcast, I take you through the five steps we use to determine what to test on a website.

  • Step One: Look for Evidence
  • Step Two: Rate the Traffic
  • Step Three: How Hard is it to Test?
  • Step Four: What does experience tell you?
  • Step Five: Bucket the Winners

We’re pretty good at picking low-hanging fruit. Last year 97% of our clients continued working with us after our initial six-month Conversion Catalyst program that uses this approach.

Each of our hypotheses gets an ROI score using the following formula:

ROI = Evidence + Traffic Value + History – Level of Effort

Once we’ve ranked all of our hypotheses, we classify them into buckets.

The top ten hypotheses reveal an interesting pattern when you bucket them.

The top ten hypotheses reveal an interesting pattern when you bucket them.

Bucketing Your Hypotheses

I also talk about how we classify hypotheses into buckets.

  1. User Experience: For hypotheses that would alter the layout, design, or other user interface and user experience issues.
  2. Credibility and Authority: For hypotheses that address trust and credibility issues of the business and the site.
  3. Social Proof: For hypotheses that build trust by showing others’ experiences.
  4. Value Proposition: For hypotheses that address the overall messaging and value proposition. Quality, availability, pricing, shipping, business experience, etc.
  5. Risk Reversal: For hypotheses that involving warranties, guarantees and other assurances of safety.

This helps us understand what the primary areas of concern are for visitors to a site. Are there a lot of high-ranked hypotheses for Credibility and Authority? We need to focus on building trust with visitors.

There’s much more detail in the podcast and my Marketing Land column 5 Steps to Finding the Hidden Optimization Gems.

As a Conversion Scientist, I used my background in Conversion Rate Optimization and Landing Pages to create the first draft of my OkCupid profile, the landing page of me. I utilized the chemistry of a successful landing page formula to make sure I hit all the known conversion points. OkCupid’s setup will limit me in the type of test I do. We’ll be doing pre/post testing so I started by putting my best page up, letting it run for two weeks and calculating my “pre” conversion rate.

[dating-series]

[pullquote]This is a key piece of knowledge for any business ready to test – know your base conversion rate.[/pullquote]
During the first 14 days my profile was live, I had 104 visitors with nine messages. Those nine messages resulted in four qualified leads. My starting overall conversion rate is 8.65%. My qualified lead conversion rate is 3.84%.
My first stop in testing was a critique with an expert in landing pages. Lucky for me, I work for one. Sometimes, it’s difficult to asses your own work, so calling in an outside expert is always a great place to start.
The Conversion Scientist, Brian Massey, was nice enough to do one of his famous live critiques. In his video critique he pointed out blind spots and a few things that might be troubling.
If you’re not ready to call in an expert, there are tools you can use to give you a better sense of what might be happening. As a Conversion Scientist, I always start with analytics, click-tracking heatmaps, and screen capture sessions. These data points allow me to come up with a hypothesis list.
When creating a hypothesis list for a client, analytics is always the first stop. It allows me to identify key pages and performance metrics. I look at landing pages, all pageviews, audience channels and conversion metrics for each. This is where I start to see patterns and look for what pages I should be testing.
Questions to ask when looking at analytics:

  • Where are visitors coming from?
  • Which pages are they landing on?
  • Which pages get the highest traffic?
  • What are the key pages in the funnel?
  • Are there pages with high exit or bounce rates?

I use this data to compile a list of key pages I want to look at more closely.
With OkCupid — and most landing pages — it’s pretty easy to know what to target. Visitors are coming from /match or /quickmatch pages and coming to my profile landing page.
Once I know what pages I will focus on, I switch to another set of tools. Heatmaps and Session Recordings provide a lot of insight into where visitors are getting hung up. The data these tools generate is a hot bed for hypothesis generation.
They allow me to see if a key call-to-action is in blind spot or if something on my page is getting surprise attention. Check out the Conversion Lab for a list of awesome conversion tools options.
[sitepromo]
Even though OkCupid won’t let me install Crazy Egg or Hotjar, I’m still going to treat my dating landing page like I would a client’s website when I start the optimization process. I make a list of hypotheses I think could improve the conversion rate and come up with a plan of action about how to test each one.
Normally the resources I can install on a client’s website inform the hypothesis list and the recommendations I come up with, so I have to be creative by relying on my own experience and on an expert’s opinion, namely Brian Massey.
Here are a few hypotheses from his analysis.

I create a list of hypotheses to test when I begin optimizing

I create a list of hypotheses to test when I begin optimizing


Brian’s critique gave me some great ideas on what to test. I know that my copy needs a bit of work, as does my landing page’s scannability. This is the first hypothesis I’m testing:
Hypothesis: If I change the copy to be about the visitor, instead of myself and improve scannability with bold text and paragraph breaks I can improve conversions.
I carefully changed all of the “I” statements and made them about the visitor. I also added more paragraph breaks and highlighted key words in my text allowing a visitor to more easily scan my profile.
revised okcupid profile

My revised profile


When testing, it’s important to isolate as many variables as possible, so for now the copy is the only thing I changed. I could have swapped out my headshot for a party shot, but if I see an increase in conversion rate, I won’t know if it’s the photo or the copy that’s improving my numbers.
For our testing purposes, my primary goal will be to beat my qualified lead conversion rate of 3.84%, but I will be tracking my overall conversion rate and visitor count as well.
I’m going to want to test more than one hypothesis to get this profile just right. For my next test, I’ll focus on images. Choosing the right images are vital to the success of a landing page, maybe even more so on this particular type of landing page. Since my next test will focus on the images. I did some research, scouring the internet for articles from online dating experts and determined the best profile photos were a smiling woman looking at the camera, showing some skin but not too much skin.
I had a small selection of photos I thought would fit the bill so I decided to take an informal poll of men that fit the type I was looking for: I asked a bunch of my guys friends to help me choose a photo. The photo of me in a black sleeveless dress smiling warmly at the camera was the clear winner. I filled out the rest of my profile photos with a variety of activities and a few shots of me dressed up a bit to show that while I may wear a lab coat to work, I do clean up okay for a night on the town.
This first test isn’t about the images, but after Brian’s critique, I knew that my images might not be saying what I wanted them to say. For this initial pre/post test, I left the photo winners from my poll as they were but added captions to clarify what I wanted the viewer to get from each image.
I've shared what I was doing when this photo was taken and also indicated that it's a fairly recent photo

I’ve shared what I was doing when this photo was taken and also indicated that it’s a fairly recent photo.


With my changes made and my visitor count ticking up, there’s nothing to do but wait and see. We’ll check back in a week (and I’ll look every day in between) to see how my text changes have fared. With any luck (or in my case, with science), I’ll have upped that 3.8% conversion rate.

[dating-series]

[signature]

How many goals do you set when you’re designing a split test for your website?

We’re goal-crazy here in the lab at Conversion Sciences. It is not unusual for our tests have dozens of goals. Why is that?

We see split testing as a data collection activity, not a tool that gives us answers. It’s not like wikipedia. The split-testing software on the market to day is amazingly agile when it comes to tracking, targeting and snooping on visitor behavior. We certainly want to track transactions, revenue and leads. But we learn so much more from our tests.

Conversion-Scientist-Podcast-Logo-1400x1400


In my new Marketing Land column The Multi-Goal Magic Of Split Testing Software, I describe how we use some of these goals to find sweet spots in a website.

  • Find out how to “light up” a funnel that is invisible to analytics.
  • Discover the pages are most influential in converting.
  • Segment your audience based on their behaviors.

You can listen to the column or read it for yourself.

The Mobile Web is still in its infancy.  Today, alleged “mobile best-practices” are nothing more than successful desktop strategies scaled to a smaller screen.  But people behave differently on small-screen devices than they do when they are sitting at a computer.

Conversion Sciences has begun to see what Mobile Web 2.0 will look like. Having completed dozens of mobile design split tests, key trends have begun to show themselves. Much of what we have learned flies in the face of conventional beliefs.

This is why we test.

Some of our customers now have higher converting mobile sites than desktop sites.

Our approach to mobile design is controversial because, as scientists, we can’t just accept traditional wisdom at face value.  We need evidence.

Joel Harvey will be reveals the results of dozens of tests we’ve completed.  Insights are based on real tests. No gut instinct here.  Watch Mobile 2.0: Judgment Day to learn what he has discovered. He shares:

  • Can mobile websites can convert better than the desktop?
  • How to increase mobile conversion rates.
  • What is poison to your mobile conversion rate.
  • How iPhone and Android visitors act differently.

Watch the replay on demand in its glorious entirety.

Don’t ignore your mobile traffic. It can be a real revenue generator sooner than you think.

The fight for online leads and sales has traditionally been fought at the search engine. That is changing.

Web analytics, bid management, competitive intelligence, ad testing and ad management tools are all common staples of any serious paid search effort. Return on ad spend (ROAS) is being tracked all the way through the sign up or purchase process and ad strategies are being adjusted accordingly.

Quietly, the battle for online leads is moving to a new front. This new front is measured by revenue per visit, and it’s kissing cousin, conversion rate. Like the tide that floats all boats, website optimization is being seen as the way to reduce all marketing costs by dropping the acquisition cost of new prospects and customers.

Why do we say this is happening quietly? That is the conclusion we came to when examining an unusual data set from SpyFu.com. We were able to determine which businesses had conversion optimization tools installed on their website. This, we reasoned, gave us a pretty good idea of which businesses would dominate in the world of online marketing — assuming they were actually using the tools.

Conversion-Scientist-Podcast-Logo-1400x1400


In this month’s podcast, based on the Marketing Land column Data Exposes Scandalously Low Adoption Of Conversion Optimization Tools, Brian the Conversion Scientist explores the usage of conversion optimization tools for two industry segments: Higher Education and B2B Software.

In one report, 73% of businesses are spending between $500 and $5000 per month on paid search ads. Almost a quarter are spending between $5000 and $50,000 per month. Yet, only 14% of businesses have at least one website optimization tool installed.

Who are going to be the winners in this new front? Where does your business fit in this statistic?

To get the most out of his column, download one of the free reports that share all of the data he uses.

In these reports you will learn:

  • Why your team needs time to review analytics.
  • Why businesses with smaller ad budgets should focus more on acquisition costs.
  • How to decrease your Search Ad costs.
  • Why you shouldn’t invest in social media sharing.