Do you know how much money you’re losing to shopping cart abandonment?
Do you know how much more money you could be earning with an optimized checkout experience?
The statistics don’t lie. The average shopping cart abandonment rate currently stands at nearly 70%. That means 7 out of 10 highly qualified leads – people who like your product enough to click “Add to cart” – are being lost during the checkout process for one reason or another.
There is no better place to stop your optimization efforts than the checkout process. Today, we’re going to cover the most common reasons customers abandon ship during checkout and review the anatomy of an optimized checkout experience with the help of this incredible infographic from our friends at SurePayroll.
 
How to Build an Effective Shopping Cart for Your eCommerce Site
Let’s look at some of the key takeaways for eCommerce store owners and optimizers. If your goal is create an optimized checkout experience, the follow points are a must read.

The Top 4 Reasons Customers Abandon Shopping Carts

There are a lot of reasons a given visitor might abandon your website during checkout. Using the data listed above, we can see some major themes about what most influences cart abandonment.

1. Extras Costs & Price Ambiguity

According to consumers, the #1 reason for cart abandonment BY FAR is hidden costs that don’t show up until they have begun the checkout process. In a similar vein, the 4th most cited reason was that consumers were unable to ascertain the total cost of the transaction before starting checkout.
What this tells us is that consumers want to know EXACTLY what to expect when they begin checkout and they absolutely do not want any new information thrown at them along the way.
In some niches, costs like taxes or shipping might be expected and acceptable, while in other niches, they will be considered new information. In all cases, however, extra fees and other costs that aren’t disclosed ahead of time will often result in cart abandonment.

2. Overly Complicated Checkout Process

As we see in the infographic, consumers hate complicated checkouts. They don’t want to create an account. They don’t want to fill in layers of unnecessary information. They don’t want to jump through 5 rounds of hoops.
They want to pay you money, get their stuff, and leave.
The longer and more complicated your checkout process is, the more primed buyers will cancel their transaction instead of paying you money.

3. Checkout Has Errors

If you can’t complete checkout… you can’t complete checkout.
Usability testing and eliminating errors should always be your #1 priority. It’s simple. There’s no guesswork or strategy involved. If you’ve been dragging your feet on this, you are literally throwing away money.
Eliminate checkout errors now.

4. Lack Of Trust In The Website Or Brand

Trust is a very important piece of ecommerce. Thanks to the Wild-West-like landscape of the internet’s opening decade, many consumers have a deeply ingrained level of mistrust towards any brand or website they haven’t already bought from.
While the landscape is much cleaner today, and consumers have many levels of protection in place, those feelings of mistrust tend to surface during the checkout process.
When it’s time to actually pay money, consumers want to be confident that they will get what they paid for. During this process, any signals that can cause skepticism will likely result in cart abandonment. Even just a lack of positive trust signals can be enough to cause abandonment.
It’s important to plant a continuous stream of encouragement, proof, and other trust signals within your checkout process.

How To Reduce Abandonment & Create An Optimized Checkout Experience

Now that we’ve discussed some of the primary reasons consumers abandon ship during checkout, let’s discuss how to reduce abandonment and create an optimized checkout experience for your customers.

1. Tell Users What To Expect Before Checkout

One of the best ways to reduce care abandonment is to tell users exactly what to expect before checkout and then avoid springing new information on them during the checkout process.
For some niches, this is as simple as including the full price on the landing page. For other niches, this can be more complicated.
The travel industry has gotten pretty good at this. In a situation where there are numerous fees in place, they will included all of costs in the price users see, and make a point of clarifying that rate covers all included costs:

If the cost will depend on a future variable, then you can either clarify that a future cost is still on the table – “Shipping cost not included” – or you can change your business model to eliminate variables for the user.

2. Simplify And Streamline The Checkout Process

The checkout process should feel simple and intuitive. Sometimes there are unnecessary steps or requirements that need to be eliminated, but at the same time, putting everything on one page isn’t always the answer.
In the example below, the two-step checkout on the left converted better than the one-step checkout on the right. This is likely due to the one-page checkout overwhelming the visitor with 15+ lines of data to enter, while the 2-part checkout broke things down into more manageable sections.

When it comes to optimizing your checkout experience, it’s very important that your are incorporating a proven A/B testing framework. Simpler is usually better, but the specifics of implementation can be tricky to predict.
Here are some great examples to pull from: 8 Ecommerce Testing Examples You Should Have Tried Already

3. Eliminate All Checkout Errors

As I mentioned before, this is really simple conceptually. Eliminate errors.
Remember that while things might be running smoothly on the Chrome browser in your desktop computer, when you change browsers and devices, errors can pop up, and there are a lot of different devices being used out there.
One of Conversion Sciences clients had a strange error that only appeared for Internet Explorer visitors. The spinning “thinking” icon appeared after the visitor selected their state. But instead of disappearing after updating checkout information, a new spinning icon was added. This happened every few seconds.

This checkout error only happened on Internet Explorer. Every few seconds, a spinning icon was added to the page.

This checkout error only happened on Internet Explorer. Every few seconds, a spinning icon was added to the page.


An AB test revealed that this error was costing them $1,500,000 per year. Ouch.
Here’s an in-depth look at some of the shopping carts bugs that can popup in your code.

4. Build Trust Throughout The Checkout Process

It’s easy to forget that “conversion” isn’t the moment someone clicks “Buy Now” on your landing page.
It’s not a true conversion until the money has been transferred.
Accordingly, we need to view the checkout process as an extension of the landing page, and just like we want to establish and build trust on the landing page and the preceding funnel, we want to continue building trust during the checkout process itself.
There are many ways to do this, but one of my favorites can be seen below:

In this example from SamCart, the user is shown a phone number for immediate customer service, as well 4 customer testimonials with included pictures.
Instead of just hoping the user goes through with the purchase, SamCart is actively building trust on the checkout page by displaying other customers who were happy with their purchase and reminding the user that if they have any trouble, help is only a call away.

Conclusion

The checkout experience is one of the more complicated optimization puzzles you’ll tackle, which is why sound A/B testing is such an instrumental part of the process.
I hope today’s infographic has provided with you some insightful ideas for how you can optimize your website’s checkout process and increase revenue for your business.

Listen. We both know that “best practices” don’t mean much.

… right?

If you aren’t A/B testing, you are leaving a ton of money on the table.

True.

BUT here’s the deal.

A/B testing is a time-consuming method of optimization. It’s effective, but if you can simply click “edit” and make an obvious improvement, start there.

This is why “best practices” can be so powerful. They let you apply what others have learned and get quick wins when nothing is working. Plus, when you begin implementing a solid A/B testing framework, best practices can give you some great hypotheses for your first round of tests.

In today’s post, we’ll cover 21 proven best practices, backed up by case studies, statistics, and data.

Let’s begin.

#1: Start with a great headline to boost conversions 41%

The headline is perhaps the most crucial element of your entire landing page. Why? Let’s ask David Ogilvy, famous advertising revolutionary, his thoughts on headlines:

“On the average, five times as many people read the headline as read the body copy. When you have written your headline, you have spent eighty cents out of your dollar.

If this has changed in the digital age, it has only gotten worse. Now, let me hammer it home with a case study…

BettingExpert is an online betting forum where tipsters can share their experience and tips. They ran an a/b test on their headlines with three variations: One with a question, one with a benefit, and one utilizing loss aversion.

Example of headline AB test

Three similar headlines on this page deliverd very different results.

As you can see, the benefit headline (which spoke directly to it’s target reader’s dreams and aspirations) boosted conversions by 41.14%.

The takeaway here? Focus on the benefit and writing amazing headlines.

#2: Add strong testimonials and social proof to be 12X more trusted

Have you ever seen a testimonial like this:

“Awesome service! Definitely recommend!” – Jane Dough

You may even have used something like that in the past.

The thing is, social proof works. Studies show nearly 70 percent of online consumers look at a review prior to making a purchase and reviews are 12-times more trusted than product descriptions and sales copy from manufacturers.

But, weak social proof can harm your conversion rates.

Frankly, most customers will write a poor review or testimonial, though they mean well. Derek Halpern from Social Triggers actually increased conversion rates by 102% by removing social proof.

So, use social proof. But, you’re probably better off writing a strong testimonial yourself. Then getting permission from your customer to say it matches their experience. This ensures it will work more in your favor.

Pro Tip: Use exact numbers in your testimonials if you can. This works due to a principle in psychology known as Ambiguity Aversion, which states that humans prefer known risks over unknown risks (i.e. we like to know what we’re getting into). Robbie Richards does this well on his blog.

Landing page testimonial example

Your testimonials should be written by you with the agreement of your customer.

#3: Write action-oriented copy to increase clicks by 93%

If you’ve ever written a paper in high school, your English teacher probably told you to write in “active” voice, not “passive” voice. Why?

Because passive voice has a weak quality, is bland, and can be boring. Active language excites, energizes, and drives action. See what I did with the previous two sentences?

It turns out that your English teacher was right. Here’s why.

The company L’Axelle sells underarm sweat pads, and ran an A/B test on the product landing page. Their original page used passive headline attempting to integrate the benefit, “Feel fresh.” The second used direct language and strong verb “Put an end to sweat marks!” With language like this, the exclamation point is probably redundant.

Version A

Version A of landing page with passive call to action.

Version A of landing page with passive call to action.

Version B

Version B of landing page with active call to action.

Version B of landing page with active call to action.

That simple change in copy lead to a 93% increase in clicks, for a total conversion rate of 38.3%.

It goes to show – landing page copy matters. Make your copy action-oriented.

#4: Use contrasting CTA colors to grow sales 35.81%

This seems so mundane and simple (and sometimes, it is). However, a simple change in the CTA button color can have surprisingly large effects on landing page conversion rates.

Wanna know how? Here’s a case study:

A major eCommerce site that sells hand-painted porcelain wanted to grow their business (who doesn’t?). They decided to get Unbounce to help them out.

Unbounce came in and made one super simple change: They made the “ADD TO CART” button green instead of blue. The result?

A 35.81% increase in sales (yes, sales, not just clicks).

AB Test in which button color was changed

Increasing the visibility of the call to action increased conversions.

Now, why did this work? I don’t think it’s because green is a particularly compelling color.

Changing the CTA color worked because the green gave the button some contrast.

The blue didn’t stand out at all, whereas the green pops. Our focus, then, is not on particular colors being better for CTAs than others, but on ensuring a color contrast to draw the eye.

Wondering what color to use? Try picking the opposite color (from your brand’s or landing page’s main color) on this color wheel:

Designers use the color wheel to select complimentary--and conflicting--colors.

Designers use the color wheel to select complimentary–and conflicting–colors.

In other words, if your main color is yellow, try a blue or purple CTA. If your main color is green, try a purple or red CTA.

#5: Command 31.03% more people to click with actionable CTA copy

You had to have seen this coming – if action-oriented copy in your headlines and body increase click-thru rates, of course they’d work in your call to action (CTA) copy as well. The CTA is usually located on the button or link on which the visitor must click.

But, don’t take my word for it. Let’s take a look at a case study:

WriteWork offers essays and writing guides for students. Their original checkout page CTA (shown below) simply said “Create My Account”. Who wants to create another account?

However, when they changed the text to say “Create Account & Get Started”, they saw a nice 31.03% increase in conversions. Not too shabby, eh?

AB Test in which the call to action button was changed

New call to action copy deliverd a significant conversion increase in this test.

The verdict? Make your CTA copy actionable, and tell your customer exactly what will happen when they click it.

#6: Use faces, but not near a CTA (unless they’re a top industry influencer)

The human brain is very drawn to faces and eyes. We have a tendency to look at faces before anything else on a web page. This can be a good thing… or it can hurt conversions.

When it comes to using faces on a landing page, they can add credibility and trust. However, they can also distract the reader from a message or CTA.

To get around this, only use faces of people the user is bound to recognize and trust (such as an authority in your industry) near CTAs. For maximum effect, have them looking at, and potentially even pointing to, your CTA.

Of course, you still want to put pictures of you and your team to help build trust – just don’t place them near the call-to-action.

In a case study, Medalia Art was able to boost clicks 95.4% by replacing the images of art on their home page to images of the artists.

The images of the artists increased conversions in this art case study.

The images of the artists increased conversions in this art case study.

#7: Format like a boss

Just as formatting makes your blog posts more engaging, so too does formatting make your landing pages easier to navigate and understand.

What do I mean by formatting?

  • Use bulleted lists to state your key benefits.
  • Use images to give the eyes a rest from reading text.
  • Utilize white space to avoid extra noise and distraction.
  • Include headings and subheadings to break up your page.
  • Use directional cues (like arrows) to point the viewer’s eyes to your CTA.

Great formatting makes your landing page easier to skim–and you know most of your visitors are only going to skim–making the most important points immediately apparent.

Just as people prefer better-looking people, people prefer better-looking websites because they associate beauty with perceived trust and credibility.

Want an example? Basecamp redesigned their landing page and found a 14% increase in conversions.

Basecamp landing page redesign AB test

Basecamp tested formatting in this landing page AB test.

Here’s one more example, for good measure: Swedish company Unionen saw a 15.8% click-through boost when they bullet-pointed their benefits:

Version A:

AB test version with block of text

Version A: A big block of text at the key call to action.

Version B (15.9% Increase):

AB test version with bulleted text

Version B: You don’t have to read Swedish to know that bulleted text and white space wins.

#8: This is 2017… use spellchecker already

Having grammatical or spelling errors in your copy can seriously hurt conversion rates. It makes you appear unprofessional at best, and like a scam at worst.

Want a real-life example of how badly a small mistake can harm your business? Take a look at this case study from Practical Ecommerce on a website selling tights – correcting their spelling from “tihgts” to “tights” on their product category page shot conversions up 80%.

Example of bad spelling on a landing page

Bad spelling can destroy visitor trust.

In a world where tools like Grammarly and built-in spell checkers exist, there’s just no excuse not to have immaculate grammar and spelling. Take an extra ten minutes to read through your page to ensure no errors get though.

Pro Tip: I actually like to read my writing out loud at least once. This helps me catch any errors and get a better idea of the flow and overall sound of things.

#9: Consider adding multiple CTAs

Multiple CTAs?! Are you crazy?

Before you scroll down and leave me a nasty comment, hear me out. I’m not talking about having a variety of buttons and forms leading to different places.

Rather, on longer pages, you should have multiple buttons and/or opt in forms that lead to the same outcome.

Having more than one chance for the customer to opt in allows them to scroll through and click at their own pace. If they don’t click your above-the-fold CTA, for example, they’ll have another chance in the middle or at the end of the article.

That said, too many buttons can cause your visitors to get decision fatigue, becoming tired of too many choices and leaving the page.

So, short rule of thumb? Place multiple CTAs on long pages, and a single one on short pages.

#10: Ditch the sharing buttons (unless you only have one other option)

One of the 21 persuasion techniques for conversion optimization was something called the “Hobson’s +1 choice effect”. This effect essentially states that having over two choices can cause anxiety and negative feelings, but we also want to have the choice to choose.

As such, if you only have one option on your landing page (the CTA), adding a “Tweet this” button can help, according the the choice effect. However, if you already have multiple offers, CTAs, or links in your offer, social buttons can add to the noise and reduce conversions.

In one case from Taloon.com, removing social sharing buttons from their product pages increased conversions by 11.9%.

AB test in which removing social sharing buttons increase conversion rate

Removing social sharing buttons increased conversion rate.

However, I’d like to point out two key elements here:

  1. They had four social sharing buttons instead of one (like “Share this product” or “Tweet this”), creating too many distractions.
  2. They also have many other choices on these pages, like clicking to a separate category or page on the website, putting them well above that ideal two-choice limit.

I’ve already said it, but I’ll say it again: you have to test these things to find out exactly what works for your product, audience, and business.

#11: Highlight your guarantees to build trust

Purchasing is an emotional decision, which is then backed by logic. Therefore, once you’ve sold someone emotionally on your product or service, you must then provide them with logical reasons to actually get through the checkout page.

One way you can do that is to highlight your guarantees.

A money-back guarantee is an amazing way to get people to commit. It’s truly risk-free. If they don’t like it, they can get their money back.

Neil Patel increased sales of his Traffic System course by 21% when he highlighted a 30-day money back guarantee.

It doesn’t have to be money back, either. Other guarantees you can try include:

  • A risk-free trial period.
  • A low-price guarantee (where you’ll refund them if they find a higher price).
  • A forever guarantee (where you’ll replace the product for life).

Don’t be afraid to test different guarantees, just as you we talked about testing different offers. You may find a free trial–a $1 trial to avoid credit card complications–converts better than the money back guarantee.

Pro Tip: Another way to build trust is by adding an SSL certificate to your site. That’s the green lock that says “secure” next to it. This shows your visitors their information is safe.

#12: Use the inverted pyramid method (keep the most important stuff at the top of the page)

The inverted pyramid is a writing style coined by journalists. It means keeping all of the key benefits and most important takeaways at the top of the page, then getting into the details as you get further down the page.

It looks like this:

The structure of the inverted funnel.

The structure of the inverted funnel.

(Source)

So, your attention-grabbing claims and statistics should be used at the top of the page to get visitors engaged, then your body copy, as you go down the page, should build anticipation for your product, at which point you give your CTA.

Of course, not all landing pages will be long enough to use the inverted pyramid method, but for longer pages it works wonders. Afterall, only about half of all your visitors will ever even reach the bottom of your account – you need to entice them.

Percent of article content viewed.

Percent of article content viewed.

Use the other best practices mentioned in this post, like formatting and imagery, to ensure you have the most important stuff first.

#13: Add related imagery and videos for 80% more conversions

Images aren’t just for formatting. They can be used to convey your main benefits and to help users understand what your product or service is about.

One study by eyeviewdigital.com even found that using video on landing pages can increase conversions by 80%. Check out their case studies if you’re interested in learning more.

When it comes to images and video, however, there are two things to keep in mind:

  1. Don’t overuse them. White space is your friend.
  2. Make the images relevant. Stock photos usually work against you.

If you’re in need of some images to add to your site, check out Unsplash. They have free high-res photos anyone can use. You can also use a tool like Canva to edit the images. For free.

Check out KISSmetric’s guide to creating unique landing page videos for more help on the video side of things.

#14: Remove any extra links

You’ve probably heard this tip before. “Remove navigation links so your visitors have to make a decision.”

However, with only 16% of all landing pages following this practice, does it really work?

HubSpot tested it to find out. They created two landing pages: One with a navbar, social sharing links, and footer, and one without any of the three.

Example of hubspot landing page with navigation

Navigation should not be necessary on a complete landing page.

The results? Up to a 28% increase in conversions. They even tracked every change and put the results in a nice little chart:

Results of landing page AB test in which navigation hurt conversion rate

In this case study, navigation hurt conversions on most pages or didn’t help at all.

In other words, it’s worth a shot to remove extra links. It may not always work, and it may not be worth losing the clicks to other parts of your website, but it had potential to increase sign-ups.

#15: Keep your landing page consistent with your brand and ads

The first job of the landing page is to keep the promise made in an ad, email, social post or link. Any variation can cost you conversions.

Consistency is key to a great many things, from blogging to getting fit. It’s necessary to succeed, and people love seeing it.

Especially on landing pages.

I can think of no better example than Optimizely’s case study on their PPC ads. They ran two tests:

In the first one, the Headline was kept the same, regardless of the ad copy they used. In the second test, they matched the landing page headline to the ad copy.

Test A:

Example of landing page that serves different offers

Test A: One landing page attempts to keep three different promises. Unsuccessfully.

Test B:

Example of using differnt landing pages for each offer

Test B: These landing pages keep the specific promises made in each add.

The results? A 39.1% increase in conversions.

Of course, the headline isn’t the only thing you should keep consistent. Also try to:

  • Match the colors of your ad/brand with your landing page.
  • Use similar images and design.
  • Use similar (and even an exact match) of your ad copy on your landing page.

Keeping things consistent ensures people aren’t confused when navigating your site, and they know get what they expected to get when they click your ads.

#16: Achieve a 214% increase in conversion rate by asking for more information

One of the landing page best practices you often hear is to reduce the number of form fields as much as possible. It’s true, this reduces friction for the customer and has been shown to increase conversion rates.

BUT (there’s always a catch, isn’t there?), asking for more information better qualifies your leads and, in many cases, shows them you’re actually capturing the information needed.

Let me give you an example.

Advanced Grass is an artificial grass solution. They were able to achieve a 214% increase in conversions by splitting up their lengthy opt-in form into two parts: contact information and qualifying information.

Part 1:

Part 1 of the multi-step lead generation form.

Part 1 of the multi-step lead generation form.

Part 2:

Example of step 2 of a multi-step funnel

Part 2 of the multi-step lead generation form.

By simply splitting their form into two parts, they are taking advantage of the psychological principle of commitment and consistency, well known in the marketing world thanks to Dr. Robert Cialdini’s book Influence: The Psychology of Persuasion.

Basically, Cialdini found that people are more likely to take additional steps towards something if they’ve already committed to the first step. In the case of advanced grass, they already committed to entering their contact info, so they’re more likely to enter the few extra details.

Additionally, asking for the right information builds trust. How could you give an accurate quote for your artificial grass if the company didn’t know how many square feet you need or what kind of project it is?

The bottom line? Ask for more info, but split your form into two steps.

#17: 10X clicks by testing different offers.

Sometimes in our landing pages, we’re focusing on entirely the wrong offer. Maybe people don’t want a ‘free trial’ or ‘free consultation’. What about a free eBook or a free tool, instead?

I didn’t pull those examples out of my you-know-where. WordStream actually increased conversions 10 times over by changing their offer from a “Free Trial” of their software to giving away a free tool they created: The Google AdWords Performance Grader.

WordStream found success by changing their offers.

WordStream found success by changing their offers.

Of course, creating your own tool requires time, capital, and finding a good developer, unless you happen to be one. Here are some other offers you can try:

  • A free ebook or video explaining how your software works (alongside a free trial, of course)
  • A free case study detailing how others have used your software to maximize their business.
  • A template or worksheet helping your visitors accomplish something specific.

The key is to offer something highly relevant to what your software does and that’s very valuable, meaning you didn’t slop it together in five minutes. Put some thought into it.

#18: Boost engagement by 102% using scarcity

If you’ve been building landing pages for any amount of time, you’ve surely heard of using scarcity as a tactic to increase sign ups.

Using scarcity means limiting resources in order to get your visitors to take action right away.

For example, putting an expiration date on a coupon, limiting an offer to a certain number of customers, and announcing that you only have a few items left in stock are all versions of scarcity.

Going back to Cialdini, scarcity is one of his psychological principles of persuasion. People place more value on that which is limited.

Let’s look at a case study by KISSmetrics:

Hiten Shah decided to reduce the free trial period from 30 days to 14 days – and he found a 102% boost in engagement. In other words, twice as many people took action and used the free trial during the 14 days than the 30 days.

Kissmetrics changed the length of their free trial

More people signed up during the 14-day free trial than the 30-day trial.

So, use scarcity on your own pages by including a countdown timer on your page or offering a limited number of products.

19#: Pay attention to “the fold” to lift conversions 220%

You think I’m going to tell you to put your CTA above the fold,  don’t you?

Well, you shouldn’t necessarily do that. This is one of the most common of landing page best practices.

Instead, pay attention to the fold. While there is research that supports above-the-fold CTA, there is research against it as well.

Most engagement happens right at or just below the fold. Let me explain…

Most engagement happens right at the fold or just below it.

Most engagement happens right at the fold or just below it.

As you can see in the chart above, people view the topmost area of the page the least, and view the area “just above the fold” the most (i.e. right where you begin to need to scroll).

Joanna Wiebe of Copy Hackers and Airstory had this to say:

“Don’t cram everything above the fold. Countless tests and scroll- / click-tracking studies have shown that visitors are willing to scroll… as long as they know there’s something to scroll down for. (So don’t create a false-bottom.) Don’t prevent people from exploring your content by making assumptions about their use behaviors.” (via Copy Hackers)

So, where should you really place your opt-in form or CTA?

It depends on the complexity of your offer (and thus, the amount of information needed to explain it). This chart by KISSmetrics explains it perfectly:

Offer complexity affects the placement of your call to action on the page.

Offer complexity affects the placement of your call to action on the page.

Let me give you a more direct example. Marketing Experiments tested one of their client’s CTA placements above- and below-the-fold. Below-the-fold actually resulted in 220% more conversions, likely due to the complexity of their product.

Example of placing a call to action near the bottom increases conversions

In this case, placing the call to action near the bottom of the page increased conversions.

(Source)

What did we learn? Above-the-fold isn’t always best – test your CTA placement.

#20: Don’t rely on these landing page best practices. Test.

All of the best practices on this list can (and probably have) been broken with exceptions at one point or another. Like I said in the very beginning of this post – best practices make a lot of assumptions. Use them, but don’t be afraid to go against them.

In the words of Mark Twain:

“When you find yourself on the side of the majority, it’s time to pause and reflect.”

I’ll leave you with one final case study to prove just how important it is to a/b test your landing pages. Convert Verve, who’s examples you saw in some of the practices above, ran a simple test on the checkout page of one of their clients – removing the green arrow pointing to their CTA button.

As it turns out, removing that green arrow actually reduced conversion rates by 12.29%. Who would have thought? Of course, if you followed along above, it makes sense – removing the arrow reduces the chance for the button to draw the eye.

Example of AB test in which removing arrow lowered conversions

Removing the arrow reduced conversion.

So, in parting: Always test your landing pages, and don’t be afraid to go against best practices once in awhile.

Conclusion

Landing page best practices are just that – best practices. We can take what’s worked for others and copy it for our own use, but ultimately, it comes down to trying different things.

As Mark Zuckerberg says, “Move fast and break things.” Follow expert advice to make your first page the best it can be, then start experimenting.

Now, what did I miss? There are surely more landing page tips & tricks out there I didn’t cover here. Drop me a comment and let me know.

And, if you found even one thing useful about this article, please take a moment to share it.

bill

2017 is just around the corner, and that means a new year with a fresh batch of goals and milestones.
If you increased your website’s conversion rate by 10%, how would that affect your business’ overall growth this year? How would that accelerate your career or revolutionize your bottom line?
Now’s the time to get optimization efforts in motion, and we’re excited to hear about what you have planned for 2017. Leave us a comment and let us know what you’re up to!
In the meantime, here’s a quick recap of Conversion Sciences’ 10 most popular articles from 2016.

  1. 5 Elements of Persuasive Writing that Make Your Posts Takeoff
  2. 7 Best Practices for Using Exit-Intent Popovers, Popups
  3. 5 Tactics for Increasing Your Telephone Sales
  4. 12 Rules for Maximizing Conversions from AdWords
  5. 10 Value Proposition Upgrades That Increased Conversions
  6. AB Testing Statistics: An Intuitive Guide
  7. The 20 Most Recommended AB Testing Tools By CRO Experts
  8. Can Live Chat Increase Conversions?
  9. The Ultimate A/B Testing Guide: By Conversion Sciences
  10. The Proven AB Testing Framework Used By CRO Professionals

And of course, if you’d like to have a group of proven experts handle your CRO efforts in 2017, the Conversion Sciences team is here to help. Our calendar fills up fast this time of year, so don’t put it off.
Contact us right now to schedule a free consultation.
 

How do you choose a Christmas Card for your boss? Better yet, how do you choose one that will get you a CRO budget in 2017? Simply apply these Conversion Principles for a happy new year and a new budget.

We’ve examined a number of holiday cards to determine the one most likely to win you some budget in the coming year. Watch this critique and give your manager the card that will deliver.

Primary Conversion Principle Metrics

Christmas cards are a lot like landing page on the web. They have to appeal to visitors quickly and deliver something meaningful.
Christmas Metrics
We’ll be examining the Christmas Card Graph for each card.
Christmas Card Graph

Lessons Learned

Don’t be too safe.

Playing it safe often means being boring. Open rates will suffer.
 
safe-card-2017

Don’t let your designer make the decisions.

One man’s beautiful design is anothers reading nightmare. Don’t let design get in the way of communicating.
Over Designed Card

Don’t deliver less than you promise.

Making promises is the best way to get people to open your cards — and read your page. However, if you over-promise, you can do more damage than good.
Try Flattery

Use copy that engages the reader.

Our brains need to be challenged to be engaged. Rhymes, humor and alliteration will work to engage the reader and get them to take action.
Rhymes Card

Be interactive.

Sometimes you just have to get them involved to get them interested. Consider asking them to do something on your page.
Budget Secured Card
 
It shouldn’t be a surprise that all of these lessons can be applied to your website and landing pages. This is what we do at Conversion Sciences.
Once you’ve secured that budget, schedule a free Conversion Strategy Session.
Christmas Conversion for the Boss Featured Image

The AB test results had come in, and the result was inconclusive. The Conversion Sciences team was disappointed. They thought the change would increase revenue. What they didn’t know what that the top-level results were lying.

While we can learn something from inconclusive tests, it’s the winners that we love. Winners increase revenue, and that feels good.

The team looked closer at our results. When a test concludes, we analyze the results in analytics to see if there is any more we can learn. We call this post-test analysis.

Isolating the segment of traffic that saw test variation A, it was clear that one browser had under-performed the others: Internet Explorer.

Performance of Variation A. Internet Explorer visitors significantly underperformed the other three popular browsers.

Performance of Variation A. Internet Explorer visitors significantly under-performed the other three popular browsers.

The visitors coming on Internet Explorer were converting at less than half the average of the other browsers and generating one-third the revenue per session. This was not true of the Control. Something was wrong with this test variation. Despite a vigorous QA effort that included all popular browsers, an error had been introduced into the test code.

Analysis showed that correcting this would deliver a 13% increase in conversion rate and 19% increase in per session value. And we would have a winning test after all.

Conversion Sciences has a rigorous QA process to ensure that errors like this are very rare, but they happen. And they may be happening to you.

Post-test analysis keeps us from making bad decisions when the unexpected rears its ugly head. Here’s a primer on how conversion experts ensure they are making the right decisions by doing post-test analysis.

Did Any Of Our Test Variations Win?

The first question that will be on our lips is, “Did any of our variations win?”

There are two possible outcomes when we examine the results of an AB test.

  1. The test was inconclusive. None of the alternatives beat the control. The null hypotheses was not disproven.
  2. One or more of the treatments beat the control in a statistically significant way.

Joel Harvey of Conversion Sciences describes his process below:

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

Joel Harvey, Conversion ScientistJoel Harvey, Conversion Sciences

“Post-test analysis” is sort of a misnomer. A lot of analytics happens in the initial setup and throughout full ab testing process. The “post-test” insights derived from one batch of tests is the “pre-test” analytics for the next batch, and the best way to have good goals for that next batch of tests is to set the right goals during your previous split tests.

That said, when you look at the results of an AB testing round, the first thing you need to look at is whether the test was a loser, a winner, or inconclusive.

Verify that the winners were indeed winners. Look at all the core criteria: statistical significance, p-value, test length, delta size, etc. If it checks out, then the next step is to show it to 100% of traffic and look for that real-world conversion lift.

In a perfect world you could just roll it out for 2 weeks and wait, but usually, you are jumping right into creating new hypotheses and running new tests, so you have to find a balance.

Once we’ve identified the winners, it’s important to dive into segments.

  • Mobile versus non-mobile
  • Paid versus unpaid
  • Different browsers and devices
  • Different traffic channels
  • New versus returning visitors (important to setup and integrate this beforehand)

This is fairly easy to do with enterprise tools, but might require some more effort with less robust testing tools. It’s important to have a deep understanding of how tested pages performed with each segment. What’s the bounce rate? What’s the exit rate? Did we fundamentally change the way this segment is flowing through the funnel?

We want to look at this data in full, but it’s also good to remove outliers falling outside two standard deviations of the mean and re-evaluate the data.

It’s also important to pay attention to lead quality. The longer the lead cycle, the more difficult this is. In a perfect world, you can integrate the CRM, but in reality, this often doesn’t work very seamlessly.

[/su_note]

Chris McCormick, Head of Optimisation at PRWD, describes his process:

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

chris-mccormickChris McCormick, PRWD

When a test concludes, we always use the testing tool as a guide but we would never hang our hat on that data. We always analyse results further within Google Analytics, as this is the purest form of data.

For any test, we always set out at the start what our ‘primary success metrics’ are. These are what we look to identify first via GA and what we communicate as a priority to the client. Once we have a high level understanding of how the test has performed, we start to dig below the surface to understand if there are any patterns or trends occurring. Examples of this would be: the day of the week, different product sets, new vs returning users, desktop vs mobile etc.

We always look to report on a rough ROI figure for any test we deliver, too. In most cases, I would look to do this based on taking data from the previous 12 months and applying whatever the lift was to that. This is always communicated to the client as a ballpark figure i.e.: circa £50k ROI. The reason for this is that there are so many additional/external influences on a test that we can never be 100% accurate; testing is not an exact science and shouldn’t be treated as such.

[/su_note]

Are We Making Type I or Type II errors?

In our post on AB testing statistics, we discussed type I and type II errors. We work to avoid these errors at all cost.

To avoid errors in judgement, we verify the results of our testing tool against our analytics. It is very important that our testing tool send data to our analytics package telling us which variations are seen by which segments of visitors.

Our testing tools only deliver top-level results, and we’ve seen that technical errors happen. So we can reproduce the results of our AB test using analytics data.

Did each variation get the same number of conversions? Was revenue reported accurately?

Errors are best avoided by ensuring the sample size is large enough and utilizing a proper AB testing framework. Peep Laja describes his process below:

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

peep-lajaPeep Laja, ConversionXL

First of all I check whether there’s enough sample size and that we can trust the outcome of the test. I check if the numbers reported by the testing tool line up with the analytics tool, both for CR (conversion rate) and RPV (revenue per visit).

In the analytics tool I try to understand how the variations changed user behavior – by looking at microconversions (cart adds, certain page visits etc) and other stats like cart value, average qty per purchase etc.

If the sample size is large enough, I want to see the results of the test across key segments (provided that the results in the segments are valid, have enough volume etc), and see if the treatments performed better/worse inside the segments. Maybe there’s a case for personalization there. The segments I look at are device split (if the test was ran across multiple device categories), new/returning, traffic source, first time buyer / repeat buyer.

[/su_note]

How Did Key Segments Perform?

In the case of an inconclusive test, we want to look at individual segments of traffic.

For example, we have had an inconclusive test on smartphone traffic in which the Android visitors loved our variation, but iOS visitors hated it. They cancelled each other out. Yet we would have missed an important piece of information had we not looked more closely.

pasted image 0 39

Visitors react differently depending on their device, browser and operating system.

Other segments that may perform differently may include:

  1. Return visitors vs. New visitors
  2. Chrome browsers vs. Safari browsers vs. Internet Explorer vs. …
  3. Organic traffic vs. paid traffic vs. referral traffic
  4. Email traffic vs. social media traffic
  5. Buyers of premium products vs. non-premium buyers
  6. Home page visitors vs. internal entrants

These segments will be different for each business, but provide insights that spawn new hypotheses, or even provide ways to personalize the experience.

Understanding how different segments are behaving is fundamental to good testing analysis, but it’s also important to keep the main thing the main thing, as Rich Page explains:

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

rich-pageRich Page, Website Optimizer

Avoid analysis paralysis. Don’t slice the results into too many segments or different analytics tools. You may often run into conflicting findings. Revenue should always be considered the best metric to pay attention to other than conversion rate, after all, what good is a result with a conversion lift if it doesn’t also increase revenue?

The key thing is not to throw out A/B tests that have inconclusive results, as this will happen quite often. This is a great opportunity to learn and create a better follow up A/B test. In particular you should gain visitor feedback regarding the page being A/B tested, and show them your variations – this helps reveal great insights into what they like and don’t like. Reviewing related visitor recordings and click maps also gives good insights.

[/su_note]

Nick So of WiderFunnel talks about segments as well within his own process for AB test analysis:

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

nick-soNick So, WiderFunnel

“Besides the standard click-through rate, funnel drop-off, and conversion rate reports for post-test analysis, most of the additional reports and segments I pull are very dependent on the business context of a website’s visitors and customers.

For an ecommerce site that does a lot of email marketing and has high return buyers, I look at the difference in source traffic as well as new versus returning visitors. Discrepancies in behavior between segments can provide insights for future strategies, where you may want to focus on the behaviors of a particular segment in order to get that additional lift.

Sometimes, just for my own personal geeky curiosity, I look into seemingly random metrics to see if there are any unexpected patterns. But be warned: it’s easy to get too deep into that rabbit hole of splicing and dicing the data every which way to find some sort of pattern.

For lead-gen and B2B companies, you definitely want to look at the full buyer cycle and LTV of your visitors in order to determine the true winner of any experiment. Time and time again, I have seen tests that successfully increase lead submissions, only to discover that the quality of the leads coming through is drastically lower; which could cost a business MORE money in funnelling sales resources to unqualified leads.

In terms of post-test results analysis and validation — besides whatever statistical method your testing tool uses — I always run results through WiderFunnel’s internal results calculator which utilizes bayesian statistics to provide the risk and reward potential of each test. This allows you to make a more informed business decision, rather than simply a win/loss, significant/not significant recommendation.”

[/su_note]

In addition to understanding how tested changes impacted each segment, it’s also useful to understand where in the customer journey those changes had the greatest impact, as Benjamin Cozon describes:

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

benjamin-cozonBenjamin Cozon, Uptilab

We need to consider that the end of the running phase of a test is actually the beginning of insight analysis.

Why is each variation delivering a particular conversion rate? In which cases are my variations making a difference, whether positive or negative? In order to better understand the answers to these questions, we always try to identify which user segments are the most elastic to the changes that were made.

One way we do it is by ventilating the data with session-based or user-based dimensions. Here is some of the dimension we use for almost every test:

  • User type (new / returning)
  • Prospect / new Client / returning client
  • Acquisition channel
  • Type of landing page

This type of ventilation helps us understand the impact of specific changes for users relative to their specific place in the customer journey. Having these additional insights also helps us build a strong knowledge base and communicate effectively throughout the organization.

[/su_note]

Finally, while it is a great idea to have a rigorous quality assurance (QA) process for your tests, some may slip through the cracks. When you examine segments of your traffic, you may find one segment that performed very poorly. This may be a sign that the experience they saw was broken.

It is not unusual to see visitors using Internet Explorer crash and burn since developers abhor making customizations for that non-compliant browser.

How Did Changes Affect Lead Quality?

Post test analysis allows us to be sure that the quality of our conversions is high. It’s easy to increase conversions. But are these new conversions buying as much as the ones who saw the control?

Several of Conversion Sciences’ clients prizes phone calls and the company optimizes for them. Each week, the calls are examined to ensure the callers are qualified to buy and truly interested in a solution.

In post-test analysis, we can examine the average order value for each variation to see if buyers were buying as much as before.

We can look at the profit margins generated for the products purchased. If revenue per visit rose, did profit follow suit?

Marshall Downey of Build.com has some more ideas for us in the following instagraph infographic.

WTW TLE Post Test Analysis Instagraph Marshall Downy

Revenue is often looked to as the pre-eminent judge of lead quality, but doing so comes with it’s own pitfalls, as Ben Jesson describes in his approach to AB test analysis.

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

ben-jessonBen Jesson, Conversion Rate Experts

If a test doesn’t reach significance, we quickly move on to the next big idea. There are limited gains to be had from adding complexity by promoting narrow segments.

It can be priceless to run on-page surveys on the winning page, to identify opportunities for improving it further. Qualaroo and Hotjar are great for this.

Lead quality is important, and we like to tackle it from two sides. First, qualitatively: Does the challenger page do anything that is likely to reduce or increase the lead value? Second, quantitatively: How can we track leads through to the bank, so we can ensure that we’ve grown the bottom line?

You might expect that it’s better to measure revenue than to measure the number of orders. However, statistically speaking, this is often not true. A handful of random large orders can greatly skew the revenue figures. Some people recommend manually removing the outliers, but that only acknowledges the method’s intrinsic problem. How do you define outlier, and why aren’t we interested in them? If your challenger hasn’t done anything that is likely to affect the order size, then you can save time by using the number of conversions as the goal.

After every winning experiment, record the results in a database that’s segmented by industry sector, type of website, geographic location, and conversion goal. We have been doing this for a decade, and the value it brings to projects is priceless.

[/su_note]

Analyze AB Test Results by Time and Geography

Conversion quality is important, and  Theresa Baiocco takes this one step further.

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

theresa-baioccoTheresa Baiocco, Conversion Max

For lead gen companies with a primary conversion goal of a phone call, it’s not enough to optimize for quantity of calls; you have to track and improve call quality. And if you’re running paid ads to get those phone calls, you need to incorporate your cost to acquire a high-quality phone call, segmented by:

  • Hour of day
  • Day of week
  • Ad position
  • Geographic location, etc

When testing for phone calls, you have to compare the data from your call tracking software with the data from your advertising. For example, if you want to know which day of the week your cost for a 5-star call is lowest, you first pull a report from your call tracking software on 5-star calls by day of week:

image00

Then, check data from your advertising source, like Google AdWords. Pull a report of your cost by day of week for the same time period:

image01

Then, you simply divide the amount you spent by the number of 5-star calls you got, to find out how much it costs to generate a 5-star call each day of the week.

image02

Repeat the process on other segments, such as hour of day, ad position, week of the month, geographic location, etc. By doing this extra analysis, you can shift your advertising budget to the days, times, and locations when you generate the highest quality of phone calls – for less.

[/su_note]

Look for Unexpected Effects

Results aren’t derived in a vacuum. Any change will create ripple effects throughout a website, and some of these effects are easy to miss.

Craig Andrews gives us insight into this phenomenon via a recent discovery he made with a new client:

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

craig-andrewsCraig Andrews, allies4me

I stumbled across something last week – and I almost missed it because it was secondary effects of a campaign I was running. One weakness of CRO, in my honest opinion, is the transactional focus of the practice. CRO doesn’t have a good way of measuring follow-on effects.

For example, I absolutely believe pop-ups increase conversions, but at what cost? How does it impact future engagement with the brand? If you are selling commodities, then it probably isn’t a big concern. But most people want to build brand trust & brand loyalty.

We discovered a shocking level of re-engagement with content based on the quality of a visitor’s first engagement. I probably wouldn’t believe it if I hadn’t seen it personally and double-checked the analytics. In the process of doing some general reporting, we discovered that we radically increased the conversion rates of the 2 leading landing pages as secondary effects of the initial effort.

We launched a piece of content that we helped the client develop. It was a new client and the development of this content was a little painful with many iterations as everyone wanted to weigh in on it. One of our biggest challenges was getting the client to agree to change the voice & tone of the piece – to use shorter words & shorter sentences. They were used to writing in a particular way and were afraid that their prospects wouldn’t trust & respect them if they didn’t write in a highbrow academic way.

We completed the piece, created a landing page and promoted the piece primarily via email to their existing list. We didn’t promote any other piece of content all month. They had several pieces (with landing pages) that had been up all year.

It was a big success. It was the most downloaded piece of content for the entire year. It had more downloads in one month than any other piece had in total for the entire year. Actually, 28% more downloads than #2 which had been up since January.

But then, I discovered something else…

The next 2 most downloaded pieces of content spiked in October. In fact, 50% of the total year’s downloads for those pieces happened in October. I thought it may be a product of more traffic & more eyeballs. Yes that helped, but it was more than that. The conversion rates for those 2 landing pages increased 160% & 280% respectively!

We did nothing to those landing pages. We didn’t promote that content. We changed nothing except the quality of the first piece of content that we sent out in our email campaign.

Better writing increased the brand equity for this client and increased the demand for all other content.

[/su_note]

Testing results can also be compared against an archive of past results, as Shanelle Mullin discusses here:

[su_note note_color=”#dcf0df” text_color=”#000000″ radius=”10″]

Shanelle Mullin, ConversionXL

shanelle-mullinThere are two benefits to archiving your old test results properly. The first is that you’ll have a clear performance trail, which is important for communicating with clients and stakeholders. The second is that you can use past learnings to develop better test ideas in the future and, essentially, foster evolutionary learning.

The clearer you can communicate the ROI of your testing program to stakeholders and clients, the better. It means more buy-in and bigger budgets.

You can archive your test results in a few different ways. Tools like Projects and Effective Experiments can help, but some people use plain ol’ Excel to archive their tests. There’s no single best way to do it.

What’s really important is the information you record. You should include: the experiment date, the audience / URL, screenshots, the hypothesis, the results, any validity factors to consider (e.g. a PR campaign was running, it was mid-December), a link to the experiment, a link to a CSV of the results, and insights gained.

[/su_note]

Why Did We Get The Result We Got?

Ultimately, we want to answer the question, “Why?” Why did one variation win and what does it tell us about our visitors?

This is a collaborative process and speculative in nature. Asking why has two primary effects:

  1. It develops new hypotheses for testing
  2. It causes us to rearrange the hypothesis list based on new information

Our goal is to learn as we test, and asking “Why?” is the best way to cement our learnings.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three
  • This field is for validation purposes and should be left unchanged.

Have you ever found yourself in the middle of a conversation or argument and it suddenly hits you that the two of you aren’t talking about the same thing? Then you have that brilliant “aha” moment where you can actually start making some progress.

One workplace conversation that can be particularly tricky is whether your company should redesign its website. It’s important to make sure everyone is talking about the same thing when you talk about redesign because it’s costly, risky, and emotionally charged.

There are a few common reasons companies choose to redesign their websites:

  • The site performs poorly
  • The desire to be “mobile-friendly”
  • The site is “dated”
  • The desire to be “unique”

These reasons have a common denominator: you’re not happy with a very particular aspect of your site. There are many ways you can approach finding a solution to the problem, and we – the universal We – attach the word “redesign” to those solutions even though it means one of many methods are used to get the result we want.

Synonyms for redesign from dictionary.com

Synonyms for redesign from dictionary.com

According to Dictionary.com, the above are some of the more common synonyms for “redesign”. The way Conversion Sciences uses this term is very industry-specific, so it has a certain jargon-y quality. Someone working in marketing at a tech or ecommerce company probably understands our jargon more than their colleagues in other departments.

If you’re that marketing person and you’re trying to convince your boss and other departments that you need conversion optimization, it’s really important that you’re all speaking the same language. You might be experiencing some miscommunication and not even realize it.

What are the different ways each of you might be using the word “redesign”?

Before you dismiss it as juvenile to keep returning to basic, dictionary definitions of “redesign”, make a mental tally of important people who don’t work in marketing, conversion optimization, or graphic design.

  • Your CEO and CFO, maybe your boss
  • Your customer service representatives answering chat, phone calls, and emails
  • Your customers

All of us feel great satisfaction in knowing the real definition, but ultimately being right isn’t helpful if no one understands each other.

A Full Redesign: Starting Over From Scratch

When we say “redesign” in its purest sense, we mean a brand spanking new website. You hired a designer, you have a new color palette and CSS, you completely threw out the old. Every page is new, the entire structure is different.

Redesign can be used to mean a brand spanking new experience

“Redesign” can be used to mean a brand spanking new experience

When Conversion Sciences cautions against redesigns, this is the definition we’re using. We say there are only two good reasons to undertake a website redesign:

  1. You are re-branding or
  2. Your CMS (content management system) is too limiting

When I worked at Westbank Library our website used a proprietary CMS built by the company that built our ILS (integrated library system). An ILS is used to search for books or connect to an online resource or check to see when books are due back. In other words, an ILS isn’t meant to be the platform for a very specific kind of online application.

Westbank's homepage in 2008, built with a CMS that was only intended to be used for online library catalogs

Westbank’s homepage in 2008, built with a CMS that was only intended to be used for online library catalogs (screenshot via the Way Back Machine)

The ILS wouldn’t support some very important non-book-related features:

  • We couldn’t optimize the site for the search engines
  • We couldn’t embed a calendar
  • We couldn’t choose which photos appeared where on the page
  • We couldn’t create customized landing pages for special events
  • We couldn’t make the site ADA compliant
  • We couldn’t add widgets other libraries were using

We needed a new site built on a new CMS, one that met our present-day needs. The only way to do that was to dump the old one. The new website was built using Drupal, and it meant everything was new. The change was necessary and long overdue.

Westbank's new homepage after the from-scratch redesign

Westbank’s new homepage after the from-scratch redesign (screenshot via the Way Back Machine, which is why the images aren’t loading)

We were excited that on smartphones, the phone number was tel-linked and that the site was now searchable without going back to Google. Best of all, we had an actual, legitimate calendar. Before the redesign, the best we could do was make a list of what was going on.

Calendar of events on old site

Calendar of events on old site

After the redesign, people could see an actual calendar with clickable events where they could go find more information.

Calendar of events on new site

Calendar of events on new site

Without a doubt, the new site was an immense improvement. The lack of functionality on the old site was crippling us.
In this case, a full redesign was justified, but the results weren’t what we had hoped.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three
  • This field is for validation purposes and should be left unchanged.

Conversion Optimization as Redesign: Making Incremental Changes

When the new site launched, our traffic went through the roof – hundreds of times more people were visiting our website. But since the change was long overdue, people who used the old site for a decade were totally lost.

Dozens of people called us saying “I can’t find anything on this new website, you need to redesign it!” and dozens more sent us angry emails saying the same. With the amount of time we spent working on the new website, it was disheartening to hear. Small public libraries don’t have the resources to do projects like this often – and in some cases, they can’t do projects like it at all. We knew we’d been fortunate, and we were suddenly terrified we had blown our only chance to fix our site. There were very serious discussions of applying for grants, then hiring a new design team to start over.

But after spending time talking to our patrons, we’d find out what they actually meant by “redesign”.

In one case, a gentlemen received an email reminding him to renew his books and included a link for him to do it online. Before our redesign, that link took him to his library account where he was automatically logged in on his home computer. All he had to do was click “renew”. After the redesign, this link took him to our homepage, so he had no idea where to go. When we say that your landing pages need to fulfill the promises you’ve made in your ad, this is a great example of what we’re talking about. Instead of changing the design of anything, we needed to fix that link.

Another way we knew people were lost is by analyzing how they used the site.

One problem in particular was how people used our site’s search box. All of the searches were for titles of popular books and movies, but the search box wasn’t connected to the online catalog. Our old site had one search box, and its only use was to look for books and movies. Everyone assumed the new search box had the same function, but it didn’t.

Our search bar at the launch of our new site.

Search options on the new website

We used the data from our search box the same way you can use heat maps. You can accommodate how your visitors are already using your site with the data you gather. Instead of forcing them to use our search box the way we wanted, we changed it to do what they wanted.

But that change meant our visitors, once again, didn’t have a site search option.

We changed the site search bar to be a catalog search, but it still wasn't perfect

We changed the site search bar to be a catalog search, but it still wasn’t perfect

From this point, we found a widget that gave us a more dynamic search bar. Then we replaced images at the bottom of the page that linked to adult, teen, and children’s programs with widgets featuring new books and the library’s Instagram account. And we featured upcoming events more prominently, moved the contact information into the footer, added navigational links along the top of the page, and worked to make the site ADA compliant. The current homepage design is very different compared to what it was when we first rolled out the new website.

The homepage as it is now

The homepage as it is now

These changes were slow-going, careful, and made one at a time. The redesign 1.0 and current iteration look similar because of branding and tabbed browsing, but for library patrons, these are two very disparate experiences. It is safe to say the new homepage underwent another redesign, but you might hesitate to use that word because the changes didn’t happen all at once.

 

Looking back at synonyms of “redesign”…

Redesign can be used to describe incremental changes

“Redesign” can be used to describe incremental changes

The website wasn’t perfect, but there was a lot to work with. We couldn’t start over every time we realized the site could be doing better.

Big Swings as Redesign: Changing Several Variables at Once

We use the term “big swing” to talk about sweeping changes we make on a page. Often these changes are on a page that’s particularly important or special, like a homepage or landing page.

It means we’ve changed several features all at once instead of testing one thing at a time. The downside of this strategy is that no matter how the page performs after the test goes live, we don’t really know why. If the page continues to perform with the exact same conversion rate, we don’t really know why: our changes may have offset each other.

Big, sweeping changes are exciting when they are successful, and people love to share these kinds of successes. They make great headlines and engaging stories. They give us hope that our big change will work out the way we want, or perhaps even better than we imagine. The problem is that there are usually third variables at play in these stories.

Think about the diet book industry. Every book boasts of its followers’ drastic life improvements due entirely to the diet. But when someone starts to pay attention to what she eats, she may also make other changes like exercising more, quitting smoking, and getting more frequent checkups with her doctor. Was her success really due to the diet book? Or was it purely chance since she made so many other changes? There’s no way to know.

Michael Scott’s Big Swing

Humans have the potential to be rational, logical creatures, but we often fall prey to our emotions when we make decisions, dole out praise, or attach blame. In an episode of The Office, Office Manager Michael Scott has the brilliant, big idea to send out paper shipments with five golden tickets tucked into the boxes at random. Each ticket awarded a 10% discount to its recipient.

The promotion quickly goes south when Dunder Mifflin’s largest client receives all five tickets, and there are no disclaimers or expiration dates. Michael arranges for a fall guy who will be fired for the idea, but then finds out this client has decided to send Dunder Mifflin even more business because of the discount. Naturally Michael wants the credit but doesn’t want to be reprimanded for almost bankrupting the company.

Michael Scott dressed as Willy Wonka, presenting his Golden Ticket idea

Michael Scott dressed as Willy Wonka, presenting his Golden Ticket idea

The Golden Ticket promotion was a big swing because Dunder Mifflin didn’t isolate the variable Michael was hoping to test: will current clients be more loyal to Dunder Mifflin because of a special, one-time-only, 10% discount?

The consequences of the Golden Ticket run the gamut of possible results of big swings:

  • Positive Result: When it seemed like the promotion would put Dunder Mifflin out of business, the responsible party was fired
  • Negative Result: When it became apparent the promotion would solidify a relationship with an important client, the responsible party was publicly commended
  • Neutral Result: Dunder Mifflin lost a huge amount of revenue due to the promotion, then gained more revenue, also due to the promotion

Big Swings at Conversion Sciences

In a staff meeting last week, Conversion Scientist Megan Hoover told us, “We completely redesigned this landing page for our client, and it was a big improvement”. In a different staff meeting, fellow Conversion Scientist Chris Nolan told us, “Our first test was to redesign our client’s homepage, and it was a huge success”.

Conversion Sciences doesn’t do website redesigns, we do conversion optimization. So what did Megan and Chris mean?

  • We switched from two columns to three
  • We wrote a new headline
  • We changed the copy
  • We changed the wording on the call to action

These changes mean they were speaking accurately when they described their big swings as “redesigns”.

Redesign describes what we do when we do big swings

“Redesign” describes what we do when we make big swings

We didn’t change the functionality of the page, the page’s purpose, or the CMS. We definitely made some big changes, but we certainly didn’t start from scratch, and all of the changes were very localized to a landing page and a homepage.

It’s worth noting that even though it’s tough to measure results when you make a big swing type of redesign, we still take the risk sometimes because Conversion Sciences has run so many successful tests. We are very good at making educated hypotheses about what kinds of changes will work well together, but we don’t attempt these big changes often. There is a lot of room for error in the big swing.

What is Your Desired End Result?

We covered three approaches to redesign in this post:

  1. Throw-out the old, start from scratch
  2. Incremental changes
  3. Big swings

Let’s return to the most common reasons a company chooses to redesign:

  • The site performs poorly
  • The desire to be “mobile-friendly”
  • The site is “dated”
  • The desire to be “unique”

When you have the conversation at work about redesigning your site, try starting with the end goal.

If you work backwards, the conversation has a good chance of staying on track because it’s likely that everyone wants the same thing, even if it comes out of their mouths sounding very different. I’m willing to bet that everyone wants a home page with lower bounce rates. Everyone wants to reduce cart abandonment rates. Everyone wants more downloads of your industry reports. Everyone wants to sell more merchandise.

Redesigns are seductive. They come with big budgets and a chance to make a visible impact. The question at the heart of my arguments is this: do you need a website redesign, or do you need a website optimization program?

An optimization program can begin delivering results within weeks. Full redesigns take months and months to develop. An optimization program tells you which of your assumptions are good ones. Full redesigns are big gambles.

With a short Conversion Strategy Session, you will be able to make the case for a full redesign or optimization program for your growing online business. Request your free session.
Brian Massey

If you compete online in the retail electronics industry, there is ample opportunity, according to a study completed by Conversion Sciences and Marketizator.

The full report, Optimization Practices of Retail Electronics Websites, can be downloaded for free. It is the latest in our series of industry report cards that include reports on Higher Education, and B2B Ecommerce.

Who Should Read The Report

The report is a report card on the adoption of key website optimization tools for businesses advertising on “electronics” search keywords. It is meant for managers of websites competing for a slice of the retail electronics market like:

  • Retailers of digital cameras, TVs, home theater, and tablets.
  • Retailers of complimentary products, such as computers and laptops.

We believe that the lessons learned here can be applied to any online retail business with high-prices and commoditized products.

Why Focus on Website Optimization?

There is a set of tools and disciplines designed to increase the number of sales and leads generated from the traffic coming to a business website. Collectively, they are called website optimization.

In the seasonal online retail space, websites seek to achieve one or more of the following goals:

  • Increase the revenue generated per capita, also known as “revenue per visitor.”
  • Reduce shopping cart “abandonment” in which visitors add items to cart, but do not purchase.
  • Increase the average size of each order, or “average order value.”
  • Decrease bounce rates for traffic from paid advertising.

Website optimization utilizes easily-collected information to identify problems and omissions on these sites that may prevent achievement of these goals.

This information can be collected in several ways:

  • Web analytics tools track prospect’s journey through a site. Examples include Adobe SiteCatalyst and Google Analytics.
  • Click-tracking tools (also called heat map tools) that track where a prospects are clicking and how far they are scrolling. This reveals functional problems on specific pages.
  • Screen Recording tools will record visitor sessions for analysis.
  • Split testing, or A/B testing tools allow marketers to try different content and design elements to see which generate more inquiries.
  • Site Performance tools help companies increase the speed with which a website loads. Page speed correlates with conversion performance.
  • Social Analytics track the performance of social interactions relating to the site, such as likes, shares, and social form fills.
  • User Feedback tools provide feedback directly from visitors on the quality of the site and content.

The existence of these tools on a website indicates that the site is collecting important information that can be used to decrease the cost of acquiring new prospects and customers.

This is a strong competitive advantage. Increasing conversion rates decreases acquisition costs, which means:

  • All advertising gets cheaper.
  • Businesses can outperform competitors with bigger advertising budgets
  • Businesses reliant on SEO aren’t as vulnerable to algorithm changes.

This report targets companies investing in search advertising in a variety of formats.

How much are these businesses pending on paid online advertising?

Of the businesses competing for consumer electronics sales, 83% are spending between $500 per month and $5000 per month on paid search ads. See Figure 1.

Fourteen percent are spending between $5000 and $50,000 per month, and only 3% spend more than $50,000.

Figure 1: Range of spending on paid search ads by businesses.

Figure 1: Range of spending on paid search ads by businesses.

Web Analytics Investments

Of the organizations that spend at least $500 per month on search ads, 75% have some form of Web Analytics installed on their site. Web Analytics is a broad category of web software that in some way measures the behavior of visitors to a site. It includes most of the website optimization tools discussed in this report.

Figure 2: Breakdown of web analytics installations by ad spend.

Figure 2: Breakdown of web analytics installations by ad spend.

When we break the list down into categories of spending, we find that the highest-spending organizations are less likely to have web analytics installed (77%) despite having the most to lose.

Google Analytics, a free tool, is the most pervasive analytics package, found on 77% of the sites with analytics. Adobe SiteCatalyst (formerly Omniture) is installed on 4.5% of these sites.

Optimization Software Investments

By looking at the software installed on the websites in the asset and inventory marketplace, we can get an idea of how these organizations are investing in the tools of optimization.

This doesn’t tell us how many are making good use of these tools, but indicates how many have the potential to optimize their site.

The graphic in Figure 3 shows that retailers spending $50,000 on search ads are most likely to invest in

optimization tools. Of this segment, 24% have at least one of these tools installed vs. 7.7% for the entire industry.

The largest spenders focus investments on A/B testing tools, social analytics and survey feedback solutions.

Figure 3: Adoption rate of Web optimization tools by ad spend.

Figure 3: Adoption rate of Web optimization tools by ad spend.

Use of AB Testing Tools

It is clear from the information presented here that, the largest group of retailers – those spending between $500 and $5000 each month on search ads — invest the least in AB testing tools. Furthermore, they invest most in social media analytics tools with 6.1%.

The question is this: Do they not have the tools budget because they don’t invest in website optimization, or do they not have the tools because they don’t see optimization as important.

Certainly, both are true for some portion of the sample. However, 75% of all organizations spending at least $500 a month have web analytics installed. At some point, most of the industry came to the conclusion that you must understand the basics of your traffic.

Yet, only 7.7% have at least one website optimization tool installed.

Over 82% of organizations spending between $5,000 and $50,000 have web analytics installed, and 15.6% have some sort of investment in optimization tools.

Recommendations

Give Your Team Time to Review Analytics

Most of the businesses in our review – 75% – have gotten the message that web analytics should be installed on their website. The majority of these have installed Google Analytics, a free package with capacity to capture the behavior of their visitors.

The value of an analytics database like this is in the insights it can provide. Incentivizing your team to glean insights from this analytics database will guide online investment decisions, increasing the performance of the website.

Businesses with Smaller Ad Spends Should Focus More on Reducing Acquisition Cost

Those businesses with larger ad spends are able to bid more for better placement on their ads. Those with smaller budgets, however, will win by reducing the overall acquisition cost.

Businesses with low acquisition costs get more inquiries for less money. This is the leverage businesses with fewer resources need.

Those businesses that learn to optimize the fastest will gain a cost advantage in paid ad auctions. An investment in free and inexpensive tools, such as click tracking, screen recording and site performance solutions will tip the scales.

Given the low adoption rate of so many of these tools, schools with few resources are in a position to disrupt their competitors by investing in them.

Leverage Your Comparatively High Purchase Price

For those businesses with higher average order values, small increases in conversion rates will deliver big increases in revenue. In short, it takes less time to get your money back from an investment in website optimization.

This can be seen in the relatively high adoption rate of A/B testing tools by businesses spending between $500 and $5000 per month (21%). While these tools require a more formal discipline, they are very effective at finding increases in conversion rates month after month.

There is still a significant opportunity for businesses spending below $5,000 to drive acquisition costs down with testing.

Decrease Your Search Ad Costs

Google favors sites with better performance. The search engine gives advertisers with more relevant sites ad placement higher on the page. Data indicates that sites with lower bounce rates are given a higher quality score than sites that elicit “pogo-sticking”, that is, sites for which visitors are returning to search results pages quickly.

Website optimization will reduce bounce rates by getting visitors into the site before they jump back to their search results.

Don’t Over Invest in Social Media Sharing

It is telling that social analytics tools have the highest adoption rates among consumer electronics retailers.

Social ads are delivering qualified traffic at a relatively low cost. In our experience, social sharing has not.

Your analytics will reveal if social traffic is delivering new leads and sales for your business. If the results aren’t there, consider using this investment elsewhere.

Begin Adoption Soon

Retail marketers are clearly behind the curve in terms of their adoption of website optimization tools. This creates an opportunity in the market. However, this window will close.

As more businesses begin optimizing, it will become harder more expensive to compete for prospects online.

The Conversion Scientists are reading some good stuff at the moment. Do you have any to add?

From Venngage – “7 Reasons Why Clicking This Title Will Prove Why You Clicked This Title”

“I don’t know about you, but anytime I see or hear mention of a story about a dog or a cute panda sneezing or a hippo farting, I get excited and immediately need to read or see more.”
The kind of traffic that comes to a “Clickbait” headline is often not well qualified. People come because of the headline’s hook, not because they need a product or service.
Having said that, the psychology of these headlines can be used to draw a more qualified audience to a content piece or landing page. Many of the best-performing headlines we’ve tested are abrupt and unexpected. It’s something they have in common with clickbait headlines: 79% of the ones analyzed in the Venngage used the element of shock.
So I offer this little study of click bait headlines. It’s worth the read if only for the dog videos. (Plus it turns out the farting hippo thing is real.)
Read more.
[sitepromo]

From Medium – “Making a Murderer: 7 Hilarious Things Wrong with Ken Kratz’s Website”

We don’t normally advocate for website redesigns. In fact, we think there are only two good reasons to do them:

  1. Rebranding or repositioning
  2. A poor content management system (CMS)

Kratz’s website might fall into both of those categories.
“If Ken Kratz had a child build his website without his awareness and did not make changes at the fear of hurting their feelings, then that would be a permissible excuse.”
Enough said.
Read more.

From The Washington Post – “The surprising psychology of shoppers and return policies”

“Overall, a lenient return policy did indeed correlate with more returns. But, crucially, it was even more strongly correlated with an increase in purchases. In other words, retailers are generally getting a clear sales benefit from giving customers the assurance of a return.”
It’s counterintuitive that sales increase when you give people more chances to return what they buy, but the data is there. Return policies are important: two thirds of eCommerce shoppers look at them, and these policies are a large part of how consumers choose where to buy what they want.
Read more.
[forfurtherstudy]
[signature]

One video can be the source of all kinds of marketing content

[videocourselinks]

Now, we all know that video is a great way for us to tell a story and communicate with our prospects, with our customers, and with suspects – people who may not even know what our business does. But have you thought about how powerful video is in terms of communicating with the rest of the team? A content strategy includes content from a number different sources – blog posts, infographs, reports, white papers, e-books. Your options are almost unlimited.

Solving the Subject Matter Expert Problem

The biggest challenge is what I call the subject matter expert problem. How do you get the knowledge out of the subject matter experts heads — either in your company or in your industry — and get them to turn what they know into these content?
The good news is you’ve got video.
It could be video like the talking head video above. It could be explainer videos that you’ve made, and even something as simple as webinar videos – videos captured while somebody delivered a webinar. The beauty of all of these forms of video is that the subject matter expert has sat down and thought about how they would explain what they want to teach on video.

Business Video Provides Everything Your Content Team Needs

Now, if you take that video and you hand it to the people who are producing your content, they should have everything they need, everything from how to lay it out and how to organize the explanations, and the thoughts, and the education, plus the graphics that come along with video.
Think about making video that’s not just for your prospects and clients. Think about making it for your internal team so that one video can cascade  and turn it into whole bunch of other kinds of content. This page was ripped from the above video.
[signature]

The Blue Line is a Metaphor for Conversion

The Conversion Function is the number of actions taken for an online property divided by the number of visits to that property.
Here is where we find the solid blue line in our websites.
It runs through our sites and our landing pages. It slices our prospects’ mobile phones, their tablets and their computers.

We Pay for Our Visitors

We charter the digital transportation that will bring people in under the line, these confounding and complex people we call visitors.
This is not an inexpensive undertaking.
We cajole Google with it’s menagerie of penguins, pandas and hummingbirds. We cast our banners and our ads across the internet, chasing prospects as they surf. We create the content, we share on social, and we send the emails that bring them to us.
We pay their fares promising them a trip to a place meant for them. Our place.

Our Visitors Want to Convert

They arrive below the line, looking for that solution, that thing that will make them feel better, that product to adorn themselves, that moment of entertainment when they can let go.
The blue line stands as a ceiling to our visitors and they image how things might be different if they could just get up there.
Above the line.
They are always tempted by the exit, the back button, the next search.
It is this blue line that our visitors struggle with, which means that we as online businesses struggle with it, too.
Some will climb. Others will accept the help of friends and strangers.
We create the line. We draw our blue line. Sometimes higher. Sometimes lower.

Conversion Optimization Helps Them Rise Above the Line

It is our duty help of our visitors to rise above this line.
We choose the tools that will elevate them.
Will we give them just enough rope to hang themselves?
Will we provide the clear steps, to boost them in their efforts?
Will we ask them to make a leap of faith and trust in their agility to spring safely above our blue line?
Will we try to make it effortless using the machinery of our websites to transport them above the line?
And what will drive them take that leap, to step on, to push the “up” button.
The vision we have for our blue line is one in which many make the journey. They come with their money in hand, ready to spend, ready to engage.
We see them coming with ample intuition and a nourishing supply of common sense, all calibrated by the way we see our business, ourselves and our world.
As it turns out what we “call sense” isn’t that common.
These frustrating people we call visitors aren’t like us. They aren’t even like the people we know. They come with their own rules, with their own ideas of beauty and their own sense of how things should work.
They are not here to be manipulated. They are here to be understood.

Why Visitors Leave Your Site

When they are not understood, they seem mesmerized by the exit, transfixed and hypnotized. We paid to bring them here and they, in their flagrant individuality choose not to stay.
In our hubris, we create the quicksand that will trap them. Did our navigation confuse them, do our words lack clarity, did we call them to act in the way they like to act?
We are opaque to them, and this is scary. Our visitors fear us like a bad dream on Halloween. Are we lurking behind our website, ready to pounce, to steal from them or, worse, to make them feel stupid and incompetent?
Do we fear being known for who we really are? For it is the unknown that allows our visitors imaginations to run to places we did not expect them to go.

A Complex Problem

How are we dealing with this complexity? For this is a complex problem.
How high will we set our line? What distance must these lost souls cover to find their solution?
What have we provided them? Why should they put their fears aside? How will we transport them above the line?
For it is their journey from below the blue line that tells us who they are and who we should be for them.
Our job at Conversion Sciences is Conversion Rate Optimization.
Our job is to get your visitors above the Blue Line.

Find out how Conversion Sciences gets more of your visitors above the Blue Line.


[signature]