As a Conversion Scientist, I used my background in Conversion Rate Optimization and Landing Pages to create the first draft of my OkCupid profile, the landing page of me. I utilized the chemistry of a successful landing page formula to make sure I hit all the known conversion points. OkCupid’s setup will limit me in the type of test I do. We’ll be doing pre/post testing so I started by putting my best page up, letting it run for two weeks and calculating my “pre” conversion rate.

This is a key piece of knowledge for any business ready to test – know your base conversion rate.

During the first 14 days my profile was live, I had 104 visitors with nine messages. Those nine messages resulted in four qualified leads. My starting overall conversion rate is 8.65%. My qualified lead conversion rate is 3.84%.

My first stop in testing was a critique with an expert in landing pages. Lucky for me, I work for one. Sometimes, it’s difficult to asses your own work, so calling in an outside expert is always a great place to start.

The Conversion Scientist, Brian Massey, was nice enough to do one of his famous live critiques. In his video critique he pointed out blind spots and a few things that might be troubling.

If you’re not ready to call in an expert, there are tools you can use to give you a better sense of what might be happening. As a Conversion Scientist, I always start with analytics, click-tracking heatmaps, and screen capture sessions. These data points allow me to come up with a hypothesis list.

When creating a hypothesis list for a client, analytics is always the first stop. It allows me to identify key pages and performance metrics. I look at landing pages, all pageviews, audience channels and conversion metrics for each. This is where I start to see patterns and look for what pages I should be testing.

Questions to ask when looking at analytics:

  • Where are visitors coming from?
  • Which pages are they landing on?
  • Which pages get the highest traffic?
  • What are the key pages in the funnel?
  • Are there pages with high exit or bounce rates?

I use this data to compile a list of key pages I want to look at more closely.

With OkCupid — and most landing pages — it’s pretty easy to know what to target. Visitors are coming from /match or /quickmatch pages and coming to my profile landing page.

Once I know what pages I will focus on, I switch to another set of tools. Heatmaps and Session Recordings provide a lot of insight into where visitors are getting hung up. The data these tools generate is a hot bed for hypothesis generation.

They allow me to see if a key call-to-action is in blind spot or if something on my page is getting surprise attention. Check out the Conversion Lab for a list of awesome conversion tools options.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three

"*" indicates required fields

This field is for validation purposes and should be left unchanged.


Even though OkCupid won’t let me install Crazy Egg or Hotjar, I’m still going to treat my dating landing page like I would a client’s website when I start the optimization process. I make a list of hypotheses I think could improve the conversion rate and come up with a plan of action about how to test each one.

Normally the resources I can install on a client’s website inform the hypothesis list and the recommendations I come up with, so I have to be creative by relying on my own experience and on an expert’s opinion, namely Brian Massey.

Here are a few hypotheses from his analysis.

I create a list of hypotheses to test when I begin optimizing

I create a list of hypotheses to test when I begin optimizing

Brian’s critique gave me some great ideas on what to test. I know that my copy needs a bit of work, as does my landing page’s scannability. This is the first hypothesis I’m testing:

Hypothesis: If I change the copy to be about the visitor, instead of myself and improve scannability with bold text and paragraph breaks I can improve conversions.

I carefully changed all of the “I” statements and made them about the visitor. I also added more paragraph breaks and highlighted key words in my text allowing a visitor to more easily scan my profile.

revised okcupid profile

My revised profile

When testing, it’s important to isolate as many variables as possible, so for now the copy is the only thing I changed. I could have swapped out my headshot for a party shot, but if I see an increase in conversion rate, I won’t know if it’s the photo or the copy that’s improving my numbers.

For our testing purposes, my primary goal will be to beat my qualified lead conversion rate of 3.84%, but I will be tracking my overall conversion rate and visitor count as well.

I’m going to want to test more than one hypothesis to get this profile just right. For my next test, I’ll focus on images. Choosing the right images are vital to the success of a landing page, maybe even more so on this particular type of landing page. Since my next test will focus on the images. I did some research, scouring the internet for articles from online dating experts and determined the best profile photos were a smiling woman looking at the camera, showing some skin but not too much skin.

I had a small selection of photos I thought would fit the bill so I decided to take an informal poll of men that fit the type I was looking for: I asked a bunch of my guys friends to help me choose a photo. The photo of me in a black sleeveless dress smiling warmly at the camera was the clear winner. I filled out the rest of my profile photos with a variety of activities and a few shots of me dressed up a bit to show that while I may wear a lab coat to work, I do clean up okay for a night on the town.

This first test isn’t about the images, but after Brian’s critique, I knew that my images might not be saying what I wanted them to say. For this initial pre/post test, I left the photo winners from my poll as they were but added captions to clarify what I wanted the viewer to get from each image.

I've shared what I was doing when this photo was taken and also indicated that it's a fairly recent photo

I’ve shared what I was doing when this photo was taken and also indicated that it’s a fairly recent photo.

With my changes made and my visitor count ticking up, there’s nothing to do but wait and see. We’ll check back in a week (and I’ll look every day in between) to see how my text changes have fared. With any luck (or in my case, with science), I’ll have upped that 3.8% conversion rate.

How many goals do you set when you’re designing a split test for your website?

We’re goal-crazy here in the lab at Conversion Sciences. It is not unusual for our tests have dozens of goals. Why is that?

We see split testing as a data collection activity, not a tool that gives us answers. It’s not like wikipedia. The split-testing software on the market to day is amazingly agile when it comes to tracking, targeting and snooping on visitor behavior. We certainly want to track transactions, revenue and leads. But we learn so much more from our tests.

Conversion-Scientist-Podcast-Logo-1400x1400


Subscribe to Podcast

In my new Marketing Land column The Multi-Goal Magic Of Split Testing Software, I describe how we use some of these goals to find sweet spots in a website.

  • Find out how to “light up” a funnel that is invisible to analytics.
  • Discover the pages are most influential in converting.
  • Segment your audience based on their behaviors.

You can listen to the column or read it for yourself.

The Mobile Web is still in its infancy.  Today, alleged “mobile best-practices” are nothing more than successful desktop strategies scaled to a smaller screen.  But people behave differently on small-screen devices than they do when they are sitting at a computer.

Conversion Sciences has begun to see what Mobile Web 2.0 will look like. Having completed dozens of mobile design split tests, key trends have begun to show themselves. Much of what we have learned flies in the face of conventional beliefs.

This is why we test.

Some of our customers now have higher converting mobile sites than desktop sites.

Our approach to mobile design is controversial because, as scientists, we can’t just accept traditional wisdom at face value.  We need evidence.

Joel Harvey will be reveals the results of dozens of tests we’ve completed.  Insights are based on real tests. No gut instinct here.  Watch Mobile 2.0: Judgment Day to learn what he has discovered. He shares:

  • Can mobile websites can convert better than the desktop?
  • How to increase mobile conversion rates.
  • What is poison to your mobile conversion rate.
  • How iPhone and Android visitors act differently.

Watch the replay on demand in its glorious entirety.

Don’t ignore your mobile traffic. It can be a real revenue generator sooner than you think.

The fight for online leads and sales has traditionally been fought at the search engine. That is changing.

Web analytics, bid management, competitive intelligence, ad testing and ad management tools are all common staples of any serious paid search effort. Return on ad spend (ROAS) is being tracked all the way through the sign up or purchase process and ad strategies are being adjusted accordingly.

Quietly, the battle for online leads is moving to a new front. This new front is measured by revenue per visit, and it’s kissing cousin, conversion rate. Like the tide that floats all boats, website optimization is being seen as the way to reduce all marketing costs by dropping the acquisition cost of new prospects and customers.

Why do we say this is happening quietly? That is the conclusion we came to when examining an unusual data set from SpyFu.com. We were able to determine which businesses had conversion optimization tools installed on their website. This, we reasoned, gave us a pretty good idea of which businesses would dominate in the world of online marketing — assuming they were actually using the tools.

Conversion-Scientist-Podcast-Logo-1400x1400


Subscribe to Podcast

In this month’s podcast, based on the Marketing Land column Data Exposes Scandalously Low Adoption Of Conversion Optimization Tools, Brian the Conversion Scientist explores the usage of conversion optimization tools for two industry segments: Higher Education and B2B Software.

In one report, 73% of businesses are spending between $500 and $5000 per month on paid search ads. Almost a quarter are spending between $5000 and $50,000 per month. Yet, only 14% of businesses have at least one website optimization tool installed.

Who are going to be the winners in this new front? Where does your business fit in this statistic?

Split tests are a favored tool of economists, direct response mailers and website optimizers. The primary challenge when creating a split test is being sure that you’re measuring exactly what you think you’re measuring.

As my grandmother used to say, “The road to bad conclusions is paved with uncontrolled variables.” Yes, we are a geeky family.

When your variables get out of hand, all hell breaks lose, and you end up testing something you didn’t intend to test.

When comparing humans, nothing beats twins for experimenting. I feel sorry for twins because of the number of medical and economic studies that tap them for “controlled” experiments. It must be like living on the Island of Dr. Moreau.

The following YouTube video uses twins to perform an experiment that tests the social axiom that, “Chewing gum isn’t attractive.” Watch the following video and see if you can name some of the uncontrolled variables that may be putting this experiment on the statistical road to invalidity.

This is an art exhibit, not a controlled experiment. However, many who watch it may believe that it provides scientific evidence that we should all go out and pick up a few sticks of Wrigley’s if we want to be more charming. In fact, that is what the makers of Beldent want you to believe.

Is this test really telling us to put our jaws in motion if we’re to be more likable?

Controlling Variables

The artist who created this experimart controlled for appearance quite effectively. Genetically identical twins are used as the control and the variation. They are dressed identically. They wear identical makeup. They are sitting in a neutral position on identical chairs. They have identical facial expressions. The lighting is the same on both participants.

The only apparent difference is that one of them is chewing delicious Beldent gum (Beldent is the South American brand from Trident).

Test subjects sit in front of these two doppelgangers and listen to questions piped in through a headset.

“Which one seems like he has more friends?”

“Which one has more imaginary friends?”

“Which one gets invited to more parties?”

“Which one gets invited to more bridge tournaments?”

“Which of these bosses would give you a raise?”

Participants chose the left or the right by pressing one of two buttons vaguely reminiscent of The Family Feud.

It looks like all variables have been controlled, and that the results can be considered valid.

The Unsuspecting Twists that Ruin Experiments

One thing experienced testers learn quickly is that little things will influence your tests more than you would have imagined.

For instance, this experiment could be testing if people favor pressing the left button when they are unsure of their answer. In every situation, the gum chewer is sitting on the left. To control for this, the gum chewer should have been on the left sometimes and on the right sometimes.

This experiment could be comparing gum chewers to people who look like total bummers. In every situation, the control twin is sitting with a neutral expression. In any situation, they would look boring. To control for this, they could have asked the twins to be smiling sometimes.

The handlers who setup the test, a test paid for by Beldent, may have unconsciously chosen the more attractive twin of each pair to be the gum-chewer. Certainly, the artists wanted Beldent to win. This creates a bias. To control for this, the twins should have taken turns as the gum chewer.

The designers of the test attempt to control for age and gender biases. A mix of twins is used: men, women, older, younger.

The people pressing the buttons may represent a skewed sample set, however. The experiment was done in a museum. Only a portion of the total population enjoys museums. Thus, the test subjects are probably not representative of the population as a whole.

Basically,

Beldent can only conclude that chewing gum makes you more attractive to museum-goers when you’re sitting to the left of your boring twin.
Beldent can only conclude that chewing gum makes you more attractive to museum-goers when you’re sitting to the left of your boring twin.

A More Rigorous Experiment

To really get a handle on the social benefits of constant mastication, you would design a test that controlled for even more variables.

Participants should be shown only one twin. This, of course, means you don’t need twins. Just one person from each set, chewing, not chewing; smiling and not smiling.

Questions would have to be reworded.

“Does this person make friends easily?”

“Does this person get invited to parties frequently?”

I would scratch the “bridge tournament” question. You’ve got to be smart to play bridge well. Smart people are likable, right?

This new design would require a larger sample size than the 481 participants in the Beldent artsperiment. You now have sixteen different treatments (four chewing, four not cheweing; four chewing and smiling; four not chewing and smiling). You would need 1600 participants or more to get close to statistical significance.

Unfortunately, this new experiment wouldn’t deliver such dramatic video footage. But without this rigor, Beldent may be lying to themselves — and to us — about the value of chewing gum.

Imagine Your Web Pages as Twins

Our job at Conversion Sciences is to design website tests for companies, tests that tell us exactly what we want to know about a web page and nothing more. We agonize over the subtle things that introduce bias into our test. We always want to test the right thing.

We create two or more versions of a page that are like identical twins, with only one thing changed. We make sure that the visitors in our tests are representative of the site’s visitors at large.

And we must control our natural human biases. Like the artists who setup the experiment, we want to get wins for our clients. If we give in to this desire, we can find ourselves calling tests too soon, or running with statistically insignificant results that favor our treatment. Even when our testing tools tell us we’ve got a winner, we are skeptical.

It isn’t hard to tell if we make a bad call. If our tests didn’t increase the fortunes of our clients, our reputation isn’t worth a p-value. The accounting department’s numbers don’t lie.

Let us test some pages for you. If you have five-hundred transactions a month or more, you have the sample size to test your way to more sales, more leads and more subscribers. And we’ll throw in some chewing gum.

Beldent Almost Identical on YouTube

Welcome email tests that will help begin the onboarding process that turns tryers into buyers and buyers into long-term subscribers.

Email is still the most effective strategy for onboarding visitors. By “onboarding” we mean:

  • Getting tryers to use the product so they can become buyers
  • Getting buyers to use the product so they become long-term subscribers
  • Getting repeat buyers to share their appreciation of the product

Yes, email is important to your business. It can’t be done through Facebook or Twitter. It can’t be done through SMS. Maybe it can be done through direct mail. Maybe.

The first step in these processes is the ubiquitous Welcome Email. It gives customers a first impression of your business. Guides them through your product. And demonstrates the value that you can bring them. It’s what takes them from trial to paying user to a repeat user to a evangelist.

In fact marketers who utilize welcome emails find that they have a substantial effect on their conversions with some even experiencing up to a 50% conversion rate when implementing them into their onboarding marketing strategy. Impressive, huh?

Welcome emails aren’t as straightforward as you would think, however. They need to be tested. From timing to subject line, rigorously A/B testing the different aspects of your emails is a sure fire way to build the most effectual onboarding strategy for your business.

Today, we are going to focus on one aspect of welcome email A/B testing – Content.

Content is what entices your user to click-through and act. You need to get it right.

Welcome Email Tests to Engage Customers

Here are five A/B tests you should be doing on your content to optimize your onboarding emails and get users converting from trial to lifetime customers.

1. Test Simple vs. Hyper-Stylized Design

Let’s begin with design.

No matter how well-written your emails are, if it the look isn’t right the effectiveness will be hampered. Emails can be as simple or flamboyant as you wish. Generally they are divided into three types:

  1. The first type is E-zine style. It’s flashy, hyper-stylized with images and bold font taking centre stage.
  2. Next is SaaS style. It’s cleaner and simpler yet still professional.
  3. And finally Personal. This has no branding, no design. Just a straightforward email.

It’s up to you to test what works best for your business.

welcome email tests - stylized versus simple email design

Will your visitors prefer a stylized email or a simple “personal” email?

An interesting design case study comes from SitePoint, a specialist in content for web developers. After sending out over 40 newsletters, their campaign started to look a little lackluster.

Their initial emails were uncluttered and pared back in design. And they wanted to continue with this look but update it and get more clicks.

So they ran an A/B test.

The first thing they tested was the template, and the results were positive with an initial 16% rise in click through rates.

Next they tested images – should they include them or keep it plain text? SitePoint already had a hunch that their customers didn’t care for them and wanted a text only email.  This assumption proved to be inconclusive as the results were 118 vs. 114 clicks in favor of no images.

This inconclusive test demonstrated that readers didn't prefer nor mind images in their welcome email.

This inconclusive test demonstrated that readers didn’t prefer nor mind images in their welcome email.

These welcome email tests were just the first round of experimenting for SitePoint. They went back to the drawing board and tested everything again. They experimented with images and templates until they found what worked best.

The winning email template after ab testing of welcome emails. Simple, but a little design can go a long way.

The winning email was simple, but a little design can go a long way.

The winning email retained the simple look of their original email. It was just updated, more attractive to readers and most importantly, increased their click-through rate.

Contrasted to this is Wishpond. After extensive testing of their own emails, they discovered images were just what their audience wanted. Using images produced a 60% higher click-through rate versus just using text alone.

These two contrasting examples are just to illustrate the fact that there is no single best design for all businesses.

There is no one template fits all.

You need to test to discover what your customers like and what drives results.

2. Test A Single Call to Action

When you send out your welcome emails we are betting you have one goal in mind – getting customers to use your product.

All too often we see businesses sending emails with multiple links and requesting customers do numerous actions. It’s confusing and will distract your user from your goal.

So here’s a challenge – try restricting your welcome emails to have only one call-to-action,

That’s exactly what Optimizely did.

In 2014 they began rigorously testing all aspects of their emails. One of the tests had a goal of increasing click-throughs on the call to action.

To do this they sent out two emails. The first having only one CTA, while the second had multiple.

Welcome email: test a single call to action. Optimizely tested emails with a single call to action against their one with several.

Optimizely tested emails with a single call to action against their one with several.

There was one clear winner. The email with only one CTA produced substantially more click-throughs with a 13.3% increase.

Narrowing down your email to one call to action can be a tough task. You have a limited amount of onboarding emails to send. Yet you have so much to say.

Try removing any unnecessary call to actions you have in your emails and just focus on what you believe is most important.

Ask yourself what is the most important thing you want your customer to do after receiving this email and make this your call to action.

Then test.

3. Test Urgency Inducing Copy

When sending welcome emails to onboard your users there are some tactics you can use to convert those trial users into paying customers.

One method is urgency. Using a sense of immediacy in your email to get your customer to act now.

MarketingExperiments tested the effects of urgency in their email campaigns.

They planned a Web Clinic Invite and sent out two emails. One was just the simple invite. The other however, had three extra urgency inducing words – Limit 1000 Attendees.

Five welcome email tests to turn tryers into buyers - test urgency inducing copy.

Urgency may induce more of your email recipients to act.

The email containing the urgency had a 15.4% increase in click-throughs. Pretty impressive figures considering the only difference was 3 words!

When sending welcome emails, urgency can be incredibly valuable.

Here is another example of urgency from Sprout Social.

To get trials to convert to paying customers they use copy to imply urgency and encourage users to act now.

Urgency can be communicated in may ways.

Urgency can be communicated in may ways.

They use phrases such as “Only 2 days left” and “Time Flies – your trial period is over in just 2 days”. It shouts “act now or you’ll miss out!”

It’s a clever way to optimize your emails and get more customers converting.

4. Testing Email Length (How Long Should a Welcome Email Be?)

When a customer signs up you want to tell them everything about your business.

Explaining every feature and what you offer in a long winded email is going to show them the value of your business, right? Well probably not.

Conversely, saying too little can be problematic also. Customers might feel under informed and might not act at all.

Research has shown that the average open time for an email is only 15-20 seconds.

With such a small window of time, you need to test how long your emails should be to have the maximum impact.

iMedia Connection decided to carry out tests, with two versions of an email promoting an upcoming conference.

One email was verbose, containing all of the information about the conference within it as well as links to the website.

The other was half the length, with only a short description and a link to a website containing the information.

Testing email length: A bigger open rate doesn't mean a higher click-through rate.

A bigger open rate doesn’t mean a higher click-through rate.

The shorter email proved to be more appealing. iMedia Connection reported that not only was the open rate on the shorter higher at 30% vs. 20% but the click-through rate was also higher at 11% vs. 5%.

Short, brief content was the winner here but that might not always be the case. Getting your emails length right must be tested.

Good ab testing will help you find the perfect balance between being informative while also being concise.

5. Welcome Email Tests: Test Personalization

Personalization is one of the most effective techniques to increase conversions from emails. Using a customer’s data to appeal to their interests has been proven to work time and time again. And it isn’t as complicated as you may think.

DoggyLoot, an online store experienced astonishing success when they began personalizing their email’s content.

They recognized that Rottweiler owners wouldn’t want the same emails as Chihuahua owners. So they began to segment in the simplest way possible.

They began collection “doggie data” by asking owners one simple question – is their dog small, medium or large?

Based on this data, they created three email segments based on dog size. Each segment received an email that had products that were suited to their dogs.

Welcome email tests: test personalization. DoggyLoot sent different emails to owners with different sized dogs.

DoggyLoot sent different emails to owners with different sized dogs.

The results were impressive to say the least. The personalized emails that were targeted at large dog owners had a click through rate that was 410% higher than the average.

Personalization doesn’t have to be complicated. Just find whatever works for your business.

Doggyloot just asked the right questions on signup, enabling them to segment their audience with relative ease.

Whether you just add a user’s name or build comprehensive buyer personas, testing personalization can be a real asset to your welcome emails.

5 Welcome Email Tests To Turn Tryers into Buyers: Summary

These 5 A/B tests and case studies are guidelines. Some may work for your business while others might make no impact at all.

It is important to focus on how customers are reacting to your email content. Measuring click-throughs and conversions is essential. See what makes statistical significance, gets users converting and becoming lifelong customers. For more advanced A/B tests read our Ebook “Welcome Your First Million Users: The Ultimate Guide to A/B Testing Your Welcome Emails”.

You’ve read the blog posts and you’ve heard from the vendors. A/B testing is a lot more difficult than you can imagine, and you can unintentionally wreak havoc on your online business if you aren’t careful.

Fortunately, you can learn how to avoid these awful A/B testing mistakes from 10 CRO experts. Here’s a quick look at some of their greatest pitfalls:

Joel Harvey, Conversion Sciences Worst A/B Testing Mistake

“Because of a QA breakdown we didn’t notice that the last 4-digits of one of the variation phone numbers displayed to visitors was 3576 when it should have been 3567. In the short time that the offending variation was live, we lost at least 100 phone calls.”

Peep Laja, ConversionXL Worst A/B Testing Mistake

“Ending tests too early is the #1 mistake I see. You can’t “spot a trend”, that’s total bullshit.”

Craig Sullivan, Optimise or Die Worst A/B Testing Mistake

“When it comes to split testing, the most dangerous mistakes are the ones you don’t realise you’re making.”

Alhan Keser, Widerfunnel.com Worst A/B Testing Mistakes

“I had been allocated a designer and developer to get the job done, with the expectation of delivering at least a 20% increase in leads. Alas, the test went terribly and I was left with few insights.”

Andre Morys, WebArts.de Worst A/B Testing Mistake

“I recommend everybody to do a cohort analysis after you test things in ecommerce with high contrast – there could be some differences…”

Ton Wesseling, Online Dialogue Worst A/B Testing Mistake

“People tend to say: I’ve tested that idea – and it had no effect. YOU CAN NOT SAY THAT! You can only say – we were not able to tell if the variation was better. BUT in reality it can still be better!”

John Ekman, Conversionista Worst A/B Testing Mistake

“AB-testing is not a game for nervous business people, (maybe that’s why so few people do it?!). You will come up with bad hypotheses that reduce conversions!! And you will mess up the testing software and tracking.”

Paul Rouke, PRWD Worst A/B Testing Mistake

“One of the biggest lessons I have learnt is making sure we fully engage, and build relationships with the people responsible for the technical delivery of a website, right from the start of any project.”

Matt Gershoff, Conductrics Worst A/B Testing Mistake

“One of the traps of testing is that if you aren’t careful, you can get hung up on just seeing what you DID in the past, but not finding out anything useful about what you can DO in the future.”

Michael Aagaard, ContentVerve.com Worst A/B Testing Mistakes

“After years of trial and error, it finally dawned on me that that the most successful tests were the ones based on data, insight and solid hypotheses – not impulse, personal preference or pure guesswork.”

Don’t start your next search marketing campaign without the guidance of our free report. Click here to download How 20 Search Experts Beat Rising Costs.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three

"*" indicates required fields

This field is for validation purposes and should be left unchanged.


Team, tech, process and scale: these are the primary components shared by Tania Shershin of Homeaway at the Which Test Won Live Event in Austin, Texas.
We captured the high-points of her presentation live in this instagraph infographic.
If you want to create a culture of testing in your organization, here is a roadmap to success.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three

"*" indicates required fields

This field is for validation purposes and should be left unchanged.


Questions can be powerful things, if the questions are good and are leading us in the right direction (normally this means toward better results).  When the questions are about the statistical significance of your marketing results, you’ve got a winning combination that will really take you places.

Conversion-Scientist-Podcast-Logo-1400x1400


Subscribe to Podcast


So, what are a couple of good questions that you can use to help you become statistically significant as a digital marketer?  Brian introduces a couple of doozies in How to be a Statistically Significant Marketer (Part 2) on Marketing Land:

Question 1: Are the results I’m seeing from my online marketing telling me what really happened?

Question 2: Are these results “good” enough to predict what will happen in the future?

You will find part one here: Become a Statistically Significant Marketer.

Brian tackles these challenging questions by introducing us to three characters involved in life-like scenarios that will seem all too familiar to some of us.

Finally, he ends with the question that every marketer has to continually ask at every step along the way:  What do I do next based on the results I am seeing?

Here’s another solid question you can ask yourself:  “Self, is it worth 15 minutes of my time to find out how to begin asking the right questions in my marketing efforts?”  We’re going to help you out here and encourage you to go with a resounding “YES” on that one.


21 Quick and Easy CRO Copywriting Hacks to Skyrocket Conversions

21 Quick and Easy CRO Copywriting Hacks

Keep these proven copywriting hacks in mind to make your copy convert.

  • 43 Pages with Examples
  • Assumptive Phrasing
  • "We" vs. "You"
  • Pattern Interrupts
  • The Power of Three

"*" indicates required fields

This field is for validation purposes and should be left unchanged.


My partner Joel Harvey is fond of saying, “My favorite part of a design is the money.” He’s been part of many a web design project. His perspective comes in response to the number of times he’s heard things like:

“I want the design to pop!”

“I want my site’s design to be groundbreaking like nothing else out there!”

“Let’s turn it up a notch on the design.”

“I want the site’s design to reflect the high value of our product.”

In and of themselves, none of the above statements are unworthy pursuits. But if your goal is to increase online sales conversion and fill your coffers to the brim, you will fall woefully short if you believe that web design alone can do the heavy lifting of convincing your visitors to take action. If increasing sales is your goal, the most important person on your split testing team is the accountant.

Designers Don’t Design for the Accountant

A while back, a client sent us a couple of different mocks of some new designs they were entertaining. They ask which one I liked. The first thing I said is I like the one that makes you the most money. Up until that time their team was arguing over color palettes, white space,and rounded edges.
When I reminded them about the bigger goal, their conversation evolved. In a clock tick, we were all discussing the quality of content on the pages rather than the design elements. When their offer and call to action were right, everyone seemed to forget about the trivia of the actual design.

Designing For Your Ego

Another client brought to us a new landing page campaign they had just launched and were baffled and disappointed by the early results. They went on to explain that they thought this was the best designed landing page they had ever done. They had just hired a new graphic designer that ‘got it’, and even the CEO was impressed with his work. One problem, their paying customers didn’t seem to agree. No doubt, the design was gorgeous. Rich colors, curvy rectangles, sexy images, even the header and body fonts were crisp and clean.
So why wasn’t this campaign working? We had them show us their most recent successful campaign. The design was a tad dated, and compared to the new landing page it looked like a high school hobbyist in the company basement eating Cheetos and suckling energy drinks.
Still, by comparing we immediately saw the problem with the new landing page. The copy on the old page was much better. The headers screamed the product’s value proposition and benefits. The body copy answered relevant questions, and helped the reader imagine themselves buying the product. The call to action button was big, bold, and in your face. The new page looked stunningly attractive but said very little.
To add insult, the hot shot designer was a minimalist and had an aversion to big gawky buttons, so his primary call to action was tiny button that blended in with the hero image, and , by design, was easy to ignore. We instructed them to use the old page copy on the new design (they had to make a few adjustments to make it all fit), and we asked the designer to create a bigger and bolder call to action button. They obliged us and that new design finally beat the old landing page.

How Much Time Are You Spending With Your Designer vs. Your Banker?

So my lesson is this. Beautiful, eye-popping design and effective, profitable web design are two different things. And it always seems easier to mistake those eye-popping designs for profitable ones. Split testing will always lead you in the right direction.
Some companies spend more on design than they do on organic SEO, and almost all companies spend more on design than on Conversion Rate Optimization. Search engine spiders don’t evaluate site design, only content and links. And I have yet to see a company design their way into a better conversion rate and better RO.
Some companies spend way more time going back and forth about a design element than they do actually testing it. Makes you wonder how far ahead of your competitors you could get if you spent more time and resources on conversion optimization and testing.
So when considering a redesign of your entire site, of a successful landing page, or even a banner ad, do the following:

  • List the things about the page experience (not just he design) work. Keep those in the new design.
  • What about the experience doesn’t work?
  • Why do we want to change this (especially if it is working)?
  • Before you launch a radically new design, test what you believe is NOT working about the current design.

Above all, use web designers that deeply understand the web and principles of conversion. Otherwise they are just an artist, and the value of an artists works usually increases only after their demise. Can you wait that long?

© Copyright 2007 - 2025 • Conversion Sciences • All Rights Reserved
Conversion Sciences® and Conversion Scientist® are
federally registered trademarks of Conversion Sciences LLC.
Any unauthorized use is expressly prohibited.