You’ve read the blog posts and you’ve heard from the vendors. A/B testing is a lot more difficult than you can imagine, and you can unintentionally wreak havoc on your online business if you aren’t careful.
Fortunately, you can learn how to avoid these awful A/B testing mistakes from 10 CRO experts who tell all in this Content Verve article. Here’s a quick look at some of their greatest pitfalls:
Joel Harvey, Conversion Sciences
“Because of a QA breakdown we didn’t notice that the last 4-digits of one of the variation phone numbers displayed to visitors was 3576 when it should have been 3567. In the short time that the offending variation was live, we lost at least 100 phone calls.”
Peep Laja, ConversionXL
“Ending tests too early is the #1 mistake I see. You can’t “spot a trend”, that’s total bullshit.” Tweet
Craig Sullivan, Optimise or Die
“When it comes to split testing, the most dangerous mistakes are the ones you don’t realise you’re making.” Tweet
Alhan Keser, Widerfunnel.com
“I had been allocated a designer and developer to get the job done, with the expectation of delivering at least a 20% increase in leads. Alas, the test went terribly and I was left with few insights.”
Andre Morys, WebArts.de
“I recommend everybody to do a cohort analysis after you test things in ecommerce with high contrast – there could be some differences…”
Ton Wesseling, Online Dialogue
“People tend to say: I’ve tested that idea – and it had no effect. YOU CAN NOT SAY THAT! You can only say – we were not able to tell if the variation was better. BUT in reality it can still be better!”
John Ekman, Conversionista
“AB-testing is not a game for nervous business people, (maybe that’s why so few people do it?!). You will come up with bad hypotheses that reduce conversions!! And you will mess up the testing software and tracking.”
Paul Rouke, PRWD
“One of the biggest lessons I have learnt is making sure we fully engage, and build relationships with the people responsible for the technical delivery of a website, right from the start of any project.”
Matt Gershoff, Conductrics
“One of the traps of testing is that if you aren’t careful, you can get hung up on just seeing what you DID in the past, but not finding out anything useful about what you can DO in the future.”
Michael Aagaard, ContentVerve.com
“After years of trial and error, it finally dawned on me that that the most successful tests were the ones based on data, insight and solid hypotheses – not impulse, personal preference or pure guesswork.”
Get the full article on Convent Verve.
Don’t start your next search marketing campaign without the guidance of our free report. Click here to download How 20 Search Experts Beat Rising Costs.