If You Don’t Tell Your Kids About Conversion, Who Will?

lemonade-standNo parent relishes having “the talk” about conversion rates. No one wants to tell their kids: “Kids, conversion rates are still only around 2 percent” or tell them the naked truth about all the sites out there that were never optimized. Or point out that they’re hanging with the wrong crowd when it comes to web development and creation.

But if we don’t tell the kids about the benefits of hanging with fake people—called personas—who will? We have to just pick the right time and be ready to be vulnerable.

When confronted with a difficult question like “Daddy, should I put up a squeeze page?” we can just say: “I just don’t know. Let’s explore that together.”

As hard as this is, it’s better they learn about value propositions from you than off the street. Check out my article on Search Engine Land: 7 Things to Teach Your Children About Conversion to learn how to do this the right way.

I’m completing the chapters of my new book due out in this Spring. Find out how you can get a free copy of the book when its available.

  • Agree with the points you made above Brian. Quite insightful.

  • VoiceTranscribing

    Thanks for the article Brian. Yes, more and more of our clients use our transcription service http://voicetranscribing.com to transcribe their podcasts, webinars, interviews and generate content for blog posts.

  • Hi Kristi,

    I can respond with a lengthy comment, arguing the futility and wastefulness in running A/A or A/A/B etc. test, but I’ve already done an article on that back in 2014, so I’ll just share that: http://blog.analytics-toolkit.com/2014/aa-aab-aabb-tests-cro/ If you’d like to check it out and, hopefully, respond to it I think it will be beneficial for the readers of this blog.

    Kind Regards
    Georgi

    • Thanks, Georgiev. When you have a test setup that spans multiple domains, servers and security features, an A/A test is critical. We have been saved by A/A tests. In response to your excellent article I ask, “Which is more wasteful: Running a series of A/A tests or running a series of A/B tests that result in the wrong decisions?” The latter can impact sales for months or years.

      • Sounds like an unusually complicated test setup there, Brian. What kind of problems did those many A/A tests reveal? Randomization issues? User experience uniformity issues? Statistical engine issues? I’m just thinking there has to be a better way to detect & debug most of these, but the statistical engine ones…

        • We never really found the smoking gun, but we suspected cookie persistence issues, iframe security delays, page load times, etc. We redesigned the approach and verified the setup with an A/A test.

  • nrennie

    Thanks for removing my comments @bmassey:disqus.

    Surely constructive criticism is part of making things better, and excluding a market leader from your “Top tools” was exactly this?

    So my valid point was why not include Maxymiser? It’s a huge gap in your post.

    • Cut the sarcasm, @nrennie. It’s never appropriate. You commented on the wrong post here. I assume you meant to post on “The Most Recommended AB Testing Tools by Leading Experts”. I’ll reply to your comment there, but we didn’t list Maxymiser because nobody recommended it. Our team used it for one client and found it lacking on several key features.