A/B Testing: What Is It and Why Do It?

You may have heard people talking about A/B testing. This is a practice in which marketers take a portion of their mailing list and use it to test different messages, offers and calls-to-action (CTAs) against an existing version or other version(s) they are considering. A/B testing really helps you learn about your customers and what makes them tick.

For example, do you know which CTA is more likely to motivate your audience? “Act now! The first 500 people receive $100 off!” Or “Act now! The first 500 people receive a 50% discount!” Similarly, do you know if changing that discount affects response? Does an offer of “20% off” pull a higher response than an offer of “15% off”? If so, does the extra 5% discount generate enough additional revenue to justify the deeper discount?

A/B testing can be used to test anything and everything related to your message. Is this image more effective than that one? Does it matter if the piece lands on a Tuesday or a Wednesday? It allows you to ask questions, test and discover answers that may surprise you.

This practice takes work and planning, so you may be wondering if it’s worth it, especially if your marketing efforts are already successful. The answer is: Yes!  Who doesn’t want more success? Although you may be happy with the results of your campaigns, what if those results could be even better?

Here are a few examples of where A/B testing made a difference:

  1. Company A tested two versions of its marketing piece. The first had a clean, visually appealing design. The second had a design that was less visually appealing but that had better copy blocking—a shorter headline and a subhead that brought out key benefits of the product. Although the second version was less attractive, it increased sign-ups by 38%.
  2. Company B sent a direct mailer that directed people to a landing page. One version of the page had a snappier, more interesting headline and subhead. Yet, the “less interesting” version out-performed the more interesting version by 115%. Why? The less interesting version tied back to the original communications, creating recognition and consistency in the messaging throughout the sales funnel.
  3. Company C wanted to test the value of including testimonials in its communications. One version had great copy but no testimonials. The other included testimonials, even though they were located below the fold. The result? The version with testimonials out-performed the other by 34%.

Want to see how much you can improve your own campaign results by testing individual elements? Contact us today for help crafting your own A/B tests!

Leave a Reply

Your email address will not be published. Required fields are marked *