Test for Success

Ever wondered what drives response – pictures or words? Red or blue? Flash or plain html? A great way to capitalize on the democratic medium of email is to put your burning questions, late night hunches, and out-of-the-box ideas to the test with an A/B split test! Allow your audience to vote with their clicks and get instant answers that can help drive stronger results!

Follow a few simple guidelines provided in the eec Email Design Roundtable’s A/B Test Checklist and start testing your way to more engaging email program.

Let no area of your message be safe from scrutiny! The checklist provides test ideas that will help you optimize:

Subject Lines

According to the vast and varied experiences of our very own eec Email Design Roundtable, there are 3 golden rules to follow when executing a successful and insightful test:

Rule #1:
Focus on one key variable at a time. Note before you start the test what key metric you are looking to influence to declare a winner. Subject line testing is generally about getting people to open the email; calls to action are more about clicks and conversion.

There is one caveat to focused decision making in A/B test scenarios – while it is necessary and rewarding to get answers to your burning questions by tracking a measurable change in a single metric, it is important to realize that your fidgeting with things can cause unintended side-effects…

• When SL testing, you might focus on change in open rate in order to determine which worked better, but also consider post-open actions (did the subject line set the person up to convert in the email?).
• When image testing, keep an eye on your overall file size, does this negatively impact your deliverability?

At the end of the day, email is a direct response medium, so just be clear what you are trying to test/achieve, and make sure your positive results in one area aren’t sabotaging another.

Rule #2:
You MUST use a random distribution for setting up your “A” and “B” audience groups. The sizes of the segments don’t need to be the same if the key metric you are looking to influence is expressed as a “rate”, but they do need to have the same general characteristics to be a fair test (don’t test all buyers in the A group and all prospects in the B group).

In fact, if you can’t decide between one hero image and another, do an initial AB split test with a small percentage of your audience on Monday, then send the winning creative to the remainder on Tuesday.

The initial test will give you enough of a sense of “what worked” to roll out the best variation to the remainder of your list. Be ready to act on what the data tells you – you might be surprised!

Rule #3:
Ron Blum of Upromise astutely points out that while the purpose of A/B testing is to find out what works – “don’t assume what works today will work tomorrow…
tastes change, people get used to and fatigued by getting the same look-and-feel”.

Continuous testing is the best recipe for continued success.

Advanced A/B Testing

If you are one of those highly-evolved, weekly A/B test prodigies and are looking for a new angle on ye old A/B test, try multi-variate testing on for size.

Not all customer / audience segments behave the same way. As your mailing strategy gets to be more complex, there is no reason to stop A/B testing. In fact, segmenting your audience allows you to exponentially increase the insights provided by your A/B testing!

Take this example from Williams-Sonoma:

In general, we find that including the price for a featured item on the hero image of an email drives clicks and conversions. However, when we recently tested the presence of price on an email that was segmented between customers who had a history of spending more than $100 per transaction vs customers who had a tendency to spend less than $100 per transaction, we found that low price customers were more likely to click when the price was NOT provided whereas the opposite was true for customers who had spent more than $100 with us.

Not only did this test help us drive response rate for all customers in the first test, this insight helped us develop a strategy around talking to our lower price customers that will continue into future campaigns.

In order to set this up correctly, just remember golden rule #2 and make sure you have a “control” group in both segments.

With these four segments:
Low Price A vs Low Price B
High Price A vs Low Price B

You can test A vs B in Low Price Segments and see if it’s the same as A vs B in your High Price Segments.

Please join us in the pursuit of more perfect email by using our A/B Test Checklist, available in the eec’s Whitepaper Room, and returning to post your results below!

Megan Walsh, Williams-Sonoma
eec Email Design Roundtable Co-Chair