A/B tests drive growth – if you do it right. Qubit found that poorly conducted A/B tests can hurt revenue because the wrong changes were made. If you’re not A/B testing your emails, you’re not putting your best foot forward. So, keep reading to learn some A/B testing best practices.
What is A/B testing?
A/B testing, also known as split testing, is not as complicated as it sounds.
You’re changing a single element of your email (version A) and testing it against another version of the same element in the same email (version B). A subset of your test group gets version A and a different subset gets version B. The best-performing version is your new email and will be deployed to the remainder of your list.
Benefits of A/B Testing
It’s cheap. You can test as many times as you want, whenever you want. And we encourage you to never stop testing.
It’s easy. As long as you do it correctly (and we’ll show you how below), the execution and evaluation processes are simple and accurate.
It’s honest. You’re using your own database of real customers, so you’ll get the best gauge of how your market will react.
A/B Testing Best Practices
1. What can you A/B test? You can test just about any variable, but here are some big ones for email:
- Call to action (Example: “Book Now” vs. “Book My Room”)
- Subject line (Example: “Here’s 15% off your next stay” vs. “Want 15% off your next stay?”)
- Personalization (Example: “Mrs. Smith” vs. “Amy”)
- Images (Example: destination vs. people)
- The specific offer (Example: “Save 20%” vs. “Book 2 nights, get 3rd free”)
- Design (Example: colors, fonts, font sizes, bullet points vs. paragraphs, etc.)
2. Set up a control and variation. The purpose of A/B testing is to see if a variable change will improve email click and conversion rates. You should test the control (the original email) against a variation (the email you’ll test against your control).
3. Only test one variable at a time. One of the most important rules is to only change one element at a time. For example, you don’t want to test a new subject line and a new CTA at the same time. If your conversion rate spikes, you won’t know which new element influenced that change.
4. Don’t make changes mid-test. Don’t make any changes to the test until it is finished. If you interrupt the test before it ends, your results are no longer reliable. This includes adding new variables and changing your test group and the number of people who see the control or variation. Altering your goal will also invalidate your test results.
5. Figure out your test group. Two important components make up your test group: the number of people and the type of people.
- Your sample size needs to be a sufficient representation of your email list. Otherwise, any decisions you make based on the results may be flawed. The Optimizely sample size calculator is great to calculate your sample size. Input your current conversion rate plus the percentage increase you would like to see. The calculator will find the optimal test group size you’ll need for your A/B test.
- Segment your test groups. A/B testing needs to be done on like-minded subscribers. Otherwise, your results may be skewed since the engagement rate may differ based on segments.
6. Create a schedule. Determine a test period between email send time and analysis time that is long enough to produce reliable, accurate results. You should conduct your tests at the same time, day, month, etc. with seasonality in mind.
The time varies depending on how many variations you are testing. The more variations, the longer the test should run. If you know your emails don’t have many opens right now, spend a little longer A/B testing.
7. Analyze the right metrics the right way. Look beyond the one variable you’re changing. When your email open rate goes up, what happens to the amount of direct bookings? Visits to your spa or restaurant? An accurate analysis is a comprehensive analysis.
Measure as far down the marketing funnel as possible. Ensure that you use a reliable tool to read your results, such as Google Analytics or campaign performance reports from Revinate Marketing.
8. Determine your end goal. What do you want to achieve with this test? With emails, the goal is usually to improve open, click, and conversion rates. Establish a hypothesis to guide your A/B test.
Here’s the best way to create a hypothesis:
- Observe your current trends. What is happening that you want to change?
- Find a possible reason that may be keeping you from your goal.
- What element should you change to fix this?
- What is your goal rate? How will you know when you are successful?
Revinate’s A/B Testing solution
Why are we sharing these best practices? Revinate will soon debut a brand new A/B testing feature, and we want to make sure you have a simple introduction to the process. Stay tuned to learn more about Revinate Marketing’s A/B Testing.
Smarter email marketing
If you’re not already using Revinate to increase direct bookings with smarter email marketing, please reach out to learn more!
The post You’re Doing it Wrong – 8 Ways to A/B Test Emails Correctly appeared first on Revinate.