The biggest difference between successful digital marketers and unsuccessful digital marketers is that the successful ones continuously perform A/B tests.
When you test your campaign, you’ll want to use the A/B split testing methodology mentioned in the key definitions. A/B split testing allows you to compare one campaign element (the control) against a variation of that element.
The result is that you can test which changes to the control improve the click-through rate or the conversion rate.
Many people try to skip this step and design their campaigns around what they think would be most appealing.
But you’re only one data point, and you’re already an insider, so basing a campaign solely on your opinion (or the opinions of a few people in your office) is a prescription for failure.
Here’s a great story of why you don’t want to base your campaign on individual preferences (or previously held assumptions):
It’s a commonly held belief that having a security badge on a website improves conversion rates. (A security badge is a
graphic that indicates that the website has security features built in.)
But one designer decided to test whether eliminating a security badge from a landing page would significantly decrease his conversion rate.
So he did an A/B split test where one landing page had the security badge and the other landing page did not have the security badge.
To his great surprise, eliminating the security badge didn’t decrease his conversion rate. Instead it actually increased conversions 400%!
The lesson is that you won’t know how your audience responds to specific changes until you test your way to success. With that in mind, remember that it’s always a good idea to run A/B split tests to optimize your campaigns.
Correct This Search Marketing Mistake In Two Steps
- Be sure to have the discipline to run A/B split tests for your next campaign.
- Don’t make assumptions about how your target market will respond.