Survival of the Fittest: Build Better Creative by A/B Testing Your Ads

Written by Tim Edmundson

Every marketer wants their creative to be top notch – but how can you be sure it’s the best it can be? A/B testing. It’s a simple concept where you pit one ad against another to see which is more effective. This way you know you’re utilizing the best combination of copy, visuals, and CTAs for your audience. Even Darwin knows a little competition is healthy, so putting your ads up against one another will help you determine which ad is a winner.

Let’s explore the ins and outs of effective A/B testing, and how you can tweak your campaign creative to see what works, and what doesn’t.


If the scientific method is good enough for the scientists of the world, it’s good enough for us marketers. Testing your ads is very similar to the experiments you may have done back in chemistry class — any successful test begins with a hypothesis and ends with analysis. Here’s a quick rundown of the A/B testing process:

> Determine Your Hypothesis |  Develop an idea of what you expect to see — which ad will perform better and why? What sets that ad apart from the others? By taking stock of your assumptions and then challenging them, you put yourself in a position to evolve your expectations, and build better ads.

> Identify Your Variable | What are you going to change across your ads? Pick an element of your ad to change; messaging, CTA, and visuals are all fair game.

> Define Success | This should be tied to your goal for your campaign — which ad was more effective at driving the metric you’re most interested in?

> Prepare for Launch | Determine when your ads will launch, who they will target, and how long they’ll run. Now let your ads run and let their performance stack up.

> Analyze the Data and Pick the Winner | Now that the results are in, look at your reporting and determine which ads performed best against the definition of success you set earlier.

By following this method, you should be able to surface which types of creative work best. Once you have that information, you can optimize current and future ads to take advantage of your learnings, and drive more customer action. But remember, one test is never enough — keep testing your optimized ads and continue to drill down deeper into what works. This will ensure your future campaigns incorporate all of your cumulative learnings, and will continue to evolve with your audience.


Don’t feel like your testing should be limited to just a handful of elements — there’s plenty of variables you can pit against one another. These can be as focused as the wording on your CTA, or as broad as the complete visual look and feel of your ad. Here are some variables you can mix up:

> CTA | Try different action-oriented CTAs to see which better prompts your audience to act; “Shop Now,” “Learn More,” and “Get X% Off” are all strong contenders. You can even shift your CTA’s placement in the ad to see which arrangement gets the most clicks.

> Copy | Phrasing matters! Alternate the way you discuss your deal or promotion, product, or brand. And don’t limit your copy adjustments to minor tweaks — testing two completely different lines of copy (a long vs. short version, for example) against one another can reveal what customers find engaging.

> Visuals | Your visual theme can have a heavy impact on your ads’ effectiveness. Test to see what works best for your brand — what colors work best? Should you include people, or not? Test your way to the answers.

> Elements | Adding an element like a countdown timer to your ad can create a sense of urgency. Test your audience to see if they’re more likely to visit your product page and convert when you show them that time is working against them.

> Targeting | Your ad is only as effective as your audience is receptive. Test different target segments to see which ones are more interested in clicking and converting.

> Placements | Consider where your ads are running, and mix it up. Your ads may perform better on certain sites, but you’ll never know until you test and analyze the results.

Changing multiple components of an ad at once can get you interesting results, but it could muddy the waters in determining which tweak was responsible for the uptick in performance. So if you’re looking for clean data, try only changing one thing at a time when you test — changing too much at once can leave you unsure of what worked and what didn’t.

That said, if you’re feeling the urge to try completely different approaches and test them against one another — go for it. Large differences in performance can reveal audience attitudes quicker if one of the ads doesn’t resonate and the other does, giving you a larger vision of what works better. And you can still make one-at-a-time tweaks to the winner to continue to optimize. This is also a good way of kicking off a campaign, and can help you determine your overall messaging and visual direction going forward.


Once you’ve done your A/B testing and identified what works best for your ads, you’re on the right track — but you’re not done yet. As your campaigns progress, your testing should continue as you create new ads based on the learnings from your previous tests. Take elements that worked in your previous creative and create a subset of new ads, but ensure they have a few key differences so you can test them against one another. The work of the A/B tester is never done.

Eventually you’ll develop a knack for what works, and build a collection of ad element combinations that connect with your audience. Although it’s tempting, you can’t just find what works once and run with it forever. Finding the perfect combination is a constant evolution that will always need to be optimized if you want to get the best results. If you take that approach, your future campaigns will be set up for success.