Imagine you’re working on a new email campaign and are unsure which subject line will get more people to open the message. Say you’ve got two good options. Instead of tossing a coin or going with your gut, you try them both: half your audience gets version A, the rest get version B. Whichever one performs better, wins. That’s the core idea behind an AB test.
An AB test (sometimes written as A/B test) is a structured and straightforward way to compare two versions of something to see which performs better. It could be a web page, a call-to-action button, an ad, an email subject line, or even the colour of a sign-up form. You change one element between A and B, keep everything else the same, and then measure which version leads to more conversions, clicks, signups, or whatever metric matters to your goal.
At its heart, AB testing is about learning what works, based on real behaviour, not assumptions. That’s why it’s such a valuable marketing and product development tool. Instead of relying on opinions or best guesses, you’re running experiments with users to find out what resonates most.
AB tests are usually run using an analytics or marketing platform that can split traffic randomly and track results. Over time, you can collect enough data to tell which version performs better statistically. The key detail here is randomness: each person has an equal chance of seeing either version, so the test results are fair and meaningful.
Marketers use AB tests to improve email open rates, landing page conversion rates, ad click-throughs, and customer engagement. Product teams use them too, to test features or design changes. If you’ve ever noticed a button move location or a slightly different version of a website, it might have been part of an AB test.
Let’s say you’re responsible for boosting newsletter signups on a blog. You want to test whether changing the wording of the prompt from “Sign up to get updates” to “Join 10,000+ readers who get our monthly tips” makes a difference. You run an AB test and find that the second version lifts signups by 15%. That’s a small change with a valid result, and now you’ve got data to back your decision.
Here are a few things to keep in mind with AB testing:
AB testing lets the audience decide in a world where everyone has an opinion on what “should” work. It removes guesswork from the equation and replaces it with evidence, giving marketers a clearer path to better results.
Gmail AI Overviews summarise emails automatically. What does this mean for email marketers? Learn how it works and how to stay visible.
Learn what a WhatsApp chatbot is, how it works, and how businesses use it for customer support, automation, and lead generation.
Email accessibility helps humans and AI interpret your content. Learn how structured, accessible emails improve clarity and inbox performance.
On Tuesday 9 June, we're attending the first ever Study Choice & Strategy Congress, bringing together marketing and communications professionals from MBO, HBO and WO.
Doctolib relies on Spotler SendPro to handle large-scale, compliant, and mission-critical transactional communications.
Discover 6 triggered email campaigns travel brands can use to increase bookings, recover abandoned searches and grow ancillary revenue.
Most travel bookings fail due to hesitation, not price. Learn how reassurance-led email marketing helps travel brands build trust and drive bookings.
Travel research now spans AI, social and OTAs. Learn how travel brands can connect touchpoints and drive bookings with a smarter marketing cloud approach.
Luxury Coastal wanted its marketing to reflect the same premium, personalised experience its guests enjoy.
Travel booking journeys aren’t linear. Discover why ecommerce-first platforms fall short and what travel brands should look for instead.