Why A/B testing in Email always falls short & what to do about it
?? Adam Kitchen
Founder @ Magnet Monster ?? - Klaviyo Elite Agency & Content Army ?? Scaling Personal Brands for B2B Founders on LinkedIn & X
I've concluded that most A/B testing over email marketing is a waste of time.
Yes, hot take, but hear me out for a second: this is based on working with close to 200 DTC brands over email (see my recommended A/B tests).
Most A/B testing is generally implemented on the front end of the website to convert first-time visitors into buyers.
Experimentation holds credence here as it can plug the gap in a leaky bucket leading to tens of thousands of incremental revenue gains.
With email, that's not the case, and there's a very specific reason why.
Your customers already know who you are!
You don't need to run so many experiments to "convince" them to buy from you again.
You just need to stay top-of-mind with engaging content.
After working with close to 200 brands and auditing in the region of nearly 500, guess how many times I've come across a brand with a consistent - and effective - A/B testing protocol in place over email that drives incremental revenue?
ZERO.
I'm not kidding. There have been some decent efforts, but the juice is seldom worth the squeeze.
There's a fair few reasons for this:
?
These are the core reasons why A/B testing fail consistently over email to generate anything meaningful.
?
I'm including Holdout Tests over campaigns as well as sending time optimisation in this bucket as well.
?
Both these strategies lead to transient findings that seldom hold over the long-term and are influenced by constant variables which distort their credibility.
?
Bold statements, I know, but it's backed up by enough research at this stage of my career to make the claim.
?
With that being said, am I against ALL testing and diminishing the impact a testing culture can have?
?
领英推荐
Absolutely not. I do believe in testing 1 thing, relentlessly.
?
Pop-Ups!
There are only two ways you can influence the results of your clients:
?
Pop-ups are unique in that they not only drive number 2 but also have a direct influence on converting first-time visitors into buyers, which is where the majority of efforts should be focused with A/B testing.
?
Pop-ups tend to fall under the jurisdiction of email marketers as their responsibility and they're a key lever you should pull to drive conversions and build up the brand's database.
?
There are a huge amount of variables you can test with pop-ups across multiple parts of the customer journey making testing a non-stop treasure trove of incremental gains.
?
A few (obvious) ideas I recommend testing monthly:
?
?
Test your pop-ups relentlessly with the focus on driving conversions and increasing list growth and keep a diligent log on your findings.
?
It's the main focus area that will unlock leverage for you as an email marketer and genuinely move the needle.
WATCH & LEARN ??
Are you prepared for the BIG changes coming to email deliverability in February 2024? Watch this video to understand the basics quickly.
The biggest misconception around email deliverability? It's this.
??Optimizing Lifetime Value & Profits Through Klaviyo, CRO & Customer Value Optimization || Owner at Polaris Growth
1 年Here's my view: While I do see sample sizes being small for some brands, random testing changes without giving it a thought is a common thing I've seen, having a good hypothesis in place will help. Even with low numbers you can still run multiple similar tests to i.e. test which email layout works best. Combining the results of these tests will give you a better sample size. Another example would be to test psychology principles. There are many principles you can test. Rewriting text for a specific principle will give you better insights in what resonates the best with customers. More examples - testing what journeys work better. Would sending someone to a landing page or blog page or product page work better. Or even send time testing which is also some form of testing. The important thing is to always have a good hypothesis to start with. Either from insights in the data or ideas from research or a hunch that there is something you're missing. Testing can lead to new insights but you can't ignore statistics. Things I normally see go wrong: Testing random shit without giving it much thougt, low sample sizes, not taking into account outliers, no hypothesis at all. It's not always about winning, is also about learning!
Killer B2B FinTech copywriting and marketing strategies (I generate new leads and revenue for B2B FinTech companies)
1 年95% of A/B testing is, Adam Kitchen ????.
? CRM marketing, Email Flows, Email Campaigns, HTML, CSS, SQL
1 年I agree with AB-testing send times because people open their email whenever they want, and it's ridiculous to read statements that "the most popular day to open email is Thursday at 11am." ??
Turning one-time shoppers into customers for life!
1 年Could not agree more about focussing on creativity instead of spending all our brain power onA/B testing. Whether you're an email marketeer or a front end developer. Making amazing, memorable customer experiences will triumph over adjustments to word placement here or there.
Host of the Experimentation Podcast 'From A to B,' and CRO Manager Passionate about Learning-Focused Experimentation.
1 年"Your customers already know who you are!" .... So Amazon, Walmart, Booking.com don't test? Or in-product testing is a waste by that logic? "After working with close to 200 brands and auditing in the region of nearly 500, guess how many times I've come across a brand with a consistent - and effective - A/B testing protocol in place over email that drives incremental revenue? Zero" Is it not possible they don't know how to run an effective Experimentation program? Why do you blame the tool? If I use a hammer to chop an onion, i don't blame the hammer for being an ineffective tool. It seems like this is an experimentation strategy limitation that you're broad scale saying " you don't need to really test for email". To say "don't worry about email testing - only run popup testing for email" seems very short sighted IMO.