What happens when you have a 25K email list and a campaign for it?

  • By Idan Carmeli
  • 22 Aug, 2017

We ran wild with some A/B tests for one of our client campaigns, and learned some interesting things, which might surprise even veteran B2B marketers.

By ‘good’ I mean ‘nerdy’ and by ‘things’ I mean ‘experiments’. But let’s start at the beginning. One of our clients, a classic B2B multi national firm, uses data to track which product line would be of interest to a specific contact in their database. They do it so they can send better targeted information in their campaigns, direct inquiries to the appropriate product people internally, and because I told them to.
Over time, the company’s marketing team has developed increasingly smarter tactics for collecting this data point. As an example, they run periodic product-line specific webinars, so every person who expresses interest in a certain webinar is automatically tagged with its associated product line info. Still, for a big chunk of their historic data set, this piece of info is still missing. In its bid to fill the gap, our client has concocted a simple tactic: send an email campaign inviting people in its general list of non-tagged records to receive only information relevant to them in the future, and use their selection in order to fill in the missing data.
So the team took to the drawing board, and the result was an aesthetically pleasing HTML email which I’ve had to blur here to protect sensitive information:
First version: HTML with full intro paragraph and large image links
Naturally, with such a large list, we could not resist the temptation to run a few experiments. So instead of sending the above to the whole list, we sent it to about 10% of it. Once the results started to come in, we set about figuring which experiments to run.
The obvious starting point was the Subject Line. That’s because our initial experimental shipment was met with an underwhelming 12.3% open rate. But click through rate was also unimpressive (6.1%), which meant that even those who did open the email, were not excited about its calls to action. So we worked out a different subject line, hypothesized that the original email body’s opening text section was too long and wordy, shortened it, and shipped it out to another random selection of 5% from the main list, which ended up being close to a thousand recipients.
Turns out our alternative subject line performed exactly the same, we were still flatlining at 12.3% opens. However, CTR jumped to 11%, which means it performed 80% better than the original. Sadly though, the sample sizes, i.e. the number of people who opened the email and were thus able to decide whether to click or not to click, did not yield statistical significance. Still, since the average CTR was unimpressive (8.5%), we decided to experiment on both the subject line and the email’s body, as we now had a decent idea about the independent performance of each component.
So for our next experiment, we hypothesized that text links would work better than images:
Third version: HTML with shorter intro, text links
Turns out we were right! Text links converted 71% better than images, and this time around, we had statistical significance. Yay!
We could have stopped there, but decided what the heck, our remaining list was still very large, let’s run another experiment. Our final hypothesis was that a plain text email, with text links, would perform even better than an HTML email with text links. Can you guess what happened?
Last version: plain text email
Our final shipment yielded the best CTR to date: 17.4%. This was 39% better than the previous result, although, once again, the difference wasn’t statistically significant. Strangely, even though we haven’t changed the subject line, open rates also increased, from 13.1% to 16.8%, a statistically significant difference, considering the number of emails we sent in each test.

So what have we learned?

In terms of statistically significant results:
  • For similarly purposed emails, text links converted better than image links
  • Subject lines performed better when combined with plain text emails compared to HTML based emails
Less significant results, yet consistent with our past experiences, indicated that plain text emails outperformed HTML emails in terms of click throughs.
Most importantly, the major differences between the initial send and the final ‘winner’ have resulted in a significantly better outcome in terms of the campaign objectives (in our case, maximal number of recipients indicating their interest area), and, nerdy stats aside, that is the only thing that truly matters.

The Converto Logs

By Idan Carmeli 22 Aug, 2017
By ‘good’ I mean ‘nerdy’ and by ‘things’ I mean ‘experiments’. But let’s start at the beginning. One of our clients, a classic B2B multi national firm, uses data to track which product line would be of interest to a specific contact in their database. They do it so they can send better targeted information in their campaigns, direct inquiries to the appropriate product people internally, and because I told them to.
By Idan Carmeli 14 Feb, 2017
In this LinkedIn post , I explore three themes that inhibit the achievement of accurate Lead-to-Revenue visibility within B2B organizations.
By Idan Carmeli 25 Jan, 2017
Marketo has been communicating this architectural change to its customers worldwide since August of 2016, yet only lately it has become clearer that the change will affect more than a few fields. Specifically this is true for Marketo customers whose Salesforce CRM edition is the Professional edition.

Tip: if you're unsure which edition your Salesforce org is running on, log in to your account there and simply hover the cursor on the browser tab where Salesforce loaded. The tab tooltip will show you the edition.

If you're not a Salesforce Pro user, you can stop reading here; the change won't affect you beyond the 16 fields that you were asked by Marketo to recreate in Salesforce (see here , Marketo login required).
By Idan Carmeli 21 Dec, 2016
It's hard to overstate the degree to which Marketo's Email Editor 2.0 has improved on the previous generation. In fact, it has become, in one fell swoop, the most powerful email editor in the marketing automation space - and we know quite a few of them very well.

One of its most striking features is the use of variables to expose many aspects of email design and functionality to the editor's user interface, enabling users to quickly and easily manipulate their email's elements, without resorting to source code edits or fighting the whims of the rich text editor.

To fully benefit from this feature, though, it's important to understand the two types of variables available to you: global and local. Mastering them and configuring your templates accordingly can greatly improve your (or your users') email editing experience in Marketo.
More Posts
Share by: