We just reached the 50 campaign / destination milestone! So, with some specific focus on our second month which has also recently ended, we thought we would expose our data … and share what we learned from those 50 campaigns.
- Sharing our list growth and engagement rates from Month 2
- Asking “So flipping what?” – what might be learnt from our data?
- Some specific notes on Churn rates, Unsubscribes, List Fatigue, and Engagement over time
- You can experience EveryDaydream Holiday with a free subscription to our daily destinations email
If you’ve just found this article:
- Why we are sharing all of this information with you? Because we love articles that open the kimono to other web businesses, and wanted to share our data and experience with those keen to learn from it.
- This is Month 2, Part Two. Month 2, Part One (website traffic and revenue data) is here; Month 1 can be read from Part One or just Part Two.
- EveryDaydream Holiday is a travel entertainment startup – We craft a travel story so real that it feels like your own endless vacation, and deliver it to your inbox five days a week.
Come here from Hacker News? This is a longform post – if you want to ensure there’s a useful HN discussion (like I do), consider upvoting this post now before it drops off the new page.
WANT TO SEE OUR EMAILS IN SITU?
If it helps you to visualise our emails – which differ in design and content length from our daily posts – here are some examples:
- Our very first campaign – Ancient Rome, all the content from our daily post
- Our first template redesign – Marlborough Wine Tour, larger pictures and little text
- Our current template design – Houston, adding back in more text
My intention here is to first share the raw data, because I know some people find value in this (especially as a comparative exercise). The ‘So What?’ (and WIIFM) comes from our more in depth look at the data further down – click here to go directly there.
Four days into this month, my co-founder and I were having a text message discussion. At the time we noted after Month 1 we had added only 22 new subscribers – not even one a day. He was making a point about consistency – “We need to ask where do we get tomorrow’s subscriber from?”
I made the point that at 200 subscribers, you can statistically expect 1 unsubscribe every single time you send out an email*. Since we send a daily email (weekdays), finding 1 new subscriber each day would keep us steady, at best. My response was “We need to ask where are we getting 50 subscribers tomorrow?”
The next day, we had 55 people sign up.
This graph shows our cumulative number of daily subscribers since we launched on 12/12/2012. You’ll see that the first month was one of steady, though uncertain, growth – we were generally moving upwards, with a particular spike around Christmas**. And then BANG on the 17th of January we meet my hypothetical 50 subscriber challenge.
The reason for this is detailed in Part 1 – it was our “200 Hours Effort” post going ‘viral’***. The impact of that on our site traffic and subscriber numbers lingered for about a fortnight – during which period we took our eyes off the ball a little. The last fortnight of the month, however, things trail off. That hurt – and we’re addressing that, so next month’s data will be rosier!
- * This is based on some Vision 6 and MailChimp email marketing research, backed up by my own experience over 11 years of email marketing. An unsubscribe rate of 0.5% is about normal.
- ** We’d like to say the Christmas spike was everyone spreading the word. In reality, it was mostly us seeing family.
- *** I’m mindful that we’re sharing this data, not because we’re super massive and successful, but to help those who are also starting out. For us, 50 subscribers and 10,000 visitors in a day is ‘going viral’ – Buzzfeed probably sacks writers who hit numbers that low.
A reminder of the two Engagement Rates we use: “Open” and “Opened Click Through Rate”.
- Open Rate is pretty straight-forward – what percentage of people who were sent the email, actually opened it.
- Open CTR is the number of people who clicked through as a percentage of how many people OPENED the email.
- This is slightly different to normal CTR, which measures click as a percentage of number sent, and therefore (in my opinion) lessens its utility.
Here’s how we trended this month against both measurements.
And then here are two graphs comparing us to Travel and Transportation Industry Averages, provided by our email service provider MailChimp.
These comparisons are not perfect, of course – they lump our daily travel entertainment email in with weekly travel deal sites, hotel newsletters etc. It’s not apples v apples, but it’s what we have.
Our Open Rate, while declining, remains strong. In fact, the closest it went to approaching the average was our Kyoto email, which Gmail (and likely others) marked as spam – those days suck when you have a daily product – of course, you always have tomorrow.
Our Opened Click Through Rate, also declining, is not as successful. Our struggle here – and we experienced this last month – is finding the balance between providing a self-contained amazing daily email, and driving people to our site to read more. I’ve always believed that a content email ought to have value in and of itself – people don’t want to read emails they know will simply force them to visit a website. Our current email template design, amended to add more text, is part of the reason for declining numbers at the end of this month.
This made it less likely readers would click through to the website, but we believe (and must now test) that they will enjoy the email better as a result. If that’s the case, then Open Rates could be expected to improve as a result.
- Our normal CTR, when graphed, looks as more like our Open Rate graph than the other one.
- If we just used that metric, we would be patting ourselves on the back. It’s misleading because it fails to filter out Open rate as a factor – more opens will equal more clicks-through. So while it’s a useful measurement for gross traffic numbers, it’s misleading and risky to use it as a measure of engagement.
SO FLIPPING WHAT?
Right, Raw Numbers are only useful if you want to put in the effort to compare. So here’s our ‘So What’ – we hope it also gives you some guidance for your email marketing efforts, with the caveat that you can never assume what is the case for one business will apply to another.
IF YOU LEAVE ME, CAN I COME TO?
We wanted to know 3 things from our Unsubscribe data:
- What is our list churn?
- How long, on average, do readers stay before they unsubscribe?
- Is there a tipping point from the data that can help us avoid unsubscriptions?
This is a simple metric that asks ‘How long does it take for our list to replace itself?’ Some people will unsubcribe quickly (‘this isn’t what I expected’) and some people will never unsubscribe (thanks mum!), but on average how long would it take for our entire list to leave.
This is a simple calculation – we’ll do it just for month 2.
How many people left this month as a percentage of the remaining list.
(49 / 166 = 29.51%)
And then how many months would it take for that percentage to reach 100%?
(100% / 29.51% = 3.38)
Based on this data, our list will turn itself over every 3.38 months. That, incidentally, is awful for a business like ours! Our initial goal, not based on experience, was to make that figure 12 months* – we have a long way to go.
It’s also important to track that over time. In fact, if we do the analysis for our first 50 campaigns, the churn rate is 8.9 months. I won’t draw any conclusions until we have the data from more months.
Do you know your churn rate?
- * Mailchimp notes that the average list loses 1/2 – 1/3 of their list each year – a churn rate of 12-18 months. HOWEVER, I note Newsletter Directory says the average account sends 42 campaigns per annum.
- If I combine those figures, we could deduce an average churn rate of 42-63 campaigns. We’re on track for ~180 campaigns, which shows either our content is really engaging or that the flaws in this comparison are gaping!
2) How long, on average, do readers stay before they unsubscribe?
This is also a simple calculation. I’m going to use a median for this, because over time I don’t want long term subscribers or confused subscribers who leave immediately to unduly impact the figure.
For EveryDaydream Holiday, the length of time in 13.5 days. (If we used the mean average, by the way, the figure would be 16.8 days. Again, let’s see how both those numbers move over time before drawing any conclusions.)
The wider data set here is kind of spiky, based largely on our small size at this point in time. (If our list size were 10x as large, 1-2 unsubscribers moving around wouldn’t affect things; here they create spikes and valleys.)
3) Is there a tipping point from the data that can help us avoid unsubscriptions?
I was hoping this data would show us something really interesting – like a lot of subscribers in the first X days, then a lull; or a significant spike after Y days that we might be able to prevent in some engaging way. But there’s nothing so dramatic in the data, based on where we’re at today.
Not being Nate Silver, I’m not going to draw many statistical inferences from this chart or the trendline in it. I will make two points:
- There’s a clear trend downwards – this means that the longer people stay subscribed, the less likely they are to unsubscribe.
- We’ve only had one subscriber stay with us for more than 6 weeks and then later leave.
- Part of the reason for this is that we are a very young website.
You could draw the statistical inference here that if people are subscribed for more than 58 days they will NEVER leave. In reality, only 87 people (about half our current subscribers) have been with us for that long, and that data set is friends and family heavy.
I wanted to overlay on this the length of time current subscribers have been around. But that comparison proved to be unhelpful, because a majority of our readers came in two spikes – at launch (thanks to our pre-launch survey) and when our “200 Hours Effort” post went viral. Again, with time (and list growth) this will be useful.
List fatigue happens when subscribers “become bored with your email content“.
Here’s a graph of reader engagement BY DAY. That is, we normalise all of our subscribers back to Day 1 (the first email they receive) and then compare the daily averages. While it’s useful to know that ‘Toasting Marshmallows over Lava‘ is more engaging than ‘Going to Graceland‘ (which we can measure by the Engagement Rates, above), it’s also really valuable to test what happens to your subscribers over time. When, if ever, does List Fatigue kick in?
You can see an element of that in the trend line, and certainly the drop in open rates after the first week. Honestly, I was expecting to see a much more obvious trend, possibly with a plateau towards the end as we reached a consistent level of people happy to be receiving our daily email ‘forever’.
Instead, after the first week, there’s a fairly steady Open Rate band between 30% and 40%.
There are two major dips – at the very end (campaign #50) and at campaign #26. These are actually the same event, the campaigns just before I produced this data. Because we’ve had two dramatic spikes in subscribers, the current position of those cohorts (at 26 campaigns and 50 campaigns respectively) has an oversized impact on the numbers. As more subscribers move through, these will level out in a real number sense (ie, campaigns 26 and 50 won’t always look that bad); and we’ll always have to keep an eye on those cohorts, because where they are exactly on any day we produce this report will likely create a spike upwards or downwards.
One thing we’ve noticed – although anecdotally, unless we pull apart MailChimp to create that report as well – is how many people use the weekends to catch up on their readings. So our gross Open numbers normally go up over the following weekend – that’s part of the story here, with those dips caused by Wed / Thu / Fri emails awaiting a weekend. [Edit: All of those posts did see improvement over the weekend.]
Here’s a graph that shows what percentage of our subscribers open what percentage of their emails. (The X axis runs from 0%, never open, to 100%, always open our campaign; the Y axis shows the percentage of our subscriber base, including those who have unsubscribed, that match that open rate.)
I always expect 0% to be over-represented in these figures, because of the way email marketing software measures Opens. It’s possible for someone to open and read every email, and still never show on our reports.
At the other end of the spectrum, there’s another spike. Fully 6% of our subscribers open every single daily email; 14% open in excess of 90%.
That’s the email marketing game. Our overall Open Rates this month, for example, were sitting in the 30%-40% range. One-half to one-third of these are the same people every day – the remaining group varies by day, from people who open a few a week to the 25% of subscribers who open, on average, less than one per week.
WHAT ELSE WOULD YOU LIKE TO KNOW?
I could play the email marketing data game all week, but there’s an amazing travel newsletter to publish. If you’d like to be part of this data in the future, you can subscribe to our free emails here.
And I would love to know – what other data would you like me to reveal, next time we open the kimono to EveryDaydream Holiday
Email me – firstname.lastname@example.org – or leave a comment here or on Hacker News.