In my analysis time at work, I’ve been batting around the idea of “unengagement” — a metric to determine individuals that have received x number of email campaigns, are still eligible to receive campaigns1, and have not clicked on any links in the email.
The industry standard, per Epsilon Interactive, defines an engaged user as someone who has opened an email campaign or clicked on a link.
I believe that that metric has too much grey area in it to be useful.
The first issue is the preview pane. Lyris uncovered that 9 in 10 have access to it as part of their email reader, and 7 in 10 actively use it. When scrolling through their inbox, a preview that is not actively read by the recipient can still count as an “open”.
Secondly, there’s the matter of being unable to accurately measure the open rate for those people who look at the text-only versions of the email.
I believe that both these factors muddy the waters significantly, and have determined that it is better to determine users who are “unengaged.”
To qualify, under my current system, a recipient — over the life of the program — must have been sent to more than six times, have never opted-out, and have had less than four failure-to-delivers.2 Additionally, they must never have clicked on a link.
When setting this up, I did so with the understanding that opening an email is no indicator of engagement, but clicking on a link would be.
Ideally, in the future, I’d be able to find a system that would allow me to not only see how many times they’d open, but how much time they’d actively spent looking at the email. Then we could use opens as an indicator of engagement, when filtered by a metric based on t.
How are you measuring the engagement of your mailing list?
1. Eligibility is defined as not opted out, and has had less than four fail-to-delivers.
2. In the program at my employer, we have a threshhold for delivery failures of four — after which, the system will no longer send to that recipient without an override during the scheduling process.