September 10, 2019

How to Make Effective Emails With The Power of Science!

Scientists collaborating over beakers full of colorful chemicals

If you?re looking to create an email campaign, you?re in the right place ? but not the only place! We?re going to be talking about how to write the main body of an email campaign here, but you should also look at our article on how to make sure your message makes it into their inbox and gets clicked in the first place ? in other words, how to bypass spam filters and get the audience?s attention.

Here in part two, we?ll assume you?ve made it into their inbox, and now we?re going to work on making it stick ? turning your foothold on the reader?s attention into a successful email campaign!

Borrowing from Science to Create Successful Email Campaigns

As we mentioned at the top, this is part two on how to create a successful email campaign, from start to finish. Here, we?re going to talk about what you do to ensure success while actually writing your campaign ? and we?re going to borrow some ideas from what scientists call the scientific method.

What this means is that we?re going to draw from an established body of knowledge that tells us how to ask specific, targeted questions about how something we do will affect the world. Then we set up a scenario where we do that something, and use the results of that action to answer the question we started with.

Questions ? tests ? results ? answers! That?s how it goes. We?re going to use that series of steps to ask questions about your efforts to deliver a message to your audience, and then we?re going to look at what happens and use that to answer the question.

Also ? we?re not dismissing the human element to all this; a bit of emotional intelligence goes a long way in being an effective membership manager. The secrets of effective communication can?t themselves be broken down to an exact science; you won’t always find your way to an audience without a healthy dose of insight and intution in the mix, too.

Soft skills are important, but that doesn?t mean that an analytical approach won?t get you results!

Asking a Question

The first thing we?re going to borrow is the idea of a hypothesis: to scientists, this is a specific question that you ask before you run an experiment, and the answer you give to that question based on your best educated guess. You look at the facts and use them to predict something else that might happen or come true, then you run the experiment and see whether your guess was right or wrong.

How does that apply here? Well, we?re going to talk about a few different ways to improve your campaign by using certain kinds of tests, but for those tests to tell you what the most successful approach is, you have to define success first.

What does success mean to you? What are you measuring ? clicks? Views? Donations? You may want all three to be the highest they can be, but that?s not a given, so you need to define it ? some of those answers might be more important than others.

Pick the metric(s) that will be most important to you in the context of the changes you?re about to make, because you?ll need to make that decision in order to judge your results!

Running a Test

Now you have to run a test to measure your metric of choice. What does this mean? Well, there are two kinds of tests you could be running: split testing, and A/B testing. The two sometimes get used interchangeably, but there?s an important difference:

Split testing consists of taking two very different versions of a message, advertisement, or landing page, and exposing two different audiences to one version each, then measuring which one gets a better response.

A/B testing is best done after you?ve picked the best overall version through split testing: this process means making small tweaks or variations in the text, layout, or graphics, then sending the two versions to two different groups.

Both of these processes involve making a change to your content and then testing that result against two different groups, but one?s about big changes and the other is about small ones. It makes sense to find the most successful large-scale change through split testing, then using A/B testing to figure out what works best.

Be sure not to compare more than two different versions at once in a split test, as the most powerful tests will only check one version against the version you?re using already. After you?ve picked something through split testing, checking several minor variations in the same round of A/B testing is possible ? but it becomes harder to interpret the results that way, so be careful!

Understanding the Results

Now we have to circle back to something we discussed earlier: the ?hypothesis?, or the initial question you?re doing this to try to answer! You defined a metric earlier ? a measurement you?d use to judge the success of your changes ? and now it should colour what you think of the results.

Did the results from your split test produce more clicks in one version, or the other? Did these small changes make members more likely to click and donate, or did they not change the results? Maybe a small change only made a difference to one version of the overall message but not in another!

You can ask other questions with this information, too, and they become easier to answer depending on the analytics available to you through your membership management tools: did a particular change have more of an effect on one demographic, industry, or group than on another?

Using What You?ve Learned

Now you?ve got some conclusions that can be drawn about how effective your large-scale changes (the ones you split tested) and small tweaks (the ones you A/B tested) are in reaching certain segments of your membership base and in achieving the metric you?re trying to measure.

What?s especially important, once you?ve identified some trends and drawn some conclusions, is to document your work. You won?t want to have to go back and do this analysis again, so write down the things you discover all in one place so that you have a go-to resource for designing content for your membership base!

Next time, you can apply these changes and try another round of testing on the next mail-out, to further improve your results!

Conclusion

We can learn a lot from the way scientists do things; they?re using a tried and tested system of figuring out what will achieve a desired result and what won?t, and you owe it to yourself to put that system to work for you, too. Science is for everyone, not just scientists ? and that includes membership managers. Good luck with the testing!

Share This Story...

Subscribe to the Member365 Newsletter

Get the latest from Member365 delivered directly to your inbox subscribe today!