Why, How To, and Limitations of A/B Testing
Personal Experience With A/B Testing
A/B testing, also known as spilt or bucket testing, is a method of comparing two versions of a creation against each other to determine which one performs better. This is ultimately an experiment where two different versions of something such as a web page, are randomly shown to users and the results determine which version performed better. By testing, you are eliminating the guessing, and backing your findings with actual data results. Running an A/B test, you can test hypotheses and focus on specific changes you want to make and collect data to see the impact of these modifications. A/B testing is also beneficial because over time you can continue to use it to consistently improve users experiences.
The software used for our A/B testing was Mail Chimp. In order to run an A/B test in Mail Chimp you begin by creating a template. Next, choose your theme, such as a newsletter, events, or e-commerce. From this, you will create an a campaign for either an email, ad, or landing page. For this example we will do an email. After making your selection, you can import email addresses, say who it is coming from, the subject, and import your template creation. Once all of this is completed, you can send out your test, and begin watching the results come in.
There are a few limitations with A/B testing though which includes, time (longer), number of participants, knowledge (interpretation and statistical skills), and lastly preference vs behavior. For our A/B testing we wish we would have had a longer time period to send our emails to compare results and behaviors over a more separate time period. Number and type of participants can also be a limitation. Our participants were only year one marketing students at Western. By having a more board type and number of participants our results could have varied compared to what we received this time. Lastly, if people preferred our newsletter because of the information that it provided or if they were uninterested/motivated by other factors to read it. With our participants being college students themselves, we felt they would be interested in reading our information provided.
In the end, our newsletter was successful but has room for improvement. We had 15 opens and three clicks into our articles. With the 15 opens, our title and subject line were intriguing but could be more convincing. As far as the three article clicks, we could have rewrote our short summaries to make them appealing but not giving away the entire article where they do not want to go on to read more. Attached below is a link to our newsletter along with a link to my post about other types of softwares you can use to create your own newsletter.