Improving customer experience with A/B testing

A/B testing is a simple experiment that compares two versions of the same webpage by displaying them to real-time webpage visitors, who are randomly assigned to see either the current (A) or updated version (B) of the webpage. By analyzing visitor behavior in each version, it can be determined which one performs better and should thus be implemented. The goal is to test the planned changes before their full-scale implementation.

What to test

When testing which version performs better, it is important to choose an appropriate goal, i.e., the metric you aim c to improve. As businesses primarily want visitors to take (more) action on their website, they aim to increase the conversion rate (CR), e.g., increase the number of website visitors who actually book a stay at the hotel. You can also test the number of clicks on the pop-up window of your new marketing campaign, or the number of views of your conference amenities page.

Since there are many experiments you could be running, it is natural to ask yourself if resources are devoted to the right experiment. Prioritize A/B tests. For example, the checkout page is usually given higher importance than the product features page, because of a higher chance of a conversion. Prioritization should also be driven by your expectations of which product updates are most likely to succeed, and how challenging it would be to implement them.

Identify meaningful updates

Heatmaps and Google Analytics data can be examined to identify problems on your website, i.e., the visitors’ pain points. For example, some webpage elements, even a video demo or crowded navigation bar, might be distracting users from the main content. Sometimes, even a small change, such as changing the “call-to-action” button from “Book” to “Check Availability”, or just repositioning it and changing its visual design, can lead to a significant increase in the number of clicks or even bookings.

Not limited to web pages

When it comes to e-mail marketing campaigns, changing their headlines is particularly worth A/B testing. Before performing an actual test, you can estimate how engaging your headline is (e.g., the Headline Analyser tool). The time at which an email is received can also affect response rates and may be worth testing.

Expand A/B testing to the amenities

Provide two different sets of toiletries, welcome gifts, or snack baskets, and determine how it affects guests’ experience (you can start by examining if toiletries were used or snacks eaten). This kind of A/B test should take into account different market segments – Millennials, Gen X, and Baby Boomers – which are generally known to have different needs and preferences for in-room amenities.

 

Photo: Designed by Freepik

Share this post