Airports worldwide are eagerly embracing ecommerce in line with market forces. From the integration of direct booking capabilities for flights and car parking to the provision of a one-stop travel shop for everything the passenger needs, the sky is the limit. But in order to create a passenger experience that’s memorable for all the right reasons, it’s more important than ever for airports to optimise digital interactions with the customer, commensurate with methods perfected in Retail. This ensures a strong flow of revenue and a customer who is engaged and happy before they even set off on their trip.
So what makes the difference between a successful ecommerce site that converts readily to sales and builds a base of loyal, happy customers, and one that doesn’t?
A/B Testing for airport ecommerce
Split testing, or A/B testing as it’s often known, is a tactic that is recognised across industries. The process of comparing two versions of a web page or marketing campaign against one another with the goal of determining which performs better has long been a staple technique, and it’s now the most popular conversion rate optimisation method after customer journey analysis. According to Invesp, 71% of companies are running at least two tests every month on websites (77%), landing pages (60%), email campaigns (59%) and paid search (58%). The same study showed that 60% of companies believe A/B testing is highly valuable for conversion rate optimisation, and that 1 out of 8 A/B tests has driven significant change for an organisation.
While A/B testing has long been seen as the answer to conversion optimisation, questions around its future were raised when A/B testing platform leader Optimizely announced the closure of its free Starter plan in February 2018. This move was seen by some market analysts as a sign that ‘reality’ had caught up with the ecommerce industry, and that A/B testing was ‘dead’.
The fact is, many A/B tests are set up incorrectly: Data is contaminated by external factors or never reaches statistical significance. Others have no sustainable impact on business and growth. Many fail due to insufficient data – a sample size must be set in advance and only one aspect can be tested at a time. Yes, the tests have the potential to make a huge impact, but Optimizely’s decision urges caution – done badly, A/B testing can do more harm than good.
So let’s have a look at the basic steps for creating valid A/B testing, and some of the common mistakes to avoid…
How does split testing work?
When a page or campaign is tested, it exists in two distinct formats. One will act as a ‘control’ page, and one will contain one variation. This variant, for example a call to action button or a header image, will be the thing that you are testing. During an A/B test, 50% of site visitors or subscribers are shown Version A, and 50% see Version B.
It is important to understand that an A/B test requires more than just creating a variation of the original page. It’s about identifying a problem and providing a hypothesis, backed by data, and then testing a new experience based on that groundwork.
When used properly, tests like this take the guesswork out of important business decisions. Testing brings behavioural data to the foreground, so that every aspect of your digital strategy is founded on insight rather than presumed knowledge about your customers. The results of A/B tests are evaluated based on whether the page containing the variant achieves a higher conversion rate. In this way, the success of proposed changes can be measured and customer preferences understood, while the impact of things that don’t work is limited.
Benefits of A/B testing
Because there are so many different elements to an ecommerce site, each site presents a multitude of opportunities for testing. This means there is massive potential for improvement. You might want to complete a full site overhaul, or to optimise your checkout process to add transparency and smooth the booking journey. This can be achieved by experimenting with one layout element at a time.
The conversion rate of your airport ecommerce sites naturally has a direct impact on revenue. This means that the results of A/B testing can be easily measured in terms of bottom line. These figures are not about lead generation, sign-up or brochure download, they are about sales and only sales. Successful tests can reduce cart abandonment rates and increase basket value. It’s even possible to analyse user behaviour to find out what interests your visitors and what doesn’t.
For example, ReplaceDirect reduced cart abandonment by 25% using A/B testing. Responding to research that showed high or unexpected shipping costs as the top reason for cart abandonment, the Dutch ecommerce business trialled versions of their site that told the customer early on that there would be no shipping costs. The company also tested the checkout page, adding an order summary, complete with total costs and delivery date.
A/B testing and conversion rates
Every single page on an ecommerce website prompts a desired action to be taken by visitors. Conversions include:
- Making a purchase: This is the primary and most important conversion for an ecommerce store
- Subscribing to a service
- Submitting a survey response: Useful for customer feedback and advocacy
- Signing up for a newsletter
For each webpage, A/B tests can be used to increase the conversion rate of the desired action. While simply increasing traffic can build sales, A/B testing categorically optimises each page to ensure conversion. Pretty much any and every aspect of your site can be A/B tested, and sometimes the smallest change can lead to a game-changing result.
How do I launch an A/B test?
Do your research: Study competitors’ websites and examine internal data to find areas that could potentially be improved with testing.
Ask industry experts for advice. Rezcomm has access to the data of a quarter of a billion passengers worldwide. The insight you can gain by consulting such a specialist in airport ecommerce will help you to develop and optimise many opportunities for ancillary revenue and improved customer service.
Based on your findings, create a hypothesis to be tested. For example, “Our shipping costs are not transparent and this is causing a high instance of cart abandonments, therefore if shipping costs are prominently displayed early on in the customer journey, the buyer is more likely to complete the purchase.” This was precisely the starting point used by ReplaceDirect from which the company reduced cart abandonment by 25%.
Think of A/B testing as a scientific experiment. You must determine the variable and set proper controls to gain an accurate conclusion. In the example above, the variable is the visibility of shipping costs. By comparing conversion rates from the new version of the page with conversion rates from the original, ‘control’ page, you will gain an accurate result. Once you have selected the variable that you will test, all the other elements of the website must be kept identical or results will be unclear.
Don’t run your A/B test indefinitely; set strict parameters determining the length of time and number of visitors needed for an accurate, reliable answer. It is important to ensure that the sample is large enough to produce statistically valid results.
Include aspects of personalisation in your tests. This will help guide the creation of a custom user experience where your site is tailored to the individual user – an aspect of the digital experience that is increasingly vital to customer satisfaction and successful conversion.
A/B best practices
In order to get the most out of A/B testing, tests should be managed according to several ‘best practice’ principles. The trick to accuracy is to keep things simple:
Instead of a permanent 301 redirect, use a temporary 302 URL, since each test page is only live for a limited time.
Don’t use one set of website content for human viewing and another for robots. This technique, which is called ‘cloaking’ is used to avoid negative SEO impact such as loss of Google-search ranking. However, it will render any tests pointless, because the necessary 50/50 feedback from real customers and prospects is missing.
Run each test for only as long as is necessary to gather enough data to see whether Version A or Version B leads to more conversions. As soon as you have the information you need, implement any changes and move forward to another test.
The business intelligence and analytics functions in your Rezcomm airport ecommerce platform will allow you to easily track any tests in real time, so you can be sure that your key business decisions are based on clear customer data.
Accurate A/B testing needs enough traffic, experience, and information to test quickly and with high confidence. It also requires the necessary resources to set up the tests correctly. Done badly, it can offer wrong information and result in misguided business decisions. With the support of Rezcomm’s tools and expertise, there’s every opportunity to achieve sophisticated optimisation that really makes a difference to your airport and its customers.
Rezcomm’s team of experts can guide your airport through the process of split testing every aspect of your ecommerce site and email campaigns. If you have any questions about how you can use our integrated platform to increase revenue and build customer loyalty, contact us for a chat today.
Subscribe to the Rezcomm Blog
Get the latest posts delivered right to your inbox