Notes on an A/B test

Author

Category

Read Time

Date

We often run A/B or multi-variant tests on pages to see if we can maximise our click through rates and banner testing is something we run extensively.

Earlier this week one of the hotel merchants we work with was running a 33% off January sale and we decided to push our banner test out to an internal competition to see who could guess which sale banner would perform the best for the client over 2 days.  Now, normally we get a lot of correct answers however this test was a little unusual – if only because everybody got it wrong!

Here are the three sale banners we rotated:

A)                                          B)                                     C)

Each banner was viewed over 500 times, with varying click through rates – the worst performing at 16% and the best at 21.5%.  Normally a test like this would need to run for at least 1000 impressions to get any really conclusive results, but the short time span on this offer meant that we had to go on the clicks that we had.

Everyone that answered sent me the same prediction… C – many of them stating that the ‘Ends 31st’ messaging would create more of a sense of urgency and increase CTR.

However the test results said different.

In 1st place – B) ’2012 Sale – Book by 31st’ – 21.5% CTR

In 2nd place – A) ’Jan Sale – Book by 31st’ – 18.9% CTR

In 3rd place – C) ’Jan Sale – Ends 31st’ – 16% CTR

Now this may have been an anomaly with not enough data to make a sure-fire conclusion but I think that the text in ‘B’ summed up the nature of the offer best: the 33% off Sale was not restricted to bookings in January – the holiday just needed to be booked in January. The 2012 intro to ‘B’ implied that 33% off offers over the whole year would be available.  I have no real explanation about why C was the worst performer – perhaps the use of the words ‘Ends 31st Jan’ rather than ‘Book by Jan 31st’ reinforced a perception this offer was only available for bookings in January.

We would normally collate a larger volume of clicks to feel confident about the results.  However, what it really highlighted to me is that just by running the test in the first place we improved the CTR on the banner, as both variations outperformed our ‘original’, and that the traditional ‘proven’ messaging that is often assumed to be high converting should be challenged and tested at every chance you get.