Thursday, July 12, 2012

A-B testing isn’t a-bsolutely right for everyone


Not for the first time, Wired magazine has been responsible for exposing me to an exciting new idea with a range of possible applications beyond its original use.
In this case it’s the notion of A-B testing featured in the June 2012 issue of Wired UK.
Put simply, instead of deciding which of all the proposed website designs are best, some firms or organisations simply put both or all of them live, split the website traffic to go evenly between them and wait for the resulting sales/conversion/hits data to tell them which the users/customers say is best. Once that’s clear, the winner runs solus. Simples!
It’s a neat idea which could be applied in lots of other areas, such as direct mail (one or more test postcodes could receive different versions of a mailing and response rates compared), TV and radio advertising or even newspaper page design (different geographical editions could have different versions of a limited number of pages).
But there are some areas you wouldn’t want to use this approach. Principally, those where the audience isn’t homogenous (of equal value to you or your client) or is a group of high value, such as key accounts, who you can’t risk being exposed to anything other than the best possible representation of your organization or client as a bad impression from receiving the ‘losing’ design could cost you/them a lot of money.
Doubtless, there are lots more areas where this would work really well. But think carefully before using it.

No comments: