A/B Testing: Knowing What Works Doesn’t Tell you Why

04/04/2012

I read a great post today called “Throw Everything you Know About Ads Out the Window”. The author describes how he ran a very simple test of two ads to see which would work better. You can see the two ads here.

The first ad was very professional looking with good looking graphics, nice fonts and a green call to action button. The second ad was in his words “some shit ad I made in 5 mins in Microsoft Paint.” The ad was a hand drawn picture of a car with the hand written words “Need for Speed!!! Play free!!”  He tested the 2 ads for 15K impressions each and found the low tech ad generated a clickthrough rate of 0.137% versus 0.049% for the more professional looking ad.

Whoa. That’s quite a difference.

His conclusion that, “every idea you have is worth testing no matter how crappy it is”  is a smart one in my opinion, but also trickier than it might sound.

Your Test Results Tell you What Works but not Why

So we know the second ad generated a better CTR. Now what? Here is a list of reasons the second one might have gotten more clicks:

  1. Novelty – we are inundated with ads and something that looks very different is interesting and click-worthy.
  2. Freeresearch shows that this a bit of a magic word for folks and it appears much more prominently on the second ad.
  3. Less Text – the second ad had much less text and is easier to read.
  4. Single Image – A single image might make the ad easier to process.
  5. Simplicity and Flow – The second ad is much simpler and it flows simply from top to bottom. The “professional” ad is more complex and flows right to left and top to bottom.
  6. Weird Psychology – Maybe that hand-drawn ad reminded us of the doodles we made we were 5 years old and a Proust-ian nostalgia swept over us and gosh darn it we just had to click!!

OK, so the last one isn’t all that likely but hey, anything’s possible. So what does this test tell us? It tells us that we can improve our CTR in one (or maybe many) of these ways. It’s giving us some clues about what hypothesis to test next but without those tests, the “why” around the increased CTR is not clear.

CTR is not the Same as Conversions

Another important thing worth pointing out is that there was no mention of conversions after the clicks. If I look at some of the possible reasons that the CTR might have been higher I could see that maybe some folks are clicking just to see what the heck this crazy ad is all about but aren’t really serious about taking any other action. While the test might have proven the second ad generated more clicks, it did not prove the second ad “worked” better from a business perspective.