Okay, so right off the bat—I’ll admit it…

A big mistake I made in A/B testing last month cost me a Ferrari California; which, if you’re unfamiliar with the model, only MSRPs at 192,000 dollars.

Admittedly, while that looks like a hefty amount of cash (for most individuals, it is), to most agencies, it’s a fairly small amount. However, everyone in the performance marketing space knows that that this “paltry” sum of money can mean the difference between a break-even campaign and a wildly successful one. Let me show you what I did, how we identified it, and how we resolved it.

 

The 200,000 Dollar Mistake

We ran our A/B testing between two final landing pages for 3 weeks. Now, this is a new vertical, and at the time I was still working hard to learn the ins and outs of that specific market…so I’m going to bank on that as my excuse. Anyway…

On Friday we tested the two final versions—one was left oriented, one was right oriented, and other than that we had two color variations on the call to action buttons. By Monday morning it looked like we had a clear winner—Test A was doing nearly 60% better than Test B.

So this was Landing Page A—it started off in the lead and we were pretty confident it was going to win.

However, by Wednesday the page was down 40% to Page B. Take a look at a quick mockup of Page B to get an idea of the difference between the two. By Friday, Page B was still showing a 40% conversion increase over Page A. So we waited a while to see if things evened out, and prepped to set the winner live to all our traffic. Two weeks later, page B was still the champion.

We rationalized that the green button, right alignment, etc. were just the industry best practices, so of course it was going to be a success.

Needless to say, come Monday morning the conversion rate was down. We’d seen that before so we left it running for the entire week and it was right back where it was supposed to be. However, I couldn’t get over this lingering idea that something went wrong.

What Went Wrong

To put it simply, the base assumptions we made about the product were incorrect. We presumed that our users fit neatly within one demographic, since the product was geared towards teenagers. But we overlooked the fact that teenager-geared products really have two user groups: teens and parents.

We poured over the data that weekend, trying to make sense of why our weekend traffic seemed to prefer one design, while our weekday traffic went an entirely different way. The answer was so glaringly obvious. Can you figure it out?

The Solution: Segmentation. Our Weekend Users and Weekday Users Were Different Demographics

It really was that simple: Our weekday users tended to be parents looking for information for their children while they had a few minutes of downtime at work, while our weekend users were the teenagers themselves doing their own research.

Once we realized that, we started to dig a bit deeper and realized that our conversion rate at night was lower too. Same thing. The teenagers who were home after school weren’t responding as well to our landing page.

The Solution: Day Part Landing Pages

I’m almost embarrassed at how easy this was to fix. We ran two more tests: one for the weekday days and one for the weekday evenings and weekend. We found that, our assumption that our teen audience and parent audience were interested in different things—not just design-wise but copy-wise was absolutely correct. We dayparted our media buys and their respective landing pages, and like magic, everything was moving in the right direction. It’s a newer campaign, but with the right segmentation in play we’ll easily make $200,000 more on it in May than we did in April if we don’t change anything else.

Just to be clear, we love testing. The point I’m trying to make here is that not all tests are created equally. If you don’t clearly define your questions and your segments, at best your tests could be meaningless, and at worse, they could cost you a brand new Ferrari California.


Comments

There are currently no comments, be the first to post one.

Post Comment

Name (required)

Email (required)

CAPTCHA image
Enter the code shown above in the box below