Sitemaps
1of10

Next Video

Register to continue watching.

Create a free account to unlock this video.

Submission confirms agreement to our Terms of Service and Privacy Policy.

Already a member? Login

Instructor

Mike Greenfield

Founder, Data Geek, Entrepreneur

Transcript

Lesson: Data-Driven Decisions with Mike Greenfield

Step #5 Frequency: A/B testing vs holistic design

We did a lot of frequency testing at Circle of Moms. We did it holistically. We didn't just say, "Okay, let's turn on more emails and see if people click more." If you turn on more emails, they're probably going to click more in the short term. The big question is in the long term is there going to be a drop off? Let's say we were sending one email a week and we wanted to test, "What if we sent two?" we would turn on two, four a month, or two or three and we would see over that longer period what's the effect of sending two emails versus one.

I don't think there's a one size fits all solution for how much email you can send. Before starting Circle of Moms, I was at LinkedIn. When I was at LinkedIn we had very little in the way of high quality email to send people. We could have tested sending two emails versus one email versus two emails versus three emails. I think what we would have seen is low click through rate for one email, low click through rate for two emails, low click through rate for three emails. Nowadays LinkedIn sends a ton of emails. I think they're able to away with that because they have fairly high quality and they have so much content that they can send people.

It's not you should send one email, or you should send two emails. If you have enough high quality content, you can send more, but you need to test and see is your second best piece of content going to drive a lot of clicks or not.

It's an interesting question about A/B testing versus holistic design. I think that there is a camp of designers who are inclined to think that A/B testing means that you're only optimizing the button color, and you're only optimizing these tiny little pieces. If you optimize a page in 50 tiny little pieces, you're going to wind up having a hole that's a jumbled mess.

That's certainly a risk, if you A/B test in the wrong way. I think that the right way, in many cases, to A/B test is you have an old design which you think is okay, then you would have something new that is an innovation, that is a big change and you can test that new design against the old design more holistically.

If it's a home page, you're probably not trying to optimize on one specific action. Rather than testing one change and saying, "How does it impact our sign ups or invitations?" you're saying, "We have an old homepage, we have an old home page. We're going to see how they compare against each other." Chances are, some things are going to be better on the new homepage and some things are going to be worse. If you run that test, you may see that three things are up and two things are down. One of those two things that's down you're okay with and the other one you're really not okay with.

If you test in that way you can say, "Okay, let's refine our new homepage. Let's see if we can get that number that was down in the new design back up." Then you come up with a new version of the new page, you push that out to users, you see how people engage with it. Then hopefully, ultimately, maybe after a couple of iterations of that, you wind up with a new home page that is generally a better performer than the old home page. There may be some numbers that are down, but it’s not being done one little pixel at a time.

Loading...