Monday, January 30, 2017

the power of humble a/b testing

I often joke that my company is "the engineers running the asylum" because there's not much "product management" as a separate discipline; engineering figures out what to do, and does it. But it's not as bad as all that; we are highly data driven, and have a solid A/B test system in place.


So one engineer had a thought that "Subscribe" with a pin wasn't the most clear UI, and that we might get better results with alternate wording. So we took the top bar of this:

And changed it to that:
A/B tests (called "view versions here") are enumated in a Java file, and then we have a page that lists them all, allows the developer to say what % of traffic should be sent to each one, and then an "assign to me" button that says the developer ALWAYS gets put in that group:


For each of those, within each percent (like 30) half of the % traffic gets sent to GROUPNAME as a test, and other half gets sent to GROUPNAME_CONTROL. Then you can pull up logs at look at results.

The results are often counter-intutive. Stuff that seems like solid UI wins get poorer results. (As happened here -  switching to a Bell and "email alerts" from a weird pushpin and subscribe got fewer clicks, and so did trying an email envelope icon and keeping the word "subscribe" ) Or- you might be looking at the wrong result. Like, maybe you don't want more sales leads if they're all crappy leads bumping down the overall conversion rate.

Still, it's good that not everything is just developer or product manager guessing...

No comments:

Post a Comment