Thursday, September 20, 2018

Candy Japan's A/B Testing

The idea behind A/B testing is very simple: randomly assign users to groups A or B, change one thing between the groups, then see whether the change affects some metric of interest.

Candy Japan recently wrote an article on "A/B testing how to ask YouTubers for product reviews". I really like articles like these, where an author breaks down a real experiment and discuss what they learned. In this case, the experiment was emailing Youtubers to see whether they would be interested in making an "unboxing" video to review Candy Japan's product. The author varied the words used in the email and analyzed how the variations affected the rate of a positive response. In total, 180 YouTubers were emailed.

Considering the amount of effort required to message each YouTuber, the author took the opportunity to A/B test several different "splits" simultaneously. The author, unfortunately, does not discuss his methodology regarding the splits, so I'll assume that each split was even and independent of the others.

Some of the results presented were great. For example, including a call to action:
If you would like to receive a review box, please let me know your mailing address.
had the most impact, with the message reducing positive responses by more than 10%. It is a well-known fact in advertising technology that including the price of an item decreases the click-through rate. Here, asking for something as sensitive as a person's mailing address in the first email can come off as creepy.

I did find some of the other results less than convincing. There was only a less than 2% difference in positive response rates difference between including the following "elevator pitch" vs not.
I run a site called Candy Japan, which sends surprise boxes of Japanese sweets to members twice a month.
Since the 180 YouTubers would need to be split into two groups, with each group having around 90 YouTubers, the difference in positive responses between the two groups must be no more than 2. This is a very small difference, too small to justify even the author's toned-down conclusion that "including an elevator pitch of your service may help."

Moreover, the difference between offering viewers a discount vs not:
I can also give your viewers a discount coupon.
was also around 1-2%. It is interesting (but understandable) that the author's takeaway there was that discounts don't really matter -- at least for getting YouTubers interested in making a video. Of course, including a discount might encourage video viewers to become new CandyJapan customers, which is the real goal.

I'm as surprised as the author that the positive response rate was over a quarter. The success is a testament to the author's effort in targetting the right channels. I wonder how results would change if he targetted a wilder group of YouTubers, without the initial selection. I also wonder how he assigned channels into groups, and whether there were any correlations between the groups. Experiments like this are so difficult because there are so many features to test, and getting a large sample size is a lot of work.

The author promised a part 2, to test whether this method of advertising yields better results than buying YouTube ads. I'm very curious to see the results.

This blog post was written so that I could follow my own prompt at