A/B testing

« previous post | next post »

Today's Dilbert:

An earlier strip expressing negative opinions about "A-B tests":

If you wonder what Scott Adams means by "A-B testing", read this and this — which seems to align with the standard interpretation of the phrase.

I hope our IRB understands the difference between this and (say) the ABX paradigm used in perception research. Not that Dogbert's attitude towards "A-B Testing" ought to influence them to worry about the effects of such marketing-oriented tests, anyhow…

Another Dilbert strip that attacks testing, from a different angle:



5 Comments

  1. languagehat said,

    March 15, 2014 @ 8:55 am

    Your "standard interpretation" link goes to one of the Dilbert strips you show in the post; was that intentional?

    [(myl) No -- a slip of the mouse, apparently -- it's fixed now.]

  2. Peter Seibel said,

    March 15, 2014 @ 10:30 am

    I'm not really sure that Adams is one about with these strips. The only thing that rings true to me is the third frame of the second strip–the natural tendency for people to go with their own opinions, even in the face of contrary data. But A/B testing doesn't seem to me to be about manipulating people toward a particular outcome. Rather it's about "manipulating" people to do some higher level thing such as buying books or clicking ads. People doing A/B testing well don't care whether you click the green or orange button; whatever works better, i.e. gets more clicks if that's what you want.

    [(myl) I'm also puzzled, though it's kind of funny to fantasize that imposing "forced choice" on subjects makes sadistic test-designers cackle maniacally.]

  3. Craig said,

    March 15, 2014 @ 12:10 pm

    I think the idea in the first strip is that if A/B testing reveals irrational preferences (such as people wanting to click an orange button rather than a green one), you can take advantage of that to influence people's choices by associating the option you want them to take with things that are favored by their irrational preferences. Advertisers, for decades, have tried to associate random products with sex to get people to buy them, and political activists like to paint themselves as being "for" something rather than "against" something else ("We're not anti-abortion, we're pro-life!" And isn't everyone in favor of life, in the abstract?) because people generally respond better to something they perceive as positive. Adams seems to be suggesting that there could be other biases discoverable through A/B testing that could be used to influence choices. Naturally he presents this in an exaggerated, comedic fashion (this is a comic strip, after all), but the basic idea doesn't seem far-fetched; it seems rather obvious, actually.

  4. Eric Ringger said,

    March 15, 2014 @ 4:04 pm

    Kohavi et al. make the argument: http://dl.acm.org/citation.cfm?id=1281295

  5. Christophe Chaudey said,

    March 17, 2014 @ 12:41 pm

    Thanks for these funny comics !
    A/B Testing is very important if you want to grow your business, especially if you do so to improve your marketing technics. However, there are few other use of A/B Testing that are not as popular but very useful though: In Search Engine Optimisation for example: test several titles opf an article to see which ranks the best.
    Another good example: in UI: If you want to improve usability of an application you can do some A/B Testing to see which one is the easiest to use, even though both can have the same conversion rate. The idea is you don't test a colour, a place or a size but a process, a way to do something. You might look for some faster way to reach a goal.

    Basically, the whole concept of A/B Testing helps you to leverage better results in what you are studying. It definately is something to try, maybe even in real-life as well in some cases :)

RSS feed for comments on this post