top | item 4655617

A/B Testing the Effect of Sender Race on Email Response Rates

24 points| ezl | 13 years ago |blog.ezliu.com | reply

25 comments

order
[+] dantillberg|13 years ago|reply
You spammed a million people for this? Enough that over a hundred thousand people spent the time and energy to write an email reply? Please do justice to the collective time you took up with the survey.

Could you break down the data by name and/or gender of both the sender and the respondent? Test multiple hypotheses simultaneously? For example, in page 1002 of the referenced research article (http://faculty.chicagobooth.edu/marianne.bertrand/research/p...), they break down a whole slew of other qualities that may have affected response rate.

It's entirely possible that the name choice for your study has more of an effect than the perceived race association of those names. The "99.9% confidence" cited is really not that, and is subject to various biases which are hard to discern.

[+] ohashi|13 years ago|reply
The spamming part really bugged me too. It seemed like he harvested these emails rather than any form of opt-in. This seems like bad science in general.
[+] jules|13 years ago|reply
Assuming on average 10 minutes was spent by the receiving end, the OP has just killed 0.24 human lives to conduct this study. Or if we value a human life at $7 million [1], the damage is about $1.7 million.

[1] http://en.wikipedia.org/wiki/Value_of_life

[+] e-dard|13 years ago|reply
I get annoyed when I read things like: "A performed 4.9% better than B", when the comparison is between two proportions.

Just say it like it is, without putting a slant on it - A's response rate was 0.6% higher than B's.

I don't care if the results are significantly different, if the difference between the two samples is so small.

[+] ezl|13 years ago|reply
i don't really understand why this is "putting a slant on it".

citing an absolute difference isn't very useful when you're talking about differences in conversion rates.

    * 0.6% -> 1.2% is a 0.6% bump.
    * 49.4% -> 50.0% is a 0.6% bump.
however, in the first case, you're talking about doubling your conversion rate.

it sounds like you're saying that if someone increases their landing page conversion rate by 1% (from 1%) you'd rather hear that the conversion rate increased by a point.

the business owner is probably more interested in the fact that their revenues doubled.

[+] hnr|13 years ago|reply
This is an important point concerning Effect Sizes: "I don't care if the results are significantly different, if the difference between the two samples is so small". People do get too wrapped up in statistical significance and forget about practical significance.

However, the relative difference (4.9%) is the relevant metric to be looking at as noted by "ezl" in a comment.

[+] jcr|13 years ago|reply
Eric, the write-up is great, but it would be better if you provided the supporting data. The research is interesting, but sending out a million emails with tracking bugs is a bit, umm, questionable when one considers the time/effort wasted by the recipients. If everyone ran similar experiments, it would make a real mess, so providing the data you collected could also be beneficial in reducing the load.
[+] hammock|13 years ago|reply
Applaud you for doing the research, and I'll probably end up referencing it in my work at some point. Surely there are a number of holes to poke (as with anything) the one that stands out to me at the moment is you didn't control for the race of the recipient. I.e. if your overall recipient list was 50% Hispanic, even though you randomized who got what, would still expect Hispanic-sent open rate to be higher.
[+] ezl|13 years ago|reply
op here. i should own up. this is an apology.

@dantillberg, @jcr, et al: you're right.

i heard about the original study, was curious, but not enough to think much of it. in a previous startup we had a female intern who was getting substantially better response rates than the male founders.

after the recent press about how women in senior roles correlates with startup success i became a bit more curious and wondered if i could craft the perfect "from" field for outgoing emails.

i admit this was aggressive and that I got carried away. that's no excuse.

[+] rdwallis|13 years ago|reply
I assume the author was just trying to set up the premise before getting to the meat of the article but the opening paragraph claim that Americans are more sensitive about discrimination than anybody else is probably false and more than a little ironic.
[+] biznickman|13 years ago|reply
File this one under "tests that shouldn't have been conducted in the first place"
[+] slig|13 years ago|reply
PC aside, why not?
[+] tzs|13 years ago|reply
You should have also included unconventional names not usually associated with black people, such as Moon Unit, Starshine, Whalesong, and other such "hippy" names, or names associated with poor rural white people.
[+] jarin|13 years ago|reply
I would also like to see the effect of East Indian names. My name isn't from India (it is from Thailand), but it is often mistaken for being Indian or Arabic.
[+] reinhardt|13 years ago|reply
Offtopic but I did a double take on this: "Former options trader. Passionate QBASIC developer." Subtle irony or what?
[+] jere|13 years ago|reply
Frankly, I'm not really surprised that Antonio Banderas commanded a higher response rate.