Some of these fall premises, are actually admitted to be true in the article... wtf?
Also, a lot of the quotes are not so much against a/b test, but people who claim a/b testing is a panacea, and are recommending the wrong tools to the wrong people. (ie tools based around static content to people with entirely dynamically generated content)
basically the biggest problem with a/b testing is that some of it's most noisy proponents have no idea what they're talking about. (this is true of a lot of things)
The biggest problem I have with A/B or Split testing is that it makes it easy for inexperienced people to assume there are multiple right answers to every problem and that they all warrant 'discussion' or testing... Most often this is not the case, and people who jump straight to A/B everything are the sort who don't really have/trust any experience and would rather spin wheels and throw spaghetti at the wall.
That said, sometimes there are two (or more) options that warrant testing (say a marketing strategy) and then it works great.
Test strategy, don't test everything, don't test design principles.
I've always thought the problem with A/B testing is that it's mostly hill-climbing optimization which is likely to hit and get stuck at a local optimum.
This is a good example that disproves many of the false premises mentioned in the OP, as the clear winner of the A/B tests was a bold dynamic design with lots of character, that was completely different from the original designs.
You could A/B test two completely different business models if you had the resources to do so. Your imagination and your bank account are the meaningful limits on what you can test, not the technology.
He addresses exactly that point with False Premise #2. It really depends on how you use A/B testing, if you only do it to test small incremental changes then yes, you're correct, but there's nothing stopping you from testing two totally different features/designs with A/B testing.
This would be 100x more useful if it was "Avoiding common pitfalls of A/B testing" -- since the author admits that poor usage of A/B testing makes most of the premises true.
[+] [-] ehutch79|12 years ago|reply
Also, a lot of the quotes are not so much against a/b test, but people who claim a/b testing is a panacea, and are recommending the wrong tools to the wrong people. (ie tools based around static content to people with entirely dynamically generated content)
basically the biggest problem with a/b testing is that some of it's most noisy proponents have no idea what they're talking about. (this is true of a lot of things)
[+] [-] dylanrw|12 years ago|reply
That said, sometimes there are two (or more) options that warrant testing (say a marketing strategy) and then it works great.
Test strategy, don't test everything, don't test design principles.
[+] [-] adw|12 years ago|reply
Common sense is often only common sense in retrospect!
[+] [-] varelse|12 years ago|reply
[+] [-] wikwocket|12 years ago|reply
This is a good example that disproves many of the false premises mentioned in the OP, as the clear winner of the A/B tests was a bold dynamic design with lots of character, that was completely different from the original designs.
[+] [-] noelwelsh|12 years ago|reply
[+] [-] zaguios|12 years ago|reply
[+] [-] ojilles|12 years ago|reply
[+] [-] greglindahl|12 years ago|reply
[+] [-] jfarmer|12 years ago|reply