It’s really hard to accelerate A/B tests, for lots of reasons. Unfortunately, it’s also desirable to accelerate A/B testing, to get people excited and to show that you’re making progress. Here are some alternatives:

Try not testing. Seriously. You will have to accept the possibility that the changes you make won’t work. (Which you have to accept anyway, even if it’s less likely, when you A/B test.) But can you find some best practices, or do you have historical research, that can help you? For example:

  • Use a library of completed tests, such as, to give you guidance

  • Follow best practices that are research-based. Usability sites like the Nielsen Norman Group often have great advice that is grounded in data

  • Do user testing, instead of A/B testing. Services like will let you record people as they navigate a page or site, which might give you valuable insight that you can use

  • Put your effort elsewhere, entirely. Let’s say you have run a bunch of tests on a page that are showing very little impact. You get an extra 100 conversions a month in the best case. You could also write a great blog post, that has a 10% chance of giving you 1,000 conversions a month. Do that instead.

Do more radical tests, and fewer of them. It’s possible that you’ve hit a local maximum, and further tests will have a high chance of running without showing a significant result. Redesign something altogether, instead of tweaking, and test that.

Adjust the significance (the chance that you will get a false positive) or power (the chance that you will get a false negative) of the test down; trade certainty for time.

Think hard about whether you are running the right tests. I was once asked to run tests on a page that actually couldn’t be changed for political reasons. I have also been asked to run many, many tests on the same page, to avoid a hard conversation about priorities and strategy. You probably won’t decide the design of a page based on an A/B test. That should be a strategic brand decision.