A/B Testing

A/B testing is a way of testing that compares two versions of your design. It helps you to understand which version works best for your users.

Updated on January 27th, 2023


When A/B testing, each version has a single difference compared to the other. You determine which version performs best based on pre-determined objectives or key performance indicators like the number of clicks, time on page, or conversion rate.

An example of two A/B testing variations of a design component

You can use every element of your design for an A/B test. However, the most common ones are colors, headers, calls to action, images, and layout structure. A/B testing is a great way to get instant feedback and can be repeated as often as needed.

Why should you do an A/B test?

The aim of doing A/B testing is to optimize the design of your product. When you do, it can help you achieve business goals quicker while creating a more user-friendly product.

Instead of doing countless usability tests that require a lot of money and time, you can leverage the traffic you already have to reach the same outcome. It's just quicker and cheaper to do.

How to do an A/B test

Doing an A/B test well isn't as easy as it sounds. You need to figure out what to test, how to do it, and what to test for. Here's a step-by-step guide on how to do an A/B test.

1. Decide what you want to test

You can use A/B testing for any element of your design. For example, let's say you're working on an e-commerce project and want to increase the conversion rate.

That's a solid goal for any business owner, as a higher conversion rate will increase the number of people who buy something. You could consider the following elements for your test:

  • The shape, colors, and copy of your call-to-action.

  • Content like headers and paragraphs.

  • Form: numbers of fields, labels, and layout.

  • User flow: the number of steps required to complete a purchase.

  • The layout of the page and how elements are placed.

2. Set up the A/B test

Now that you've decided what element to A/B test, it is time to set it up. Create two versions of your design, with each a variation on the element you want to test. Remember only to change that one element.

It's also important to identify the results you want to get before starting the test. Are you looking to increase the conversion rate, as mentioned in the example above? Or maybe the time your users spend using your product?

Each requires different ways of testing. For conversion rate testing, you can use Hotjar, for example. Testing the time-on-page can be done using analytics tools like Google Analytics.

You can run multiple tests at the same time, but they cannot be dependent on each other. As such, they need to be on separate pages to prevent overlap.

3. Start gathering data

Start the A/B test on your website and gather data from actual user sessions. Then, leave each test version online for the same amount of time.

Most A/B tests are online for multiple days or even weeks to gather enough data. That's a requirement if you want to be able to make a relevant conclusion from your test.

Several factors will affect how long it takes to run the test:

  • The number of visitors shows the average number of users visiting your pages when running the test.

  • The goal-to-success ratio shows how many of these users are taking the action you want them to take before running the test.

  • The number of variations. A/B testing usually only tests two variations simultaneously, while multivariate testing can test many more.

Another way to run your A/B test is to give 25% of your users one version of your design and 25% the other. Then, once you know which version works best, you can give the remaining 50% of your users that version.

4. Analyse your data

First of all, the test should only be modified once it has been exposed to a statistically significant number of users. This number depends on your total number of users and the margin of error you're willing to accept.

If you don't follow this "rule," the A/B test could result in an unreliable outcome.

If the test's hypothesis appears valid through the experiment, you can implement the change. On the other hand, an unexpected result is never a waste of time. It still teaches you the better variation of your design. However, you might have to rethink the hypothesis and run another test.

Useful resources

A/B Testing Guide - VWO

A/B testing - Optimizely

15 of the Best A/B Testing Tools for 2023 - Hubspot



Page link

Want to learn more about A/B Testing?

Learn together with other UX designers in our community. Join by picking one of the options below.


  • Same as annual, but billed monthly.


Choose plan


Save 18% compared to monthly

  • 1️⃣ Join our Discord community

  • 2️⃣ Monthly UX Master Classes

  • 3️⃣ Office hours in the community


Choose plan


One purchase. Join forever.

  • Support the development and upkeep of UX Dictionary


Choose plan

Once you choose your plan, you'll be redirected to our Stripe page. After that, you'll be redirected to our dashboard. You can cancel, pause, or upgrade at any time.