This guide will walk you through how to leverage A/B testing in Builder. A/B testing is a as a data-driven approach to testing and measuring different content and experiences to help you make smart, impactful improvements to your site or app.
In order to leverage A/B testing, you first want to create different variations of a page or piece of content in order to measure metrics between the two.
In the Builder editor, click the A/B Tests tab and select Add A/B Test Variation.
You can rename your new variation (titled Variation 1 by default) and adjust to your desired test ratio. You can click Add A/B Test Variation to create more variants to perform multi-variate testing.
Navigating back to the Edit tab, you can toggle between the two variations and make adjustments. When ready, click Publish to start the test or you can schedule it to publish later You can learn more about scheduling here.
Once a test is published and receiving traffic, you cannot add or remove variants. This ensures that the results of the your test aren't impacted by data from previous tests or earlier versions of the content.
👉Note: The allocation is random based on the test ratio. We track the test group via cookies; however, users that have disabled tracking are not assigned to a test group
A/B Test Metrics
To determine how your test is performing, click on the A/B Tests tab, and then click the "View Results" button underneath the sliders. This will take you to the insights tab, where you can see results for the test. At first you will not see many visitors or much of anything there, because the test has not been running long enough. But after a while, you will see some statistics on how your tests are doing. It will look like the sample data below.
Note that for conversion metrics on custom tech stacks you'll need to integrate conversion tracking. For Shopify hosted stores, this integration is done for you automatically
👉Note: We calculate conversions based impressions; therefore, it only requires an impression of Builder content to lead to a trackable conversion.
We can use this data to determine which variation is performing better by comparing the conversion stats between the variations. Depending on what is important to you and your store, you might choose to focus on conversion rate, number of conversions, or total conversion value. Or maybe you are just interested in the number of clicks a particular version got. No matter what metric you choose to focus on, make sure you give the test a long enough time to run and get enough visitors.
Once you have determined the winning variation, it's time to roll out the winner to all your visitors. If you want only show the winning variation to new visitors, or want to keep all the variations around to possibly use in the future, feel free to adjust the sliders in the A/B test tab to the desired ratios. If you would rather remove all the variations except for the winning one, click "Choose Winning Variation" on the A/B test tab. This will open a modal that allows you to select a winning variation and remove all others, ending the test.
Ending an A/B test
When you are ready to end your test, navigate back to the A/B Test tab and click "Choose Winning Variation". This will open a modal that allows you to select a winning variation and remove all others, ending the test.
👉Tip: To run a new A/B test, you can duplicate the page and add new variants. If you duplicate a page before ending the test, the variants will be copied to the new page. If you duplicate after a winner has been chosen, the test variants will not be copied.
Why is our solution different?
- Zero performance impact. Unlike other A/B testing tools, our A/B tests do not block and slow down your site
- The Builder editor serves as a single source of truth between its easy drag and drop interface and metrics page
- SEO friendly!
- No DOM blocking manipulation or flashing of unwanted content ever