A/B Testing

Run site-wide A/B tests in Nyla

Nyla's A/B testing feature enables you to run A/B tests on your site. You use Nyla's UI to set up your test, set up your experiment in GA4 where you also access the test results, then publish your winning variant in the Nyla App.

Important: Sending results to GA4 for AB testing is supported through Nyla's native GA4 integration by default, however if you are loading GA4 via Google Tag Manager you will need to configure your container in order to send the experiment details to GA4. 

It's super easy - check out the following 25-second screen recording and you'll get the gist:

Running A/B tests in Nyla: 

Important: Your site cannot be reverted back to its original state after running an A/B test. That is to say, that in order to stop your A/B test, you must publish a winner. As a result, you should make sure to always set your variant A as your control variant (e.g the state that your site is currently in) so that if you decide to stop the test, you can continue with your site exactly as it is. 

To run an A/B test you should: 

1. Click into A/B tests in the left bar navigation

2. You'll then see a new UI which allows you to set up to three variants

3. Create up to 3 variants. Your variant A should be the control. 

4. Save and queue each variant in the A/B test. 

5. Once you are ready, move to the publish screen in the AB test and publish your test.

6. You should follow these instructions to set up your experiment in GA4

To access results, you can look at the corresponding experiment in  GA4

7. Once you are ready, publish the winner of your A/B test in order to end it. 

Locked entities during an AB test

If an entity in Nyla (e.g a template, page and so on) has an active AB test running, then it will be locked for editing until the test is ended so that the test is able to run as intended.  

Running multiple simultaneous A/B tests in Nyla: 

It is possible to run multiple A/B tests at the same time, although the tests that you run will not be mutually exclusive.

Important: When using simultaneous AB tests, ensure there is no crossover on any pages between any simultaneous AB tests in order to get accurate results. 

If there is any chance that your AB tests will cross over, then you should run your AB tests sequentially, not in parallel. 

How traffic is divided for simultaneous AB tests: 

Users are assigned test variants for a test when they land on a page running a test, or when a page that they visit has a link to a page running a test. 

Impact of tests that overlap with one another

If you run 2x tests on the same page, then this will cause issues with your test. 

Let's say you run two AB tests on the homepage and your footer at the same time. On both AB tests, the A variant is assigned 000 and the B variant is assigned 001: 

Test 1

Test 2

A is assigned 000

A is assigned 000

B is assigned 001

B is assigned 001

Given the Homepage and Footer are on the same page, users will be assigned either 000 (Variant A) or 001 (Variant B) for the Homepage test.

That assignment will also apply simultaneously to the Footer experiment given the footer is present on the same page, so the same assignment that applies for the Homepage test (000 or 001) will apply to the footer test. 

Therefore, users will either be in 000 for both tests, or 001 for both tests, which means that the split of results is going to be:

    • Test 1: A + Test 2 A: 50%

    • Test 1: A + Test 2: B: 0%

    • Test 1 B + Test 2 A: 0%

    • Test 1 B- Test 2 B: 50%

This results in an unfair test.

On the other hand, if tests are run with with no crossover between the two then users can be assigned 000 on one page running one test and 001 on a different page running another test. Therefore you'd end up with a split like this provided you have enough traffic:

  • A-C: 25%

  • A-D: 25%

  • B-C: 25%

  • B-D: 25%

Limitations: 

It is not possible to test app settings*, page settings or site settings via Nyla A/B testing.

*App settings that are accessed via Apps > Your App cannot be AB tested, however content items on pages can be included in AB tests. 

Note: If A/B testing changes on page templates, then we suggest not to create new pages that use your page template whilst your test is running.

Creating new pages that are created with your page template will use the current published version of the page template as opposed to the changes in the test.