Running an A/B Test

A/B Testing Overview

RMC offers two types of testing: Classic and Targeting Actions.

Classic Testing is an advanced feature added to the existing "Set inner HTML of an HTML Element" targeted action to allow you to test different JavaScript codes for different variations on your website against the default page.

Targeting Actions Testing: You can test one or more versions of your Targeted Actions to determine which experience produces the best results with different audiences.  


Haven't you defined your personalised experiences yet? You may want to create Targeting Actions first to determine your personalisation campaigns. Once you do, you'll be able to test different versions of your targeted experiences to find the messages, offers and content that convert the most.



Creating an A/B Test 

  1. Click Target in the top menu bar.

  2. Select A/B Testing.

  3. Click New A/B Testing.



Classic and Targeting Actions Testing 

  1. Select the Test Type: Classic or Targeting Actions. See A/B Testing Overview for details.

    If you choose to test one or more Targeting Actions, you need to select from the list of actions you have previously created.

    Make sure that the A/B Testing option is turned on to be able to see a list of Targeting Actions you can test. See Testing and Optimising Targeted Actions for more information.

  2. In the Title box type a Name. It is recommended that you use a name that uniquely identifies the A/B Testing, and will allow you to recognise it quickly and easily.

  3. Select the Status; Enable or Disable the A/B Testing on your site.



  4. Add the URL(s) of the Page(s) where you want to run the A/B Testing campaign.

    Test a single action across multiple pages. An action can be tied to many locations on your site(s) but tested as a single variable.


  5. Select the Conversion Event (Goal). Your conversion goals are the metrics you are using to determine whether or not the variation is more successful than the original version. You can add your custom conversion events such as: clicks on links, page views, conversions, email signups, etc.

  6. Enter the Start Date and Time of your A/B Testing campaign.

    You can stop your test once your variation reaches a statistical significance or if the improvement rate is high enough to be able to draw an accurate conclusion.


  7. Select the size of your Control Group in Percentage

    Control Group selection allows you to choose a percentage of visitors who will not see the changes being tested. This gives you a point of reference by which to judge your conversion goals (improvement rate).


  8. Click Add Variants to Create Variations.
    1. Adding Variants for Classic Testing: add the Javascript codes for the variations you want to test.


    2. Adding Variants for Testing Targeting Actions: add the different Targeted Actions you want to test. 

       

  9. Use Target Rules to select your audience. If you do not use Target Rules to segment your visitors your Testing campaign will be applied to all your site visitors.

    The ability to test effects of actions on limited customer segments, rather than all visitors to your site, saves you time by eliminating irrelevant data, and also reduces wasted conversion, by not showing the actions to customers who don't match the audience that is intended to market to. See Targeting Rules

  10. Use AND or OR logical operators to include/exclude audiences. Any new rules you add with the "OR" condition will expand your audience (to include any visitor who meets your first set of rules or your second set of rules), while rules added with the "AND" conditions might shrink your audience (by requiring audience members to meet both first and second set of rules).


  11. Click Save Changes to start your test and wait for visitors to participate. Visitors will be randomly assigned to either the control or variation of your experience. Their interaction with each experience is measured, counted and compared to determine how each performs. 



Tracking your A/B Test Results 

To view the results of your A/B Tests:

  1. Go to Target > A/B Testing.  

  2. Click on the Actions button next to the A/B Test you want to view the report for.

  3. Select Show Report from the drop down.



Analysing Test Data and Drawing Conclusions 

Analyse the A/B test results, and see which variation delivered the highest conversions against the original. RMC presents the data from the test in the reporting screen, and shows you the difference between how the different versions of your page performed, and whether there is a statistical difference. 

 

  • Unique Conversions  the number of conversions based on the defined conversion event (goal).
  • Visitors the total number of visitors this action was executed on.
  • Conversion Rate conversion in proportion to the number of visitors.
  • Improvement Rate the improvement in terms of conversions that the variation has generated compared to the original.
  • Statistical Significance  the likelihood that the difference in conversion rates between a given variation and the original is not due to random chance. Statistical significance represents the confidence level. For example, if a variation in your A/B test result yields a 95% confidence level, this means that if you determine it as the winner, you can be 95% confident that the observed results are real and not caused randomly. It also means that there is a 5% chance that you could be wrong.

If there is a clear winner among the variations, go ahead with its implementation. If the test generates a negative result or no result, you can either decide to give it more running time, or use it as a learning experience and generate new targeting actions that you can test.


Why would you see a high Improvement percentage but a Statistical Significance of 0%? It's because your testing hasn't had enough visitors. As more visitors encounter your variations, and convert, you'll start to see the Statistical Significance increase.



Managing Existing A/B Tests 

The initial page of A/B Testing (Target > A/B Testing) displays a list of previously created A/B Tests.

To manage existing A/B Tests, use the following actions:

  •  opens A/B Tests' editing interface, where you can configure an existing A/B test campaign.
  •   enables a disabled A/B test campaign.
  •  ends a running A/B test campaign.
  •  shows the A/B test analysis and results.


Parent Topic: A/B Testing


Copyright 2020 Related Digital