Ease of Setup
A/B testing is a core capability built into the Norton Shopping Guarantee (NSG) code you will be deploying on your site. Once the code is deployed there is no effort on your end to enable testing. The NSG Experience Management System will automatically manage your traffic during testing.
Methodology
Shoppers nowadays use multiple devices and clear their cookies often which can lead to a "mixed experience" during an A/B test. The key to a successful A/B test is to ensure it utilizes a methodology that will provide you with an accurate conversion impact measurement for the solutions you are testing.
The NSG Experience Management System was specifically designed with two key goals in mind: 1) minimize the "mixed experience" effect to a very small level and 2) provide total transparency so that everyone can see how it works, verify how it works and ensure there is no bias in the bucketing mechanism.
At a high level, the NSG Experience Management System works in a predefined and predictable fashion. A new shopper is determined to be in a particular experience based on the last 2 digits of their public IP address. The system then locks the shopper into that experience using a combination of techniques including the specific IP address of the shopper and cookies. The combination of these layers provides an experience that is much more stable than with cookies alone.
Transparency
During the test all of the visit data is sent directly from the visitor's browser to the analytics platform of your choice (e.g.: Google Analytics, Adobe Analytics, VWO, Optimizely). The data does not pass through any Norton Shopping Guarantee systems or filters. It's a direct feed from the browser to your analytics platform. Shoppers are bucketed upon arrival on their first page on the site and are maintained in that experience throughout the life of the test.
You will also have full visibility into the NSG Experience Management System, and how it assigns Shoppers to each bucket. Our On/Off bucket ranges are predefined before the A/B test begins. You'll know exactly how traffic will be allocated before and data starts being collected. In all cases, we assign the visitor's Experience based on the last 2 digits of their IP addresses. Our default 50/50 A/B divides the full spectrum (00-99) as follows:
00-09 On |
10-19 Off |
20-39 On |
40-59 Off |
60-79 On |
80-99 Off |
Highest Density | Middle Density | Lowest Density |
We arrived at the pattern above using statistical analysis of all our existing visitor traffic. The data suggested the spectrum of 00-99 was organized into 3 groupings of densities, where the IP addresses ending in numbers toward the lower end of the spectrum had the highest usage. This seems to follow human nature, where System Administrators would tend to assign IP addresses at natural starting points like .00 or .100 or .200.
Thus, we needed to split the spectrum to account for this skew. The first option would be to use even/odd. We found this to be non-optimal, because of how some ISPs assign dynamic IP addresses. We found that assigned IP addresses were often times provided from a narrow range of IP addresses, varying by only a couple IP numbers. So, we wanted to arrive at a split algorithm that would have a good chance of providing the same experience if a visitor's IP address changed by only a couple numbers. We found what we believe to be an optimal solution by creating 3 even splits across the spectrum. This provides both a statistically relevant and visually balanced division of the available range.